The creator of ChatGPT recently unveiled its next venture in generative artificial intelligence.
OpenAI, headquartered in San Francisco, introduced a new text-to-video tool named Sora, designed to swiftly generate brief videos based on written instructions known as prompts.
While Sora is not the pioneering technology of its kind, companies like Google, Meta, and Runway ML have also delved into similar developments.
OpenAI’s exceptional video quality, particularly showcased after CEO Sam Altman encouraged social media users to submit prompt ideas, astounded onlookers. However, the resulting videos also sparked concerns about potential ethical and societal implications.
Originating from New Hampshire, a photographer proposed a prompt on X that outlined the preparation of gnocchi, an Italian dish, in an aged countryside kitchen setting.
The prompt read:
“An instructional session on preparing homemade gnocchi led by a grandmotherly social media influencer in a quaint Tuscan rural kitchen illuminated with cinematic lighting.”
Shortly after, Altman responded with a lifelike video that materialized the details outlined in the prompt.
FILE – This illustration shows a video generated by Open AI’s newly introduced text-to-video tool “Sora” playing on a screen in Washington, DC on February 16, 2024.
Although the tool is not yet accessible to the public, OpenAI has disclosed limited information regarding its development process. The company has also refrained from specifying the image and video sources utilized to train Sora.
Legal actions have been taken against OpenAI by The New York Times and some writers for employing copyrighted written works to train ChatGPT. Additionally, OpenAI compensates The Associated Press for licensing its textual news collection.
In a blog post, OpenAI mentioned ongoing consultations with artists, policymakers, and other stakeholders before the official release of the new tool.
The company highlighted its collaboration with “red teamers” – individuals tasked with identifying issues and offering constructive feedback – in refining Sora.
“We are collaborating with red teamers — … experts in domains like misinformation, offensive content, and prejudice — who will rigorously evaluate the model,” the company stated.
“We are also developing tools to identify deceptive content, including a detection algorithm capable of discerning videos generated by Sora.”
I’m John Russell.
John Russell adapted this report from the Associated Press.
Terms in This Account
tool — n. an instrument for performing a task
quaint — adj. attractively unusual or old-fashioned
prompt – n. a cue that directs a computer to execute an action or provide data
collection – n. a repository for preserving documents
algorithm – n. a set of rules for solving a problem or performing a task