advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

OpenAI’s Sora terrifies us and many others

  • OpenAI has revealed its text-to-video generation tool, Sora.
  • The tool isn’t publicly available as OpenAI is said to still be tweaking Sora as well as working to ensure it isn’t misused.
  • Reactions to Sora’s existence have been met with concern and vitriol.

The development of large language models that power artificial intelligence applications such as ChatGPT and Midjourney has been rapid. This is understandable as the compute demands of this technology are massive so companies that develop AI solutions need to start filling coffers as soon as they can lest a data center bill bankrupts them.

The pace at which this development happens is aided somewhat by the fact that regulation in the AI sphere is non-existent despite its pitfalls and the potential harm to the public good these tools can have. We’ve already seen scams that use the voices of celebrities and influencers to lure people into handing over their data and its about to get a lot worse.

On Thursday evening, OpenAI announced the imminent release of Sora, a text-to-video model that “can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions.”

The response to this announcement, as can be seen in the responses to the X, formerly Twitter post above has been a mix of horror and alarm.

“I don’t think y’all realize how many artists you’re fu*king over right now,” one user responded.

“This is terrifying and going to steal jobs, not to mention the amount of terrible things this could be used for,” another added.

There are a few people who have praised Sora based on how well it performs but that’s also why so many more people are worried about this tool. For one, this sort of short-form video is often found in stock footage libraries. These libraries are populated by filmmakers who would no longer be needed if Sora becomes good enough.

We must admit that even with the obvious artifacts and oddness in the images, the content generated by Sora is impressive but it’s because it’s so impressive that the people who make this sort of content are worried. Will Sora be used to make a full movie? Perhaps not immediately but for the local videographer who creates corporate videos or sells stock footage online, Sora feels very much like the end of that income stream.

One of the other major concerns highlighted by some is that Sora could be used for bad things. OpenAI says it’s working with “red teamers” to assess critical areas for harms and risks. It’s also said to be working with artists, designers, and filmmakers to gather feedback, not on whether this is a good idea, but how to improve Sora.

OpenAI does say that’s applying the same guardrails to Sora that it did with DALL-E 3.

“For example, once in an OpenAI product, our text classifier will check and reject text input prompts that are in violation of our usage policies, like those that request extreme violence, sexual content, hateful imagery, celebrity likeness, or the IP of others. We’ve also developed robust image classifiers that are used to review the frames of every video generated to help ensure that it adheres to our usage policies, before it’s shown to the user,” the firm wrote in a blog.

“We’ll be engaging policymakers, educators and artists around the world to understand their concerns and to identify positive use cases for this new technology. Despite extensive research and testing, we cannot predict all of the beneficial ways people will use our technology, nor all the ways people will abuse it. That’s why we believe that learning from real-world use is a critical component of creating and releasing increasingly safe AI systems over time,” OpenAI added.

This is of little comfort though as guardrails can be easily bypassed in most LLMs and generative AI tools. There’s also no telling how this technology will be used when it’s inevitably picked apart and remade by other firms and bad actors as other tools have been.

Perhaps more concerning is that OpenAI has announced this tool during a year when as many as 90 countries will be hosting elections. While the firm says it’s building the ability to detect videos generated by Sora, how well this will work remains to be seen.

The good news is that Sora isn’t publicly available just yet, though how long it will remain in development is unclear.

advertisement

About Author

advertisement

Related News

advertisement