Is OpenAI’s ‘Sora’ a Gateway to Crypto Scams? Unveiling AI Risks

OpenAI Unveils AI Text-to-Video but Could ‘Sora’ Be Used for Crypto Scams?

[ad_1]

Artificial intelligence firm OpenAI has unveiled its latest product, a text-to-video model called Sora. However, advances in AI have been more fuel for scammers, and as crypto markets bloom, this could be put to use to scam people.

According to the February 15 announcement, Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions. 

The firm stated that it would be “taking several important safety steps” ahead of making Sora available in its products.

“We are working with red teamers — domain experts in areas like misinformation, hateful content, and bias — who are adversarially testing the model.”

It showcased several videos created from text prompts providing a brief description of the desired scene. 

Screenshot from Sora-generated video. Source: OpenAI
Screenshot from Sora-generated video. Source: OpenAI

The model understands what the user has asked for in the prompt and also how those things exist in the physical world, it stated.

OpenAI also said that the model has a deep understanding of language. This enables it to accurately interpret prompts and,

“Generate compelling characters that express vibrant emotions.”

However, the firm cautioned:

“Despite extensive research and testing, we cannot predict all of the beneficial ways people will use our technology, nor all the ways people will abuse it.”

Read more: DALL-E 2 Review: Everything You Need To Know

Nevertheless, the new technology is likely to be abused by crypto scammers. It could allow malicious actors to create deepfakes impersonating real people and companies to promote fraudulent crypto projects.

Scammers could also use Sora to generate fake celebrity endorsements of web3 and crypto projects. The synthetic videos and audio already showcased were very convincing. “Wow. This is truly amazing!” Podcaster Lex Fridman told his 3.3 million X followers. 

Moreover, artificial intelligence-generated text-to-video could automate the creation of large volumes of scam videos at low cost. These could be used to promote crypto pump-and-dump schemes across social media platforms, which already do very little to prevent scams. 

Bad actors could leverage the power of text-to-video AI, such as Sora, to cheaply create manipulative content. This could then be optimized to go viral across social platforms and enable crypto scams to quickly scale if left unchecked. 

Latest AI News 

The artificial intelligence sector has been busy this week in addition to the groundbreaking Sora announcement. OpenAI also gave ChatGPT a ‘memory,’ enabling it to remember prompts so that users do not have to repeat requests. 

Google revealed an upgrade to its AI model, Gemini. Version 1.5 features an innovative 1 million token context window. This enables the large language model to process considerably more data than its competitors.

Read more: ChatGPT Tutorial: How To Use ChatGPT by OpenAI

Moreover, Meta has released V-JEPA. This new learning model enables AI to understand and predict what is going on in a video, even with limited information. 

Slack has also integrated new generative AI features into the workplace platform. These include enhanced search, channel recaps, thread summaries, and more.

Disclaimer

In adherence to the Trust Project guidelines, BeInCrypto is committed to unbiased, transparent reporting. This news article aims to provide accurate, timely information. However, readers are advised to verify facts independently and consult with a professional before making any decisions based on this content. Please note that our Terms and Conditions, Privacy Policy, and Disclaimers have been updated.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *