The FBI has recently issued a warning about the growing use of AI-generated deepfakes in sextortion schemes. Deepfake videos are computer-generated images or videos that show real people engaged in fake activities, such as sex. These types of scams have been on the rise due to advancements in artificial intelligence technology and its ability to create realistic-looking images and videos.
In these cases, perpetrators will usually target victims with an email containing a link to what appears to be a video of them engaging in sexual activity. The victim is then threatened with public humiliation if they do not pay money or provide sensitive information, such as credit card numbers or passwords.
It is important for everyone using digital media platforms—particularly those who may be vulnerable targets—to understand how easy it can be for malicious actors to manipulate photos and videos using AI technology like deep fakes. It is also essential that users take precautions when interacting online, including being aware of unsolicited emails from unknown sources asking for personal information or money transfers. If you ever receive one of these requests, contact law enforcement immediately so they can investigate further.
Read more at Ars Technica