Deepfakes — digital forgeries produced by artificial intelligence (AI) — have blurred the line between reality and illusion. On the upside, AI-generated deepfakes have revolutionized special effects in motion pictures and made certain education and health care industry processes more effective. Yet there are also plenty of risks associated with deepfakes.
Deepfakes purporting to represent public officials can disseminate disinformation and generate fake news stories. And if fraud perpetrators use deepfake images of a company's owner or senior executives, they can more easily perpetrate phishing schemes and steal sensitive data.
The threat extends beyond visible manipulation to audio. Deepfakes can mimic a specific individual's voice to commit theft. For example, a so-called "business partner" might leave a voicemail instructing someone in your accounting department to wire funds to an overseas account.
AI-based detection technology solutions can help reveal deepfakes by identifying unusual facial movements, unnatural body postures and lighting inconsistencies. Yet this technology is still in its infancy and far from perfect.
Alternative solutions, such as watermarking, show promise. However, watermarking technology is relatively easy to bypass and has yet to gain widespread acceptance. A small but growing body of law regulates the use of deepfakes. But the laws do little to prevent their creation. They generally punish creators when (and if) they're caught using deepfakes to commit illegal acts.
Recognizing the red flags of deepfake content is vital. You and your employees should be wary of video or audio exhibiting:
Until technology makes it easier to uncover deepfakes, exercising a healthy skepticism is the best way to avoid being conned. Before you treat a video or audio file as legitimate, corroborate it with multiple sources. And if employees receive an unusual request via voicemail or video from a supposed manager, they should verify it by phone or by talking to the individual in person.
Contact us with questions and for help training your workers to fight malicious deepfakes and other fraud schemes.
Get in touch today and find out how we can help you meet your objectives.