Deepfakes have gotten dangerously close to reality, to the point where the FBI is even warning companies of people relying on the technology to swindle their way into job interviews.
If you’re not convinced, just take a look at this Christmas video by Britain’s Channel 4 from 2020, which depicts the Queen making a speech before breaking out into a dance for her adoring public. The channel released the clip to warn about the dangers of fake news.
had Wael AbdAlmageed, Research Director at the USC’s Information Sciences Institute, scrutinize some of the most viral deepfaked videos. He also divulged some giveaways—deepfake body language cues, if you will—that footage has been manipulated with new faces, expressions, or audio.
AbdAlmageed, who’s also the Director of the Visual Intelligence and Multimedia Analysis (VIMAL), has been working on technology to detect AI fakery in videos with his team. Through their learnings, they’ve noted a few ways people can spot telltale signs of deepfakes with their naked eye.
Most notably, the deepfake expert recommended playing videos at a lower speed to check if a person’s lips are in sync with the audio.
In addition, a blank look in the forehead and eye region with doesn’t change from one frame to the next is an indicator that the face has been superimposed on another.
You could also look at the space between the face and the neck, as a blurry area might mean that there was a challenge with blending the faces of two people together.
AbdAlmageed, however, commended Kendrick Lamar’s The Heart Part 5 music video in which the rapper’s face was realistically swapped with those of OJ Simpson, Will Smith, and Kanye West. The result was so uncanny that his algorithm sometimes failed to pick up any edits, though he also alluded these shortcomings to the lack of representation of people of color in AI databases and research.
While AI detectors aren’t perfect at spotting deepfakes yet—the researcher estimates their accuracy to be about 60%—they’ll still be significant players in the fight to end misinformation, AbdAlmageed noted.
Also, here’s a tidbit: Those viral videos of Elon Musk’s Chinese “lookalike,” Yi Long Ma, might have been deepfakes all along.
Watch the video below for other clues of footage being deepfaked.
[via
http://www.designtaxi.com/news/419308/Watch-Deepfake-Expert-Shares-How-You-Can-Mostly-Tell-If-A-Video-Is-Bogus/
Leave a Reply