'Deepfake' videos become new weapon in security

  • 6 years ago
You may want to think again about the saying that cameras never lie.
The emergence of "deepfakes" or authentic looking fake videos makes it harder for people to believe what they see.
Our Kim Ji-yeon tells us more.



We may have entered a world where it's going to be harder to believe what we see.
New technology is making it possible to make realistic videos of people appearing to say things they've never actually said... or doing things they've never actually done.
The videos are known as "deepfakes."
The name comes from the process of deep learning, a form of artificial intelligence.
It utilizes face mapping technology and are created by loading a set of instructions into a computer, along with lots of images and audio recordings.

The software has already been used to create fake celebrity pornography.
By the time the prominent figure is able to show that a video is not real, it may already have been distributed.
And the emergence of deepfakes may not only tarnish someone's reputation.
Intelligence officials warn they could even be used to threaten national and global security or interfere in elections.
The issue got attention earlier this year when U.S.-based website BuzzFeed published a deepfake political video that showed former U.S. President Barack Obama criticizing President Donald Trump.
In delicate times when diplomacy is key in maintaining global peace and stability, imagine what a fake video of a U.S. leader or an official from North Korea or Iran could do if it appears to show the figure threatening the other of an impending disaster... it could create chaos and instability.

In light of this, efforts are being made to identify and remove deepfakes.
AI is being trained to distinguish them by counting the times individuals depicted blink their eyes, since the subjects in a fake video don't blink their eyes the same way that people in an authentic video might.
But in the long run, forgers may be able to create more realistic blinking in their fake videos, so experts say other types of physiological signals should also be considered to detect deepfakes in the future.
Kim Ji-yeon, Arirang News.

Recommended