Doctored photos can easily create false memories. What happens when there’s fake video?
Seeing is believing. And because of this fact, we’re screwed.
Due to advances in artificial intelligence, it’s now possible to convincingly map anyone’s face onto the body of another person in a video. As Vox’s Aja Romano has explained, this technique is becoming more common in pornography: An actress’s head can be mapped onto a porn actress’s body. These “deepfakes” can be generated with free software,and they’re different from the photoshopping of the past. This is live action — and uncannily real.
On Tuesday, BuzzFeed published a demonstration featuring the actor and director Jordan Peele. Using FakeApp, the same tool used in the celebrity face-swapping porn, BuzzFeed took an old video of President Obama and swapped in Peele’s mouth as he performed an impression of Obama. It’s a creepily powerful PSA with a forceful message: “This is a dangerous time. Moving forward, we need to be more vigilant with what we trust from the internet.”
Combine fake audio with fake video and it’s not hard to imagine a future where forged videos are maddeningly hard to distinguish from the truth. Or a future where a fake video of a president incites a riot or fells the market. “We’re not so far from the collapse of reality,” as Franklin Foer recently summed up at the Atlantic.
But I fear it’s not just our present and future reality that could collapse; it’s also our past. Fake media could manipulate what we remember, effectively altering the past by seeding the population with false memories.
“The potential for abuse is so severe,” says Elizabeth Loftus at the University of California Irvine, who pioneered much of the research in false memory formation in the 1990s. “Once you expose people to such a powerful visual presentation, how do they get it out of their minds?”
We don’t have psychological studies directly looking at the ability of AI-faked video to implant false memories. But researchers have been studying the malleability of our memories for decades.
Here’s what they know: The human mind is incredibly susceptible to forming false memories. And that tendency can be kicked into overdrive on the internet, where false ideas spread like viruses among like-minded people. Which means the AI-enhanced forgeries on the horizon will only make planting false memories even easier.
click the link below for more
https://www.vox.com/science-and-health/2018/4/20/17109764/deepfake-ai-false-memory-psychology
*************
does life imitate the media, or does media imitate life?
.
.
.
This “fakeness” has been going on far longer than most people realize.
Leave a Reply