While doing a reverse image search, Noelle Martin found a pornographic video of herself.
She didn’t make the video. In fact, it wasn’t even real. It was a “deepfake”, a form of AI-powered video manipulation that—Martin later discovered—was put together by a complete stranger who stole her pictures and selfies from her social media accounts.
“My stomach sank,” she told Earshot, ABC Radio’s flagship programme in a 2019 interview where she recounted the moment she came across the video. “I felt sick.”
Helpless, she watched the video proliferate widely online, accompanied by very sexually explicit commentary. “I remember crying on the phone to random people, just being like ‘this is happening to me. Is there something that can be done?’” she said. She even emailed the webmasters of every site where her images appeared.
When she took her case to the police, they told her there was nothing they could do.
Martin was 18 when it happened.
A 2019 report by DeepTrace Labs found that 96 percent of deepfakes circulating online are pornography, mainly featuring women and garnering well over 134 million views. The remaining four percent include uses of synthetic media for positive educational, entertainment and academic purposes, as this site does. The majority of deepfakes employ face-swapping technology to switch the faces of real porn performers with Hollywood actors, without their consent. But private individuals like Martin—who has gone on to become an activist against online image-based abuse in her home country of Australia—are clearly not immune.
Creating a highly realistic piece of synthetic media is complicated, generally utilizing advanced techniques and a high degree of expertise.
In the case of In Event of Moon Disaster—a “complete deepfake” which includes synthetic audio and visuals backed by significant computing power and financial resources—it took the directors, an actor and two synthetic media production companies three months of directed effort to complete it. This level of believability is hard, if not impossible, for the average person on the street to produce.
However, for deepfakes that are dependent on face-swapping, it’s a different story. A hobbyist or amateur can whip a convincing one up a week. A tech-savvy person, with access to the right software and computer applications, can do it in days, or hours. Last December, Timothy B. Lee, senior tech policy reporter at Ars Technica, created his own deepfake video in which he swapped Mark Zuckerberg’s face as he testified before Congress with that of Lieutenant Commander Data from Star Trek. Making the video cost him around $500.
In an extensive article on the making of his deepfake, he described his face-swapping training process in detail and listed the software he used. The idea behind his experiment was to highlight both the possibilities and limitations of the technology. But perhaps inadvertently, it also showed how easily accessible it is.