Deepfakes are a real political threat. For now, though, they’re mainly used to degrade women.

Deepfakes are a real political threat. For now, though, they’re mainly used to degrade women.

Jordan Peele uses deepfake technology to simulate a speech by Barack Obama as an ironic warning against the rise of deepfakes. | BuzzFeed

A new report on deepfakes finds 96 percent involve simulating porn of female celebrities (without their consent).

The concept of “deepfakes,” those strange, AI-created videos made to simulate real people or even people who don’t exist in frequently unrealistic or pornographic situations, is alarming enough. Now Deeptrace Labs, creator of a service designed to identify deepfakes in the wild, has found that the number of these fake videos is proliferating at an alarming rate.

Released Monday, Deeptrace Labs’ report concludes that the number of deepfakes on the internet has basically doubled over the last year. The lab identified 14,678 deepfake videos across a number of streaming platforms and porn sites, a 100 percent increase over its previous measurement of 7,964 such videos taken in December 2018.

The study also found that the vast majority of the subjects of these fake videos across the internet — a full 96 percent — are of women, mostly celebrities, whose images are being turned into sexual fantasies without their consent.

The study found that 96 percent of all deepfake videos were pornographic and nonconsensual. The top four websites dedicated to hosting deepfakes received a combined 134 million views on such videos. And on such websites, a full 100 percent of the videos’ subjects were women — usually female celebrities having their likenesses swapped into sexually explicit porn videos without their knowledge or consent. All in all, around 850 people in total were targeted by the videos.

The genre of deepfakes as we know them first grew out of a Reddit forum for photoshopping the heads of female celebrities onto the bodies of porn stars, so it’s perhaps unsurprising that satisfying sexually driven fantasies has remained the deepfake’s primary purpose. Deeptrace Labs’ findings support what we know of deepfakes in action so far, which is that they are rarely used to help usher in a dystopian political nightmare where fact and fiction are interchangeable: They exist to degrade women.

But there certainly are deepfake vides that do blur the lines of reality. While 96 percent of all deepfake content online is porn about women, on YouTube the gender of the targets is actually 61 percent male — and the content isn’t pornographic, but rather commentary-based.

“Subjects featuring in YouTube deepfake videos,” the study notes, “came from a more diverse range of professions, notably including politicians and corporate figures.” That suggests that the videos might also be used to play a role in YouTube’s ever-more reactionary political environment — though thankfully not at a chaos-inducing rate. Yet.

The study does note, however, that outside of politics, deepfakes seemed to be used to undermine cybersecurity concerns, enhance fake digital identities, and target businesses and other organizations, specifically to “enhance social engineering against businesses and governments.”

The study also found a number of other interesting anomalies related to the use of deepfakes around the world. For example, researchers found that non-Western subjects featured in almost a third of videos on deepfake pornography websites, with female K-pop singers making up a quarter of the subjects targeted worldwide.

The new report acknowledges that we’re increasingly living in an age when deepfakes can seriously disrupt the political landscape. In addition to the deepfake, the report notes the rise of what it calls the “shallowfake”: strategically edited and altered videos such as a fake video of Nancy Pelosi that went viral earlier this year. Efforts to remove these and other deepfakes from the internet have been lackluster at best; in fact, the typical response has been to use deepfake technology to warn against deepfake technology.

The rise of this dangerously deceptive genre of technology, the report concludes, should necessitate strategic and immediate action, though it stops short of recommending what that action should be. Instead, it emphasizes that many of the threats deepfakes pose to public and private security “are no longer theoretical.”

You can read the full report at the Deeptrace website.

Author: Aja Romano

Read More

RSS
Follow by Email