Pornhub, Reddit Ban AI-Generated Fake Porn Videos

Pornhub, Reddit Ban AI-Generated Fake Porn Videos

Over the past two weeks, a new Reddit community for creating fake porn videos took off, courtesy of . It’s an alarming concept for multiple reasons, given the way revenge porn has been used to target individuals. A user community focused on the creation of these edits, r/deepfakes, quickly sprang up on Reddit, while deepfake videos have been popular on Pornhub.

Today, both companies announced they were taking steps to end the practice. On Reddit’s front, the r/deepfakes community has been suspended. A company spokesperson told :

Reddit strives to be a welcoming, open platform for all by trusting our users to maintain an environment that cultivates genuine conversation.

As of February 7, 2018, we have made two updates to our site-wide policy regarding involuntary pornography and sexual or suggestive content involving minors. These policies were previously combined in a single rule; they will now be broken out into two distinct ones. Communities focused on this content and users who post such content will be banned from the site.

The “involuntary pornography” category is an interesting one. Previously, it would’ve applied to people whose sexual activities were filmed without their consent or knowledge. Now, it can apply to anyone who finds their likeness stapled into a pornographic film or image.

Meanwhile, Pornhub has stated it will delete deepfake videos that punch such a gaping hole in user consent. “We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it,” a site spokesperson said in the same report. “Nonconsensual content directly violates our TOS [terms of service] and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.”

Deepfakes are also being used, horrifyingly, to put Nicholas Cage in everything. IN EVERYTHING. INVOLUNTARY NICHOLAS CAGE.

At press time, there were still plenty of various faked videos on the site, but that should change going forward. Regardless, this kind of video fakery is going to present significant problems for content authenticity going forward. It’s always been possible to edit photographs and fake video — Stalin was notorious for this kind of thing, long before Photoshop existed. Previously, however, these edits required a substantial level of expertise. Going forward, they very well may not.

That, in turn, could force a reevaluation of the primacy we put on video evidence. Today, having video of something is often treated as tantamount to having proven it. In the future, that may not be the case. As AI and deep learning continue to advance, the sophistication of such fakery will grow, and spotting the differences will be increasingly difficult to do.

Facebook Twitter Google+ Pinterest
Tel. 619-537-8820

Email. This email address is being protected from spambots. You need JavaScript enabled to view it.