Thanks to MIT for this:

Deepfakes are AI-generated artificial media.

Recently, the analysis firm Sensity AI estimates that 90% of all online deepfake movies are nonconsensual porn.

There have also been quite a few easy-to-use no-code tools which have emerged, permitting customers to “strip” the garments off bodies in photos. The code exists in open-source repositories and has continued to resurface in new varieties.

There have been different single-photo face-swapping apps, like ZAO or ReFace, that place customers into scenes from mainstream films or pop movies.

However as the primary devoted pornographic face-swapping app, Y takes this to a brand new stage.

It’s “tailored” to create pornography using individuals without their consent.

The rise of Deepfake Porn

Y is extremely straightforward. As soon as a consumer uploads a photograph of a face, this opens up a library of porn movies. A consumer can then choose any video to generate a preview of the face-swapped consequence inside seconds—and pay to obtain the full video.

Y claims their platform is protected and accountable software for exploring sexual fantasies. The language on the positioning encourages customers to add their very own face. However nothing prevents customers from importing different individuals faces.

There will be a rise in deepfake porn marketing campaign.

Typically it’s much more sophisticated than revenge porn.

Nonconsensual deepfake porn also can have financial and professional impacts on peoples careers and livelihoods.

Y’s choice to create deepfake homosexual porn, although restricted, poses a further risk to males in international locations the place homosexuality is criminalized. That is the case in 71 jurisdictions globally, 11 of which punish the offense by death.

Leave a Reply