• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now

micromachine

Lieutenant General
Loyal
A user-friendly application has resulted in an explosion of convincing face-swap porn.

In December, Motherboard discovered a redditor named 'deepfakes' quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities—including Gal Gadot, Maisie Williams, and Taylor Swift—using a machine learning algorithm, his home computer, publicly available videos, and some spare time.

Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. Another redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

These are developments we and the experts we spoke to warned about in our original article. They have arrived with terrifying speed.

Shortly after Motherboard published its story, deepfakes created a subreddit named after himself dedicated to his practice two months ago. In that short time, it has already amassed more than 15,000 subscribers. Within the community, the word “deepfake” itself is now a noun for the kinds of neural-network generated fake videos their namesake pioneered.

Another user, called 'deepfakeapp,' created FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes' algorithm, but deepfakeapp created FakeApp without the help of the original deepfakes. I messaged deepfakes, but he didn’t respond to a request for comment on the newfound popularity of his creation.

Deepfakeapp told me in a Reddit direct message that his goal with creating FakeApp was to make deepfakes’ technology available to people without a technical background or programming experience.

“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”

n early January, shortly after Motherboard’s first deepfakes story broke, I called Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, to talk about the implications of this technology on society at large: “I think we’re on the cusp of this technology being really easy and widespread,” he told me, adding that deepfakes were pretty difficult to make at the time. “You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two.”

In fact, that barely stayed true for two months. We counted dozens of users who are experimenting with AI-assisted fake porn, some of which have created incredibly convincing videos.

Redditor UnobtrusiveBot put Jessica Alba’s face on porn performer Melanie Rios’ body using FakeApp. “Super quick one - just learning how to retrain my model. Around 5ish hours - decent for what it is,” they wrote in a comment.

More at https://motherboard.vice.com/amp/en...rn-app-daisy-ridley?__twitter_impression=true
 

tanwahtiu

Alfrescian
Loyal
Nothing new. US George Bush gov can fake bombing of his double penis towers used controlled demolitions.

Anything from Chai angmoh is pssible.

 

ToaPehGong

Alfrescian
Loyal
KNS, later sure got people post you know who photo and make everyone have nightmare.
Give you the first nightmare. KIRSTEN HAN. Next is short hair and wearing slippers.
unnamed (2)-1.jpg
 

halsey02

Alfrescian (Inf)
Asset
KNS, later sure got people post you know who photo and make everyone have nightmare.

LOL.... Oh my gourd!,.it does gives me the creeps to think of it...it will be a mother of all nightmares if you have that "one" & the resident of 38 oxley cul de sac...better check in, you know where ASAP!
 
Top