Many of us, a few months ago, may have been astonished and laughed at the video where Jim Carrey replaced Jack Nicholson in some scenes of the famous film The Shining.
We started laughing a little less last night, in front of the segment on Le Iene, where another use of this technique was told and shown to our eyes, now famous and defined as deepfake.
Deepfake (a term coined in 2017) is a technique for synthesizing human images based on artificial intelligence, used to combine and overlay existing images and videos with original videos or images through a machine learning technique known as a generative adversarial network. It has also been used to create fake pornographic videos depicting celebrities and for revenge porn, but it can also be used to create fake news, hoaxes, and scams.
THE BIRTH OF THE PHENOMENON
Deepfake initially emerged in two main areas: academia and the production of web entertainment content (one of the applications that made this technique famous was replacing Nicholas Cage’s face with that of other actors in various films).
Of course, the step that led to its use in pornography was very short.
Soon videos began to multiply where the faces of well-known actresses and actors replaced those of actors starring in pornographic films; on Reddit, specific subtopics were created where this kind of material was shared, and for some time, this technique remained in the web’s underworld.
Until 2017, when Vice published an article about the phenomenon and sparked a real case where it became evident how deepfake could open the doors to an increase in fake pornography.
In 2018, the Reddit community on deepfake was banned for the unintentional sharing of pornography, and from there, other platforms where this content circulated (especially Pornhub) began banning users who circulated them and, in general, applying much stricter policies on these issues.
THE SEGMENT AIRED ON LE IENE (November 5th)
During the episode aired on November 5th, the editorial staff of Le Iene focused on this phenomenon and talked about a recent case involving Fabiana Pastorino, a curvy model and body positive influencer.
Fabiana reports that she discovered online photos of herself naked.
No problem, except that these photos had actually been created using the deepfake technique, drawing from the girl’s swimsuit images.
Fabiana faced this event with strength, but what would happen if more vulnerable people were victims, or if these images were used for even less noble purposes than pornography, such as scams or blackmail?
This phenomenon is increasingly associated with revenge porn (which we will discuss in a future article), which in recent years has caused damage and victims, but with this new evolution, it could become even more insidious, since literally anyone could be a victim.