In June 2019, American media Vice reported the existence of deepnudeAn “awesome” application that can strip for someone women, The user only had to post a photo of it victimPay $50 and wait artificial intelligence The (AI) of the program was in charge of taking off their clothes. The results were so reliable that its usage skyrocketed, but the wave of criticism it received forced its creators to shut down the project after a few hours.
People involved in the simulation of naked girls in Almendralejo could face up to 9 years in prison
Four years later, the problem is much bigger. DeepNude disappeared, but dozens of others have emerged in its placeApps‘ Similar. One of them is the one that has been used by a group of children from Badajoz to disarm their underage classmates without consent. Miriam El Adib, the mother of one of the victims, told EL Periodico from the Prensa Ibérica group, “The photos are extremely realistic and, in addition to going viral among them, they may have been uploaded to websites like ‘OnlyFans’.” ,
Although this case has already been reported to the National Police, it represents a new practice digital violence Increasingly popular against women. A simple look at Google Trends is enough to see how searches for these applications have skyrocketed in the last year. “This is a phenomenon that is very worrying because there is no kind of regulation about it and it is very difficult to quantify the use of the images that are transmitted and to identify those who do so,” said Eva Kruels, coordinator of telling. focus line of femblokan initiative feminist Against digital sexual violence.
From girls to celebrities
Obscene “deepfakes” of celebrities, political leaders and activists are spreading
Initially, this type of manipulated content – known as ‘deepfake‘-He mainly targeted famous women. faces of girl gadot, Jennifer Lawrence one of two Scarlett Johansson They were cut up and incorporated into pornographic scenes, which were then posted on adult pages. According to company Sensity’s calculations, more than 82,000 illegal videos were detected in January 2021. Although it is very difficult to count, the current figures will be much higher.
This rise is due to normality aye, in full detail. It is in this context that many ‘apps’ have emerged which try to gain a foothold at the expense of girls taking off their clothes without their consent. Its use is becoming simpler and its results are becoming more reliable. This combination of factors has made it easier technology Can be used problematically by all types of audiences.
“The message that any girl can be exposed is a very clear statement about the sexuality of many teenagers.”
Eva Cruels. FEMBLOK HOTLINE COordinator
Public figures, such as Rosalia or ‘influencer’ Laura Askens, will remain the target of these attacks. However, the ease of access to these programs is spreading their impact to all types of women, including girls, as the case of Badajoz shows. “The message is that any girl can be exposed, this is a very clear statement about sexuality Lot’s of teenagers“Adds Cruel.
anxiety and low self-esteem
This form of digital sexual violence can affect mental health Of the victims. 54% women suffer Harassment According to a previous study by the National Observatory of Technology and Society (ONTSI), 42% of young women have experienced panic attacks, anxiety or stress while on the Internet, while 42% have experienced emotional stress, low self-esteem and loss of confidence. Year. “This is what it feels like to feel humiliated, this is what it feels like to feel exploited,” streamer CutieCinderella, a victim of deepfakes, condemned in January.
Be cautious of the photos you upload: ‘Deepfakes’ created with AI that condemn Laura Askens flood social networks
Although it is a simulation, the delivery of sexual material of this nature is a Crime, It can also be child pornography If it affects a minor. That’s why Fembalok assures that public reporting is a good practice to highlight its impact and “appeal to common sense”. “We have to raise a lot of awareness (…) and create a community that rejects this type of aggression,” Cruels says.
It is illegal to reproduce fake sexual images, but it is still very easy to find this type of material. Internet, A quick Google search shows all kinds of links, from links recommending the best AI ‘apps’ for creating nudes to the portals where they are posted.
(Tagstotranslate)girl