Spain can now punish undressing photos of minors with AI

“This is very serious and I had to share it.” In this way, gynecologist, obstetrician and broadcaster Miriam Al Adib raised the alarm through the network this Sunday. His daughter, a minor, was the victim of deepfake, An image created by an artificial intelligence tool had deposed her, and the photo was being distributed without her consent.

Miriam’s daughter is one of the many victims of what happened in Almendralejo, a town in Badajoz (Extremadura). WhatsApp groups are sharing deepfake Which includes girls from the municipality between 12 and 17 years Nude.

There are already many mothers in the city who are uniting to stop this shocking situation immediately. Many cases are already in the hands of the police, Al Adib told various media this Monday. In the Instagram video in which she sounded the alarm this Sunday, the communicator and mother expressed herself emphatically:

“To those who committed this barbarity: You have no idea how much harm you have caused to all the girls.” “You don’t even know what crime you have committed.”

There are many lessons in the case. One of them is how tools such as Generative Artificial Intelligence can aggravate this problem of many women, both minors and adults. Computer forensic expert Pablo Ducement, an expert in crimes committed on social networks by and against minors, warned about this in an article on his blog months ago.

The threats of deepfakes to democracy are clear, but those who suffer most from this phenomenon are women who appear in fake porn videos.

“Now, if that wasn’t enough, AI appears to be making everything even more complicated: from realistic images created entirely from scratch based on the details of their content to deepfake Or scenes in which the face of a person (in the present case, a minor) is projected, in a very believable way, onto the bodies of actors (who are not).

Use of AI to cause harm to the intimate area and dignity of third parties Unfortunately, this has persisted. The trilogue on future AI regulation continues in Brussels. These tripartite negotiations between the Parliament, the Council of the EU and the Commission could bear fruit before the end of the year, making this new regulation available to Member States.

But there is no reason to wait for this.

He explains this in a telephone conversation. Ibán García, Socialist MEP and one of the regulation’s shadow speakers. “These things already fall into the regulation that exists in Member States: to begin with, in the penal code. Regardless of the tool used, an AI, a drawing, a UniversityThey are healing images and criminological tools to address it.”

Duchment himself confirms this in his aforementioned article: Article 189 of the Penal Code considers as “child pornography” any material that “represents a minor or a person with a disability in need of special protection” that is sexually explicit. clearly participates. Real or fake.”

“There is ongoing debate as to whether AI-generated child pornography should be included in the category of simulated child pornography,” highlights Duchament, who then cites the theory of the state attorney general’s office. “The creation of images of minors for sexual purposes, even through technology, can pose a real risk to the integrity of minors.”

“It’s because of that The fact that the source is AI will not be an excuse, because it is involved in possible ways of generating child sexual abuse material. The expert further adds, “Other factors such as realism and purpose will decide whether it is considered child sexual pornography or not.”

AI regulation, against deepfakes

A British organization warned a few years ago that cases of sextortion And “revenge porn” had increased manifold after the outbreak of the coronavirus health crisis. Given the popularity of basic AI tools, no woman is safe from this situation anymore.

love the stars Gal Gadot, Taylor Swift, Scarlett Johansson, Maisie Williams, Daisy Ridley, Sophie Turner or Emma Watson There have been victims before deepfake In which an AI system inserts their faces into scenes of sexually explicit content.

When Brussels tabled its draft for a specific regulation for artificial intelligence to the Union, the biggest political concern at the time was the surveillance and discrimination that tools like facial recognition models could cause. The world is very different in 2023 than it was in 2021.

That’s why now, as future AI regulation takes its final steps before being approved and implemented, new debates are emerging. MEP Ibán García explains business insider spain One of these debates focuses precisely on the problem deepfake,

“There is an ongoing internal debate over whether restrictions should be instituted for users who believe deepfake And they don’t label them that way.” The fear in these talks is that the proposal merely proposes setting doors on the field. “Anybody who makes fraudulent use of AI will never get labeled. deepfake“, the MP lamented.

‘Deepfakes’ progress faster than the tools that detect them, and this worries the hacker community: the network will be filled with ‘replicators’

That’s why Garcia is more in favor of making it a matter of national law. “We are having a debate about whether companies developing this type of technology can be approved if their deepfake Can be made without listing”.

He adds: “We are thinking about whether there should be additional restrictions or a list of restrictions at the European level, whether there will be any further restrictions at the national level, or considering the possibility of urging member states to are doing. Make a list of inappropriate uses of these types of devices “And impose additional restrictions.”

“We will find such cases deepfake It doesn’t necessarily equate to nudity, but it affects people’s integrity in another way,” Garcia explains.

While the debate is being clarified in Brussels, in Spain there is already a known case of material for sexual exploitation of minors generated by artificial intelligence. As Maryam Al Adib warns: “This is not nonsense.” “It will take everything to the police, not only those who are dedicating themselves to taking off the girls’ clothes with AI, but Everyone who is sharing it.

“And a message for the girls: you are not to blame. Your mothers are going to help you. And if it’s hard for you to tell them, just write it down. Rest assured because it is not going to stay like this and it will stop now. Now close the whole chain of WhatsApp groups and if you have uploaded something on the internet, you can hurry up to download it.”

Source link

Leave a Comment