Over 20 girls in Spain have reported a distressing incident wherein they received AI-generated nude images of themselves on their mobile phones.
However, the question arises: can legal actions be taken against the creation and distribution of such deepfake content?
Haunted by AI-generated deep fakes
More than twenty girls from Almendralejo, a town in southern Spain, were shocked when they received nude photos of themselves on their mobile devices. What made matters worse was that none of these girls had taken these pictures, yet they appeared frighteningly realistic.
These images had been illicitly taken from their Instagram accounts, manipulated using AI, and subsequently circulated in the school’s WhatsApp groups.
In their genuine photographs, the teenagers were completely clothed, but the application had skillfully made them appear nude. Consequently, concerned parents and legal authorities are left wondering whether a crime has been committed, even if the images in question are technically not real. Could these images be deemed as child pornography?
Miriam Al Adib, one of the girls’ mothers, expressed her distress on her Instagram account, stating, “The montages are super realistic, it’s very disturbing and a real outrage.” She added that her daughter was deeply disturbed by the ordeal.
Al Adib even raised concerns that these photos might have found their way onto internet platforms like OnlyFans or adult websites, all while the girls endured hurtful comments from their peers. One of the girls was even told, “Don’t complain, girls upload pictures that almost show their private parts.” It is noteworthy that the youngest among the victims is just 11 years old and hasn’t even reached high school yet.
To address this troubling situation, the mothers have come together to voice their concerns. The National Police have initiated an investigation and have already identified several underage individuals allegedly involved in this incident, some of whom are fellow classmates of the affected girls.
The case has been referred to the Juvenile Prosecutor’s Office, and the town’s mayor has issued a stern warning: “It may have started as a joke, but the implications are much greater and could have serious consequences for those who made these photos.”
Problematic App
The hyper-realistic deepfake images, created with the ClothOff app, have raised alarm. Marketed with the slogan “Undress anybody, undress girls for free,” this app permits users to digitally remove clothing from individuals depicted in their phone’s image gallery for a fee of €10, yielding 25 naked images.
Although the nudity depicted in these images is not real, the mothers emphasize that the emotional distress suffered by the girls is indeed very real. Miriam Al Adib issued a stern message on her Instagram account, directed at those who shared the pictures: “You are not aware of the damage you have done to these girls and you’re also unaware of the crime you have committed.”
EU’s toothless laws
Legal experts are grappling with the question of whether this offence can be classified as the distribution of child pornography, which would entail severe penalties, or if a more cautious approach is warranted.
Leandro Nunez, a lawyer specializing in new technologies at the Audens law firm, emphasizes that the critical factor isn’t whether the photo is 100 per cent authentic but whether it appears to be. He suggests that it could potentially be regarded as child pornography, crimes against moral integrity, or the distribution of images containing non-consensual sexual content, with the latter resulting in a lesser sentence of six months to two years in prison.
However, Eloi Font, a lawyer at Font Advocats, a law firm specializing in digital law, contends that it might be categorized as a crime akin to the reproduction of sexual images of minors, carrying a penalty of between five and nine years in prison.
from Firstpost Tech Latest News https://ift.tt/EnrplFx
No comments:
Post a Comment
please do not enter any spam link in the comment box.