The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report.It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media.And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal."Children have told me they are frightened by the very idea of this technology even being available, let alone used.
They fear that anyone — a stranger, a classmate, or even a friend — could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children’s Commissioner Dame Rachel de Souza."There is no positive reason for these [apps] to exist."De Souza pointed out that nudification AI apps are widely available on mainstream platforms, including the largest search engines and app stores.At the same time, they "disproportionately target girls and young women, and many tools appear only to work on female bodies." She added that young people are demanding action to take action against the misuse of such tools.ADVERTISEMENTAdvertisementTo that end, de Souza is calling on the government to introduce a total ban on apps that use artificial intelligence to generate sexually explicit deepfakes.
She also wants the government to create legal responsibilities for GenAI app developers to identify the risks their products pose to children, establish effective systems to remove CSAM from the internet and recognize deepfake sexual abuse as a form of violence against women and girls.The UK has already taken steps to ban such technology by introducing new criminal offenses for producing or sharing sexually explicit deepfakes.It also announced its intention to make it a criminal offense if a person takes intimate photos or video without consent.However, the Children's Commissioner is focused more specifically on the harm such technology can do to young people, noting that there is a link between deepfake abuse and suicidal ideation and PTSD, as pointed out."Even before any controversy came out, I could already tell what it was going to be used for, and it was not going to be good things.
I could already tell it was gonna be a technological wonder that's going to be abused," said one 16-year-old girl surveyed by the Commissioner.