UK General Election Conundrum: Advocate for Apps to Allow Nudity and Combat Image Misuse in Women's Pictures
In the digital age, the issue of nonconsensual AI-generated nudes and deepfake pornography has become a significant concern. The UK government has taken steps to address this problem, primarily through the Online Safety Act 2023 and updates to the Sexual Offences Act.
Under the Online Safety Act, it is explicitly illegal to share or threaten to share non-consensual sexual images, including AI-generated deepfakes. Social media platforms and online services are obliged to proactively remove or prevent such content from appearing on their sites. Failure to comply can lead to fines up to 10% of the platform’s global revenue.
Amendments to the Sexual Offences Act introduce criminal penalties, including up to 2 years imprisonment, for creating sexual deepfakes without consent. However, the law primarily targets the sharing or threatening to share such images rather than the mere creation or possession of nonconsensual deepfakes.
To protect minors, the UK has launched a mandatory robust age verification system for accessing adult content online. This involves facial age estimation, official ID verification, and checks through financial or mobile providers.
Despite these measures, nudify apps remain a concern. These apps manipulate images to make the subjects appear nude or semi-nude, often by digitally removing clothing or altering the image to create the illusion of nudity. A recent investigation by 404 Media found that Instagram is profiting from advertisements for apps that allow users to create nonconsensual deepfakes.
The UK Ministry of Justice (MoJ) plans to criminalise the creation of sexually explicit deep fake images through upcoming legislation. The amendment to the Criminal Justice Bill focuses on intent to cause harm, instead of simple consent.
The capabilities of AI are rapidly accelerating, presenting governments with new dangers and needs for new and more sophisticated regulation. Technology solutions are emerging to counter the spread of nonconsensual AI-generated nudes, such as specialized smartphones using AI to block the creation, sharing, or viewing of nude content.
However, the issue is not confined to the UK. Celebrities like Taylor Swift, Bobbi Althoff, Jacob Elordi, and Sydney Sweeney have been targeted by nefarious players online. The increasing accounts of adolescents being subjected to deepfake videos necessitate protection for UK citizens' safety online.
Following a report by Emanuel Maiberg of 404 Media, Apple and Google pulled multiple nonconsensual AI nude apps from their respective app stores. The UK government must continue to keep an eye on how new advancements in technology are affecting marginalised genders and how well they are catching up with these issues. The upcoming general election has women's rights, particularly online, as a key topic for many young voters.
Summary:
- Sharing nonconsensual AI nudes is illegal under the Online Safety Act, with heavy fines for non-compliance.
- Creating nonconsensual sexual deepfakes is criminalized under the Sexual Offences Act, but only if shared.
- Nudify apps are not specifically criminalized, but must comply with the Online Safety Act for hosting content.
- A mandatory strong age verification system is in place to block minors from accessing adult content online.
- Emerging tech is being developed to counter the spread of nonconsensual AI-generated nudes, particularly for children.
[1] Online Safety Bill, UK Parliament (2021). [https://www.parliament.uk/business/bills/bill/2021-22/online-safety/] [2] Ortega, J., & Carpenter, S. (2024). Deepfake pornography: The new frontier of online harassment. The Guardian. [https://www.theguardian.com/technology/2024/mar/25/deepfake-pornography-the-new-frontier-of-online-harassment] [3] Age verification system for online adult content, UK Government (2023). [https://www.gov.uk/government/publications/age-verification-system-for-online-adult-content/age-verification-system-for-online-adult-content] [4] Maiberg, E. (2024). Instagram profits from deepfake apps that create nonconsensual nude images. 404 Media. [https://404media.com/instagram-profits-from-deepfake-apps-that-create-nonconsensual-nude-images/] [5] Child-safe smartphones: The future of protection against online threats. Home Security Heroes (2023). [https://homesecurityheroes.com/child-safe-smartphones-the-future-of-protection-against-online-threats/]
- Social media platforms and online services, including Instagram, are obliged to proactively remove or prevent non-consensual AI-generated deepfakes under the Online Safety Act, as failing to comply can lead to fines equivalent to 10% of the platform's global revenue.
- As the advancements in technology continue to accelerate, there is a growing need for new and more sophisticated regulations concerning the creation and sharing of nonconsensual AI-generated nudes, as seen in the pending amendment to the Criminal Justice Bill by the UK Ministry of Justice.