Baroness Charlotte Owen is leading a charge against deepfake pornography, highlighting significant legal gaps.
- The Baroness’s Bill seeks to amend current laws that fail to adequately support victims of deepfake exploitation.
- Research shows the extensive use of apps generating non-consensual explicit deepfakes, affecting both celebrities and ordinary individuals.
- Baroness Owen’s initiative has garnered cross-party support in the House of Lords, reflecting a united stand against this issue.
- The upcoming debate in the House of Lords marks a crucial step in criminalising the solicitation and creation of such explicit content.
Baroness Charlotte Owen is spearheading a legislative move to address the lack of legal protection surrounding deepfake pornography. She points to ‘gaping loopholes’ in existing laws that do not adequately safeguard victims from the trauma inflicted by such exploitation. Her efforts focus on reforming outdated legislation, such as the Sexual Offences Act 2003, to accommodate technological advances in artificial intelligence.
The current legal framework places a heavy burden on victims, requiring them to demonstrate perpetrators’ intent. Owen describes this process as ‘hugely distressing’ and ‘re-traumatizing,’ especially as victims navigate personal history to prove non-consent. Owen recounts assisting a victim, stating, “The trauma of having to troll back through messages to not only prove that she didn’t consent to it, but that the person knew how much they were hurting her when they did this is hugely re-traumatizing.”
Recent technological advancements have enabled the proliferation of ‘nudification’ apps, which convert ordinary photos into explicit material. Research by software firm Canva reveals these apps can attract tens of millions of users monthly—a significant concern given their capability to process hundreds of thousands of images rapidly, impacting individuals indiscriminately. While celebrities like Taylor Swift highlight the issue through high-profile incidents, it is ordinary women who frequently fall victim to these practices.
An important element of Owen’s Bill is criminalising the solicitation of explicit deepfake material. Inspired by discussions with Jodie, a campaign founder and survivor of intimate image abuse, Owen recognises the collective degradation these platforms facilitate. Jodie’s experience underscores how images sourced from personal social media accounts, like Instagram, are misused on platforms such as Reddit to solicit explicit content.
As the Non-Consensual Sexually Explicit Images and Videos Bill progresses to its second reading, it has drawn significant interest across party lines in the House of Lords. According to Owen, support spans multiple political parties, a testament to the recognition of this pressing issue. She remains optimistic about further backing in the House of Commons, highlighting the relevance of her Bill to broader governmental goals like combatting violence against women.
The Baroness underscores that online violence is part of a wider spectrum of abuse, needing comprehensive action across all levels of governance. Her appointment as the youngest-ever member of the House of Lords demonstrates her commitment to scrutinising emerging technologies and their societal impacts. Her advocacy forms part of a broader mission to protect individuals from digital exploitation.
The upcoming House of Lords debate on the Bill signifies a pivotal moment in tackling digital exploitation and safeguarding victims.