A pioneering start-up is on the brink of launching an innovative app designed to tackle ‘victim blaming’ in the legal system, focusing on gender-based violence.
The start-up, called herEthical AI, is nearing completion of an artificial intelligence (AI)-backed application aimed at identifying ‘victim blaming’ within the justice system. Founded in May by Tamara Polajnar and Anthony Joslin, alongside psychologist Ruth Spence and communication expert Hazel Sayer, the company aims to address cultural issues in crime processing, assisting survivors through more efficient methodologies.
Ms Polajnar, the company’s CEO with a background in machine learning and natural language processing, emphasised, ‘We strive to improve the culture around crime, making processes more efficient to help survivors’. Previously, she worked on a project evaluating a machine-learning algorithm by West Midlands Police to predict the escalation of stalking into violent assaults, although this tool has yet to be implemented.
During her previous project, Ms Polajnar connected with Anthony Joslin, now the Chief Innovation Officer, who brought extensive experience from his role as an innovation lead at Devon and Cornwall Police. Together, they established herEthical AI, extending their expertise in AI to public sector and third-sector organisations, particularly to combat gender violence.
The application under development by herEthical AI seeks to ‘identify and extract victim-blaming and misogynistic language’ from legal documents like court judgements and transcripts. This project is in collaboration with Riverlight, a London-based non-profit that advocates for domestic abuse survivors. Their initiative, launched in February as ‘In the Judge’s Words’, exposes the ‘dehumanising’ judicial language that victims face in familial court settings.
Riverlight’s findings revealed alarming instances of dismissive judicial attitudes, such as judges downplaying abuse, characterising domestic violence as mutual, or blaming victims for the violence they endured. To counter this, herEthical AI and Riverlight are crowdfunding to enable abuse survivors to purchase and feed their court transcripts into an AI model designed to detect such biased language.
Anthony Joslin indicated that this app is ’80-90% complete’ and already delivering results. It will feature a ‘victim-blaming taxonomy’ and will be documented through an academic paper. There are ongoing discussions with law enforcement on applying this app to improve statement quality by scoring texts to highlight cultural and bias shortcomings.
Additionally, Mr Joslin is advocating for the necessity of AI expertise within public sector leadership, engaging with the police, Crown Prosecution Service, and local authorities.
While the start-up is self-funded with no immediate plans for external investment, there is a keen focus on cultural change regarding domestic abuse and gender violence, as Ms Polajnar articulated, expressing a desire to make significant progress in this arena.
This development marks a critical step towards recognising and rectifying victim-blaming attitudes within legal proceedings, driven by technological innovation and collaborative efforts with advocacy groups.