Order the latest issue here
Skip to content Skip to sidebar Skip to footer

“AI systems are only as objective as the data that feeds them.”

The View Magazine opinion piece by Verity Butler. As AI quietly enters the justice system, urgent questions are arising over fairness, transparency, and control. In recent months, the stealthy adoption of artificial intelligence (AI) within the UK’s criminal justice system has begun to attract sharper public scrutiny. Reports have revealed that the Ministry of Justice…

Read more

When Complaints Vanish and Punishment Replaces Care: The Abuse of IEPs at Eastwood Park

The Incentives and Earned Privileges (IEP) scheme was introduced to encourage “good behaviour” in prisons. But in practice, “good behaviour” is a vague, patronising and fundamentally subjective concept, one that gives frontline staff enormous discretionary power. What counts as “good” too often depends not on clear rules, but on the personal attitudes, frustrations or prejudices…

Read more

0

The View Magazine opinion piece by Verity Butler.

As AI quietly enters the justice system, urgent questions are arising over fairness, transparency, and control.

In recent months, the stealthy adoption of artificial intelligence (AI) within the UK’s criminal justice system has begun to attract sharper public scrutiny. Reports have revealed that the Ministry of Justice (MoJ) is experimenting with machine learning tools to assist with areas such as probation decisions, predictive policing, and prison management. These developments, while often couched in the language of efficiency and innovation, have provoked alarm among rights groups and researchers who warn that the risks of bias, discrimination, and opaque decision-making may far outweigh the promised gains.

In September, it was also reported that the MoJ was piloting AI models aimed at identifying prisoners at risk of reoffending or self-harm. Similar systems, trialled elsewhere in Europe, have been criticised for embedding racial and socioeconomic prejudices into the data they rely on. When historical criminal records are used to “train” algorithms, the outputs inevitably mirror the inequalities of the past. The result is what some campaigners have called “automated injustice”, an invisible but deeply consequential shift in how human freedom is assessed and controlled.

What worries many observers is the lack of transparency. The MoJ has so far refused to publish detailed information about which systems are in use, who built them, or how they are tested for bias. Without that transparency, public trust and justice itself hangs in the balance.

To better understand how these concerns are unfolding internationally, The View reached out to the Council on Criminal Justice, a US-based think tank that has established a dedicated Task Force on Artificial Intelligence. The Council’s director of that task force, Jesse Rothman, has been leading an effort to bring together experts from across disciplines to confront the challenges and possibilities of AI in law.

Read Verity Butler’s full article on The View 15 here: The View Magazine Issue 15 Autumn 2025 Digital Edition – The View – for women with conviction