Skip navigation

PROBabLE Futures

Probabilistic AI Systems in Law Enforcement Futures (PROBabLE Futures) is a recently funded RAI-UK Keystone Project that will bring together academics, Law Enforcement, Government, third-sector and commercial AI industry partners to explore responsible approaches to Probabilistic AI across the law enforcement landscape.



PROBabLE Futures (Probabilistic AI Systems in Law Enforcement Futures) is a four-year, £3.4M RAi funded project led by Northumbria University and involving Glasgow, Northampton, Leicester, Aberdeen and Cambridge Universities. It draws on the expertise of the Alan Turing Institute, the National Police Chiefs Council, police forces, the Home Office, JUSTICE (the legal charity) and three industry partners including Microsoft. The project’s focus is on probabilistic AI in policing and the wider criminal justice and law enforcement system – it will deliver tested and trusted advice and frameworks to commercial system developers and public sector bodies on when and how to deploy AI in this sector effectively and with the long-term trust of communities in mind.

The research of the PROBabLE Futures project team endeavours to deliver an ‘operational-ready’ blueprint of an evidence-based, contextually informed and future-oriented framework for responsible AI in law enforcement and criminal justice. Activities will also include mock trials with AI outputs as evidence, applying lessons from prior technological developments to study the use of ‘Probabilistic AI’ in policing, intelligence, probation and wider criminal justice contexts.

Probabilistic systems supported by AI, such as facial recognition, predictive tools, large language models (LLMs) and network analysis are being introduced at pace into law enforcement. Whilst these systems offer potential benefits, decision-making based on ‘Probabilistic AI’ has serious implications for individuals. The key problem for responsible AI is that the uncertain or probable nature of outputs is often obscured or misinterpreted, and the underlying data is sensitive and of varying quality. If AI systems are to be used responsibly, attention must be paid to the chaining of systems and cumulative effects of AI systems feeding each other. PROBabLE Futures will review, question and evaluate the use and applications of probabilistic AI across different stages of the criminal justice system.

The 4-year interdisciplinary research project is led by Northumbria Law School’s Professor Marion Oswald MBE, in collaboration with a team of multi-disciplinary co-investigators from Northumbria, Glasgow, Northampton, Leicester, Cambridge and Aberdeen universities. Northumbria University Co-Is are Professor Gita Gill, Dr Kyri Kotsoglou, Dr Kyle Montague and Professor Yifeng Zeng.

If you are interested in getting involved in this research or joining the research team then please complete the form below.

Back to top