The development of the AI tool, led by Lecturer in Computer Science Dr Nonso Alexanda Nnamoko, aims to create model that can risk assess potential domestic abuse perpetrators by analysing the language they use with partners over text messages and social media.
The project is being carried out by a team of academics from Edge Hill University, London Southbank University, De Montfort University and University of Brighton and has received a £115,000 cash boost from the UK Home Office.
When completed the AI model will help police quickly analyse conversations for red flags through keywords and patterns, without manually reading through a victim’s phone. Major benefits of this tool include quicker processing of data for the police and protecting victims’ data and privacy.
Dr Nonso Alexanda Nnamoko said: “The tool is capable of identifying the presence of coercive language and patterns from conversational text message. We trained the model by feeding it historic data related to the subject including digital evidence supplied in domestic abuse court trials as well as curated social media messages involving domestic abuse.”
More data sets will be fed to the AI model to train it further and help it to identify what meets the legal threshold for abusive messages in text or social media messages.
“So far the AI model is at least 88% accuracy at identifying coercive and abusive messages, but we can improve the performance if more data is available to train the model. As the AI model analyses text using natural language processing techniques, it extracts indicators of coercive behaviour. This in turn helps in learning and making inferences on new unseen textual data.”
Professor Vanessa Bettison, who teaches criminal law at De Montfort University Leicester is a leading consultant on the project, and she believes the tool may encourage more victims of domestic abuse and coercive control to come forward.
She said: “As technology has evolved, so have abusive and coercive behaviours. Perpetrators can now control light settings around the house and monitor the door or central heating systems, leaving victims vulnerable to abusive behaviours while they are by themselves at home.
“However, we still feel that language holds the key to identifying those who are at risk of being coerced or abused. This funding from the Home Office is allowing us to put this theory to the test and enabling us to see if an AI analysis tool is feasible.
“The police forces we have surveyed as part of our research are very much on board with the development of the tool and are keen to see this type of technology introduced, with the hope it can increase the chances of identifying more perpetrators of domestic abuse at an early stage.
“While we are right at the beginning of the project, I’m confident that what we’re developing will protect the privacy of the victim. Having the AI tool analyse conversations instead of an individual may encourage more victims to come forward as their private conversations won’t be manually sifted through.”
Dr Chris Magill from University of Brighton; Denise Harvey and Dr Tirion Havard from London South Bank University will undertake the survey with police and survivor organisations, and document its findings in the research team’s Home Office report.
Once the feasibility study is done, the coalition of universities hopes to secure further funding to carry out more testing, eventually reaching a point where police forces can routinely offer this support to domestic abuse survivors.
To discover more about our courses at Edge Hill, please visit ehu.ac.uk/study.