Artificial Intelligence Is Not the Answer to Preventing the Murder of Women
Several true crime cases illustrate how AI is a poor replacement for educating law enforcement and teaching boys to be better men
By Ellie Taylor and Jessica Cash
One of the hardest things for a victim of domestic abuse to do is to tell someone. Abusers exert such significant psychological control that their 7victims are likely scared. They may think that no one will believe them about the abuse. The vast majority of domestic abuse victims never report their abusers to the police.
So, when a victim takes the brave step of telling someone, in particular the police, they deserve to be protected. All too often, we see that while the police may arrest an abuser, they'll make bail and come out even angrier with the victim. The victim may be granted a restraining order designed to protect them, and bail terms may even prevent the abuser from having contact with the victim. Still, as we highlighted in Harshita Brella's case, this often isn't enough to protect a woman, let alone save their life. Since reporting on Harshita's murder, we've looked into the murder of another woman by her ex-partner after she reported him to the police.
This is the story of Lina Balbuena and how complex technical solutions to simple law enforcement problems may not always be the best way.

Lina lived in the seaside town of Benalmádena, Spain, where the government has pushed local agencies to use analytics to predict the likelihood of additional violence in domestic abuse cases. It's meant to solve for the problem that law enforcement officers are not good at predicting domestic violence, and it also illustrates that analytics, machine learning and artificial intelligence are not likely to do better.
On January 20 this year, Lina's ex-partner, identified only as Augustine, took her phone and refused to return it, raised his fist at her and threatened her. She reported him to the police, telling them he had hit her on previous occasions. Lina then had to answer 35 questions about the nature of the relationship and abuse. Afterward, the police entered the answers into a web application called VioGén. The answers were then analyzed, and then the system uses an algorithm to assign each victim at levels: negligible risk, low risk, medium risk, high risk, and extreme risk. A lower score results in fewer resources, and a higher score could result in increased police presence and, in some cases, 24-hour police surveillance of the person, putting the victim at risk.
Lina was rated at a medium risk level, which entitled her to a call from the police within 30 days. The following day, Lina appeared in a specialized gender violence court, where prosecutors asked for a restraining order against Augustine. The judge rejected the request and cited VioGén's assessment as a primary reason.
Less than three weeks later, on February 9, Augustine entered Lina's home, where she lived with their three children, a son from a previous marriage and her mother. We don't know how it began, but their 11-year-old son walked in and found Augustine strangling Lina to death, having punched and possibly stabbed her, and attempted to hang her from the ceiling - believed to be intended to make her death look like a suicide. The boy tried to get between his parents to protect Lina and Augustine hit him as well, leaving him with cuts and bruises. Augustine then set the house on fire with his family inside - thankfully, they all escaped.

At 5.25 a.m. Augustine called the police to report the fire but did not mention that his family lived there or that Lina was still inside. When they arrived he told them that the fire was accidental - his 11-year-old son said "Dad, that's not true - you killed mom" and informed police she was still inside.
Lina was denied a restraining order and police protection refused her only shot at safety by an algorithm. While VioGén is supposed to be one of many tools police and judges can use when assessing risk. In reality, a study by Eticas found that officers accepted VioGén's evaluation 95 percent of the time. Even though the head of the National Police's family and women's unit in Malaga, Chief Inspector Isabel Espejo, would only say that the algorithm's risk calculation is "usually adequate."
Let us state now: Adequate is not a good enough standard when it comes to a threat to someone else's life. Adequate is okay; it's fine. It's for a cheap wine at dinner or the Sunday matinee movie. But it's not the word you want to hear about an algorithm that carries hearts and minds through the system.
And Lina is not alone. Lobna Hemid was murdered by her husband in February 2022, two weeks after she was found to be at low risk. Stefany González Escarraman was murdered by her husband in 2016, a month after she was found to be at negligible risk. Eva Jaular and her 11-month-old daughter were murdered by her ex-boyfriend in 2021, six weeks after she was found to be at low risk. In July 2024, a 30-year-old woman, whose name has not been published, was murdered by her ex-partner after being classified as low risk.
Between 2007 and 2024, 990 women were killed in Spain by a current or former partner. 247 of them had been assessed by VioGén - Spain's Interior Ministry refused to release information about their risk classifications to The New York Times. Instead, The Times analyzed reports from 2010 to 2022 and learned the risk scores of 98 victims - 55 of them were assessed as facing low or negligible risk. Yet, The Times reported that “8 percent of the women who the algorithm found to be at negligible risk and 14 percent at low risk have reported being harmed again.”
The Times series explores VioGén and the cases of several Spanish women who received a “low risk” rating and then were murdered, including those 32-year-old Hemid and 26-year-old González Escarraman. In both cases, the police accepted the system’s prediction. Hemid was fatally stabbed by her husband before he committed suicide. Judges can serve as a check on VioGén. Still, despite coming into court with bruises and reporting that her husband had choked her, which is the leading predictor of whether a woman will be murdered, Escarraman was denied protection. A month later, she was stabbed by her husband multiple times in the heart in front of her children.
Astonishingly, VioGén is reportedly programmed to only give the number of "extreme" risk assessments the department can afford, meaning that a woman can be given a lower risk status than she should get—and it's all because of money.
The secrecy around VioGén's results is nothing new – since being introduced more than 20 years ago, the Interior Ministry has never allowed an external audit of the system. Like The Times, the Eticas Foundation was denied access to VioGén data after they offered to conduct a confidential, pro bono, internal audit. Eticas then conducted their own study, which found that 45% of those who reported domestic abuse were assessed as "low risk," and only one in seven people were granted any form of police protection.
Mistakes can be made with or without an algorithm. Still, it is deeply concerning that VioGén consistently makes them while its recommendations are accepted 95% of the time. A computer program is not a substitute for better training on domestic abuse for all police officers, social workers and others in contact with victims.
The Interior Ministry's refusal to allow any independent assessment of VioGén also raises questions. In their full-throated defense of VioGén they claimed it was an "incontestable fact" that it has reduced violence against women and referred to their own figures, which state that repeat attacks now make up 15% of domestic abuse offenses, down from 40%. One of its creators said, "If it weren't for [VioGén], we would have more homicides and gender-based violence."
We're not saying that if Lina had been granted a restraining order, she'd have been safe or that a restraining order alone would have provided adequate protection. Harshita Brella's husband broke his restraining order within weeks, and as we wrote then, "We need to stop throwing pieces of paper like domestic violence protection orders at the problem, and we need to put our hearts, minds and resources behind effective solutions." But a restraining order is better than nothing. Nothing is what Lina got from her brave decision to tell the police - as she was deemed to be at "medium risk," the only action they were required to take was a phone call within 30 days.
Balbuena's murder and the failures that allowed it to happen demonstrate the worst aspects of humanity. The response from the local community represents the best. After the parish priest suggested a collection for Lina's children during Mass, locals donated 800 euros within an hour. In three days, they collected over 6,200 euros, which is about 5,300 pounds or 7,100 dollars. Children sold bracelets in the market to raise money, local schools carried out fundraisers, and a protest was organized with the slogan "Ni Una Menos" - Not One Less.
VioGén is a symptom of a wider problem, not its cause. The societal and law enforcement failures that allowed Lina, Stefanie, Lobna, Eva and countless others to die at the hands of their abusers can't be solved by improving or not using an algorithm - solving that requires us all to say Ni Una Menos, or Not One Less, the slogan chanted by thousands protesting the murders of young women. We need to teach boys to be better men, create a culture in which victims feel able to tell people they are being abused, alongside a legal system capable of protecting them when they do, a system where all police officers, prosecutors and judges are trained to measure risk to those facing domestic violence, and not lean on an ill-equipped computer aid.
Jessica Cash and Ellie Taylor are the hosts of the Version of Events true crime podcast. If you are a victim of domestic violence or need resources for someone you care about click here or here.
Heartbreaking. "One of the hardest things for a victim of domestic abuse to do is to tell someone" spoke to me as a victim of DV myself. Thank you for bringing light to this.