Researchers in the United States have developed an artificial intelligence system that can help healthcare professionals identify patients at risk of intimate partner violence (IPV) years before they may seek help. The study, published in the journal Nature, highlights the potential of AI to support early intervention and improve patient outcomes.
Intimate partner violence, which includes abuse by current or former partners, can result in severe injuries, chronic pain, and mental health issues. According to a 2021 report by the European Commission, 18 percent of women who had ever had a partner reported experiencing physical or sexual violence. Detecting abuse early is often difficult, as victims may be reluctant to disclose it due to fear, stigma, or safety concerns.
To address this, researchers trained a machine learning model using hospital data from nearly 850 women who had experienced IPV and more than 5,200 patients of similar age who had not. Three AI models were developed to assess risk. The first used structured data such as age, medical history, and prior visits. The second analysed written medical notes, including physicians’ observations and radiology reports. The third combined both structured data and unstructured notes.
All three models demonstrated strong performance, but the combined system proved the most effective, correctly identifying risk in 88 percent of cases. The AI tool was able to flag potential abuse more than three years before many patients entered hospital-based intervention programmes.
By analysing large volumes of hospital data, the system can detect patterns of physical trauma that align with abuse cases. It alerts clinicians to patients whose records resemble confirmed IPV instances, enabling earlier support and intervention.
“This clinical decision support tool could make a significant impact on prediction and prevention of intimate partner violence,” said Qi Duan, program director at the US National Institutes of Health’s National Institute of Biomedical Imaging and Bioengineering. “Given the prevalence of cases, the tool could be a game-changing asset to public health.”
The AI system is designed to assist rather than replace clinicians. It does not diagnose abuse or require patients to disclose personal information. Instead, it provides signals that help doctors approach the subject carefully and offer support where appropriate. Bharti Khurana, an emergency radiologist at Mass General Brigham and associate professor at Harvard Medical School, said the tool represents a shift toward recognising risk earlier using information already available in healthcare data.
Researchers plan to integrate the technology into electronic medical record systems so hospitals can receive real-time assessments during routine care. By providing earlier warnings, the AI system could help healthcare professionals intervene sooner, potentially preventing further harm and improving outcomes for survivors of IPV.
