In a cross-disciplinary effort, a University of Iowa engineering professor and a social work professor are creating software that leverages artificial intelligence to detect violent behavior, aiming to enhance intervention strategies.
The technology functions like an advanced nanny cam designed to detect physical abuse using artificial intelligence. The device remains dormant until it detects potential abuse, at which point it activates to record the incident and sends the footage to the caregiver, who can then choose to forward it to authorities if necessary.
UI Mechanical Engineering Professor Karim Abdel-Malek said the idea for this technology came from his work as the director of UI’s Virtual Soldier Research Program, which develops predictive models to analyze body movements and interactions in virtual environments for military applications.
“So that led to an idea that if we could use this for soldiers, why not use it for other things?” Abdel-Malek said.
Abdel-Malek and a team of UI researchers patented this technology in 2020, at which time Abdel-Malek said he wanted to collaborate with an expert in child abuse. He envisions the first applications of this technology in settings like daycare, babysitting, and elder care, where instances of abuse can often go undetected and unreported.
He reached out to three UI faculty members in the field of social work and said Aislinn Conrad was the first to respond.
“I got a cold call from Karim last September, and he didn’t even say what it’s about,” Conrad said. “But I just knew, I just had this feeling, this is going to be a big deal.”
Before her appointment at UI, Conrad worked as a child welfare investigator and in foster care reintegration. She later earned a Ph.D. and a Master of Social Work to further specialize in child welfare.
“I decided to dedicate my research life to child abuse prevention because, unfortunately, in society, violence feels like a very intractable problem,” Conrad said, noting that one in nine children will experience abuse or neglect by the age of 17, and one in three women will experience violence in their lifetime.
“And that’s not even accounting for people who are elderly, who can experience horrific acts of violence at the hands of caregivers,” Conrad said.
She believes this technology will be a transformative way to overcome the challenge of collecting evidence when investigating reported incidents of abuse retroactively.
RELATED: UI students create startup focused on alleviating child anxiety
In Iowa, the Department of Health and Human Services assessed 26,613 reports of child abuse in 2023. Of those, 70 percent — more than 18,000 children — resulted in a finding of “not confirmed,” indicating that there was insufficient evidence to substantiate the abuse allegations.
“This is a game changer,” Conrad said. “This is going to change the paradigm of how we respond to violence in a meaningful way.”
Abdel-Malek added that this technology aims to identify abuse before it escalates and causes noticeable signs.
“The idea here would be before or just right at,” Abdel-Malek said. “So right away, you can interfere.”
Conrad emphasized the importance of this technology for children who are victims of abuse, as they cannot often articulate their experiences or understand the need to report them.
She added that it will create the opportunity to protect children proactively and course-correct with
the aggressor.
“Sometimes it’s a lack of emotional regulation. It’s a lack of understanding that the behavior is not okay,” Conrad said. “There’s a lot of ways that this is going to alter the landscape for everybody.”
Abdel-Malek said UI students will have an important role in the ongoing development of this technology.
“We’re going to need actors to replicate different behaviors on dummies,” Abdel-Malek said, explaining how the AI will be trained.
He emphasized that the technology’s effectiveness depends on collecting data from people of diverse heights, weights, hairstyles, and clothing, allowing it to accurately recognize a wide range of abusive actions.