UNO researchers make progress on preventing violent acts through chatbots

UNO researchers are in Phase 2 of developing a chatbot that could be used to ward off potential threats.
Published: Mar. 24, 2023 at 4:55 PM CDT

OMAHA, Neb. (WOWT) - Researchers at the University of Nebraska-Omaha are making progress on a new approach to help stop violence before it even starts. They’ve begun testing a chatbot prototype that will eventually be used to help people report critical information.

Dr. Erin Kearns, a researcher at UNO, says people often suspect something is wrong, but they do not speak up.

“Many people in the public understandably don’t necessarily know what really are things that should be reported.”

Kearns is heading a research project along with Dr. Joel Elson at UNO, aimed at getting more complete information from people who report suspicious activity quickly, to the people who need it: law enforcement and mental health professionals and other first responders.

The research is taking place at the National Counterterrorism Innovation, Technology and Education Center, or NCITE at UNO. It is funded through a $715,000 grant by the Department of Homeland Security.

“When it comes to suspicious activity reporting, this can often be sensitive or difficult-to-talk-about material, and in some circumstances, using technology instead of a person helps to facilitate trust,” Elson said.

That’s why the research team is developing a chatbot. It’s a system where people can report suspicious behavior to a computer instead of a live person.

Now in Phase 2, the team is currently testing a chatbot prototype.

UNO researchers are testing a chatbot they say aims to help report threats and prevent incidents on or near its campus.

“We want to put the power of information sharing into the person who is collaborating, who is willing to disclose information, to share information,” said Elson.

The chatbot will be embedded in reporting platforms that already exist. The information it receives will be fed into existing networks.

Elson says the chatbot will help protect people’s identities.

“We want to make sure that individual feels extremely comfortable and protected. We want to make sure our privacy and societal values are maintained,” he said.

With Kearns’ background in criminology and Elson’s background in psychology and technology, the team has a unique perspective that is coming closer to removing barriers in reporting.