Closer collaboration on health technology

UiS has strengthened its collaboration with Laerdal Medical. One of the projects aims to make emergency calls to 113 easier to handle using artificial intelligence (AI).

Published Updated on
En mann ser på en PC-skjerm.
Excerpt from the screen shot an operator at the AMK central station looks at when using the AI ​​solution AiCOS. Photo: Laerdal Medical

Department of Electrical Engineering and Computer Science at University of Stavanger (UiS) has recently strengthened its collaboration with Laerdal Medical. Two new agreements have been signed for the next three years. In total, Laerdal is contributing NOK 3 million to research within health technology.

One of the projects benefiting from the collaboration agreement is led by Øyvind Meinich-Bache, Adjunct Associate Professor at UiS and Senior Data Scientist at Laerdal Medical. He works with signal, image and video processing and analysis.

The first few minutes are critical

Meinich-Bache's project "AI in Chain of Survival" is related to artificial intelligence and emergency medical communication. Analysis of words, voice direction and emotional state of the caller will help emergency medical operators identify the incidents more precisely. The goal is to make it easier for operators to guide and provide life-saving responses in stressful situations. Understanding language, emotions and cognition through AI-powered speech analysis enables more accurate and effective care.

“Research has shown that the greatest opportunity to save lives in time-critical emergencies lies in the first stages, before ambulance personnel arrive on the scene. What the caller and the emergency centres do together plays an important role,” says Meinich-Bache.

The emergency control centres are receiving an increasing number of calls. At the same time, the use of video calls has become more common. This means that the employees at the alarm centre have more to pay attention to. They must interpret both what is said and what they see on the screen.

“The functionality of our system is speech-to-text to understand what is being said, and image analysis of the video stream from the caller's phone. The system interprets the patient's symptoms, and provides medical advice based on the protocol used by emergency operators. This means that the person receiving the call can focus more on the caller and the dialogue. It becomes easier to make decisions about what is urgent, and what kind of help should be given to the patient,” says Meinich-Bache.

Language and Alzheimer’s

“AI in Chain of Survival” is just one of the projects covered by the new collaboration agreement. Another project funded by the agreement is led by Mina Farmanbar, Associate Professor of Computer Science. She is researching how we can use artificial intelligence and speech and language technology to detect symptoms of Alzheimer’s earlier.

"In our project, we investigated a wide range of linguistic features across multiple languages. One of the challenges is multilingual detection of Alzheimer’s. This challenge has paved the way for new research, both language-specific approaches and methods that improve the model and its robustness. Now we are expanding the AI ​​models by incorporating emotional state and so-called prosodic features to explore how it can distinguish Alzheimer’s patients from healthy controls," Farmanbar says.

Prosodic features are the rhythm, melody and pressure of speech. These features are not related to individual sounds, but to syllables, words and sentences.

Other research areas covered by the collaboration agreement between UiS and Laerdal are artificial intelligence and empathetic communication between patient and healthcare worker, and signal processing to support the stabilization of newborns with breathing difficulties.

Text: Kjersti Riiber