NewbornTime – participant information

The NewbornTime project aims to improve newborn care using artificial intelligence (AI) for activity and event recognition taken from videos in the time both during and immediately after birth.

Published Updated on
Consent form

Here you can find the consent information. This is the same text as you will find in the digital platform where you can give your consent.

Study registered in ISRCTNregistry, number: ISRCTN12236970

Information video for the NewbornTime project

This QR code leads to the digital platform where consent can be granted or withdrawn. You can also click here: or in the picture.

Approximately 5% of newborns need breathing assistance immediately after birth to cope with the transition from fluid-filled lungs in the mother's womb to breathing on their own. Oxygen deprivation for an infant during and after birth can lead to birth asphyxia, a leading cause of death for newborns, as well as cerebral palsy and other long-term damage. To avoid damage, it is important that the breathing assistance starts quickly. Therefore, this project will develop a newborn timeline, with automated registration of the time of birth and the treatment administered until the baby begins breathing. The timeline documents what events took place so that healthcare professionals can learn; it can detect deviations and it can identify areas where there is a need for better routines or training. Respiratory activities include stimulation, clearing airways and performing bag-mask ventilation. NewbornTime is making every resuscitation a learning event.

Data collection

The exact time of birth is recorded using thermal cameras that can detect temperature differences, so-called infrared (IR) cameras. There is no radiation emitted from those cameras nor do the cameras detect sound. When recording a video with a thermal camera, you can see images of people because they are warmer than the surroundings, but it is not possible to recognize people. See the example image taken at SUS. There are IR cameras installed in 4 delivery rooms and the operating room where a caesarean section occurs at Stavanger University Hospital.

Newborns who need breathing assistance are moved to treatment rooms where regular video and audio are recorded from the treatment station. In the regular video, only the employees' hands and the newborn are visible. In this room there is also a thermal camera to record the number of people participating in the treatment.

For a complete and accurate description of the newborn timeline, SUS will compile relevant information from the birth journal and the child's journal with the video recordings. This information will be used for teaching, medical research and technology development. All medical treatment the mother and newborn will receive remains the same and is not affected by the project. There are no advantages or disadvantages for the mother or newborn before the technology has been developed and put into use.

Artificial intelligence in video

The NewbornTime project will produce a timeline describing events and activities performed during birth and newborn resuscitation. Accurate time of birth will be detected using artificial intelligence (AI) models from thermal videos collected in the delivery room. Activity recognition will be performed using AI in the form of deep convolutional neural networks on thermal and optic video from the resuscitation table. The system will be designed to recognize multiple time-overlapping activities. The AI models will be made robust, reliable, general and adaptive to be able to use it at different hospitals and acute settings.

Medical related research and development

The newborn timeline will be used to evaluate compliance with guidelines and to identify successful patterns of resuscitation activities / breathing assistance. By generating automatic timelines for many hundreds and perhaps thousands of such breathing assistance events, we are able to gather objective data on breathing assistance in a way that has not previously existed. By looking at statistics and connections in the timelines linked to how the child responded, it is possible to learn what works best and potentially challenge the current generally accepted procedures.

The timelines can also be used as part of a data-driven quality improvement tool where a hospital can see if the statistics improve over time, through training and other measures.

Furthermore, the timelines can be used as an objective de-briefing tool, and in the longer term it is possible that it can be useful in treatment support.

Data platform

In the project, we are developing a digital consent tool that is secure and easy to use, both to authorize and withdraw consent. The digital consent system is connected to the collection of video data so only that data is automatically stored when mothers have given their prior consent. Additionally, video data is only stored in a de-identified manner on the data platform with strict access control.


The project is a collaboration between University of Stavanger (UiS), Stavanger University Hospital (SUS), Laerdal Medical and BitYoga. UiS, SUS and Laerdal has long experience in collaborative research on newborn resuscitation, and have documented promising results on detecting activities using resuscitation videos from a hospital in Tanzania.

In the NewbornTime project, the data collection will be performed at SUS. BitYoga and Laerdal will ensure smart, secure and GDPR-compliant data platform. A solution for digital consent has been developed for piloting by BitYoga and Laerdal Medical. UiS will develop adaptive AI methods for activity recognition in videos.


The Regional Committee for Medical and Health Research Ethics has made a research ethics assessment and approved the project, REK number 222455

On behalf of the University of Stavanger, NSD - Norwegian Center for Research Data AS has assessed that the processing of personal data in this project is in accordance with privacy regulations, NSD no. 816989 Project website: