In Search of Life: Learning From Synthetic Data to Detect Vital Signs in Videos - Crossminds logo

In Search of Life: Learning From Synthetic Data to Detect Vital Signs in Videos

Sep 29, 2020
Authors: Florin Condrea, Victor-Andrei Ivan, Marius Leordeanu Description: Automatically detecting vital signs in videos, such as the estimation of heart and respiration rates, is a challenging research problem in computer vision with important applications in the medical field. One of the key difficulties in tackling this task is the lack of sufficient supervised training data, which severely limits the use of powerful deep neural networks. In this paper we address this limitation through a novel deep learning approach, in which a recurrent deep neural network is trained to detect vital signs in the infrared thermal domain from purely synthetic data. What is most surprising is that our novel method for synthetic training data generation is general, relatively simple and uses almost no prior medical domain knowledge. Moreover, our system, which is trained in a purely automatic manner and needs no human annotation, also learns to predict the respiration or heart intensity signal for each moment in time and to detect the region of interest that is most relevant for the given task, e.g. the nose area in the case of respiration. We demonstrate the effectiveness of our proposed system on the recent LCAS dataset, where it obtains state-of-the-art performance.

Reactions (0) | Note
    📝 No reactions yet
    Be the first one to share your thoughts!