In hospitals around the globe, premature babies are fighting for their lives. These newest additions to our world face an uphill battle – born as many as four months premature and confined to incubators in neonatal intensive care units (NICUs). The doctors and nurses around them do everything they can to give them a chance at a full and normal life, while also trying to keep them comfortable and calm. But this can be challenging, as keeping tabs on their vital signs like pulse, breathing rate, blood oxygenation and blood pressure requires wires covering the babies’ arms, legs and chests.
Mayank Kumar, a third-year Ph.D. student at Rice University, working with Profs. Ashok Veeraraghavan and Ashutosh Sabharwal, has set out to change this.
“In January 2013, we visited Texas Children’s Hospital here in Houston, and the biggest challenge we saw were the wires placed on the babies’ extremely sensitive skin,” Mayank said. “These wires monitored premature babies for decades with little change in the technology. To us, it felt cruel to a baby’s sensitive skin.”
To solve this problem, the Rice team of Mayank, Ashok and Ashu, set out to find a new way to monitor a baby’s vital signs without any physical connections. To accomplish this goal, the research team from Rice had the idea to use cameras to measure pulse and breathing rate without wires.
When a person’s heart pumps blood, the volume of blood increases and decreases in the arteries and veins directly beneath the skin. Because of this change in blood volume, there is a change in the skin color that is invisible to the human eye but can be recorded by a camera. Using complex algorithms, the new vital signs system the researchers hoped to create could take the readings from the camera and measure the pulse and breathing rate of a patient.
But there were a number of issues that Mayank and his team immediately encountered. First, most NICUs keep the lights dimmed for the babies’ comfort, making it difficult for the camera to get good resolution in a low-light situation (much like trying to take a picture at night without a flash).
The second challenge was skin tone. The darker the skin of the baby, the harder it would be for the camera to pick up the skin color changes as the blood volume increased and decreased.
The final challenge involved motion. If the baby moved in any sort of way, it would overload the system.
“Our research boiled down to determining how can we improve the signal strength and be robust against motion” Mayank, whose Ph.D. is funded by a TI Fellowship, said.
To begin their research, the team needed an open-source, raw signal as a baseline to compare the signal strength of different algorithms. Mayank invested time speaking with companies about getting access to this raw data, but the solutions provided were difficult to use, costly or simply not available. Only our AFE4490 integrated analog front end for pulse oximeters provided the baseline information Mayank and his team needed – and for less than $50.
“The open nature of TI’s software, with the waveform in the raw format, was absolutely critical, since most other companies that make pulse oximeters do not disclose this information. As a result, we could find the baseline we couldn’t have otherwise,” he said.
Using our technology as a baseline, the team started developing algorithms to enhance the signal strength and robustness of the camera systems. Within months, the signal strength improved by 3-6 decibels, which Mayank said is two to four times better than other known methods for camera-based vital sign estimation. The team’s system is called CameraVitals.
“At TI, we know that great innovations happen when creative people apply technological building blocks to real-world problems in unexpected ways, and in the process, create entirely new capabilities that enhance the quality of life,” said Umit Batur, TI manager of the perception and analytics R&D lab. “Therefore, TI actively supports innovators at universities with its diverse and open portfolio in analog and embedded processing, and helps them overcome the painstaking challenges of innovation. It is all worth it in the end when we see the positive impact of the innovation on human life.”
How does the CameraVitals work? Mayank said some parts of the face are better to extract vital signs than others. It all depends on the depth and density of the veins and arteries in a particular region. For example, veins and arteries are closer to the skin in the forehead than on the cheek or chin, and thus the forehead is a better region for extracting camera-based vital signs.
Previous algorithms gave equal weight to all parts of the face by averaging the signal over the whole face without giving more or less weight based on the depth and density of veins and arteries. Rice University’s new algorithm gives more weight to areas similar to foreheads and less weight for cheeks and chins – but this depth and density varies in each person. As a result, the algorithm must be flexible enough to accommodate the unique features of each person.
“We divide the face into small regions, down to pixels, and assign a number for each pixel, saying how good that pixel is in giving the pulse signal,” Mayank explained. “A zero is not good enough and we will not consider it in the algorithm, but a one will be considered. The result is a virtual ‘facemask’ that can be generated from a 10-second analysis of the video.”
(Please visit the site to view this video)
An algorithm is developed based off of the patient’s virtual facemask, all running with real-time software on Mayank’s laptop. The team tested the system to read pulse and breathing rate on all skin tones. But their work is only halfway done. The team is still working to develop techniques to determine blood oxygenation and blood pressure from a camera.
“I feel great having solved half of this problem, and the solution is right here in my laptop,” Mayank said. “But I’ll feel even better once we figure out how to read the other two vital signs.”