As the caricatured faces of smartphone emoticons reveal, color is integral to the perception of various ailments. The chartreuse pout perfectly captures the feeling of nausea, while the icy blue grimace effectively conveys hypothermia.
Beyond providing a colorful shorthand to communicate the state of one’s health, smartphones can actually measure specific hues to help diagnose disease. Researchers at University College London and the University of Ghana have developed smartphone colorimetry technology to screen for conditions that give rise to unique pigmentation changes in bodily surfaces. Their color science-based approach quantifies color in specific facial regions to identify children affected by jaundice and anemia. By creating an accessible and affordable screening tool, the team hopes to improve detection, timely intervention, and long-term outcomes for pediatric patients in low-resource communities.
If it’s yellow
Babies with jaundice typically exhibit a yellowish tinge to the skin and the whites of the eyes. This color change results from excess blood levels of bilirubin, a yellow substance produced when red blood cells degrade. While jaundice is common in newborns because their immature livers may not effectively clear bilirubin, infants in sub-Saharan Africa are especially at risk due to the prevalence of glucose-6-phosphate dehydrogenase deficiency, a genetic disorder that causes red blood cells to break down prematurely. Jaundice may require phototherapy or more invasive treatments, but if left unchecked, it can have devastating consequences. “If it gets to a dangerous level, it can cause brain damage, and some babies will even die. Worldwide, it's still a very important cause of death and disability,” said Judith Meek, a neonatal care researcher at University College London.
In many parts of the world, babies often leave the hospital before jaundice can set in. Midwives who check on the baby at home may use a transcutaneous bilirubinometer, a device that analyzes the light reflected by the skin and the tissue underneath to provide a measure of blood bilirubin levels, in order to screen for jaundice. In low-resource areas in Ghana and other countries, however, these expensive instruments are often not available, and community health workers who visit homes must rely on what they can see. “In a lot of these settings, which is where I'm from, our skin color is dark. So, if the baby is yellow, you might not notice,” said Christabel Enweronu-Laryea, a pediatric health researcher at the University of Ghana. Assessing the yellowness of the newborn’s eye with a simple visual examination is also extremely prone to error. “You need tools that are available everywhere, like smartphones, that might have the capacity to, to a large extent, objectively assess the level of jaundice better than the naked eye of a community health worker,” Enweronu-Laryea said.
The researchers set out to develop a smartphone imaging application that could quantify the yellow pigment to screen for jaundice. They decided to focus on the sclera, the white outer layer of the eyeball, as it lacks skin pigmentation and provides an unbiased canvas for babies of all ethnicities. Obtaining a rigorous color measurement, however, wasn’t as simple as snapping a photo of the eye. “With two smartphones, even from the same manufacturer — let's say Samsung smartphones — when you take a color picture, sometimes the color images are slightly different,” said Terence Leung, a biomedical engineer at University College London. “We're trying to use a smartphone camera as a scientific instrument. That's why these subtle differences need to be compensated for.”
The team developed a one-time calibration procedure to standardize images across different smartphones. They use the phone to capture an image of a color checker, a commercially available card containing squares of many different colors that are painted on rather than printed to ensure color consistency. By analyzing how the smartphone images the color checker, they can correct for the characteristics of an individual device.
Another factor that influences the color of an image is lighting. To account for this source of variation, the researchers capture two pictures of the baby’s eye: one with the smartphone camera flash, and one without. When the flash is on, both the flash and the ambient lighting illuminate the eye. When the flash is off, only the ambient lighting shines on the eye. By processing the two pictures to subtract out the effects of the ambient lighting, the team can normalize the images across different environments. The technology displays a signal-to-noise ratio to pinpoint image pairs that are too similar for the subtraction processing due to insufficient flash illumination or very bright ambient light. In that case, the user can try taking the picture closer to the eye or darkening the room to maximize the difference with and without the flash.
With the standardized images in hand, the researchers identify a representative region of the sclera to analyze, avoiding areas with interfering light reflections or prominent blood vessels. To explore how to best quantify the color, the team conducted a pilot study in which they collected eye images and blood samples from newborns at University College London Hospital Neonatal Care Unit (1). They built mathematical models that calculate a scleral bilirubin value from a yellowness measure extracted from the red, green, and blue (RGB) coordinates of the digital image and convert it to a predicted blood bilirubin concentration. They compared their results to the newborns’ blood bilirubin levels and identified a model that gave a high correlation, indicating that it could predict jaundice status.
The researchers then evaluated their smartphone application, called neoSCB, in a larger population of newborns in Ghana in both hospital and rural settings (2). They captured images of the newborns’ eyes, typically while the babies were alert during breastfeeding, and calculated a predicted blood bilirubin concentration. They also collected transcutaneous bilirubinometer measures of blood bilirubin levels and used each set of data to identify newborns that should receive a diagnostic blood test based on an established threshold. By comparing their results to blood bilirubin measures from a blood test, they found that the neoSCB application correctly designated approximately 94 percent of positive jaundice cases and approximately 73 percent of negative cases, which is comparable to the sensitivity and specificity of the transcutaneous device.
The researchers found that, compared to the smartphone-predicted blood bilirubin levels, the blood bilirubin measures from the commercial transcutaneous bilirubinometer showed a stronger correlation with the actual blood bilirubin concentration values. “If we want to use this app as a screening device, it’s as good as the commercial device. But then if you want to get a measure of the bilirubin, in that case, the commercial device is more accurate,” Leung said.
The team observed that neoSCB tends to underestimate blood bilirubin levels at higher values. One possible explanation is that at a certain point, the intensity of the yellowness measure may plateau rather than increase proportionally with bilirubin concentration. While the application would still indicate that the level is above the threshold in these cases, interestingly, the transcutaneous bilirubinometer did not even provide a numerical reading under similar conditions. Additionally, just as the transcutaneous device has guidelines for compatible gestational age, the researchers determined that their application should not be used with babies born at less than 37 weeks of gestation because it underestimates yellowness, possibly due to their thinner scleras.
Overall, the mothers involved in the study were receptive to the smartphone technology. “It wasn't a problem, because in the setting where the study was done, the traditional way for us to check for yellowness in babies is by looking at their eyes,” Enweronu-Laryea said. “When we went to rural settings, there were some mothers who thought we were taking pictures of their babies. And in some cultural practices, you shouldn't be taking pictures of the baby before the [outdooring ceremony], which is a traditional timeline.” The researchers reassured the mothers by showing them that they only took pictures of their babies’ eyes.
For Leung, the experience helped him expand his understanding of a successful research endeavor. “Personally, I'm an engineer. I have tunnel vision and am very focused on achieving the aim. But this is a healthcare technology; it involves people. It also involves culture,” he said. “Developing healthcare technology is not just about technology; it’s also about how you use it. ...You need a multidisciplinary team to tackle a problem like this.”
While their published studies used offline analysis on a computer to delineate the scleral region, the researchers have further adapted their technology for real-time processing, developing a feature that can zoom into the eye to select an area of interest and calculate a bilirubin value on the spot. Although an extremely accessible tool, the team doesn’t think neoSCB is suited for parental use at this stage. For example, the application might recommend against a blood test if a bilirubin measure is just below the threshold, even though the baby may have other symptoms of jaundice. The researchers are considering integrating a confidence interval into the result as opposed to a binary readout, but “the first stage of the app should only be in the hands of healthcare professionals who can make a 360-degree assessment of a baby,” Meek said. In the study in Ghana, the researchers trained community health workers to use the smartphone application, which required an average of only 30 minutes.
The team is now working to secure approval for neoSCB as a clinical medical device, a regulatory pathway that has only recently emerged for similar smartphone technology in the United Kingdom. They hope to obtain approval to use the application in Ghana and potentially other countries with insufficient access to jaundice screening tools. “If we could get this rolled out around the world, I think it would be a game changer for babies,” Meek said.
As the researchers avoided blood vessels in the sclera that might interfere with their yellowness measures for jaundice screening, they wondered whether they could adapt their technology for application to another disease. “We realized that actually, it has some information in there, because this blood vessel — how red it is — can be an indication of whether someone has anemia,” Leung said.
Anemia results from impaired oxygen transport throughout the body due to a lower concentration of functional hemoglobin in the blood. Since hemoglobin reflects more red light than other colors, the redness of a blood vessel can serve as a measure of hemoglobin levels, potentially providing a screening tool for anemia.
Pediatric anemia, which can affect cognitive development if left untreated, is a pressing public health concern in Ghana due to the prevalence of sickle cell disease and dietary iron deficiency. Anemia can be diagnosed with a blood test, but it is often not apparent that a child may be anemic, and it can be challenging for people in rural areas of Ghana to access a healthcare clinic where they can be tested.
To develop a readily available smartphone imaging-based screening tool, the researchers looked for parts of the body where redness measures would be unaffected by skin pigmentation and lifestyle factors such as nail polish application. They decided to analyze the sclera, the interior of the lower eyelid, and interior of the bottom lip. In a pilot study of children in Ghana, the team collected images of these facial regions and measured blood hemoglobin levels using a finger prick test (3). They then tested several mathematical models that predict blood hemoglobin concentration based on different ambient correction methods and redness measures.
The researchers found that areas within the sclera that reflect incoming light could serve as a reliable indicator of the color influence of the ambient lighting. They analyzed these reflective areas to standardize scleral images captured under different lighting conditions. They observed that this white balancing procedure did not effectively correct for variable lighting in images of the eyelid and lip, which are too far from the sclera to experience the light in the same way, and used their previous flash/no flash ambient subtraction method for these regions.
In comparing different redness measures, the researchers found that analyzing the reddest five percent of pixels in each region gave the most accurate blood hemoglobin prediction. “The theory we have is that if you take the average redness, it depends a lot on how many blood vessels there are, which is just an individual variation,” said Thomas Wemyss, a medical imaging graduate student at University College London and coauthor of the study. “If you look at how red the reddest bit is, what you’re kind of indirectly measuring is how red are the blood vessels because they’re in the reddest part.”
The researchers then used their smartphone imaging technology to predict anemia status based on an established blood hemoglobin threshold, which could be used to indicate the need for follow-up bloodwork. They found that when they integrated all three regions into their analysis, they could correctly identify approximately 93 percent of positive cases and approximately 90 percent of negative cases. “It's at a level that if it worked like this in the real world, it probably wouldn't be causing massive issues in healthcare systems. We really didn't want to be saying, ‘Everybody's got anemia,’” Wemyss said. “We found that in this group, we did need to analyze all three areas together to make that decision.” Combining multiple regions may improve accuracy by offsetting error in one of them. For example, while there are confounding causes of redness in the eye, such as hay fever, the lip would not be affected.
I really hope, because we've got these smartphones with these powerful sensors, that we can responsibly use them for healthcare.
- Thomas Wemyss, University College London
With the goal of eventually moving their technology into home settings, the team has built the entire image analysis pipeline into a smartphone application and is working to expand the automated quality control features to ensure that the images are useable. The researchers also hope to investigate if their technology can classify mild, moderate, and severe anemia based on additional thresholds, which will require testing it in a larger and more heterogeneous cohort. Similar to their jaundice application, they are considering the best way to present the results to the user — a yes/no recommendation for a blood test, a category of severity, a numerical value — and how to communicate the uncertainty in the measurement. “We need to do a lot of user studies to work out how to show that to people in a way that they can take action but that doesn't stress them out or cause too much anxiety,” Wemyss said.
Wemyss is interested to see how regulatory, ethical, and security considerations for modern digital health technology evolve outside of a research setting. “I really hope, because we've got these smartphones with these powerful sensors, that we can responsibly use them for healthcare,” he said.
- Outlaw, F. et al. Smartphone screening for neonatal jaundice via ambient-subtracted sclera chromaticity. PLoS ONE 15, e0216970 (2020).
- Enweronu-Laryea, C. et al. Validating a sclera-based smartphone application for screening jaundiced newborns in Ghana. Pediatrics 150, e2021053600 (2022).
- Wemyss, T.A. et al. Feasibility of smartphone colorimetry of the face as an anaemia screening tool for infants and young children in Ghana. PLoS ONE 18, e0281736 (2023).