A new artificial intelligence machine learning model is capable of accurately diagnosing certain illnesses nearly every time by simply looking at a patient’s tongue. The novel technology, while state-of-the-art, draws its inspiration from medical approaches utilized by humans for over 2,000 years.
When it comes to diagnosing ailments, traditional Chinese medicine and other practices often turn to the tongue for clues. Based on its color, shape, and thickness, the muscle can reveal a number of possible health issues—from cancer, to diabetes, to even asthma and gastrointestinal issues. Now, after more than two millennia of peering into patient mouths for answers, doctors may soon receive a second opinion from artificial eyes powered by machine learning.
[Related: An experimental AI used human brain waves to regenerate images.]
“Human tongues possess unique characteristics and features connected to the body’s internal organs, which effectively detect illnesses and monitor their progress… Among these, tongue color is of the most importance,” a team of engineering researchers collaborating between the University of South Australia (UniSA) and Iraq’s Middle Technical University (MTU) in a recent study published in the journal, Technologies.
Ali Al-Naji, the paper’s senior author and an adjunct associate professor in UniSA’s Department of Medical Instrumentation Techniques Engineering, offered a number of scenarios in the study’s August 13 announcement.
“Typically, people with diabetes have a yellow tongue; cancer patients a purple tongue with a thick greasy coating; and acute stroke patients present with an unusually shaped red tongue,” he explained. A white tongue, meanwhile, can be indicative of anemia, while indigo or violet color points to vascular and gastrointestinal issues or asthma. In more recent cases, deep red tongues may provide evidence of severe COVID-19 cases.
Like similar visual machine learning algorithmic programs, Al-Naji’s team constructed their own system by visually training it on two data sets. First, they fed 5,260 images spanning seven colors across different saturations and light conditions. Of those, 300 “gray” entries represented various unhealthy tongues along with 310 “red” selections in place of healthy examples. Next, two Iraqi teaching hospitals in Dhi Qar and Mosul trained the system in real time using 60 photos showing a mix of healthy human tongues and those with various diseases, including mycotic infection, asthma, COVID-19, fungiform papillae, and anemia.
Finally, it was time to test the algorithm in person. After connecting the program to a USB webcam, both healthy and ill volunteers were asked to position their tongues 20cm from the camera for scanning. The results, according to Al-Naji’s team, displayed a “remarkable precision.”
“The proposed system could efficiently detect different ailments that show apparent changes in tongue color, with the accuracy rate of the trained models exceeding 98 percent,” they write in the study’s conclusion. In the case of 60 tongue images, the program achieved a 96.6 percent accuracy.
According to researchers, they believe the experiments illustrate the promising feasibility of incorporating similar or improved AI systems into medical facilities to one day provide a “secure, efficient, user-friendly, comfortable and cost-effective method for disease screening.”