(Source: Elnur/Shutterstock.com)
Perhaps more so than other disciplines, instrumental diagnostics has benefited the medical profession by enabling physicians to augment and objectify their senses during the examination of their patients. Physicians can precisely measure temperature, blood pressure, lung volume, fracture displacement, inflammation, heart rhythm, and a plethora of other biomarkers that, along with symptoms and patient history, paint a picture of patient health. Objective tests, measurements, and interpretations are replacing subjective ones of days past thanks to advances in sensors and artificial intelligence. Let’s look at these advances enabling improvements in at-home testing and linking patient data to empirical studies.
Assessing and diagnosing injuries, such as concussions, illustrate how objective tests are supplanting subjective ones. According to the Centers for Disease Control and Prevention, more than 300,000 athletes are diagnosed with sports and recreation-related concussions each year in the US. A primary measure of a concussion and its severity is the pupil light-reflex assessment. Traditionally, the physician shines a light on the eye and then observes the pupil’s contraction and dilation in response. Of course, the problems with this approach are that significant variances among patients exist and that different physicians interpret subjective pupil size and response differently.
The testing of sophisticated, computer-assisted alternatives that use a pipeline of complex technology and sensors is underway to replace manual assessment in pupillometry. For example, miniaturized, high-resolution cameras built into smartphones or virtual-reality headsets can now record the eyes at up to hundreds of frames per second. Computer-vision processes optimized for real-time analysis extract features of eye components such as the sclera, iris, and pupil from the images. The infrared spectrum, which is not perceptible to humans, simplifies this segmentation significantly. Complex mathematical models then transfer this pixel data to a three-dimensional eye model to derive real-world measurement units such as millimeters. Finally, machine learning (ML) algorithms remove noise from the data, recognize patterns in the time series, establish references to the estimated distribution of the parameters on the total population, and present these results. This pipeline provides accurate measurements, removes subjectivity, and shifts the physician’s responsibility to validation, interpreting results, and communicating findings to the patient.
Pipelines, such as the one described above, could be optimized for specific healthcare applications and ultimately enable objective diagnoses not previously possible. For example, modern thermal imaging cameras are increasingly being used to detect inflammation and even breast cancer using advanced noise-reduction algorithms. Unlike the classically applied x-ray mammography, the actual examination might be more precise in the early stages of cancer and uses neither potentially dangerous radiation nor any form of touch. In times of mass fever testing, such as the fight against COVID-19, these technologies offer enormous opportunities for efficient, hands-off testing with results far more accurate than contact-based methods.
Miniature tech, in particular small wearable test devices, enables testing in non-clinical locations. For example, rather than going to a medical facility, patients needing an electrocardiogram (ECG) might wear a Holter monitor to collect long-term ECG data while going about their daily lives. The Holter monitor is about the size of a small camera with attached electrodes and worn for 24 to 48 hours. Though the monitors are bulky and can inhibit some patient activities, they can provide broader insights into patients’ cardiac rhythms than testing done in medical facilities.
Further miniaturization of technology continues to accelerate medical progress. A new generation of optical sensors built into fitness bracelets and smartwatches promises significantly improved comfort alongside increasingly useful data. Although the data quality might not yet match their electrode-based counterparts, Apple recently caused a stir far beyond the experts when it demonstrated smartwatches’ potential use for detecting some forms of cardiac arrhythmia. In addition to such devices measuring parts of the cardiovascular system, other sensors aim for an enormous amount of other modalities. For example, rather than observing patients in an unfamiliar sleeping lab, modern portable electroencephalogram (EEG) headsets in the form of sleeping masks might already be used for gaining a rough capture of brain activity in domestic bedrooms. With the growing availability of certified open-source sensors, the development of tiny yet powerful medical devices is no longer a domain of a few influential companies.
One significant problem facing physicians is drawing on clinical trials, studies, repositories, registries, and so forth for information relevant to their patients. Clinical trials and other resources have some standardization, but not across organizations, medical disciplines, or geographic regions. The lack of standards hinders physicians' ability to find, access, and use research findings. Addressing physicians' needs for this information presents a two-fold challenge: Developing a system that enables physicians to find and access relevant information while allowing interoperability across healthcare information systems.
Standards are emerging to meet these demands, drawing on data that are findable, accessible, interoperable, and reusable (FAIR) with minimal human intervention. Systematized Nomenclature of Medicine Clinical Terms (SNOMED CT), for example, is a computer-friendly collection of medical codes, terms, synonyms, findings, symptoms, diseases, procedures, devices, and the like that aims to provide a consistent way to index, store, retrieve, and aggregate medical data. Similarly, Logical Observation Identifiers Names and Codes (LOINC) is a database and standard for identifying medical laboratory observations.
Metadata requirements in these standards enable physicians to access and use data and insights drawn across data sets. Projects such as NFDI4Health embed empirical health data, such as biomarkers, in disease and disorder metadata that can subsequently be analyzed using machine-learning algorithms across widely varying patient pools. Even diseases and disorders that were previously difficult or impossible to diagnose objectively—those in psychiatry, for example—are now being put on a new, biomarker-based foundation with the help of sensors and machine-learning algorithms.
Patient diagnosis is closely linked to physicians’ ability to use measuring instruments to augment their senses. Only through such instruments can distances be defined unambiguously, temperatures can be measured precisely, and objects and organisms not visible to the naked eye can suddenly be identified. Medical sensors and AI are revolutionizing instrumental diagnostics by providing objective data, enabling testing in patients’ environments, and integrating patient data and research findings.
Christopher Gundler is a cognitive scientist and medical computer specialist. His research focuses on the mathematical modeling of sensor data for their usage in the diagnosis. Currently, he is responsible for the machine learning and computer vision behind eyeTrax, a solution for the clinical assessment of mild traumatic brain injuries.