Read Summary

NEW ORLEANS — Researchers have developed an algorithm using machine learning and smartphone-based photoplethysmography (PPG) to detect diabetes with “reasonable discrimination.”

Because one in three people with diabetes don’t know they have it, and it is a risk factor for cardiovascular disease, the researchers hope that one day the app can improve rates of diabetes diagnosis.

People would still need to visit their healthcare provider to confirm their findings and have regular blood tests.

The researchers used contact PPG to obtain waveforms of color changes in blood vessels with heart beat — which is also used by apps to measure heart rate — to develop the screening test for diabetes.

Specifically, they input waveforms from 54,269 people older than 18 years who transmitted their data from the Azumio Instant Heart Rate smartphone app (and about 2.6 million PPG waveforms) to the Health eHeart Study into a 45-layer deep neural network computer program to “train” the program to develop a better algorithm to predict diabetes.

“Our study is the first [such] proof-of-concept study,” lead author Robert Avram, MD, a cardiology fellow at University of California, San Francisco (UCSF), told | Medscape Cardiology.

Avram will present the findings here at the ACC 2019 Scientific Session (ACC.19).

As a next step, the team is validating the algorithm in patients from two cardiovascular prevention clinics at USCF and in Montreal to see how the diabetes detected from the PPG signal compares with that detected from a blood glucose or HbA1C test.

“Right now we have 40 patients,” Avram said, “and we have an AUC that’s comparable to what was reported in the poster — 0.76 — so it’s performing just as well as in [diabetes] self-report.”

The team is also working with the company to integrate this algorithm into its heart rate app, and they are examining actions that a clinician should take when a patient comes in with an app-based positive test for diabetes.

This diabetes-detection app could potentially be available in the marketplace in 2 years, according to Avram.

Still Need Provider Visits

However, these are early days and consumers need to be careful when using healthcare apps, Nathan Wong, PhD, MPH, Heart Disease Prevention Program, University of California, Irvine, and a member of the ACC Prevention of Cardiovascular Disease Council, told | Medscape Cardiology in an email.

“While these technological advances are promising, they need validation against gold-standard tests (such as HbA1C fingerstick testing in the case of diabetes) and, depending on the manufacturer and [type of test], they can vary substantially in terms in accuracy,” he noted.

For example, a study published last year in the European Journal of Preventive Cardiology showed a large variation in accuracy among commercially available apps for heart rate.

“As much as consumers like to use apps these days to monitor their health,” he continued, “they should not be seen as a substitute for regular screenings by a reputable healthcare provider, and as such, a fasting blood sugar and/or HbA1C needs to be regularly performed at healthcare provider visits according to guidelines.”

Avram agrees with the need for follow-up testing. “We are hopeful this technology will assist with early diabetes detection,” he said in a statement issued by the ACC. “A positive screening test would still require a physician to confirm the diabetes diagnosis and establish appropriate treatment.”

However, “the potential to transition screening that’s normally done by physicians or nurses to the patient themselves through a smartphone app is a very novel concept and gives us a glimpse into how healthcare might work in the future.”

Different Waveforms in Diabetes

Avram noticed differences in waveforms from patients with and without diabetes while he was doing research related to a heart rate app, and so the team investigated further.

In a previous study in Health eHeart participants, Avram said, the researchers found a good correlation between self-reported diabetes and blood test results for glucose and HbA1C.

The participants in the current study had a mean age of 45 years, 53% were male, and 6.6% had self-reported diabetes.

The app-based screening test had a specificity of 62% (true negative), a sensitivity of 67% (true positive), a negative predictive value of 96%, and a positive predictive value of 13%.

The area under the curve (AUC) of receiver operating characteristic curves was 0.72 based on the model with waveforms alone and 0.81 with added information about age, sex, race, and body mass index.

On the basis of the test’s specificity, Wong noted that among people who reported not having diabetes, only 62% would screen negative but 38% would screen positive.

And on the basis of the test’s sensitivity, of the people who self-reported having diabetes, only 67% would screen positive and 33% would screen negative.

“While the number of a false-positive is high,” the researchers acknowledge, “those patients tended to have twice the prevalence of other cardiometabolic conditions, such as hypertension, hypercholesterolemia, CAD, or sleep apnea, although they did not have diabetes.”

Wong countered that “these are common in those who have diabetes; hence, the necessity of demonstrating accuracy in such individuals.”

False Positives?

“I would worry about these numbers,” Wong said, “in particular, the [low] positive predictive value indicating that 87% of those testing positive on the new test would actually not have diabetes, falsely alarming” many consumers.

“The 96% negative predictive value is more promising,” he conceded, “meaning that the new test is a good one to rule out diabetes (96% of those testing negative would be negative for diabetes).”

“We tuned the algorithm to develop the test as a screening tool,” Avram said, “to make sure you did not miss a condition; that’s the sensitivity part.”

“And the negative predictive value, that means a user getting a negative PPG diabetes screening result has only a 3% chance of having diabetesand a 97% chance of not having diabetes. “

The AUC for this test falls in the range of common medical screening tests, he noted. For example, cervical cytology to screen for cervical cancer has an AUC of 0.74 and mammography to screen for breast cancer has an AUC of 0.62 to 0.94, depending on the experience of the radiologist.

Others have reported that an AUC of “0.7 to 0.8 is considered acceptable, 0.8 to 0.9 is considered excellent, and more than 0.9 is considered an outstanding ability…to diagnose patients with and without the disease or condition based on the test.”

In addition, those other screening tools require administration by a physician, Avram noted, “which is a problem with diabetes patients because at least a fifth of them who have undiagnosed diabetes will never encounter a physician until it’s too late.”

The model performed best in men 42 to 60 years of age and in people with five to 10 recordings.

Azumio provided no financial support for this study and only provided access to the heart rate data. Data analysis and interpretation was performed independent from Azumio. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of this abstract.

The research was supported by an NHLBI award. Avram is supported by a grant from the Fonds de la recherche en santé du Québec. Coauthor Mark Pletcher, MD, MPH, UCSF, is partially supported by a PCORI contract supporting the Health eHeart Alliance and has received support from the National Institutes of Health (NIH). Coauthors Jeffrey Olgin, MD, UCSF, has received support from the NIH. Coauthor Gregory Marcus, MD, UCSF, has received support from the NIH and research funding from Medtronic and Cardiogram Inc; is a consultant for Lifewatch and InCarda; and holds equity in InCarda. Coauthor Peter Kuhar is an employee of Azumio.

American College of Cardiology (ACC) 2019 Scientific Session: Poster 422. To be presented March 17, 2019.

For more from | Medscape Cardiology, follow us on Twitter and Facebook.

Print Friendly, PDF & Email