Skip to main content

Face value

ASU researchers use cutting-edge technology to build a better health care practitioner


person looking at facial recognition technology

|
July 19, 2016

If you’ve ever lied to your doctor, chances are you’re not alone. But whether you’re ashamed of eating too much bacon or you’re trying to hide a smoking habit, the fact is, the more honest you are, the better equipped a physician is to help you.

So what if your doctors could tell whether you were hiding something? Better yet, what if they could adjust their bedside manner so that you never felt the urge to fib in the first place?

That’s what a couple of researchers at Arizona State University are hoping to make a reality.

At the Motivational Interviewing Laboratory on ASU’s downtown Phoenix campus, professor Jack ChisumJack Chisum is a clinical associate professor in the School of Nutrition and Health Promotion, an academic unit of the College of Health Solutions at ASU. and tech support analyst Glenn Brown are putting cutting-edge technologies to work in an attempt to build a better health care practitioner.

Using a combination of facial recognition and layered voice analysis software, they have been able to analyze an individual’s emotional responses during conversation with remarkable accuracy.

“In the near future, we’ll tell you what you’re thinking,” Brown said.

Turns out, when you know what someone is really thinking and feeling, you gain insight into ways you can make them feel more comfortable and forthcoming.

two people talking at table

ASU graduate students Frank Medina (left) and Neil Soneson conduct a mock interview.

Soon, Chisum and Brown plan to incorporate two more software capabilities that promise to further enhance the accuracy of their emotion analyses: one that measures physiological responses, such as blood pressure and perspiration, and one that measures brain activity.

Although their primary goal is to improve communication between and among patients and practitioners in the health care sector, the possible technical applications are manifold — the duo have received inquiries from professionals in fields ranging from law to insurance to business management.

“Anywhere where you can have two people in dialogue, this can be used,” Chisum explained.

How it works

Four hundred ninety-one — that’s the number of points on a human face the Noldus facial recognition software can detect.

So even if you think you can hide your disdain for a certain line of questioning from your doctor, Chisum assures, “You can’t hide from this technology.”

As for the voice analysis software, it splits the voice out to 1/10,000th of a second — meaning for every five seconds of vocalization, there are 5,000 sections of the voice that can be analyzed. (The software has most notably been used for counter-terrorism efforts to detect high levels of anxiety in a would-be terrorist’s voice, alerting officials to possible danger.)

Using these two very powerful tools to analyze video footage of an individual during a conversation gives you an incredibly accurate, incredibly detailed look at how they are truly reacting.

Noldus uses infrared technology to track minute facial movements, which it translates into a graph depicting different-colored lines for different emotions. Similarly, the voice analysis tracks minute vocal changes that indicate a change in emotion. When a researcher looks at the resulting data and sees a change or spike in emotion, all he or she has to do is back up the video to that point in time to see what caused it.

“Putting the face and the voice together, that’s measurable,” Chisum said. “A very finite measurement.”

In the case of one of Chisum’s clients who agreed to participate in a study about obesity, reviewing her data showed him that she reacted negatively to questions about her ideal body weight and exercise habits — even though she appeared outwardly neutral or even happy. Chisum was then able to share that information with her psychiatrist, who helped her to focus on those areas and work through them.

Chisum and Brown are now in the testing phases of implementing software that can detect changes in a person’s physiological responses. Called Shimmer, it attaches to a person’s fingers to measure things such as blood pressure, oxygen saturation and perspiration levels.

Eventually, they’d also like to incorporate a neurological component, using an EEGAn electroencephalogram (EEG) is a test that detects electrical activity in the brain using small, flat metal discs (electrodes) attached to the scalp. net to detect changes in brain activity.

What’s next

Considering the potential this technology has to enact positive change on any number of professional fields, it may be surprising to discover Chisum and Brown are the first to be researching its use for that purpose (as opposed to more stern purposes, such as interrogation).

man watching video using facial reading tech

Glenn Brown analyzes video footage using the  facial recognition and layered voice analysis software.

“No one else in the world is doing this,” Chisum said. “And we know that because all the software providers have told us that.”

And the best part?

“We’re not confined to this lab,” said Chisum.

Data captured by the software can be analyzed from anywhere in the world. Theoretically, a hospital in Kathmandu could record a conversation between a doctor and a nurse, send it to Chisum and Brown to be analyzed at ASU, and then make communication improvements based on the results.

There’s only one real obstacle: “We need more people,” Chisum lamented.

He and Brown simply cannot do it alone; they need more students to train in the technologies who can then disseminate it across various professional sectors. But the outlook is good, with numbers of interested parties growing daily.

“What we’re trying to do is trying to find out what are you doing well, and how can we augment that? And what are you not doing well, and how can we decrease that?” Chisum said.

Their hope is that the use of this type of technology will eventually become commonplace in the health care sector. Where it goes beyond that is limited only by human ingenuity.

Save

More Science and technology

 

Portrait of Meenakshi Wadhwa

ASU planetary scientist to be inducted into the National Academy of Sciences

The National Academy of Sciences is inducting School of Earth and Space Exploration Director Meenakshi Wadhwa into the 2023 class of new members for her pioneering work in planetary sciences and…

Adam Cox speaks to an unseen audience, sitting next to another person in a suit

Unlocking the potential of AI for homeland security

“Can we do what we're doing now cheaper, more efficiently, more effectively?” Adam Cox, director in the Office of Strategy and Policy at the Department of Homeland Security Science and Technology…

A large group gathered for a photo with ASU signage behind them

SpaceHACK highlights student solutions to environmental challenges, digital divide

By Adrianna Nine About 250 students from around the world convened online and at Arizona State University on March 22 for the ASU Interplanetary Initiative’s second annual SpaceHACK for…