Trisalyn Nelson joins ASU to lead School of Geographical Sciences and Urban Planning


July 19, 2016

Brainstorm a list of societal challenges — from extreme heat to social equity to transportation — and they all share two characteristics: They have unique distributions in space, and their spatial patterns change over time.

Trisalyn Nelson, the new director of ASU’s School of Geographical Sciences and Urban Planning as of July 1, has focused her research on examining spatial and spatial-temporal patterns in order to better understand the underlying processes behind everything from wildlife movement to hazardous chemical exposure to cycling safety. Trisalyn Nelson, Director, School of Geographical Sciences & Uran Planning Trisalyn Nelson is the new director of ASU’s School of Geographical Sciences and Urban Planning in the College of Liberal Arts and Sciences. Download Full Image

“Societal challenges are increasingly complex,” Nelson said. “They include interactions between physical environments, built environments, and people.

“With 54 percent of the world’s population living in cities, health and sustainability require urban planning that is informed by knowledge of earth processes and human interactions with space and place.”

“I chose to move to ASU because I believe society needs solutions that an integrated geographical sciences and urban planning school can provide,” Nelson said.

With its strength in areas such as urban climatology and Geographical Information Science (GIS), Nelson feels that the School of Geographical Sciences and Urban Planning is uniquely poised to address a wide range of key issues:  the impact of climate change on urban sustainability and population health, how best to use newly-available geographic data for smart urban design, the role of active transportation like cycling as a mechanism for sustainability, the impacts of human disturbance on earth systems, plants, and wildlife — and more.

“I’m pleased to welcome Dr. Trisalyn Nelson to our college and to ASU,” said Elizabeth Wentz, dean of Social Sciences for the College of Liberal Arts and Sciences. “She brings an impressive research record that crosses disciplinary boundaries to solve societal problems, as well as strong success as a leader.”

Nelson joins ASU from Canada’s University of Victoria, where she founded and directed the Spatial Pattern Analysis and Research Laboratory, was director of the Geomatics Program, and held the Lansdowne Research Chair in Spatial Sciences.

Nelson’s research initiatives build on transdisciplinarity among technical fields — integrating the power of computing, statistics and geographic information science (GIScience). She sees great opportunities to lead the School of Geographical Sciences and Urban Planning to collaborate across campus, within the surrounding community and with industry.

“Location, the heart of geography, provides a mechanism to integrate transdisciplinary knowledge and supports development of nuanced solutions,” she said. “The School is operating at the highest level and I believe by growing ideas we are collectively passionate about I can help build on existing momentum and continue the trajectory of excellence. The School is open for business.”

Barbara Trapido-Lurie

research professional senior, School of Geographical Sciences and Urban Planning

480-965-7449

 
image title
Have trouble being honest with your doctor? ASU researchers may be able to help.
When you know how someone is really feeling, you can help them be more open.
July 19, 2016

ASU researchers use cutting-edge technology to build a better health care practitioner

If you’ve ever lied to your doctor, chances are you’re not alone. But whether you’re ashamed of eating too much bacon or you’re trying to hide a smoking habit, the fact is, the more honest you are, the better equipped a physician is to help you.

So what if your doctors could tell whether you were hiding something? Better yet, what if they could adjust their bedside manner so that you never felt the urge to fib in the first place?

That’s what a couple of researchers at Arizona State University are hoping to make a reality.

At the Motivational Interviewing Laboratory on ASU’s downtown Phoenix campus, professor Jack ChisumJack Chisum is a clinical associate professor in the School of Nutrition and Health Promotion, an academic unit of the College of Health Solutions at ASU. and tech support analyst Glenn Brown are putting cutting-edge technologies to work in an attempt to build a better health care practitioner.

Using a combination of facial recognition and layered voice analysis software, they have been able to analyze an individual’s emotional responses during conversation with remarkable accuracy.

“In the near future, we’ll tell you what you’re thinking,” Brown said.

Turns out, when you know what someone is really thinking and feeling, you gain insight into ways you can make them feel more comfortable and forthcoming.

two people talking at table

ASU graduate students Frank Medina (left) and Neil Soneson conduct a mock interview.

Soon, Chisum and Brown plan to incorporate two more software capabilities that promise to further enhance the accuracy of their emotion analyses: one that measures physiological responses, such as blood pressure and perspiration, and one that measures brain activity.

Although their primary goal is to improve communication between and among patients and practitioners in the health care sector, the possible technical applications are manifold — the duo have received inquiries from professionals in fields ranging from law to insurance to business management.

“Anywhere where you can have two people in dialogue, this can be used,” Chisum explained.

How it works

Four hundred ninety-one — that’s the number of points on a human face the Noldus facial recognition software can detect.

So even if you think you can hide your disdain for a certain line of questioning from your doctor, Chisum assures, “You can’t hide from this technology.”

As for the voice analysis software, it splits the voice out to 1/10,000th of a second — meaning for every five seconds of vocalization, there are 5,000 sections of the voice that can be analyzed. (The software has most notably been used for counter-terrorism efforts to detect high levels of anxiety in a would-be terrorist’s voice, alerting officials to possible danger.)

Using these two very powerful tools to analyze video footage of an individual during a conversation gives you an incredibly accurate, incredibly detailed look at how they are truly reacting.

Noldus uses infrared technology to track minute facial movements, which it translates into a graph depicting different-colored lines for different emotions. Similarly, the voice analysis tracks minute vocal changes that indicate a change in emotion. When a researcher looks at the resulting data and sees a change or spike in emotion, all he or she has to do is back up the video to that point in time to see what caused it.

“Putting the face and the voice together, that’s measurable,” Chisum said. “A very finite measurement.”

In the case of one of Chisum’s clients who agreed to participate in a study about obesity, reviewing her data showed him that she reacted negatively to questions about her ideal body weight and exercise habits — even though she appeared outwardly neutral or even happy. Chisum was then able to share that information with her psychiatrist, who helped her to focus on those areas and work through them.

Chisum and Brown are now in the testing phases of implementing software that can detect changes in a person’s physiological responses. Called Shimmer, it attaches to a person’s fingers to measure things such as blood pressure, oxygen saturation and perspiration levels.

Eventually, they’d also like to incorporate a neurological component, using an EEGAn electroencephalogram (EEG) is a test that detects electrical activity in the brain using small, flat metal discs (electrodes) attached to the scalp. net to detect changes in brain activity.

What’s next

Considering the potential this technology has to enact positive change on any number of professional fields, it may be surprising to discover Chisum and Brown are the first to be researching its use for that purpose (as opposed to more stern purposes, such as interrogation).

“No one else in the world is doing this,” Chisum said. “And we know that because all the software providers have told us that.”

man watching video using facial reading tech

Glenn Brown analyzes video footage using the 
facial recognition and layered voice analysis software.

And the best part?

“We’re not confined to this lab,” said Chisum.

Data captured by the software can be analyzed from anywhere in the world. Theoretically, a hospital in Kathmandu could record a conversation between a doctor and a nurse, send it to Chisum and Brown to be analyzed at ASU, and then make communication improvements based on the results.

There’s only one real obstacle: “We need more people,” Chisum lamented.

He and Brown simply cannot do it alone; they need more students to train in the technologies who can then disseminate it across various professional sectors. But the outlook is good, with numbers of interested parties growing daily.

“What we’re trying to do is trying to find out what are you doing well, and how can we augment that? And what are you not doing well, and how can we decrease that?” Chisum said.

Their hope is that the use of this type of technology will eventually become commonplace in the health care sector. Where it goes beyond that is limited only by human ingenuity.

Save