ASU lab provides insight into user experience for apps, stores, more


May 1, 2015

You’ve created a new mobile app. In addition to opinion-based focus groups and surveys, how do you know it works well for your customers, that they find it easy to use and engaging?

Or suppose you’re charged with creating a storefront display. How do you know people are drawn to the products you hope to sell? iLUX Lab uses an eye-tracking tool to capture user experience Download Full Image

What if you could virtually see through their eyes, or know what emotions they are feeling? And what if you could pair that to an exact action on your app, or the moment that captures their attention in your marketplace?

A newly opened lab at Arizona State University’s Ira A. Fulton Schools of Engineering can provide the evidence-based data and analysis that will give invaluable insight into the user experience.

The iLUX (Innovative Learner and User Experience) lab is designed to conduct a range of user-experience studies. These range from small-scale usability studies during prototype development to large-scale user-experience studies for industry and university partners.

The lab is led by Robert Atkinson, engineering and education professor, whose research explores the intersection of cognitive science, informatics, instructional design and educational technology.

The state-of-the-art lab, located in the School of Computing, Informatics and Decision Systems Engineering, includes a complete range of biometric sensors:

• brain-computer interfaces that collect EEG data and provide constructs such drowsiness/alertness, engagement and mental workload

• eye-tracking systems that track eye movement, gaze, fixation and pupil dilation

• facial-based emotion recognition system that collects facial images that infer emotions, such as joy, anger and sadness

• galvanic skin response bracelet that measures arousal and heart rate

The iLUX Lab is unique – it is based in a research environment, yet also completely mobile.

“We have a distinct combination of high-tech mobile biometric hardware and software that allow us to provide an all-in-one quantitative biometric user-experience analysis for our customers, in both the lab and/or onsite,” said Jim Cunningham, a doctoral student at the Mary Lou Fulton Teachers College and an expert in learning analytics and data mining who works in the lab.

“We also can provide all-in-one services from the experimental design to the data analysis and results interpretation, or we can work with the experts from the client side," he said.

So how does it work? Take EEG, for example. Sensors placed on the scalp of a user record electrical signals at 256 samples per second. These responses can be translated into measurements of attention, excitement, engagement, frustration or cognitive load.

Another system involves infrared cameras that capture visual attention and fixation. By recording eye movement, eye-tracking technology is able to record fixation points and how long a user gazes at certain elements. This data can be used to create heat maps of focus by users.

The systems are used in combination to provide the researchers information about the diverse reaction that a stimulus causes.

You can decide the level of service you need, whether it be just the use of equipment or a complete suite of services using technicians and consultants. You can supply your own experts or work with those provided by the iLUX lab. The lab can provide services for small-scale studies and large-scale studies.

The iLUX lab is located in the Brickyard on ASU’s Tempe campus. For more information, visit ilux.lab.asu.edu or email ilux@asu.edu.

Sharon Keeler

Making sense of smell: Will bio-inspired robots sniff out bombs, drugs and disease?


May 1, 2015

With just a sniff, our noses can detect smells that trigger specific memories, tell us food has gone bad, or even connect us to a potential mate. What if a robot could "smell" as effectively as we do?

In a new study funded by the Human Frontier Science Program, Arizona State University behavioral neuroscientist Brian Smith, along with an international team of researchers, is investigating how the brain is able to separate specific odors when many exist in a natural environment. They are also studying how learning specific scents may create changes in the way the brain perceives odors. A honey bee feeds on a flower. Download Full Image

The goal of the project is to create a computational model that would allow a flying robot to independently detect specific odors in a natural environment. This research could set the stage for the creation of bio-inspired robots that may one day be able to detect bombs, drugs or even uncover odor markers in diseases such as cancer.

“This is a very exciting research project, and we’ve put together a highly skilled, interdisciplinary team to investigate these hypotheses,” said Smith, a professor with ASU’s School of Life Sciences. “In a previous paper, we’ve shown that the process of converting odors into electrical impulses in the central nervous system takes as few as two milliseconds – 20 times faster than previously thought. This leads us to new insights into possible ways a target odor can be separated from background scents.”

Insects, like many mammals, depend on detecting different types of odors to find food, mates and shelter, and even to defend themselves. To do this, insects fly through an airstream that breaks up the "target" odor into thin, wispy filaments. They may contact these filaments for only a few milliseconds.

“Insects detect the odors in the filaments through thousands of odorant receptors on their antennae, which is the functional analog to our nose,” said Smith, a co-investigator on the project. “It's clear that insects do segregate target odors from background odors, but we just don’t know yet how they do it. The ‘background’ problem is a huge one for the development of artificial detectors. If we figure out how insects do it, then maybe we can implement those principles in artificial noses."

Researchers from University of Konstanz (Germany), University of Brighton (United Kingdom), and University of Tokyo (Japan) are co-investigators on this project. Smith and the team from ASU will focus on insect behavior and molecular work. The Japanese team will develop the robotics, while researchers in the U.K. work on computational modeling and scientists in Germany complete the imaging. The group will meet annually to conduct collaborative experiments.

The Human Frontier Science Program, which in March awarded the research group $1.35 million over three years, provides financial support specifically for research on the complex mechanisms of living organisms. Particular emphasis is placed on cutting-edge, risky projects. Smith’s team competed with more than 1,000 applicants to receive one of only 21 program grants awarded in 2015.

The Human Frontier Science Program is an international program of research support implemented by the International Human Frontier Science Program Organization based in Strasbourg, France. It promotes intercontinental collaboration and training in cutting-edge, interdisciplinary research focused on the life sciences.

The School of Life Sciences is an academic unit of ASU College of Liberal Arts and Sciences.

Sandra Leander

Manager, Media Relations and Marketing, School of Life Sciences

480-965-9865