image title

ASU urban climatologist reveals hottest and coolest spots on Tempe campus

June 14, 2019

Scientists' three-year heat study reveals how to best combine and place design features to cool pedestrian spaces

“To sit in the shade on a fine day and look upon verdure is the most perfect refreshment.” — Jane Austen

Just in time for face-melting season, a study of shade, heat and the best ways to cool pedestrians is being published by two Arizona State University scientists.

Urban climatologists Ariane Middel and Scott Krayenhoff did a three-year study of the Tempe campus, mapping out the three coolest (and hottest) spots on campus, taking readings even during excessive heat and record temperatures and discovering what works best to stay cool.

“It all boils down to shade,” Middel said. “To generalize, anywhere you have a huge tree that’s over grass. In terms of water use, it’s not the best solution, but it’s one of the coolest solutions. … Just putting grass doesn’t do much. Only planting grass without shade doesn’t make you more comfortable.”

In fact, one of the most miserable summer places on campus is surrounded by a lawn.

The three hottest spots on campus

1. At the center of the "X" sidewalks on Hayden Lawn (pictured above).

2. The walkway between Coor and Payne halls.

ASU hot spot: walkway between Coor and Payne Halls

3. The intersection of Cady and Tyler malls.

(“That was the worst,” said Middel, who would know.)

ASU's hottest spot: intersection of Cady and Tyler Malls

The three coolest spots on campus

1. The breezeway at Coor Hall.

Coolest spot at ASU: Coor Hall breezeway

2. Under the trees on the Old Main lawn.

Cool spot: under trees on the Old Main lawn

3. Under the giant ficus tree just west of the Memorial Union.

Cool ASU spot: under the giant ficus west of the Memorial Union

Next after shade as the best way to chill pedestrians comes cooling ground surfaces, like grass. Tree-shaded grass felt like 109 degrees Fahrenheit when the temperature was 119 F.

Vertical surfaces like hot building walls make it feel hotter. The walkway between Coor and Payne halls is completely concrete. The peak mean radiant temperature Middel and Krayenhoff measured there was 168 F.

The study’s findings give a look at how to best combine and place design features — green spaces, trees, and shade structures — to cool pedestrian spaces and can inform future construction and landscaping at ASU and in the broader community.

The city of Phoenix is struggling with how to cool the place down as heat-related deaths rise every year. In 2010 the city came up with a tree and shade master plan. Last year the city council voted to ramp up the plan by implementing a goal of having 25% of the city covered in shade by 2030. (Phoenix was at 12.4% in July 2015, according to officials).

Planting trees is great, Middel said, but they need to be planted where people walk. “That’s much more important than the number of trees,” she said.

The combination of rising temperatures, longer summers and the urban heat island effect are not pointing in good directions. Desert dwellers need to adapt.

“We’re expecting much warmer temperatures and for a longer time too,” Middel said. “A heat wave that might last two or three days now could potentially last for a week or so. … Temperatures we have here might occur elsewhere in the country. … It’s really important to think about how we can use urban form and design to try to mitigate that. We probably will not be able to reduce air temperature by that much, whatever warming we expect. It’s much more important what radiation hits your body — like shade versus sun, that’s much more important.”

To collect data on radaition, the scientists used a garden cart loaded with instruments like you’d find on a weather station. It measures air temperature, humidity, wind speed, wind direction and location with a GPS. The cart has 12 radiometers that measure incoming radiation from six directions. This includes shortwave radiation (think visible sunlight and UV radiation) and longwave radiation (the heat that gets emitted from hot surfaces). The shortwave and longwave radiation can be integrated into mean radiant temperature, the sum of all the radiation that hits a person’s body from 360 degrees. It’s a good measure of the human thermal experience in dry climates.

Ariane Middel with MaRTy mobile weather station

Assistant Professor Ariane Middel shows off MaRTy, the observational platform used in her newly released study. Photo by Charlie Leight/ASU Now 

The cart is named MaRTy, for mean radiant temperature. They took MaRTy on a campus tour on a record-breaking June day in 2016 when the National Weather Service issued an excessive heat warning. The airport reported 119 F that day. It was the fifth-highest temperature ever recorded in the Valley. Four hikers died that day.

They measured 22 locations with MaRTy every hour from 10 a.m. to 9 p.m. The locations are a combination of various sun-exposed and shaded areas (by trees, photovoltaic canopies, tunnels) with different ground covers (grass, concrete, gravel, asphalt). To stay hydrated, they drank all the sports drinks in one vending machine.

Urban planners and landscape architects will find studies like this useful in cooling down cities from the Persian Gulf to the American Southwest, Krayenhoff said.

"Pedestrians are likely to experience more extreme heat in the coming decades than they do today,” he said. “Common adaptation strategies may fall short in terms of their ability to offset projected air temperature warming. However, your exposure to heat as a pedestrian depends on the radiant environment in addition to air temperature, and by providing additional shade and cool surfaces in key pedestrian areas we may be able to keep pedestrians cool despite the coming heat."

Middel is an assistant professor with dual appointments in the School of Arts, Media and Engineering and the School of Computing, Informatics, and Decision Systems Engineering at ASU and a senior sustainability scientist in the Julie Ann Wrigley Global Institute of Sustainability.

Krayenhoff is now an assistant professor in the School of Environmental Sciences at the University of Guelph in Ontario, Canada.

Both authors are affiliates of the Urban Climate Research Center at ASU.

All photos by Charlie Leight, ASU Now

Scott Seckel

Reporter , ASU Now

480-727-4502

Innovations in imagery

Experiential computing research is revealing ways to integrate more realistic virtual objects into actual environments


June 14, 2019

Robert LiKamWa sees a future in which the next generations of visualization technologies will enhance our experiences in an ever-increasing number of pursuits.

Applications of virtual and augmented reality technologies already significantly impact education and technical training of all kinds — especially online teaching and learning. Robert LiKamWa Robert LiKamWa, an assistant professor of electrical engineering and arts, media and engineering, leads research on embedded optimization for mobile experimental technology at Arizona State University. Presentations on two of his lab team’s projects will be given at MobiSys 2019, the leading international conference on mobile systems science and engineering. Download Full Image

Engineering, science, communications, arts, culture and entertainment media such as movies and video games are all beginning to benefit from advances in tools that generate 2D and 3D images that can be integrated into actual physical settings, says LiKamWa, an assistant professor at Arizona State University.

He directs the METEOR (Mobile Experiential Technology through Embedded Optimization Research) Studio that focuses on designs for software and hardware systems to improve the performance of smartphones, tablet computers and other mobile communication devices. The designs are also used on applications of experiential computing — virtual and augmented reality systems.

LiKamWa has faculty positions in the School of Electrical, Computer and Energy Engineering, one of the six Ira A. Fulton Schools of Engineering at ASU, and the School of Arts, Media and Engineering, a collaboration of the Fulton Schools and ASU’s Herberger Institute for Design and the Arts.

From those schools he has drawn more than a dozen students to the METEOR Studio lab, where some are at work on projects to develop, fine-tune and expand the capabilities of computer systems to create new and improved forms of integrated sensory immersion.

Examples of the team’s progress will be highlighted at MobiSys 2019, an annual international conference on mobile systems, applications and services June 17–20 in Seoul, South Korea. MobiSys is presented by the Association for Computing Machinery, the world’s largest scientific and educational computing society.

Presentations on two projects directed by LiKamWa are on the agenda. It’s an exceptional recognition of what his group is accomplishing, given that conference leaders have typically accepted fewer than 25% of presentation proposals throughout the event’s 17-year history.

LiKamWa will present “GLEAM: An illumination estimation framework for real-time photorealistic augmented reality on mobile devices,” a project led by graduate student Siddhant Prakash, who earned a master’s degree in computer science from the Fulton Schools in 2018. 

Computer engineering doctoral student Alireza Bahremand, recent computer science graduate Paul Nathan and recent digital culture graduates Linda Nguyen and Marco Castillo also worked on the project.

GLEAM is short for Generates Light Estimation for Augmented Reality on Mobile Systems. The project involves an augmented reality application enabling the placing of virtual objects into physical spaces and illuminating those objects as they would be in those actual environments.

“This is about estimating the light in a space more accurately and robustly,” LiKamWa said. “People have been doing this for a long time in postproduction for computer-generated imagery in the film industry but we are bringing this capability to real-time mobile augmented reality systems and doing it in a way that improves the scene’s realism.”

With current systems of this kind, using even low-resolution virtual images requires a large amount of energy, he said, “so we have developed some tactics to actually reduce the energy use but still get better performance from the technology.”

Computer engineering doctoral student Jinhan Hu will present “Banner: An Image Sensor Reconfiguration Framework for Seamless Resolution-based Tradeoffs.”

Recent computer science graduate Saranya Rajagopalan and undergraduate electrical engineering student Alexander Shearer also worked on the project.

Banner involves technology that enables smartphones, headsets, tablet computers and similar devices to adapt camera settings to the needs of applications of computer vision and augmented reality to render images in appropriate resolutions for specific purposes.

Applications may need high-resolution image streams from the camera to capture distant or detail-intensive features of a scene, LiKamWa says, but they would be able to save energy and boost performance by capturing images at low-resolution settings when possible.

However, when applications try to change resolutions of image streams from a camera, mobile operating systems will drop frames, causing interruptions in the imaging.

With the Banner system, images at differing resolutions are captured and rendered seamlessly, with no loss in performance for app developers. At the same time, the system prolongs battery life by enabling low-resolution image streams.

“What we are doing is rearchitecting the parts of the operating systems, the device drivers and the media frameworks, so that we can actually change the resolution without dropping any frames, not even one frame,” LiKamWa said. “So you get a completely smooth resolution without any loss in the user experience.”

LiKamWa and his lab teams have come up with “two very impressive, elegant, technically demanding, well-implemented and well-evaluated solutions to very different problems,” said Rajesh Balan, an associate professor of information systems at Singapore Management University and a program committee co-chair for the MobiSys Conference.

The GLEAM project is a definite step forward in enabling “much more realistic scenes” using augmented reality imaging, while the Banner project “has high value for any application that processes a large number of photo or video images — for example, face recognition applications on mobile phones,” Balan says.

Balan’s fellow MobiSys 2019 program committee co-chair, Nicholas Lane, an associate professor of computer science at the University of Oxford in England, says LiKamWa is working at the forefront of research poised to produce “powerful mobile vision systems with capabilities that were until recently the domain of movies and science fiction. His work stands out because it rethinks how core aspects of these devices must function so they can better cope with the demands of high-fidelity processing and understanding of visual input.”

LiKamWa “has brought great energy to arts, media and engineering,” said the school’s director, Professor Xin Wei Sha. “He is inspiring students and colleagues alike and his METEOR Studio is blossoming with good students and innovative engineering research projects like GLEAM and Banner that are exploring fundamental experiential and technical aspects of mobile technologies and setting the stage for advances five to 10 years into the future.”

LiKamWa’s research, which has earned support from the National Science Foundation and the Samsung Mobile Processing and Innovation Lab, has led to provisional patent applications on the GLEAM and Banner technologies. His team will be releasing software frameworks for application developers to integrate these technologies into their solutions.

Joe Kullman

Science writer, Ira A. Fulton Schools of Engineering

480-965-8122