Skip to main content

Posting Yelp reviews to Facebook changes their nature, ASU study shows

W. P. Carey School of Business professor Yili Hong breaks ground by examining review text, which is harder to quantify


|
March 07, 2017

Before you spend your hard-earned cash on a restaurant dinner, you want to make sure it’ll be worth it, and online reviews have become a big part of deciding where to eat.

A quick check of an online-review platform can show you how thousands of other people rated the food and service of restaurants, which thrive on the feedback.

There has been a lot of research on online reviews, but an Arizona State University professor’s paper is breaking new ground by looking at the actual words that people use in their reviews. He and his colleaguesHong’s colleagues on the paper were Nina (Ni) Huang of the Fox School of Business, Temple University, and Gordon Burtch of the Carlson School of Management, University of Minnesota. The paper, “Social Network Integration and User Content Generation: Evidence from Natural Experiments,” was published in MIS Quarterly. found that when the online-review platform Yelp started allowing users to post simultaneously on Facebook, it changed their nature.

The result was a double-edged sword for the sites — more reviews, which Yelp wants, but more “emotional” language, which users say is less helpful, according to previous research. In other words, more quantity but less quality.

Yili Hong, an ASU assistant professor of information systems, found that online restaurant reviews became more emotional when they also showed up on Facebook.

“Online reviews help consumers make decisions about which products to purchase, and firms want to leverage that to advertise their products and have good word of mouth,” said Yili Hong, an assistant professor of information systems in the W. P. Carey School of Business.

“There is a long stream of research in our discipline looking at user-generated content, but one thing the research really hasn’t delved into is the textual aspects.”

That’s because it’s much more difficult to quantify words compared with counting the number of stars or words in a review.

Hong and his colleagues wanted to see how integrating with Facebook — where consumers’ reviews could be seen by their friends — would change their words.

“How will this affect people’s behavior in writing reviews? They don’t want to disagree with their friends,” Hong said.

The team had a natural control situation when Yelp integrated with Facebook in July 2009 and TripAdvisor integrated 15 months later, providing a window that the team examined. They then randomly selected nearly 4,000 restaurants in New York City, Los Angeles, Chicago, Philadelphia and Phoenix that were reviewed on both platforms from 2008 to 2012.

They used automated text-mining software to calculate the presence of words in threeAn example of a review with “emotional” words is “yummy shakes and malts … my favorite place.” Cognitive wording: “Worn-out place, trying to make it charming without really succeeding.” Negation: “Not a lot of parking … food is nothing special.” linguistic categories — emotional, cognitive and negation, or disagreeing, language — in reviews of the restaurants on both Yelp and TripAdvisor. Then they compared the wording.

They found that when reviewers knew that their Facebook friends would see their reviews, they used more emotional language — and more positive emotions — and less cognitive language. There also is a big decrease in “negation” words.

That’s not necessarily a good thing for the review sites, Hong said.

“If you’re very emotional, people will think you’re just coming to dump your emotions in the reviews as opposed to being logical,” he said.

One takeaway for review sites: Consider ways to encourage users to be more logical and less emotional when writing reviews.

Online reviews are a huge business, both for the platforms — which make money by selling ads based on number of views on the site — and the businesses that are reviewed. Hong has two other papers on user-generated content that were recently published in the journal Management Science. In one, he and his team measured how people could be persuaded to write online reviews. The best way? A combination of financial incentives and peer pressure — telling them how many of their peers had contributed reviews. In another study, he found that using push alerts to tell reviewers how many “likes” they had compared with other reviewers only prompted them to produce more if they were winning. Low-performing reviewers would slack off when they discovered they weren’t competitive in “likes.”

Analyzing the linguistic features of reviews is the next frontier of research as more sophisticated evaluation techniques are developed, Hong said.

“Nowadays people are thinking about tweets and other social-media things, and they want to look into the text to measure things,” he said.

“Maybe they will even be able to measure sarcasm one day.”

More Science and technology

 

Student using laptop computer

ASU class explores how ChatGPT Enterprise can assist in scholarly writing

Just over a month ago, Jacob Greene received a notification he’d been waiting for — his proposal to use ChatGPT Enterprise was approved. Greene is an assistant professor at Arizona State University’…

Outdoor ASU sign reading "New schools New degrees New buildings" in front of a building.

New engineering degrees at ASU aim to open pathways, empower engineering expertise

It doesn’t take an extensive internet search to discover that engineering has become one of the most rapidly and broadly expanding STEM fields. Engineering has been on an upswing in recent years,…

Graphic illustration of a close-up view of the gut microbiome.

Study: Combining info on genes, gut bacteria enhances early disease detection

Identifying those at highest risk for developing common chronic diseases like heart disease, diabetes, Alzheimer’s disease and cancer is a core priority for preventive medicine. By catching elevated…