Assistant Professor of Business Administration
Michael Luca is a faculty member at Harvard Business School. Professor Luca works closely with companies and cities to help them become more data-driven, and has ongoing collaborations with Yelp, Facebook, the UK government, and the City of Boston, in addition to other partners.
Professor Luca teaches The Online Economy, an elective course about the strategic and operational decisions faced when designing and launching an online platform. He also teaches an elective course in which student teams develop behavioral interventions and experimental designs for government and company clients, called IFC: Behavioral Insights.
Professor Luca's current work focuses on digital data and platforms, analyzing a variety of companies including Yelp, Amazon, and Airbnb. Professor Luca also works on issues related to the design of information disclosure. Focusing on the behavioral foundations of how people make decisions, he has done work on rankings, expert reviews, online consumer reviews, and quality disclosure laws, among other types of information provision.
His work has been written about in a variety of media outlets including The Wall Street Journal, New York Times, Washington Post, Boston Globe, Guardian, Telegraph, Huffington Post, Harvard Business Review, Atlantic, Quartz, Vox, and Forbes.
My new research on discrimination on Airbnb
Your Company Is Full of Good Experiments (You Just Have to Recognize Them)
How to Design (and Analyze) a Business Experiment
The High School Senior's Dilemma: Where Should I Go to College?
on choosing the right college.
City Governments Are Using Yelp to Tell You Where Not to Eat
Keeping it Fresh: Predict Restaurant Inspections
My collaborators and I co-sponsored a competition with Yelp and support from the City of Boston to explore ways to use Yelp review data to improve the city's health inspection process.
The goal for this competition was to use data from social media to narrow the search for health code violations in Boston. Competitors had access to historical hygiene violation records from the City of Boston — a leader in open government data — and Yelp's consumer reviews. The challenge: Figure out the words, phrases, ratings, and patterns that predict violations, to help public health inspectors do their job.
Click here for more information, and to see the results of the competition.
On the Facebook and OkCupid experiments
Should companies run experiments?
How to update credit card disclosures for the digital age.
Digital Discrimination: The Case of Airbnb.com
Online marketplaces often contain information not only about products, but also about the people selling the products. In an effort to facilitate trust, many platforms encourage sellers to provide personal profiles and even to post pictures of themselves. However, these features may also facilitate discrimination based on sellers’ race, gender, age, or other aspects of appearance. In this paper, we test for racial discrimination against landlords in the online rental marketplace Airbnb.com. Using a new data set combining pictures of all New York City landlords on Airbnb with their rental prices and information about quality of the rentals, we show that non-black hosts charge approximately 12% more than black hosts for the equivalent rental. These effects are robust when controlling for all information visible in the Airbnb marketplace. These findings highlight the prevalence of discrimination in online marketplaces, suggesting an important unintended consequence of a seemingly-routine mechanism for building trust.
Read the recent covereage by The Boston Globe and Forbes.com.
Fake It Till You Make It: Reputation, Competition, and Yelp Review Fraud
Consumer reviews are now a part of everyday decision-making. Yet the credibility of reviews is fundamentally undermined when business-owners commit review fraud, either by leaving positive reviews for themselves or negative reviews for their competitors. In this paper, we investigate the extent and patterns of review fraud on the popular consumer review platform Yelp.com. Because one cannot directly observe which reviews are fake, we focus on reviews that Yelp's algorithmic indicator has identified as fraudulent. Using this proxy, we present four main findings. First, roughly 16 percent of restaurant reviews on Yelp are identified as fraudulent, and tend to be more extreme (favorable or unfavorable) than other reviews. Second, a restaurant is more likely to commit review fraud when its reputation is weak, i.e., when it has few reviews, or it has recently received bad reviews. Third, chain restaurants - which benefit less from Yelp – are also less likely to commit review fraud. Fourth, when restaurants face increased competition, they become more likely to leave unfavorable reviews for competitors. Taken in aggregate, these findings highlight the extent of review fraud and suggest that a business's decision to commit review fraud responds to competition and reputation incentives rather than simply the restaurant's ethics.
Read the Wall Street Journal’s recent blog coverage
Where Not to Eat? Improving Public Policy by Predicting Hygiene Inspections Using Online Reviews
This paper offers an approach for governments to harness the information contained in social media in order to make public inspections and disclosure more efficient. As a case study, we turn to restaurant hygiene inspections – which are done for restaurants throughout the United States and in most of the world and are a frequently cited example of public inspections
and disclosure. We present the first empirical study that shows the viability of statistical models that learn the mapping between textual signals in restaurant reviews and the hygiene inspection records from the Department of Public Health. The learned model achieves over 82% accuracy in discriminating severe offenders from places with no violation, and provides insights into salient cues in reviews that are indicative of the restaurant’s sanitary conditions. Our study suggests that public disclosure policy can be improved by mining public opinions from social media to target inspections and to provide alternative forms of disclosure to customers.
Read the Atlantic’s recent coverage.
When 3+1>4: Gift Structure and Reciprocity in the Field
Do higher wages elicit reciprocity and hence higher effort? In a field experiment with 266 employees, we find that paying above-market wages, per se, does not have an effect on effort relative to paying market wages. However, structuring a portion of the wage as a clear and unexpected gift (by offering a raise with no further conditions after the employee has accepted the contract – with no future employment) does lead to higher effort for the duration of the job. Targeted gifts are more efficient than hiring more workers. However, the mechanism makes this unlikely to explain persistent above-market wages.
Read the Washington Post’s recent blog coverage.