4

Addressing Racial Discrimination on Airbnb

 1 year ago
source link: https://hbr.org/podcast/2023/01/addressing-racial-discrimination-on-airbnb
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Addressing Racial Discrimination on Airbnb January 31, 2023

For years, Airbnb gave hosts extensive discretion to accept or reject a guest after seeing little more than a name and a picture, believing that eliminating anonymity was the best way for the company to build trust. However, the apartment rental platform failed to track or account for the possibility that this could facilitate discrimination.

After research published by Harvard Business School associate professor Michael Luca and others provided evidence that Black hosts received less in rent than hosts of other races and showed signs of discrimination against guests with African American sounding names, the company had to decide what to do.

In the case, “Racial Discrimination on Airbnb,” Luca discusses his research and explores the implication for Airbnb and other platform companies. Should they change the design of the platform to reduce discrimination? And what’s the best way to measure the success of any changes?

BRIAN KENNY: In Romeo and Juliet, William Shakespeare asked, “what’s in a name?” As it turns out, names matter a lot, particularly if your name is common among African Americans and you’re applying for a job or a credit card or admission to college or preschool for that matter. Examples of discrimination in each of these situations and many others have been documented numerous times. A 2018 Gallup poll of Black Americans found that nearly two thirds perceive that Blacks are treated less fairly than whites while shopping in brick and mortar stores, driving many to opt instead to shop online. That approach works well in an anonymous transaction, but what happens when the buyer must reveal their identity? Today on Cold Call, we’ve invited Professor Michael Luca to discuss the case entitled, “Racial Discrimination on Airbnb.” I’m your host Brian Kenny, and you’re listening to Cold Call on the HBR Podcast Network. Mike Luca’s work focuses on the design of online platforms and on helping organizations to leverage data to inform managerial and policy decisions. That’s pretty much exactly what we’re going to talk about today. Mike, thanks for joining me.

MIKE LUCA: Yeah, thanks for having me.

BRIAN KENNY: I think obviously many of our listeners are probably familiar with Airbnb. Many of them have used it, including me. I’ve used it before. I’m a customer, so I was actually pretty taken aback when I read the case and I remember this work. This case actually was written in 2014 about some work that you did back then. Did I get that date wrong?

MIKE LUCA: I had the one paper came out in 2014. Then we had the 2015 paper, and this is actually a new case that’s putting ourselves in the footsteps of Airbnb after the research.

BRIAN KENNY: After the research. Okay, great. Anyway, I think people will be really interested in hearing about the insights that you bring to this conversation and the steps that they’ve taken as a result of some of your work. Thanks for writing it and thanks for coming on. Why don’t we dive right in? And I’m going to ask you to start by telling us what the central issue is in the case, and what’s your cold call is to start the discussion?

MIKE LUCA: Let me give a little bit of backdrop that’ll give a sense of what the central issue is. If we rewind to some of the early research on the economics of the internet, there are a bunch of predictions about how digitization and online platforms were going to change transactions. And a bunch of this relates to efficiency. You can think about a broader variety of products that are available, lower search costs, lower prices, and some of these predictions have borne out. But there are also some discussion about how it’s going to affect equity. And if we look at some of the early research, there was the idea that the internet was going to create more arms-length transactions, and by doing that, it had the possibility of reducing discrimination. Airbnb then came in, in the second wave of platforms and actually didn’t have an arms-length transaction. Instead, they were making not only the products and services very transparent to customers, but also the people that were renting and the people that were renting out their places. The idea that we had both in the research, and this is the jumping off point of the case, is that this has important implications. If you’re a host on Airbnb, now instead of Expedia or earlier platforms where you would just going to be able to book a place when it’s listed as available and proceed without any intermediate steps, here you have hosts that are seeing a name and a picture of a person and then deciding whether or not to reject them. The concern that we had is that this actually had the possibility to facilitate further discrimination in an online marketplace, even more so than some of the earlier generation platforms.

The case now puts ourselves in the situation of Airbnb after they found out about research that my collaborators and I had done looking at discrimination on the platform and puts themselves in the shoes of the CEO, Brian Chesky, and asks students what you should do when you see that the platform has facilitated widespread discrimination? What action should you take to help to reduce discrimination? Now what’s the opening question? I like to open this with a straightforward question I think kind of gets at the heart of this issue, which is: does Airbnb facilitate discrimination?

BRIAN KENNY: Yeah, that’s pretty direct.

MIKE LUCA: The question, you can see there’s kind of two parts in that, right? What is the evidence on discrimination? How do you as a CEO kind of understand what the implications of your product design are? The second part is about how actionable is this? What are the things that you as a CEO or a leader and a company could do if you see discrimination? Are there steps you could take to reduce it?

BRIAN KENNY: Yeah, that’s a great way to start the conversation. We always ask our guests what motivates them to write a particular case. And I’m wondering in this instance, how does this relate to the kind of things that you think about as a scholar?

MIKE LUCA: This is a topic I’ve been working on for a long time now. My research focuses broadly on the rise of online platforms and thinks about some of the design choices that platforms make and how we could have an online economy that’s both efficient and inclusive. For me, the case is an extension of this broader agenda I’ve been doing, thinking about inclusivity on the internet. And I actually started this case after the research that we had done and when we had written these papers that had documented discrimination on Airbnb, and then I was motivated to then say, “Okay, that’s kind of the research landscape that helps us to understand what the problem is.” We also put forward some of our own proposals for how Airbnb might fix it, and we’ve now been interested in developing a toolkit for that. But then what I thought is that this is a great topic for students to grapple with because it’s not enough for a leader to sit on the sidelines and say, “Oh, these societal issues are outside of my purview.” You need to bring them in and think about the managerial perspective on what you could do as a leader to address it.

BRIAN KENNY: Yeah. And many of them will probably encounter situations like this wherever they end up. That’s a great point. What’s kind of ironic about the whole thing is that Airbnb had a strategy and their strategy was to build the platform on trust. Describe a little bit more what their approach was and why they chose that, but also how this strategy might have actually facilitated distrust among their clients.

MIKE LUCA: You could think about some of the earlier platforms, Priceline, Expedia, you go on, you see a property, you see a price, you see the dates that might be available, but what you don’t see is a picture of the host or the person who’s managing the property. And what you don’t see is the hotel having an option of whether or not to reject somebody. Now we could think about Airbnb’s strategic decision. That was a pretty wide departure from the way that things were being done in the industry until Airbnb and others started to do it. Now, one thing about what was the strategy, there’s kind of an inkling that, oh, by connecting somebody and allowing them to see each other’s information, you might facilitate some extent of trust or a connection between the two individuals on that.

BRIAN KENNY: Yeah, there’s transparency there, that matters.

MIKE LUCA: There’s at least a kind of perception of transparency or a sense that you might be reducing the social distance between two people.

BRIAN KENNY: I mean, I always thought the very concept of Airbnb was a strange one when it first came out because you’re inviting somebody into your home, to sleep in your bed or however you want to think about that. I think that that whole notion, that strategy of building the platform on trust makes a lot of sense in that context.

MIKE LUCA: You could actually think about some of the early lines that were coming out from Chesky and others at Airbnb where they would think they wanted to make hosts feel comfortable. There were marketing campaigns that were sort of centered around the idea that if you don’t feel comfortable with the looks of a guest, you could just reject them for any reason and not feel like you’re going to be penalized for that. It’s kind of ebbed and flowed how stringent they’ve been about allowing a host to reject somebody and why you might be allowed to reject someone.

BRIAN KENNY: Yeah. I know we talked a little earlier before we started recording today about how some of your work has changed the way people think about discrimination online. I’m wondering if you can give us more of a sense of your work more broadly in this area.

MIKE LUCA: My research has focused on, I think it’s fair to say, it’s a big missed opportunity in the tech ecosystem to try to take steps to proactively create more inclusivity in the design of the products that companies are putting out there. My research has taken as a starting off point, this idea that the techno-utopian vision of a discrimination free world online isn’t something that we could just take for granted. And in fact, some design choices could make discrimination even worse. And I think that’s what we’ve documented in the context of Airbnb and elsewhere. And since then we’ve been thinking about now we know this is an extensive problem, we’re getting a better sense of what the contours of the problem are, what are the steps that a platform could take to reduce bias? That’s kind of the broader agenda around this.

BRIAN KENNY: Well, maybe we can focus on Airbnb in that context. What were some of the things that you did in your experiment that brought this to their attention?

MIKE LUCA: I should say even before the experiment, we had some non-experimental data that they had seen and others had seen. I think kind of companies were a little bit slow to start thinking about this as an issue. And I think what was really a wake-up call for the industry was we had run a large-scale experiment where we sent out requests to about 6,400 hosts where we only varied the name of the guest and kept the rest of the request the same. The names were following an audit study methodology that’s used by labor economists and also by policy makers where the names are statistically indicative of ethnicity. Think about names that are statistically more common among African Americans versus white Americans. And you could see that having a distinctively African American sounding name leads people to get accepted about 16% less often relative to an equivalent request from a white host. That’s kind of the first punchline of the paper.

BRIAN KENNY: Huge. That’s huge.

MIKE LUCA: These are big effects, right? There’s a pretty big problem at the application phase, and then we started thinking about is this shared properties? What’s going on here? And we found that it was pretty persistent, that even property managers who had multiple listings were still discriminating on the platform.

BRIAN KENNY: What was some of the fallout from that study? Because I do remember this making headlines back when it initially came out.

MIKE LUCA: After we wrote the paper, actually somebody from Airbnb had flown out to Boston and we actually met over a pizza and talked about what are the things that a platform might do. And that was kind of the impetus for starting to think more. And we made our recommendations to Airbnb. I have an HBR article on the topic. We’ve put together more toolkit building type things to think what a platform might do. We could pause here and think what might a platform do? If you’re sitting there at Airbnb and you want to reduce discrimination, well, you might try to automate transactions more than they were being automated.

BRIAN KENNY: Does that sort of fly in the face though, of the trust strategy?

MIKE LUCA: Well, we could think about what does it mean to build trust, and it’s a certain type of trust to say, well, you should trust the platform because we’re making it easy for you to reject someone if you don’t think that they’re trustworthy. I think what we’re pushing them to do, and this builds to another recommendation we made, is that the more robust you can make the rest of the system, the more people are going to trust Airbnb and they’re not going to have to rely on their own biases to determine who they should or shouldn’t be allowing to stay at their place.

BRIAN KENNY: Yeah. Did they take you up on any of the suggestions that you came up with?

MIKE LUCA: Some years have gone by from our initial research on. It’s been interesting to watch the arc of this, and we can think about the position of Chesky and others who are CEOs at these companies and trying to think how do they move forward? And actually at first I think they were a little bit slow to react. I think they sort of have been very public about the fact that they were slow to take the issue on. To their credit, they kind of acknowledged that. Now what have they done? They put together a task force to evaluate proposals, our proposals, other people’s proposals, and they put together set of commitments of changes they would make. Actually out of the changes that we have proposed in our HBR article, there are a few they have taken in and put into at a reasonable scale across the platform. They’ve increased the amount of what’s referred to as instant booking. Things that look a little bit more like Expedia or Priceline where you don’t have that intermediate step. A host just allows people to book if there’s listed availability. Second thing that they implemented is doing a lot to make pictures, so they’ve actually now removed pictures of guests until after the booking decision is made. That was a big change on the platform back in 2018. They’ve also tried to build out other systems of trust to try to give people confidence. And I would say at kind of a meta level, you could think about their changes as product changes. More instant booking, getting rid of pictures and also process changes. They actually have built out a team that’s focused on looking at bias in an ongoing basis. Kind of treating this not as a one-time fix, but as something they need to be more aware of in an ongoing basis.

BRIAN KENNY: Do they talk about these efforts at all? Or is this something they’re just kind of doing quietly hoping that it addresses the problem?

MIKE LUCA: No, so I would say we could think about what does it mean to talk about the changes? When they make changes, they’re public about it. Here’s a concrete example. They made people sign to new terms of service that they wouldn’t discriminate. A bunch of people ended up getting kicked off the platform for not wanting to sign it.

BRIAN KENNY: Wow.

MIKE LUCA: And when they make these changes, they sort of tell the world that they made the changes. Now you could say, is that enough? And I think kind of one area where it’ll be nice to see more in general is in addition to that sort of putting out there some concrete data on how much bias is left and how much of the problem do these different things solve. And I think it’s important for a number of reasons. It’s important when you’re thinking about building trust among customers. Now customers sort of now just have to take your word about were these things very effective, kind of effective, not effective at all? Are there more things that they could have done? There’s having more transparency about that would be helpful. It’s also relevant for policy makers. In California had actually reached an agreement with Airbnb to make it easier to do things like audit studies on the platform and to have more reporting. I think policy makers need to keep their eyes open for platforms to think about what is the data actually showing rather than just what are the changes? Because it’s hard. The reason these problems take a while to solve is because they’re hard to solve. It’s important to know both what is happening and also how effective the different strategies have been.

BRIAN KENNY: What are they measuring? What are the kinds of things they’re looking at and how does it affect the way that they experiment on the platform itself going forward?

MIKE LUCA: So, it’s a great question. And actually one of the interesting things about this… so, I’ve taught this in different context: so it’s going to be taught in our “LCA” course, but I’ve taught this in-

BRIAN KENNY: Which is “leadership and corporate accountability” for people who aren’t familiar, that’s all about business ethics and doing the right thing.

MIKE LUCA: And I think that is it a real moment for our leaders to think about how do you bake that more into your core operations? Actually, there’s an example where this became a headline priority for Airbnb, and it’s not something that just sort of sits on in a different group that’s thinking about accountability. It’s something that everybody at the company needs to think about. But now back to your question about what does this mean for experiments? What does this mean for measurement? These are challenging, they’re technically challenging issues and also managerially relevant ones. I’ve taught this in my “data- driven leadership” course as well. One of the things we think about, I think is really pertinent for leaders at companies is, what’s the thing you’re measuring? There’s this kind of old adage, what you measure is what you get. And I think in the age of experiments, that’s actually more true than ever, right? It’s not enough to just kind of measure the easiest to measure stuff because actually Airbnb had been running experiments prior to ours, certainly. They had run hundreds of experiments at a time thinking about different product changes. But what was missing is that if you aren’t measuring something that’s giving you a read on whether bias is increasing or decreasing or how much there is, it’s pretty hard to think that you’re able to solve the problem. Because essentially the company was optimizing to a narrower and shorter run set of outcomes, and by missing this, they were missing the boat in their experimentation. Now you could say, how does the company change this? So, I could kind of talk generally about this. If you’re sitting at Airbnb or another company and saying, “Okay, how do I better measure these types of issues?” You could think, “Can I bake these things more to my core metrics? Can I do just ongoing reports on how much discrimination there is and can I put in guardrails?” Maybe a change might be good for short run profitability or conversions, but if it’s actually bad for discrimination, then you should really be asking yourself as a leader, “Is this a change that I want to be making?” Even if it looks okay on some of the other metrics that they’re interested in.

BRIAN KENNY: I mean, how would a firm know if they’ve got a discrimination problem? I mean, are there certain markers that they can look for, certain indicators?

MIKE LUCA: A lot of companies might be thinking, “Well, I don’t even measure ethnicity” or there’s outcomes or types of discrimination you might be interested in looking at that you don’t kind of have at the tip of your fingers. What are the things you might do? One, I think that just having broad discussions of what are the types of issues that might be going on would allow leaders to get a broader sense of what data do we even want to be checking for and where do we think there’s a problem? More generally, once companies look, they get a sense of, “Oh, here are areas where there might be bias creeping in.” And that could allow them to do more targeted deep dives. Let’s look at Airbnb. They could have looked at complaints that were coming in, allegations of bias. Now you might say, “Okay, those are not definitive, or we don’t know how wide-scale that is.” But then you could say, “Okay, you could build a measurement system to estimate or approximate the amount of bias on there,” which is essentially a version of what we did in our experiment. It’s things that are indicators of ethnicity, and then you could see how product changes affect that. In fact, now we’ve been putting together some guidelines for what types of things companies might look at and things like, you could look at the names or pictures of people where you don’t know the race or ethnicity, you could statistically, you could estimate kind of what the likelihood of different ethnicities are, and then you could use that to get some sense of how bias is changing with different product changes. And that’s something that’s been adopted at a couple of companies now. Airbnb has done versions of things like this. Uber has done versions of things like this. I think companies are starting to say, “Okay, even if there’s not a perfect solution, there’s two or three different things that you might be able to do to measure bias on the platform and then use that to guide your decisions.”

BRIAN KENNY: Just raising the awareness is huge, right? The fact that your work has been able to elevate these firm’s awareness that there might be a problem that they should look into.

MIKE LUCA: Absolutely. Once leaders know, then it just takes some creativity and putting some tools in place. And lots of these things are now increasingly off the shelf to start to understand how do we address this?

BRIAN KENNY: Yeah, and it sounds like Airbnb was very responsive and it doesn’t sound like they backed off their original strategy of trust. They’re just trying to find ways to make it more trustworthy.

MIKE LUCA: Depends how you think about trust. There’s couple of lines from the early 2010s that platforms are thinking about what’s the future of anonymity on online? If you think about removing anonymity on Airbnb as saying that everybody should see everything about everyone they’re transacting with before you make a decision about whether or not to accept or reject somebody, then yeah, you might be introducing some targeted anonymity to reduce bias. Or the flip side, if it’s really about building connection, you can make the accept or reject decision and then there are creative ways to help to build that sense of connection that aren’t so, kind of… making it so salient race and ethnicity in a way that’s likely to facilitate bias.

BRIAN KENNY:  I know you look at a lot of different companies as you’re doing this research. Are there other examples you can cite of firms that are doing innovative things to try and get at this issue?

MIKE LUCA: There are a lot of companies that are trying to mitigate bias in the ways that Airbnb is now taking steps to do. But there are also efforts that I think are interesting of companies that are trying to think about how to create initiatives to bring attention to historically underrepresented groups. To give one example that’s from a paper that we’re working on now, Yelp recently rolled out a feature to allow people to more easily search for Black-owned businesses. And what we found is that by having this feature, that it actually brings more business both on the platform and in general to restaurant owners. When thinking about that as a leader, you’re thinking both about how to mitigate bias, but also thinking about are there groups that you want to raise awareness of? And I think putting these pieces together, all this comes down to having a broad lens of think about the social ecosystem in which you’re engaging and what you could do as a business leader to create an inclusive ecosystem.

BRIAN KENNY: Yeah. This has been a great conversation, Mike. Lots of things to think about. And obviously this is a continuing pioneer that everybody’s still sort of learning what it means to do business online and to create these kinds of relationships. Your work is having a great impact there. Before we let you go, I just want to ask you if there’s one thing you want our listeners to remember about the Airbnb case, what is it?

MIKE LUCA: It’s hard to say just one thing. I would say that getting about a high level, thinking about whether if you’re in charge of a product, if you’re in charge of a process, whether you’re leaving the world with an efficient and inclusive process or product, whether you’re doing things that are achieving the goals that you want to achieve, but without having these types of unintended consequences.

BRIAN KENNY: Yeah. Mike, thank you for joining me.

MIKE LUCA: Thank you.

BRIAN KENNY: If you enjoy Cold Call, you might like our other podcasts, After Hours, Climate Rising, Deep Purpose, Idea Cast, Managing the Future of Work, Skydeck, and Women at Work. Find them on Apple, Spotify, or wherever you listen. And if you could take a minute to rate and review us, we’d be grateful. If you have any suggestions or just want to say hello, we want to hear from you. Email us at [email protected]. Thanks again for joining us. I’m your host, Brian Kenny, and you’ve been listening to Cold Call, an official podcast of Harvard Business School and part of the HBR podcast network.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK