- 11 Jul 2018
- Managing the Future of Work
Ep 8: What can businesses learn from the present crisis of trust in tech?
Bill Kerr: Facebook has reportedly shared large amounts of personal information on users and their friends with 60 companies, without getting the consent from all parties involved. And Uber has provided a fake version of its app to government officials trying to investigate the company. Actions like these jeopardize the trust placed in business. In fact, only half of Americans today trust companies. This mistrust has been fueled in recent decades by corporate layoffs and outsourcing and by the role of business in the 2008 financial crisis. In the Twitter era, moreover, leaders can wake up in the morning to a raging firestorm due to the poor actions of a single employee. Sandra Sucher from the Harvard Business School is conducting research on the current challenges of tech companies.
Sandra Sucher: Tech companies are under attack at the moment because they don’t understand the different dimensions of trust.
Kerr: Sucher says it isn’t enough for tech companies to be competent. If they cross ethical boundaries, they may face serious penalties. As a result of Uber’s actions, the city of London revoked its license to operate.
Sucher: The language that the city of London used is really pretty expressive. It said that Uber did not deserve a license to operate in the city of London because it was not a fit and proper business.
Kerr: Although trust in tech companies may have taken a hit, Sucher says trust can be rebuilt.
Kerr: Welcome to Harvard Business School’s podcast on Managing the Future of Work. I’m Professor Bill Kerr. I’m pleased to be joined by my colleague and Harvard Business School professor, Sandra Sucher, to discuss her new and important research on how companies can build and maintain trust. Welcome, Sandra.
Sucher: Well, thank you.
Kerr: Sandra, why is talking about trust in business so important now?
Sucher: Well, it’s important for a couple of reasons. One is trust in business is low. So over the past decade, trust in businesses ranged between 43 and 53 percent. So roughly one out of every two people does not trust business. And if you’re in a business, that’s not a good rate.
Kerr: And is the trend up or down?
Sucher: The trend recently has been up, but up to 53 percent is still kind of a low bar.
Kerr: Still a low amount. Probably higher than Congress.
Sucher: Right.
Kerr: But lower than where the businesses would like to be.
Sucher: Exactly. Well, that is true. Yeah. That is true. Yeah.
Kerr: So why is this important?
Sucher: So this is important really because we trust ... We need to trust businesses because they have so much power over our lives.
Kerr: What type of power do companies have over us?
Sucher: As customers, we trust businesses to fill our wants and needs. As employees, they have power. They can hire us. They can develop us. They can promote us. They can lay us off, and they can fire us. As investors, they have the power to make decisions that either put the company in a good place or in a not-so-good place.
Kerr: Are there any other ways that power gets exercised?
Sucher: Oh, sure. So the last way is really just us as members of the general public, and here they have powers that include powers over our communities, including the power to determine how much information about us and our friends gets into the public domain.
Kerr: Okay, this sounds familiar. Maybe take us to that step.
Sucher: So Facebook, it turns out, had opened up deals with 60 phone and other kind of device makers to continue to give them access to our personal data after they [Facebook] signed a consent decree with the Federal Trade Commission in 2011 saying that they would protect us. And these 60 companies ... four of them were outside the country. Four of them were Chinese. One of them is on a list as being a national security risk. But what really blew my mind were the statistics that the reporter who worked on this story found. So he used his BlackBerry to see, well, so how bad is this? And what he found was that this app on his BlackBerry could locate personal data about 550 of his friends. Now, he’s a great guy if he’s got 550 friends.
Kerr: And he’s got a BlackBerry, which also is …
Sucher: … says something about him …
Kerr: ... not exactly ...
Sucher: Right. And the other thing is the BlackBerry could find personal data about 300,000 of his friends’ friends. So when you think about whether or not anyone has given permission for that kind of access, that is a way that business has power over our lives, who we are, who knows about us, that really is truly awesome.
Kerr: Beyond these very high-profile examples, what distinguishes a company as trusted versus one that’s not?
Sucher: It’s actually a lot of hard work. There are two things that companies need. So, obviously, you have to be competent in order to be a kind of firm that people do business with. But in addition to that, you actually need to be legitimate. You need to be regarded as a legitimate business, and what I mean by that is legally, sort of a license to operate, which is, so you’re chartered by the government wherever you do business, but also more generally. The legitimacy is kind of a form of consent.
Kerr: Give us an example of a company that sort of built that consent up.
Sucher: So think about Airbnb. In the beginning, who would have thought that they would trust themselves to go spend the night at somebody else’s house that wasn’t a hotel, or their mother, right?
Kerr: Yes. Or Uber. I mean, hitchhiking suddenly becomes something of a regulated industry or something we can be a part of.
Sucher: Exactly. And legitimacy has this initiating phase where a company is first getting established and becoming legitimate and offering a certain kind of service. And then there’s a kind of a joining stage where over time people have to keep coming to it, and that’s when they need to pay more attention to how well they’re doing their service and all of that.
And there’s also a phase in legitimacy where people can withdraw their consent. There are many stories that you can read about people saying, “Well, I’m not going to do business with Airbnb because of what I found.” And so consent can be given; it can be taken away.Kerr: You’ve paid close attention to how the Michelin company was able to build up its trust and then gain advantages from that. Can you tell us a little bit about that story?
Sucher: Yes. So there’s a great example. Michelin is a company that has worked hard both at being trusted and at trusting other people. In 2015, they announced that they had signed an agreement with the employees and unions of a factory that they had in Rouen in France. And this agreement came out of a process that Michelin started where they had a long-range planning process for restructuring. And so they knew that this factory was going to actually be cut.
So they decided to give the factory a chance. And they said, “Look. You’re on the chopping block here, but we’re going to give you a year to see if you can come up with a way to become more economically viable as one of our factories. And if you can, we’ll see what we can do.” So over the course of a year, more than 70 people worked, and this included managers, union representatives, and employees. So first they had to identify what was wrong with the actual factory. Why wasn’t it making money? And then they had to build a plan for what they were going to fix, and this plan was very comprehensive. It involved quality, safety, productivity, morale, and technology. So it was all the things that they felt were preventing them from being more productive. And they presented to Michelin, and Michelin actually bought it. They said, “We’re going to give you …” and they invested 80 million euros in a new production line for premium tires, which was much, much easier to work with and better from a quality standpoint than their old, inefficient line. And then 95 percent of the employees of this factory signed on to this agreement, as did its unions. If you know anything about the labor situation in France, it truly is miraculous what kind of agreement can come, as people overcame decades of mistrust, to try to figure out how they could work together.Kerr: And that achieved great operational advantages in terms of the ability to move the factory up or down.
Sucher: Correct. One of the most striking improvements that they made is that they changed their staffing and scheduling so that they could flex their production up or down by 12 percent. And if you know anything about manufacturing businesses, 12 percent is huge in terms of allowing you to be responsive to immediate changes in the market.
Kerr: In your research in this area, you describe several levels or aspects of trust. And I’m hoping that you can take us, and maybe in the context of the tech companies and some of the challenges they face, walk through those levels and say where they’re doing great and then where they’re starting to have troubles. Or where do the problems arise?
Sucher: Tech companies aren’t doing well because they don’t understand that trust isn’t just one thing. It’s multidimensional. So tech companies understand very well one facet of trust, which is competence, but there are actually five dimensions of trust. Beyond competence, companies also need legitimacy, which we’ve talked a little bit about. And then there’s kind of a whole moral domain. Trust is a moral concept, and so it makes sense that, when we evaluate companies based on trust, we’re thinking about them in ethical terms.
Kerr: Describe those levels of ethical terms. What do we mean by the moral ground?
Sucher: This moral domain, it’s got three parts to it. The first is, does this company pursue moral ends? And the second is, are they using moral means to get there? And the third is, what do I think about their motives? Are they in it just for themselves or are they really trying to serve other interests instead of just their own?
Kerr: Okay, let’s go to the moral ends. Can you give us an example of where a company has struggled in this domain?
Sucher: Yeah. So, moral ends, we judge that by impact. Machiavelli famously said that we judge by appearances, and the great student of power understood that most of the time we can’t see inside other people’s heads. What we can see is what we can see. So we judge a company’s ends by the impact that it allows itself to have, and that includes intended impacts and unintended impacts. So under the unintended impacts, we hope, is the story of Facebook and two national elections.
Kerr: Okay, I’m presuming one of those is Donald Trump and the 2016 election.
Sucher: Cambridge Analytica gets access to over 50 million user profiles based on Facebook data. Out of that, they do psychographics that allow them to target individuals with ads that will sway their behavior.
Kerr: What is number two?
Sucher: Number two came out in 2018. And this was the use of WhatsApp, which is another Facebook product, to sway the Indian elections in 2018. And WhatsApp is a messaging service that’s used by 250 million users in India. Big group. And this was used as the platform through which messaging about the election was sent out, and so what people were listening to and getting as feeds were critiques of the different parties and what they were doing. Fake news stories. There was even a fake BBC survey that committed to and promised a huge election win for the governing party.
Kerr: Was it effective?
Sucher: Well, there was one data point that I thought was really interesting. One young political activist said that he was given a task of presiding over 60 people and getting them to vote a certain way. And he thought that messaging was responsible for his ability to commit ... to get 47 of them to commit to the party of his choice, and that included 13 who were uncommitted when he started.
Kerr: Wow. So that was moral ends. Can you go back and talk next about maybe moral means?
Sucher: So, moral means ... Here we are in Uber terrain. Right?
Kerr: You’re going after all the big tech companies.
Sucher: You know, one by one. And so with Uber, moral means really has to do with the rules of engagement. Are we trusting you to set rules of engagement that are fair? So that’s what we mean when we talk about moral means.
So rules of engagement start inside a company, and in 2017 Uber learned how damaging it could be to have a report penned by a disaffected employee that described very clearly the kind of culture that was in existence at Uber at the time. So this was a woman named Susan Fowler who wrote a 3,000-word essay that detailed Uber’s persistent refusal to take any action based on sexual harassment complaints. And this started a snowball of investigations inside the company by the board. It ended up with the forced departure of the CEO and co-founder, Travis Kalanick, and 20 other executives who had to leave because of allegations of sexual harassment against them. So that’s kind of fairness inside the company. What are the rules of engagement like? What’s it like to be there? Outside the company, Uber has been notorious, quite honestly, for the tactics that it uses to try to gain access to new markets. One of these is actually fairly entertaining. It’s a program called “grayball” that allows them to ...Kerr: Okay.
Sucher: Yeah. You’ve heard of blackballing someone?
Kerr: Yeah, yeah.
Sucher: This allows you to “grayball” someone, and what it means is ... This is primarily used against the government officials who are trying to decide whether or not Uber should be granted a license to operate.
So what Uber does is they find out who these people are, and they send them an app that is not the same app that you and I see. Their app has ghost cars on it that never arrive, and they create a completely discouraging experience where no cars ever show. So this government official who was trying to evaluate Uber’s service is in a position of not being able to make a judgment about whether or not this is a good app. So this is just one of many ways that Uber has tried to make sure that it can gain access to markets where cities have questions about whether or not they should give it to them.Kerr: It’s a very aggressive behavior. So we’ve talked about the ends and the means. You also said moral motives. Give us an example of moral motives.
Sucher: When it comes to moral motives, we’re really thinking about is this a company that can serve interests beyond its own. Right? So the kind of company we don’t want to do business with is someone that’s just in it for themselves and really doesn’t care about us as customers, its employees, members of the public. And that’s hard to do, by the way. Companies have a very tough row to hoe because they do have to serve multiple interests, and these interests can conflict.
So Google found itself in the middle of one of these rapid-fire crises that emerged in the spring and summer of 2018. What had happened was that Google had signed up to be part of a contract with the Pentagon for a program called Maven. Maven was using artificial intelligence to target drone strikes. Now, on the one hand you could say, “Well, that’s a good thing. I want to have drone strikes that are very precise.”Kerr: Less collateral damage, unintended consequences.
Sucher: Exactly. On the other hand, when you think about do you really want to be part of a company that is creating the software that allows drone strikes to target individuals. That becomes something that’s a little bit more contentious.
Kerr: Weaponizing artificial intelligence and the implications that brings.
Sucher: Exactly. Exactly. So the Google data scientists were not happy about this. They said, “This is not the kind of company we came to work for.” Google, to its credit, hosted a big debate between well-respected members of both sides of the debate here. Four thousand employees at Google signed a petition asking the company to not weaponize AI. And so, in the end, Google decided that it would withdraw. In other words, it wouldn’t stay in the Maven contract once it came up to be renewed. It would continue to serve the military in terms of allowing advertisements for places on all of its products, but that it was just going to really stay out of this business of weaponizing AI.
Kerr: Let’s go back to where you began on this by saying you use all of these services, and many of us do. If there are these issues with parts of the ethical ends, the ethical motives, how come we are all still customers? Does this really have impact for the companies?
Sucher: The answer is kind of yes and no. Right? And the no part is that we all do use these companies because they’re really good at what they do. Yeah.
Kerr: It goes back to that competency.
Sucher: They understand their business really well and they work really hard, and they’re quite skilled at designing the kinds of experiences that we want to have, but companies can still be hurt when people stop trusting them. And I think that the way that they get hurt can differ, but it basically is kind of a road to the loss of legitimacy. So the first place that it starts is when a company is subject to regulation.
Kerr: Government steps in.
Sucher: Government steps in and says, “There’s something going on here that we don’t like,” and in fact, in 2018, the EU promulgated new regulations to protect data privacy of EU citizens, and they were the most strict protections that had been made legal anywhere in the world.
Now, all of these companies have to work a little bit harder than they were before on data privacy because there are regulations that are going to affect how they do business.Kerr: Okay. Beyond the government, what are other ways that companies could be undermined?
Sucher: Well, the second thing that can happen is lawsuits. So 48 minutes after these regulations took effect, an Austrian non-profit called NOYB …
Kerr: Sandra, can you tell us what that acronym stands for?
Sucher: It actually stands for “none of your business.”
... filed lawsuits against Google, Facebook, and two Facebook subsidiaries, WhatsApp and Instagram. So the suits alleged that the four companies failed to give European users specific control over their data and violated the new regulations. The suits asked regulators to impose the maximum fines allowed by law, which for the four companies would have totaled, or could total, $8.8 billion.Kerr: That’d be a big check to write.
Sucher: A big check to write even for companies like this. So that’s the second way. So regulation. You could get sued. The third way is you can lose your license to operate. This is what Uber found out when the city of London said that it was not a fit and proper business.
Kerr: And that was the official terminology that was given.
Sucher: Official terminology. This is how they evaluated Uber, and two other British cities actually also revoked their license to operate. And so there are many ways that you can get hurt and hurt profoundly. I think there’s one new way to be hurt that’s trending quite decently now, and that is through collective action taken against companies.
Kerr: Sandra, tell us a little bit more about this collective action.
Sucher: Okay. So one form of the collective action is internal to a company, and so we saw an example of that with Google when its employees basically rose up and said, “We do not want to weaponize AI. We don’t want to work in the company that does this.” And so Google learned that, if it was going to continue, it was going to have a whole lot of ticked-off employees.
Kerr: Yeah. There’s a boundary internally as to what the company wanted to do.
Sucher: Exactly. The second form of collective action is external. We’re seeing many, many more examples of that in business. A noteworthy example was when ABC discontinued a new TV show that Roseanne Barr, after two decades of being off TV, had started. It was the most popular new show on TV in the spring of 2018. But Roseanne Barr had sent out a very racist tweet about Valerie Jarrett, who was Barack Obama’s ... well, one of his most influential advisers. And there was such widespread revulsion against what she wrote, that basically the company got pressured to say, “Should you really be running a show with this person and about this person?” And ABC canceled the show.
Kerr: We’ve seen earlier episodes with Matt Lauer and the #metoo movement and other places where both the ability of one person to proclaim loudly their discontent, but then also to have the collective action emerge around that, is taking companies to new places. They have to respond in new ways.
Sucher: Exactly. And, if anything, shows the importance of paying attention to trust. It is this new trend toward collective action.
Kerr: So if a company has lost its trust, can it regain it? And how would it go about regaining it?
Sucher: So trust can be regained. In fact, it’s one of the fallacies where everyone always says, “Trust is so fragile. Once lost, it can never be regained.” That’s just simply not true. Trust is a relationship and relationships go through ups and downs, and trust in a company can do the same thing. So it’s important to understand that this can be made better.
Kerr: Okay.
Sucher: But a company that seriously wants to do this should ask itself three questions. The first is, do we tell the truth? The second is, on whose behalf are we acting? And the third is, how do our actions benefit the people who trust us?
Kerr: Can you put this in action for us? Give us an example of a company that really went through a crisis and was able to rebuild the trust.
Sucher: Yeah. There’s a really good example. It’s in Japan. It’s a company called Recruit Holdings. And Recruit Holdings is a media conglomerate that has lots of advertising platforms. It’s also in the HR placement business, hence the name, Recruit. And in the 1980s, the CEO created one of those shares-for-favors scandals.
Kerr: Doesn’t sound like it’s going the right way.
Sucher: It is so not going the right way. And so this scandal was so destructive that it brought down the prime minister of Japan and his entire cabinet, all of whom had to resign because of this scandal. So, in fact, if you go to Japan now and you ask people about Recruit, one of the first things, if they’re a little bit older, people will say, “Oh, you mean the scandal?”
And so this is something that people are still actually aware of. So let me show you how Recruit answers those three questions, because they’ve done a very good job of coming back. It’s now a global business that is $16 billion in revenues, 45,000 employees, manages lots of media properties. So they answered the first question, about do we tell the truth, by actually posting on its website the story of the scandal. So they tell what they did. They tell all of the actions that they took afterward and what they think they’ve learned from what happened. In terms of on whose behalf we’re acting, around that time, right after, about a year after the scandal, they created a new mission, which was to contribute to society. Now, that can sound like one of those kind of anybody can make a mission like this, but this is a mission that they’re quite committed to. And so one of the things that they do is they ask employees pretty much all the time ... “Why are you here?” And that is meant as a seriously clarifying question for employees to think about.Kerr: “Why are you here?”
Sucher: “Why are you here?” And the employees are reviewed twice a year, and at each of the reviews they’re asked to think about what impact they want to have in society in the next six months and over the next three years. So this is something they’re very serious about, and the proof is in the pudding. In terms of the last question about how our actions benefit those who trust us, you really can see it in the kinds of businesses that they start.
And I’m not talking here about creating charities or non-profits. This is a very well-respected profitable business that’s doing a great job, but here are the kind of things that they do. Let me tell you about how an executive described how he came to create a hair salon booking platform.Kerr: Okay.
Sucher: In Japan, hairstylists were expected to be onsite throughout the day to wait for customers. This meant the stylists with children would leave their jobs because they never knew when they would be needed. Recruit’s salon booking platform solved that problem for stylists who could come to work when they were needed. On the other side, it also offered customers the ability to book an appointment online, to check immediate availability of stylists, and to compare different salons based on customer ratings.
So other businesses are digital platforms that offer cheap tutoring to rural students, a platform for finding emergency daycare, and acquisitions like the job search sites Indeed and Glassdoor. So what’s so striking about the Recruit example is that it’s essentially a tech company. So there are different paths to follow when it comes to earning trust, no matter what industry you’re in.Kerr: So how long did the Recruit story that we have span, that goes from the scandal with the prime minister to today, how long did that unfold? How long did it take the company to rebuild the trust?
Sucher: It took decades. The founder had made some very bad real estate investments, and so the company actually had to be sold for a while. So it finally got out from under debt in the early 2000s and has only recently, within the past five or six years, become a truly global company. So coming back from the dead, coming back from a serious trust problem, takes time and unbelievable dedication, but it definitely can be done.
Kerr: So, Sandra, as we come to the end of the podcast, could you summarize for us or give us a checklist that managers and business leaders can apply for thinking about how their companies are trusted and what they can do more?
Sucher: Yeah. And the checklist goes right back to these five dimensions of trust. Right? And so the first question that you’d ask is, are we competent? Right? That’s the foundation of trust. Trust is when we trust someone to do something for us, and so without that, you’re not going to make too much headway.
The second question is, are we legitimate? And I mean by that, do we have a kind of an uncontested license to operate? Do people consent to do business with us willingly? And is there no contention around that? The third question is, are we pursuing moral ends? And what I mean by that is, are we holding ourselves responsible for the impact of our actions? And that includes unintended impacts as well as intended impacts. And that’s how the world is going to look at us, so that’s how we should look at ourselves. The fourth question is, do we use moral means? And what I mean by that is, are we setting rules of engagement that other people think are fair? And then the fifth question is, are we operating from moral motives? And the test that people are going to be using for that is, are we looking out for the interests of all the people that we do business with, that we affect, or are we just looking out for ourselves?Kerr: We thank Sandra Sucher for sharing her insights today. Whether a scandal from a single employee or the actions of the company as a whole, business leaders need to think about how they can maintain and improve trust in their organizations. Sandra, we appreciate you joining us.
Sucher: Well, thanks so much. It’s been great.
Kerr: We appreciate everyone else for listening in.