Brian Kenny:
Within days of the 9-11 terrorist attacks on the World Trade towers, while the dust was literally still settling, the US Congress passed an anti-terrorism bill that allowed the government greater liberty to use surveillance technology including internet wiretaps to combat terrorism. Senator Orrin Hatch of Utah, who co-authored the bill, said that, "For now at least privacy is going to have to take a backseat to security." And in the wake of that horrific attack, most Americans agreed with him. Of course, in 2001, it was hard to imagine a future where most everyone would carry a device around that held a limitless amount of personal information, private correspondence, financial and medical details, essentially composites of our lives. And most of us wouldn't have foreseen the extent to which smart connected devices would catalog and share our information in ways that both benefit us and cause us harm.
The truth is that while private information creates the personalized experiences that many crave, it also makes us vulnerable to attack in ways large and small. Today on Cold Call, we welcome Professor Henry McGee and Professor Nien-he Hsieh to discuss their case, “Apple Privacy versus Safety.” I'm your host, Brian Kenny, and you're listening to Cold Call on the HBR podcast network.
Nien-he Hsieh's research looks at whether and how managers ought to balance considerations of economic efficiency with values such as freedom, fairness, and respect for basic rights. Henry McGee studies the governance of nonprofit organizations, especially those in the arts. He is the former CEO of HBO Home Entertainment. They are co-authors of this case. Thank you both for joining me today.
Henry McGee:
Great to be here.
Nien-he Hsieh:
Thank you. It's great to be here.
Brian Kenny:
Great to have you back, Henry, and to have you Nien-he on the show for the first time…I'll ask you to start Nien-he, can you talk about what the central issue is in the case and how you start the class?
Nien-he Hsieh:
So the central issue in the case is how you as Tim Cook manage and balance the responsibilities to your users in terms of their privacy and protecting their data. And at the same time acknowledging that doing so may give rise to challenges with regard to the broader society responses you have. So that's sort of the heart of the question, is: How do you balance your responsibilities to customers and users and at the same time, responsibilities to society, whether it's in the form of security with regard to fighting terrorist acts, public health, or more generally protecting users from child pornography?
Brian Kenny:
What's the cold call that you use to start the case?
Nien-he Hsieh:
The cold call has actually changed over time. I think the first time it would've been after San Bernardino is if you're Tim Cook, how do you respond to the FBI's request to create a backdoor to allow them access to the iPhone? But now that we have three sets of issues, there's a question of physical safety in San Bernardino, there's a question of public health in the case of contact tracing with Covid-19, and then there's a case of protecting minors from child pornography. I think now opening the case is more generally: How important is privacy to users?
Brian Kenny:
Okay. So you wrote the case together. I'm curious, Henry, let me turn to you for a moment, as to why you decided to write the case. Why did you think it was important to document this?
Henry McGee:
Well, believe it or not, this case goes back to 2016.
Brian Kenny:
Okay.
Henry McGee:
And we have updated it twice, once in 2021 during Covid public health issue and the more recently in 2023 to address an issue raised by the issue of access to iPhones when it comes to child pornography and other issues. And we felt it important that this was a device that the students had in their pockets. Certainly it wasn't an abstract issue to students and it brought home the fact that as business leaders, they were going to have to balance these issues of responsibilities to their customers and responsibilities to society. And so this case with an iPhone in everyone's pocket really jumped out as something that we should tackle.
Brian Kenny:
Apple is sort of the main player here. Can you describe for our listeners who maybe aren't as familiar with Apple as others, what does the company look like and where does it sit in the landscape of devices?
Henry McGee:
Apple and Samsung together represent on a global basis about 40% of the worldwide phone market, which by the way, this year is going to approach sales of close to half a trillion dollars. It's absolutely enormous. Again, it depends on the quarter. The first quarter this year, they both had about 20% market share. Several Chinese companies come in right behind them. It's also important to note that the share is based on the market. So for example, in the US the share of Apple phones approaches 60%, but in China it's low double digits. So it depends on the market as well. But certainly they and Samsung are the global leaders.
Brian Kenny:
Yeah. Okay.
Nien-he Hsieh:
It's also important to think about whether we're comparing iPhone as a phone versus iPhone as an operating system. So there, the market also starts to look slightly different because then if you look at the operating system that Apple has, it's about maybe 30% I think, of all smartphones in terms of the share of market with regard to the operating system. But the other amazing thing to keep in mind with this is that unlike a lot of other products that get cheaper over time, the iPhone gets more expensive over time.
Brian Kenny:
Right. I know I have one.
Henry McGee:
Right.
Nien-he Hsieh:
And it's managed to increase its market share. And that I think is something pretty unusual to keep in mind for the listeners and thinking about just how remarkable in some sense and how successful it's been as a company.
Brian Kenny:
Why did privacy become such an important thing under Tim Cook's leadership? Because I don't hear other companies talking about the privacy issue. It always seems like Apple is front and center on that one.
Nien-he Hsieh:
So my view is that I think, and I think Henry can talk a little bit about Tim Cook's, but I think one thing is that it's actually used privacy as a way to differentiate the product because I think in some ways Apple really prides itself on the user experience. And so insofar as it's about the user experience, you need to gain user trust. And so having privacy at the forefront is one of the ways to sort of think about why it's an important feature for Apple.
Brian Kenny:
Okay.
Henry McGee:
It's clearly a product differentiator for them. And in fact, Apple's critics claim that they are emphasizing a product feature over their responsibilities to participate in law enforcement. And that Apple's in fact, on the wrong side of that issue. But privacy is central to Tim Cook and it's part of a broader philosophy he has about the role of business in society. And he feels that goes well beyond simply making profits. And if I had to sort of have a poster person for stakeholder capitalism, it would be Tim Cook. In fact, famously he stood up at one Apple annual meeting and told investors that if they didn't agree with the way the company was approaching its responsibilities, that they should simply sell the stock, which is not something you hear a lot of CEOs say.
Brian Kenny:
And it's kind of remarkable because you don't, at least I don't recall Steve Jobs ever really taking a stand one way or the other on this particular issue. So the case goes into Ed Snowden. People might recall that Ed Snowden was somebody who worked for the US government and he was able to steal a lot of information, a lot of data, particularly about US government surveillance. Can you talk a little bit about what the nature of that data was and what it revealed?
Henry McGee:
That's the key here in terms of the time. I remember Cook becomes CEO in 2011.
Brian Kenny:
Yeah.
Henry McGee:
And the Snowden as an issue shows up 24 months later in 2013 when Snowden, who was a contractor for the National Security Agency came across documents that showed that the US and British governments, without the consent of telecom companies, Google and others, was collecting search data, emails and there was actually no privacy at all. He leaked that information and it created a firestorm. Congress would later create legislation that would increase privacy. Snowden was branded a traitor. In fact, he ended up, initially he was in Hong Kong, initially fled to Russia, and is in fact, now a Russian citizen. And what he did has often says is one of the greatest breaches of US security in history. So the security establishment continues to vilify Snowden. On the other hand, it was a catalyst for the beginning of encryption and a real focus on privacy.
Brian Kenny:
Yeah.
Henry McGee:
So it was maybe just think about again, the timeline. 2011 Cook become CEO, 2013 Snowden leaks. There's a huge uproar over the issue of privacy. And then two years after that, in September 2015, Apple releases iOS eight, the new Apple iPhone 6s and iOS eight, in which there's a default encryption. So that no one can get access to the message because it's encrypted when you send the message and the person getting the message has a set of keys that unlocks it.
Brian Kenny:
Yeah.
Henry McGee:
And Apple does not have that key. And so it's the first real focus on making a product that is really focused on security.
Brian Kenny:
And that becomes a really central issue in the case as we go forward. But we're going to talk about San Bernardino and what happened there in a moment. But before we even go to that, I'll just take myself as an example. I feel like I give away my personal data all the time, signing up for this or asking for information for that. How good are people about protecting their own information on their devices?
Nien-he Hsieh:
It probably depends.
Brian Kenny:
Yeah. You might be better than me. I don't know.
Nien-he Hsieh:
When's the last time you changed your password?
Henry McGee:
Right.
Brian Kenny:
Yeah.
Nien-he Hsieh:
So look, if you look at surveys, people are notoriously bad at protecting their privacy, and they're open to not just actively participating and sort of responding to phishing emails or opening links and things like that that are problematic. So I think in that sense it's hard. But at the same time, I think people at some level also don't always really know just how much they're putting their privacy at risk. So I think that for me is the more concerning issue, is that people actually don't really understand just how much data, right, is being gathered and how much data is being collected. And that I think for me is the real issue as opposed to sort of how good or bad are you at protecting your privacy, if that makes sense. Yeah.
Brian Kenny:
Well, and to the Ed Snowden example, unless somebody tells us that it's being used surreptitiously, we otherwise wouldn't know. But there are laws to protect this, right? We have the Fourth Amendment. What does that cover?
Nien-he Hsieh:
Well, the Fourth Amendment is against the government collecting information. So it doesn't really exactly cover private companies collecting data in the same kind of way. So that's sort of one thing to keep in mind.
Henry McGee:
But to that point, certainly, and this is embedded I think in what we think of as a right to privacy as Americans, I think, well, it's certainly written vis-a-vis government. I think people would interpret it. They have a right to privacy and business, and importantly, even in the universal human rights declaration, a right to privacy as outlined. So I think it's a global issue, and people feel that privacy is a central issue, which takes us to San Bernardino for example, is that two terrorists committed mass murder, the information about their contacts was on their iPhones, but residing only on their iPhones because of the security protocols that Apple had. And the question was the government needed desperately to get into the phones in order to make sure that there weren't other attacks that were to come. And Apple took the position that they were not going to do that. One is there were no such keys and that they certainly weren't going to order their engineers to undo years of work they had done in order to encrypt the data.
Brian Kenny:
Yeah.
Henry McGee:
So there Cook was wrestling with two things, it was several things, right? He had his responsibility to maintain the idea of privacy, he's got issues with his engineers and the culture that he's created around privacy. He took the government to court, refusing. Before the issue could be resolved, the government actually hired a third party to hack the phone.
Brian Kenny:
To break the phone, right? Yeah.
Henry McGee:
And so it never came to that. But from Cook's point of view, the data belongs to the consumer. And if the government wants the information, they need to go to the consumer and not call him up as a third party asking him to damage his software that he's encoded. The United States Supreme Court ended up siding with Apple, a different case in which they argued that while law enforcement could get the information through a subpoena and someone could comply with a subpoena or not,-
Brian Kenny:
Right.
Henry McGee:
That what was being done in the prior programs, we were simply harvesting the data without getting a subpoena. So this is an issue, a great moment then as it is now.
Brian Kenny:
Yeah, yeah. And to your point earlier, Nien-he, part of this is, I mean, the whole issue of privacy was a differentiator for Apple. Very hard for Tim Cook to go back on that promise that he's made to his customers, even though I'd be willing to bet a fair number of customers thought, hey, this is a really horrible thing that happened and we need to find out who else might be involved because people's safety is in jeopardy.
Nien-he Hsieh:
That's true. But we have to remember also that the part of the issue here is not just sort of protecting consumer privacy, it's how it's done. Right. So this isn't simply allowing access to an individual account. This isn't simply opening a safe deposit box. What this is doing, if you sort of access the safe deposit box or the user's data in this case requires you, as Henry said, to sort of actually re-engineer the software, that in effect then essentially creates access to everybody's safe deposit box. And that's a key difference here. And that's what makes this, I think, somewhat different from other cases of the government trying to get subpoenas to go after information.
Brian Kenny:
So let's take this out of the US context because there's an expectation, right, of privacy in the United States. If we go to other parts of the world in countries where that's not a given, and we'll take China maybe as an example, how does a company like Apple operate in a place like China?
Henry McGee:
Well, it does operate in China,-
Brian Kenny:
In a big way.
Henry McGee:
I'll tell you. Very good. China not only most of its devices are manufactured in China, but also it's a huge market for them as well. And I think that Cook is well aware that just as the United States has banned Huawei, the huge Chinese mobile manufacturer from shipping to the US, that they're sort of one edict away from having Apple phones banned in China. I think what works against that is there's so much manufacturing going there, but of course Apple's in the process of trying to move some of its manufacturing away from China. But anyway, so big consumer market for them, essential manufacturing partner. And so they dealt with it two ways. One is Apple has not admitted this, but it's generally accepted that they have given the Chinese government access to their source code that goes into devices. Apple maintains that simple access to the source code, does not permit any ability to change the encryption. But importantly, they point to two things Apple has done. So for example, virtual private networks, which are another level of security, the Chinese government has banned those. And so Apple in compliance with that took the VPN apps off their app store. Secondly, during the protest in Hong Kong, Apple removed at the request of the Chinese government, removed maps of the city of Hong Kong. Apple argued that it was because people were using it to create violent things. Others say it was a push by the Chinese government. And Cook talks about this, he will deal with local laws, but privacy has to remain supreme.
Brian Kenny:
Yeah.
Nien-he Hsieh:
But people do criticism on this. Right. So if you think about, for example, Google, right, Google pulled out of China when they felt that they couldn't offer the same level of service and privacy to their search engines users and censoring results. So some people have criticized Apple as well for maybe being a little bit too lenient in this case and still wanting to maintain its market share and manufacturing capacity there.
Brian Kenny:
Yeah, you could question motives about all of these. The case does go into the really interesting situation that presented itself with the pandemic where contact tracing became an important way for us to sort of check the spread of Covid around the country, but also inherently seemed to carry with it some concerns about privacy. Can you talk a little bit about that?
Nien-he Hsieh:
There's two things that are remarkable. One is in that situation, Google and Apple came together, right, which I think is something that doesn't always happen and developed a contact tracing app, right, and a way to do that so that it would allow you as a user to know whether you had been in contact with somebody who had Covid.
Brian Kenny:
Yeah.
Nien-he Hsieh:
So I think that was useful information for individuals. Where it became difficult was when the government of France in this case wanted to then actually have more information, that they could actually gather the information so that they could then sort of target interventions with regard to the public on this issue. And so that became, I think, a sticking point. But again, some people say, look, from the perspective of public health, it was important for the French government to be able to do that. But that's another instance where Apple pushed back and said, we're not going to provide that information to you as the government, but simply to users in terms of their own contact.
Brian Kenny:
Yeah. Yeah.
Henry McGee:
Again, Apple, this is an important issue for the student. This is being taught during Covid. Many students are quarantined themselves. The whole public health issue is front and center. And again, there are nuances here. On the one hand, Apple and Google spent considerable resources to develop the contact app, but felt strongly that it should be shared only privately,-
Brian Kenny:
Yeah.
Henry McGee:
Where many governments, understandably, from a public health point of view, wanted to have that information where there are clusters of people in a particular location. So that sort of data was important, but you can see what a nightmare that would be from a privacy point of view. Because once the government is tracking where you are in terms of the spread of Covid, what if they track you in terms of spread of ideas they don't like?
Brian Kenny:
Of course. Yeah. And we can see things playing out in the world today where there are protests and demonstrations happening where that could be a real problem for people. So what was Apple trying to achieve with iOS 14?
Nien-he Hsieh:
They wanted to limit the amount of information that sort of companies could gather from their users.
Brian Kenny:
Okay.
Henry McGee:
So Apple's business model is very much unlike that of Facebook or Google's in which it's all based on advertising. Apple's is largely based on hardware sales. And so the iOS 14 and to force the users to make a decision, whether they want their behavior to be tracked, which Apples is a big privacy issue, do you want to be tracked or not? That ability to link customers and their journey across the web was essential to the ad sales of Meta and others. And they raised a huge ruckus claiming that it was an attempt of Apple to leverage its huge market share of devices, particularly in the US, to really remove a basic cornerstone of their business model.
Nien-he Hsieh:
The small wrinkle in that to be clear, is that given that when somebody's on Google or somebody is on Facebook or Meta, they tend to stay in that area. So it turns out actually that even if they weren't being traced through Apple, there is enough information out there for say, the larger platforms to be able to gather information on users. A lot of people argue that the biggest impact would've been on smaller companies, right, that relied on being able to sort of connect data across different platforms and that it would actually harm them more than the big companies. So we haven't sort of seen, but that was argument that was made. Yeah.
Brian Kenny:
The bottom line is what I'm hearing is that you're being listened to by somebody no matter what device you're on. One of the things we haven't really talked about here, we talked about Apple's place in the landscape, but we haven't talked about how revolutionary the device itself was, which allowed them to gain such a foothold in the space. Any thoughts on that?
Nien-he Hsieh:
I think it's actually really hard to remember,-
Brian Kenny:
Yeah.
Nien-he Hsieh:
Just how much the iPhone has changed the way we not just communicate but entertain ourselves, entertain each other, buy goods and services, take photographs around the world, it's really an amazing moment. It's not like the iPhone was the first phone to try to do these sorts of things, right, but it really brought things together in such a seamless way that made it easy to do that it really, for consumer side, lifestyle side, has made a huge, huge amount of change. I mean, think about Siri for example, but the other thing is also that whole ecosystem really did change the way that people engage with public activities as well. So I think Henry and I were talking earlier about Black Lives Matter.
All the protests that are captured on the camera, engagement with police, what's going on right now in Gaza. So there's a sense in which it basically has opened up the world. And I don't think it's unfair to say that it really has opened up the world to people across the world.
Henry McGee:
It's also had its negatives. I say this wearing my professorial hat. I noted with some interest that Florida now has passed a law that banned cell phone use on campus for high school students, which I think is actually my personal view, a step in a right direction. And when you see how for the younger generation, the iPhone, cell phones in general, have become so important, whether it's that's how they get their news, that's how they connect with people, and that's a real revolution. And that's why issues around privacy and security are only going to remain front and center. And so even though we wrote this case over seven years ago, I think that the issues it raises are as salient today if not more salient.
Brian Kenny:
Yeah, I would agree completely. We've heard a lot about Tim Cook's point of view on privacy, but we haven't really heard what his philosophy is around safety and security. So I'm wondering where that factors into his philosophy.
Nien-he Hsieh:
If we take the issue that's come up most recently in the UK around sort of child pornography, there is a real concern that I think he and Apple have about the safety of users. Right. So they did try to roll out a set of measures back in 2021 that if somebody tried to search for child pornography, they would immediately be redirected to certain resources to try to help them engage and deal with that issue. They tried to develop a system where they could actually scan people's accounts and emails for various images and check them against a database. So it's not like he's not concerned about safety and security, right, it's just that he really wants to weigh that in the right way against sort of thinking about user privacy. And so they did actually roll out this program that then met with a lot of criticism in part because it was criticized that it may not actually be effective. There was a concern also it would lead to greater erosion of privacy. And so they pulled back on that point.
Henry McGee:
And false positives.
Nien-he Hsieh:
And false positives. That's right too. Yeah. So it's not like he's not concerned about these sorts of issues.
Henry McGee:
And spent time engineering that, but in the face of pushback, decided not to do that.
Brian Kenny:
Well, and I guess I shouldn't pose the question as though the two things are sort of mutually exclusive. Right. I guess you can have privacy and safety at the same time.
Henry McGee:
And that's what we want our students to wrestle with is where along that spectrum is it right for them as business leaders to fall. On the one hand, you're going to make pledges to your customers. In this case it's about privacy. You're going to make pledges to your employees. On the other hand, you're going to have responsibilities to society at large. And we want students to really wrestle with those issues. And that's why we put them in the shoes of Tim Cook and ask them, where would you come out in terms of San Bernardino? Where would you come out in terms of contact tracing in the middle of a global pandemic?
Brian Kenny:
Yeah.
Henry McGee:
And where would you come out when instances of child abuse are on the rise? And the way that child abusers communicate is via mobile devices. And so we want them to sort of think through all of those issues.
Nien-he Hsieh:
And also, why they're in that position. Right. And part of the reason Tim Cook and Apple are in that position is because they have a very successful product that is used by lots of people. And so it's precisely because of that success that these problems also arise. And if you're a much smaller network, if you didn't have this big network of users, these issues wouldn't be playing out at the same scale. So part of it's also, so where does that responsibility come from? It comes from in part by actually sort of the success of your business. And then the other question becomes, how will you make money, right, and what's your business model? And to what extent does that sort of then either exacerbate or sort of pull back on these issues too? So these issues do often arise,-
Brian Kenny:
Of course.
Nien-he Hsieh:
In individual's cases, but there's behind the background we want students to think a little bit about also what is their business model and the nature of their success, and how does that then sort of give rise to these kinds of problems as well?
Brian Kenny:
Yeah. Yeah, and I think we know there's not going to be less data going forward. There's only going to be more and more and more data and how leaders choose to use that with good judgment is going to continue to be an issue. Yes?
Nien-he Hsieh:
Absolutely. I think in some ways, for me it's almost akin to the discovery of new resources or new potential frontiers for ownership. And we're creating a whole new set of resources, data. And one question we have to ask ourselves is just as we have in the past, whether it's in regard to land or these other kinds of resources, who is going to own that? It makes a big difference.
Brian Kenny:
This has been a great conversation. I have one question left and I'm going to ask both of you, but you can't answer it the same way because you're co-authors. So I'll start with you, Henry. Can you tell me, if you want our listeners to remember one thing about this case, what would it be?
Henry McGee:
I think the one thing that they should remember is that there is a responsibility for business leaders to think about and carefully balance their responsibilities across multiple stakeholders. In this case, most prominently, government, law enforcement, and user privacy.
Brian Kenny:
Nien-he?
Nien-he Hsieh:
I think following what I said before is for people to think about just how much, this may be similar to what Henry's saying, but how much a business is a part of society, that our products, services, devices, things that we sell, they're part of society. It's not this like there's just place called the market or the economy that's separate from society. It's all part of society. So that's the first thing. And then the second thing is, as a business leader, it's not just thinking about your responsibilities to society. I mean, business leaders are also members of society. And part of the answer to the question is, what are your responsibilities to society? So think about, what kind of a society do I want to live in and what kind of society are we helping to create? And I think for me, that's kind of what I hope this case sort of really highlights, given just how much influence and impact something like the iPhone has had on the world.
Brian Kenny:
Henry, Nien-he, thank you for joining me on Cold Call.
Henry McGee:
Thank you. It's great to be here.
Nien-he Hsieh:
Thank you, Brian. This was great.
Brian Kenny:
If you enjoy Cold Call, you might like our other podcasts, After Hours, Climate Rising, Deep Purpose, Idea Cast, Managing the Future of Work, Skydeck, and Women at Work. Find them on Apple, Spotify, or wherever you listen, and if you could take a minute to rate and review us, we'd be grateful. If you have any suggestions or just want to say hello, we want to hear from you. Email us at coldcall@hbs.edu. Thanks again for joining us. I'm your host, Brian Kenny, and you've been listening to Cold Call, an official podcast of Harvard Business School and part of the HBR Podcast Network.