Our Minds Have Been Hijacked by Our Phones. Tristan Harris Wants to Rescue Them

The founder of a nonprofit aimed at stopping tech companies from “hijacking our minds” says internet users must rise up and reclaim their humanity.
Image may contain Human Person Face Man Sheran Yeini and Head
Kevin Abosch

Sometimes our smart phones are our friends, sometimes they seem like our lovers, and sometimes they’re our dope dealers. And no one, in the past 12 months at least, has done more than Tristan Harris to explain the complexity of this relationship. Harris is a former product manager at Google who has gone viral repeatedly by critiquing the way that the big platforms—Apple, Facebook, Google, YouTube, Snapchat, Twitter, Instagram—suck us into their products and take time that, in retrospect, we may wish we did not give. He’s also launched a nonprofit called Time Well Spent, which is devoted to stopping “tech companies from hijacking our minds.” Today, the TED talk he gave last April was released online. In it, he proposes a renaissance in online design that can free us from being controlled and manipulated by apps, websites, advertisers, and notifications. Harris expanded on those ideas in a conversation with WIRED editor in chief Nicholas Thompson. The conversation has been edited for clarity and concision.

Nicholas Thompson: You’ve been making the argument that big internet platforms influence us in ways we don’t understand. How has that idea taken off?

Tristan Harris: It started with 60 Minutes and its piece reviewing the ways the tech industry uses design techniques to keep people hooked to the screen for as long and as frequently as possible. Not because they’re evil but because of this arms race for attention. And that led to an interview on the Sam Harris podcast about all the different ways technology is persuading millions of people in ways they don’t see. And that went viral through Silicon Valley. I think several million people listened to it. So this conversation about how technology is hijacking people is really catching on.

NT: What's the scale of the problem?

TH: Technology steers what 2 billion people are thinking and believing every day. It’s possibly the largest source of influence over 2 billion people’s thoughts that has ever been created. Religions and governments don’t have that much influence over people’s daily thoughts. But we have three technology companies who have this system that frankly they don’t even have control over—with newsfeeds and recommended videos and whatever they put in front of you—which is governing what people do with their time and what they’re looking at.

And when you say “three companies” you mean?

If we’re talking about just your phone, then we’re talking about Apple and Google because they design the operating systems, the phone itself, and the software in the phone. And if we’re talking about where people spend their time on the phone, then we’re talking about Facebook, YouTube, Snapchat and Instagram because that’s where people spend their time.

So you’ve started this big conversation. What's next?

Well, the TED talk I gave in April was only heard by conference attendees, but now it's available online. It basically suggests three radical changes that we need to make to technology. But before understanding what those changes are, we have to understand the problem. Just to reiterate, the problem is the hijacking of the human mind: systems that are better and better at steering what people are paying attention to, and better and better at steering what people do with their time than ever before. These are things like “Snapchat streaks,” which is hooking kids to send messages back and forth with every single one of their contacts every day. These are things like autoplay, which causes people to spend more time on YouTube or on Netflix. These are things like social awareness cues, which by showing you how recently someone has been online or knowing that someone saw your profile, keep people in a panopticon.

The premise of hijacking is that it undermines your control. This system is better at hijacking your instincts than you are at controlling them. You’d have to exert an enormous amount of energy to control whether these things are manipulating you all the time. And so we have to ask: How do we reform this attention economy and the mass hijacking of our mind? And that’s where those three things come in.

OK. How do we reform it?

So the first step is to transform our self-awareness. People often believe that other people can be persuaded, but not me. I’m the smart one. It’s only those other people over there that can’t control their thoughts. So it’s essential to understand that we experience the world through a mind and a meat-suit body operating on evolutionary hardware that’s millions of years of old, and that we’re up against thousands of engineers and the most personalized data on exactly how we work on the other end.

Do you feel that about yourself? I tried to reach you last weekend about something, but you went into the woods and turned off your phone. Don’t you think you have control?

Sure, if you turn everything off. But when we aren't offline, we have to see that some of the world's smartest minds are working to undermine the agency we have over our minds.

So step one is awareness. Awareness that people with very high IQs work at Google, and they want to hijack your mind, whether they’re working on doing that deliberately or not. And we don’t realize that?

Yeah. And I don’t mean to be so obtuse about it. YouTube has a hundred engineers who are trying to get the perfect next video to play automatically. And their techniques are only going to get more and more perfect over time, and we will have to resist the perfect. There’s a whole system that’s much more powerful than us, and it’s only going to get stronger. The first step is just understanding that you don’t really get to choose how you react to things.

And where’s that line? I do choose sometimes to use Instagram because it’s immensely valuable to me; I do choose to go on Twitter because it’s a great source of news. I do go to Facebook to connect with my friends. At what point do I stop making the choice? At what point am I being manipulated? At what point is it Nick and at what point is it the machine?

Well I think that’s the million-dollar question. First of all, let’s also say that it’s not necessarily bad to be hijacked, we might be glad if it was time well spent for us. I’m not against technology. And we’re persuaded to do things all the time. It’s just that the premise in the war for attention is that it’s going to get better and better at steering us toward its goals, not ours. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. For example, we forget if the next video loaded and we were happy about the video we watched. But, in fact, we were hijacked in that moment. All those people who are working to give you the next perfect thing on YouTube don’t know that it’s 2 am and you might also want to sleep. They’re not on your team. They’re only on the team of what gets you to spend more time on that service.

So step one is, we need to transform our self-awareness. What’s two?

Step two is transforming design, so that based on this new understanding of ourselves—of how we're persuaded and hijacked, etc.—that we would want to do a massive find-and-replace of all the ways that we are hijacked in ways that we don’t want, and to replace them with the timeline of how we would have wanted our lives to go. An example of that is today, you look at your phone and you see a Snapchat notification. And it persuades you to think a bunch of things that you wouldn’t have thought. It causes you to get stressed out about whether or not you’ve kept your streak up. It’s filling up your mind. And by responding to that one streak, you get sucked into something else, and it cascades. Twenty minutes later you’re sucked into a YouTube video. And there goes your day.

What we want to do is block those moments that hijack your mind in ways you regret, and replace them with a different timeline—what you would have wanted to have happened instead. The resource we’re conserving is time. Imagine these timelines stretching out in front of people, and right now we’re being tugged and pulled onto these brand-new timelines that are created by technology. Let’s do a massive find-and-replace from the manipulative timeline to the timeline we would’ve wanted to have happened.

How do you do that?

As I say, it has to do with design. An example I gave in the TED talk released today was the idea of replacing the Comment button with a Let’s Meet button. In the last US election, conversations were breaking down on social media. People posted something controversial, and there’s this comment box underneath that basically asks you, Which key do you want to type? It turns into a flame war that keeps people expressing their views in small text boxes and keeps them on the screen. People end up misrepresenting each other’s ideas because their views get compressed into these little boxes of text. So it’s causing people to stress out. It’s causing people to dislike each other.

Internet companies are racing to the bottom to capture our attention, Tristan Harris' says in his 2017 TED talk.

Imagine we replace the Comment button with a Let’s Meet button. When we want to post something controversial, we can have the choice to say, “Hey let’s talk about this” in person, not online. And right underneath, there’s an RSVP, so people can coordinate right there to talk about it over a dinner. So you’re still having a conversation about something controversial, but you’re having it at a different place on your timeline. Instead of a fragmented timeline over 20 minutes at work getting interrupted 20 times—while Facebook delivers the messages drip by drip and other notifications come in and you’re getting sucked into Facebook, which is a total mess—you replace that with a clean timeline where you’re having dinner next Tuesday, and you’re having a two-and-a-half-hour conversation in which a very different sequence of events happens.

But how do you know meeting for dinner and talking about things is what you want to happen? Suddenly you’ve created this whole new system where you’re pushing people to meet in person because of your assumption that meeting in person or videoconference is better than talking in chat boxes. Which may be true. Or it may be false. But it’s still a decision made by the person or the social media company.

Yeah, exactly. And so before we ask, Who are we, Nick and Tristan, to say what's better?, let’s ask: Why is Facebook promoting a comment box and Like button in the first place? Were the designers thinking about what’s the best way for humankind to have conversations about controversial topics? No. They don’t get to ask that question. The only question they get to ask is, “What will get people to engage the most on the platform?”

If we really wanted to have a reorientation of the tech industry toward what’s best for people, then we would ask the second question, which is, what would be the most time well spent for the thing that people are trying to get out of that situation? Meeting for dinner is just an example. I’m not saying everyone should meet in person all the time. Another example: On the podcast, Sam Harris and I talked about the idea of a Change My Mind button. Imagine on Facebook there’s an invitation, built right in, to ask to have our minds changed. And maybe there are great places on Facebook where people are already having fantastic conversations that change minds already. And we, the designers, would want to ask, “When is that happening and when would we want to help people have those conversations.” Someone pointed both Sam and I after that to a channel on Reddit called “changemyview.” It’s basically a place where people post questions, and the premise is, “I want you to change my mind about this thing.” And it’s really really good. And that would be more time well spent for people.

So you want all the designers who work at these big companies and on these platforms to stop and think about what’s best for humankind: Hash that out, debate it. And maybe there is no single thing that’s best for humankind. But maybe you get closer to some ideal if you’re having those conversations instead of just thinking about engagement. Is that right?

Yes.

OK, so that’s part two. What is part three?

Part three is transforming business and accountability. We have to have a big conversation about advertising. I think we’re going to look at the advertising model—which has an unbounded interest in getting more of people’s time on a screen—and see it as being as archaic as the era when we got all our power from coal. Advertising is the new coal. It was wonderful for propping up the internet economy. It got us to a certain level of economic prosperity, and that’s fantastic. And it also polluted the inner environment and the cultural environment and the political environment because it enabled anyone to basically pay to get access to your mind. And on Facebook specifically, it allows the hyper-targeting of messages that perfectly persuade and polarize populations. And that’s a dangerous thing. It also gave all these companies an incentive to maximize how much time they have of your life. So we have to get off of this business model. And we haven’t actually invented the alternative yet.

So just like what happened with coal and things like wind power and solar power, if you went back to 1950 and said, “We’ve gotta get off coal,” good luck. We didn’t have any alternative that would’ve gotten us near producing the amount of energy we needed to support society. Same thing with advertising. If you said, “We’ve gotta get off advertising,” subscriptions and micropayments don’t (yet) add up to getting us back to where we are with the advertising model. But just like what’s happened with all of these renewable energy technologies, we can get to that point with technology if we make those investments now. And the background for this third point of transforming business is, the tech platforms are only going to get more and more persuasive.

What I mean by that is, we’re only going to have more information about how Nick’s mind works, not less. We’re only going to have more information about what persuades him to stay on the screen. We’re only going to have more ways to scrape his profile and what he posts to find the keywords and topics that matter to him and then mirror back his sentiments about everything he cares about when we sell him ads. We’re only going to get better and better at undermining his mind. And so the only form of ethical persuasion that exists in the world is when the goals of the persuaders are aligned with the goals of the persuadees. We want those thousand engineers on the other side of the screen working on our team as opposed to on the team whose goal is to keep us glued to the screen. And that means a new business model.

But can’t you make a compelling argument that being able to better target advertising is a way to give people what they want? If an advertiser knows that I need running shoes, they offer a discount on running shoes.

Yeah, so let’s be really specific here. This is isn’t about not getting ads for shoes we like, it’s about the advertising model. People say, “I like my ads for shoes!” People say, “And I don’t mind the advertising on the right-hand side of the article.” Exactly, the advertisements themselves are not the problem. The problem is the advertising model. The unbounded desire for more of your time. More of your time means more money for me if I’m Facebook or YouTube or Twitter. That is a perverse relationship.

Again, the energy analogy is useful. Energy companies used to have the same perverse dynamic: I want you to use as much energy as possible. Please just let the water run until you drain the reservoir. Please keep the lights on until there’s no energy left. We, the energy companies, make more money the more energy you use. And that was a perverse relationship. And in many US states, we changed the model to decouple how much money energy companies make from how much energy you use. We need to do something like that for the attention economy, because we can’t afford a world in which this arms race is to get as much attention from you as possible.

And as we start to go into virtual reality using these platforms, we become evermore manipulable and persuadable, right?

Exactly. The real message here is, now is the time to change course. Right now, 2 billion people’s minds are already jacked in to this automated system, and it’s steering people’s thoughts toward either personalized paid advertising or misinformation or conspiracy theories. And it’s all automated; the owners of the system can’t possibly monitor everything that’s going on, and they can’t control it. This isn’t some kind of philosophical conversation. This is an urgent concern happening right now.

Back to the analogy of the energy companies: Their behavior changed because the energy companies are regulated by the state. The government, which acts in the public interest, was able to say, “Now do this.” That’s not the case with tech companies. So how do you get to the point where they come together and make a set of decisions that limit the amount of attention that they take?

Well, I think that’s the conversation we need to have now. Is it going to come through the threat of EU regulation? Or will the companies get ahead of that and want to self-regulate. There are pros and cons to each of those approaches.

So tomorrow you want Mark Zuckerberg to call up Jack Dorsey, and you want the CEOs of all these companies to get together and say, “OK, we’re going to tell our engineers that they need to think about what’s best for their users, and we need to make a pact among ourselves that we’re going to do XYZ”?

That’s one part of it. And that touches on all sorts of problems having to do with colluding and self-policing and a whole bunch of other things. But we need to have a conversation about the misalignment between the business model and what is best for people; we need a deep and honest conversation among the companies about where these harms are emerging and what it would take to get off the advertising train. And I’m here to help them do that.

Talk to me a little bit about the differences between some of the companies. Apple, Google, Facebook—they have infinite sums of money. If they wanted to change their policies, that would be fine. Twitter—

Twitter not so much, but Apple and Facebook and Google could, yeah.

So you can imagine some kind of agreement between the infinitely profitable companies, but then Twitter, Snapchat, and the other companies not having the same financial success presumably wouldn’t join the pact.

Exactly, and that’s why it gets more complicated, because you can’t control, for example, popular companies that are outside the US. What do you do when Weibo swoops in and takes all the attention that Apple and Facebook and Google left on the table when they did their self-policing agreement? That's why it has to be coordinated from the outside.

There are two ways that can happen: One is through regulation, which is unfortunate, but something you have to look at; the other, and the opportunity here, is for Apple. Apple is the one company that could actually do it. Because their business model does not rely on attention, and they actually define the playing field on which everyone seeking our attention plays. They define the rules. If you want to say it, they’re like a government. They get to set the rules for everybody else. They set the currency of competition, which is currently attention and engagement. App stores rank things based on their success in number of downloads or how much they get used. Imagine if instead they said, “We’re going to change the currency” They could move it from the current race to the bottom to creating a race to the top for what most helps people with different parts of their lives. I think they’re in an incredible position to do that.

So you’ve partnered with this app called Moment, and one of the things it does is tell users how much time they've spent in each app, then users rate their satisfaction with each app. So Apple could presumably take that data, or create its own, and at the end of the day ask you, “How satisfied are you?” And if people are very satisfied it could put that app at the top of the App Store.

Yes. That’s one small thing that they could do. They could change the game, change what it means to win and lose in the App Store. So it would not be about who gets the most downloads.

What else could Apple do, specifically?

Change the way that they design the home screen. And notifications. They set the terms. Right now when you wake up in the morning it’s like every app is still competing all at once for your attention. Netflix and Facebook and YouTube want your attention just as much as the morning meditation apps. Imagine if there were zoning laws. So they could set up zoning lines in the attention city that they run and separate your morning from your evening from your on-the-go moments of screen time. So when you wake up, you’d see a morning home screen, in which things compete to help you wake up, which could include there being nothing on there at all. It’s like the stores are closed until 10 am, just like back in the old days. Right now, you don’t have a way to set that up. And there’s no way for there to be a marketplace of alternatives—alternative home screens or notification rules. So this is actually a way in which Apple could either do a really good job themselves or enable a marketplace of competing alternatives so that people could set up these zones, and we could figure out what would really work best for people.

But the incentives don’t work like that now. The reason these companies want you to use everything all the time is so they can serve you the maximum number of ads and get the most revenue and please their shareholders, but also so they can harvest the maximum amount of data.

I think we need to move from a conversation about data to a conversation about what data enables, which is persuasion. If I have data, then I know exactly what’s going to move Nick’s psychology, and I can persuade your mind in ways that you wouldn’t even know were targeted just at you.

So this is the world that we’re already living in. And this is the world that, again, ran away beyond the control of the engineers of the platforms.

But data isn’t only used to persuade me. It’s also used to help me plan my travel route most efficiently and to get me from A to B more quickly. So there’s a lot of good that can come from data, if used carefully.

Yeah, absolutely. And that’s part of why in this TED talk I say we have to have a conversation and whole new language for the difference between ethical and unethical persuasion. We don’t have good language in English for the difference between the words “manipulate” versus “direct” versus “seduce” versus “persuade.” We throw around the words like they refer to the same thing. We need formal definitions of what makes up a persuasive transaction that you want in your life and what makes up something that is nefarious or wrong. And we need a whole new language for that. That is one of the things that I plan to convene a workshop on in the next six months, bringing together basically the leading thinkers on this problem. Part of it is just defining these externalities and these costs, and the other part of it is defining what makes for ethical and unethical persuasion.

Right. I can decide, “Actually, I’ve looked at the data and I wish I spent less time on Facebook and less time on Twitter.” And then I can optimize my phone for that, or Apple can help me optimize my phone for that. But there are all these other ways that what I do on my phone or what I do in my car, all that data is transmitted to the companies and all these other things are made from it that I have no insight into. So determining a system where that is done in a way that is best for me and best for humanity is a more complex problem, right?

We need to think of these services and platforms as public infrastructure, and we need to be able to fund the solving of those problems in advance. If you’re a New Yorker, how much of the taxes you pay go to paying for police, subways, or street repairs? How much goes to sanitation? There are a lot of taxes and resources that are allocated to keep the city working well for people, asking what’s best for people. In contrast, think about how little at these technology companies is spent on “what’s best for people.” If you think of the actual scale of Facebook, just to pick on them a little bit, 2 billion people’s minds are jacked in, more than the followers of most world religions. You need a lot of people—not just 10, 20—working on the misinformation problem. We need a lot more people working on these problems, from cyberbullying to radicalizing content to misinformation and beyond.

So you want many more people looking at this. You want the companies to devote many more resources to identifying these problems, to being transparent about these problems, and you want a lot more effort put into letting users have agency and being made aware of the ways they lack agency.

Yes.

OK. And how are you going to win this war when one of the most important weapons for fighting it is social media itself? How do you win a battle about disengaging from the main weapon used in the battle?

It’s very interesting because this speaks to a related problem, which is the fact that these services are monopolies on the news. If they wanted to, without anyone knowing, they could quash my voice. They could make it so no one reads this article. And that speaks to the problem. I think this is why we’re creating a social movement in which people who care share this with each other and we start coordinating. We need to reach a consensus that there really is a problem with how 2 billion minds are being hijacked. That it’s not happening by accident. We need to talk about that with each other and pressure these companies to change.

All right, I think that is a good note to end on. Is there anything else, Tristan, that you want to say to WIRED’s readers?

I think the core ideas are here. And if people care about convening around the problem, resourcing it, or helping with advocacy—they should get in touch and join the movement for Time Well Spent.