Our Minds Have Been Hijacked by Our Phones. Tristan Harris Wants to Rescue Them

by David Pierce

 

“It’s easier to fool people than to convince them that they’ve been fooled.” – Unknown.

 


Photo courtesy of Tristan Harris

 

S ometimes our smart phones are our friends, sometimes they seem like our lovers, and sometimes they’re our dope dealers. And no one, in the past 12 months at least, has done more than Tristan Harris to explain the complexity of this relationship. Harris is a former product manager at Google who has gone viral repeatedly by critiquing the way that the big platforms—Apple, Facebook, Google, YouTube, Snapchat, Twitter, Instagram—suck us into their products and take time that, in retrospect, we may wish we did not give. He’s also launched a nonprofit called Time Well Spent, which is devoted to stopping “tech companies from hijacking our minds.” Today, the TED talk he gave last April was released online. In it, he proposes a renaissance in online design that can free us from being controlled and manipulated by apps, websites, advertisers, and notifications. Harris expanded on those ideas in a conversation with WIRED editor in chief Nicholas Thompson. The conversation has been edited for clarity and concision.

Nicholas Thompson: You’ve been making the argument that big internet platforms influence us in ways we don’t understand. How has that idea taken off?

Tristan Harris: It started with 60 Minutes and its piece reviewing the ways the tech industry uses design techniques to keep people hooked to the screen for as long and as frequently as possible. Not because they’re evil but because of this arms race for attention. And that led to an interview on the Sam Harris podcast about all the different ways technology is persuading millions of people in ways they don’t see. And that went viral through Silicon Valley. I think several million people listened to it. So this conversation about how technology is hijacking people is really catching on.

NT: What’s the scale of the problem?

TH: Technology steers what 2 billion people are thinking and believing every day. It’s possibly the largest source of influence over 2 billion people’s thoughts that has ever been created. Religions and governments don’t have that much influence over people’s daily thoughts. But we have three technology companies who have this system that frankly they don’t even have control over—with newsfeeds and recommended videos and whatever they put in front of you—which is governing what people do with their time and what they’re looking at.

And when you say “three companies” you mean?

If we’re talking about just your phone, then we’re talking about Apple and Google because they design the operating systems, the phone itself, and the software in the phone. And if we’re talking about where people spend their time on the phone, then we’re talking about Facebook, YouTube, Snapchat and Instagram because that’s where people spend their time.

So you’ve started this big conversation. What’s next?

Well, the TED talk I gave in April was only heard by conference attendees, but now it’s available online. It basically suggests three radical changes that we need to make to technology. But before understanding what those changes are, we have to understand the problem. Just to reiterate, the problem is the hijacking of the human mind: systems that are better and better at steering what people are paying attention to, and better and better at steering what people do with their time than ever before. These are things like “Snapchat streaks,” which is hooking kids to send messages back and forth with every single one of their contacts every day. These are things like autoplay, which causes people to spend more time on YouTube or on Netflix. These are things like social awareness cues, which by showing you how recently someone has been online or knowing that someone saw your profile, keep people in a panopticon.

The premise of hijacking is that it undermines your control. This system is better at hijacking your instincts than you are at controlling them. You’d have to exert an enormous amount of energy to control whether these things are manipulating you all the time. And so we have to ask: How do we reform this attention economy and the mass hijacking of our mind? And that’s where those three things come in.

OK. How do we reform it?

So the first step is to transform our self-awareness. People often believe that other people can be persuaded, but not me. I’m the smart one. It’s only those other people over there that can’t control their thoughts. So it’s essential to understand that we experience the world through a mind and a meat-suit body operating on evolutionary hardware that’s millions of years of old, and that we’re up against thousands of engineers and the most personalized data on exactly how we work on the other end.

Do you feel that about yourself? I tried to reach you last weekend about something, but you went into the woods and turned off your phone. Don’t you think you have control?

Sure, if you turn everything off. But when we aren’t offline, we have to see that some of the world’s smartest minds are working to undermine the agency we have over our minds.

So step one is awareness. Awareness that people with very high IQs work at Google, and they want to hijack your mind, whether they’re working on doing that deliberately or not. And we don’t realize that?

Yeah. And I don’t mean to be so obtuse about it. YouTube has a hundred engineers who are trying to get the perfect next video to play automatically. And their techniques are only going to get more and more perfect over time, and we will have to resist the perfect. There’s a whole system that’s much more powerful than us, and it’s only going to get stronger. The first step is just understanding that you don’t really get to choose how you react to things.

And where’s that line? I do choose sometimes to use Instagram because it’s immensely valuable to me; I do choose to go on Twitter because it’s a great source of news. I do go to Facebook to connect with my friends. At what point do I stop making the choice? At what point am I being manipulated? At what point is it Nick and at what point is it the machine?

Well I think that’s the million-dollar question. First of all, let’s also say that it’s not necessarily bad to be hijacked, we might be glad if it was time well spent for us. I’m not against technology. And we’re persuaded to do things all the time. It’s just that the premise in the war for attention is that it’s going to get better and better at steering us toward its goals, not ours. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. For example, we forget if the next video loaded and we were happy about the video we watched. But, in fact, we were hijacked in that moment. All those people who are working to give you the next perfect thing on YouTube don’t know that it’s 2 am and you might also want to sleep. They’re not on your team. They’re only on the team of what gets you to spend more time on that service.

So step one is, we need to transform our self-awareness. What’s two?

Step two is transforming design, so that based on this new understanding of ourselves—of how we’re persuaded and hijacked, etc.—that we would want to do a massive find-and-replace of all the ways that we are hijacked in ways that we don’t want, and to replace them with the timeline of how we would have wanted our lives to go. An example of that is today, you look at your phone and you see a Snapchat notification. And it persuades you to think a bunch of things that you wouldn’t have thought. It causes you to get stressed out about whether or not you’ve kept your streak up. It’s filling up your mind. And by responding to that one streak, you get sucked into something else, and it cascades. Twenty minutes later you’re sucked into a YouTube video. And there goes your day.

What we want to do is block those moments that hijack your mind in ways you regret, and replace them with a different timeline—what you would have wanted to have happened instead. The resource we’re conserving is time. Imagine these timelines stretching out in front of people, and right now we’re being tugged and pulled onto these brand-new timelines that are created by technology. Let’s do a massive find-and-replace from the manipulative timeline to the timeline we would’ve wanted to have happened.

How do you do that?

As I say, it has to do with design. An example I gave in the TED talk released today was the idea of replacing the Comment button with a Let’s Meet button. In the last US election, conversations were breaking down on social media. People posted something controversial, and there’s this comment box underneath that basically asks you, Which key do you want to type? It turns into a flame war that keeps people expressing their views in small text boxes and keeps them on the screen. People end up misrepresenting each other’s ideas because their views get compressed into these little boxes of text. So it’s causing people to stress out. It’s causing people to dislike each other.

 

Internet companies are racing to the bottom to capture our attention, Tristan Harris’ says in his 2017 TED talk.

 

Imagine we replace the Comment button with a Let’s Meet button. When we want to post something controversial, we can have the choice to say, “Hey let’s talk about this” in person, not online. And right underneath, there’s an RSVP, so people can coordinate right there to talk about it over a dinner. So you’re still having a conversation about something controversial, but you’re having it at a different place on your timeline. Instead of a fragmented timeline over 20 minutes at work getting interrupted 20 times—while Facebook delivers the messages drip by drip and other notifications come in and you’re getting sucked into Facebook, which is a total mess—you replace that with a clean timeline where you’re having dinner next Tuesday, and you’re having a two-and-a-half-hour conversation in which a very different sequence of events happens.

But how do you know meeting for dinner and talking about things is what you want to happen? Suddenly you’ve created this whole new system where you’re pushing people to meet in person because of your assumption that meeting in person or videoconference is better than talking in chat boxes. Which may be true. Or it may be false. But it’s still a decision made by the person or the social media company.

Yeah, exactly. And so before we ask, Who are we, Nick and Tristan, to say what’s better?, let’s ask: Why is Facebook promoting a comment box and Like button in the first place? Were the designers thinking about what’s the best way for humankind to have conversations about controversial topics? No. They don’t get to ask that question. The only question they get to ask is, “What will get people to engage the most on the platform?”

  Source: WIRED

 

 

Further reading:

 

How Technology is Hijacking Your Mind - from a Former Insider

“It’s easier to fool people than to convince them that they’ve been fooled.” - Unknown. I’m an expert on how technology hijacks our psychological vulnerabilities. That’s why I spent the last three years as a Design Ethicist at Google caring about how to design things in a way that defends a billion people’s minds from getting hijacked.

 

Turn Off Your Push Notifications. All of Them

Push notifications are ruining my life. Yours too, I bet. Download more than a few apps and the notifications become a non-stop, cacophonous waterfall of nonsense. Here’s just part of an afternoon on my phone: “Hi David! We found new Crown jewels and Bottle caps Pins for you!”

 

Devastated Snapchatters talk about the heartbreak of losing a Snapstreak after hundreds of days

No one likes losing things. Especially when you’ve worked on them every single day for a year and a half. But that’s the reality college student Amy Strawser is currently facing, as she lost a 571 day “Snapstreak” with her sister on Snapchat.

 

Facebook Messenger is stealing Snapchat’s streaks feature

Facebook loves to steal Snapchat features to block the popular messaging app’s momentum. Ephemeral stories and messaging have arrived in Facebook, Instagram, and WhatsApp previously, and the company has even lifted Snapchat’s popular face filters and camera features.

 

 

What is Technology Doing to Us?

A Conversation with Tristan Harris

 


(Photo via the Alan O’Rourke)

 

In this episode of the Waking Up podcast, Sam Harris speaks with Tristan Harris about the arms race for human attention, the ethics of persuasion, the consequences of having an ad-based economy, the dynamics of regret, and other topics.

Tristan Harris has been called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine. He was the Design Ethicist at Google and left the company to lead Time Well Spent, where he focuses on how better incentives and design practices can create a world that helps us spend our time well. Harris’s work has been featured on 60 Minutes and the PBS NewsHour, and in many journals, websites, and conferences, including: The Atlantic, ReCode, TED, the Economist, Wired, the New York Times, Der Spiegel, and the New York Review of Books. He was rated #16 in Inc Magazine’s “Top 30 Entrepreneurs Under 30” in 2009 and holds several patents from his work at Apple, Wikia, Apture and Google. Harris graduated from Stanford University with a degree in Computer Science.

 

Other stories you may like

[bibblio style=”bib–default bib–hover bib–row-4″]

 

Loading

Copy link