Disclaimer: I strongly disliked the documentary “The Social Dilemma”, which rode a mimetic (and neurolinguistic programming) wave of popularity during the darker days of the pandemic. But this is not a review, so I’ll set most of my aesthetic and moral observations about the film aside. If you’re looking for a thoughtful one, though, I found “The Social Dilemma is Dangerously Wrong” to be one of the best I’ve come across.
The odd thing about the Center for Humane Technology—founded by former Google “ethicist” Tristan Harris—is that its view of humanity seems entirely materialist. Which isn’t a very noble view of humans.
In this worldview, human beings are at the mercy of the Big Tech companies. Salvation from the technocracy will only come about if the tech companies change their ways and design less addictive products. Otherwise, we’re screwed. We can’t rise above our base instincts.
It strikes me as shallow—a negation of our agency, a cheap and alarmist crusade against the very things that made the documentary “The Social Dilemma” (which the Center was heavily involved in making) popular. The algorithm pushed the film to the top of Netflix for many people for weeks on end—and nobody at the Center seemed to have a problem with that. The film also utilized neurolinguistic and visual hacks to make its narrative as compelling as possible. All fine, apparently, as long as it’s serving the right interests.
My read on all of this today is fairly mimetic: the more they scapegoat technology and technology executives, the less anyone is encouraged to do the hard work of examining the mimetic drives that these platforms exploit.
In the words of my friend Tom Bevan: “You’re not addicted to Twitter; you’re addicted to the fantasy of the person you say you are on Twitter.” The problem is spiritual, as most problems are. But we’re not allowed to talk about those things.
When he was a kid, Tristan Harris loved magic. The secret of magic, he learned, is to exploit the blind spots and vulnerabilities of his audience—to take them to the bleeding edge of perception and understanding. If he worked his magic well, they offered no resistance. They didn’t even know there was anything to resist.
He eventually went on to study cults, hypnosis, and behavioral economics. After graduating from high school, he enrolled at Stanford to study computer science with a focus in Human Computer Interaction. Then he continued his studies in a masters program and started working in the university’s Persuasive Technology Lab, which specialized in captology—or the study of computers as persuasive technologies that can “change people’s beliefs and behaviors.”
The lab was founded in 1998 by Dr. BJ Fogg, a former doctoral student who used experimental psychology to show that computers can change behavior in predictable ways. His dissertation was titled Charismatic Computers.
Where do computers get their “charisma” and their ability to “change behavior in predictable ways”? This was one of the center’s animating questions.
One of the interesting things about the materialist view of the world is that if you really embrace it, you have to start personifying non-persons—you do things like ascribe “charisma” to machines. The humanity has to go somewhere. When it leaves humans, it just gets projected onto technology.
But computers are reflections of their creator. Behind every “persuasive technology” is a persuasive human. Humans are the only kind of model of desire that we’re actually persuaded by.
(Dr. Andrew Meltzoff, in his studies of “gaze following” in infants, found that babies are only interested in the gaze of their mothers and fathers or fellow humans—the attention modeled by robots, animals, or any other non-human, does nothing for them. The same is true for adults. We care about other people.)
It’s the people on the other side of social media that really interest us. The technology is merely a conduit.
New Titles, New Dilemmas
In 2007, Tristan Harris dropped out of Stanford and left the Persuasive Design Lab to start a company he called Apture. Google bought the company in November 2011. By the end of December that same year, they’d shut it down. The threat to their search business had been neutralized.
But the magician got a new hat. As part of the acquisition of his company, Google hired Tristan as a “Design Ethicist,” responsible for thinking about the ethical design of its products.
He was haunted by the question: “How do you ethically steer the thoughts and actions of two billion people’s minds every day?” He was becoming worried that successful technology products were tapping into deep-seated human needs in a way that exploited vulnerabilities in human nature. While still at Google, he created a 141-slide presentation "A Call to Minimize Distraction & Respect Users' Attention,” which went viral inside the company.
In 2015, he left Google to engage the issue publicly and founded the The Center for Humane Technology, which has a mission “to reverse human downgrading by inspiring a new race to the top and realigning technology with humanity.” (Notice the language: human ‘downgrading’, similar to Yuval Noah Harari’s thought of humanity ‘upgrading’ itself into gods—language lifted from computers.)
Western culture is built on the ideas of individual choice and freedom, he says—but that this freedom is illusory when its under the control of the magicians of Big Tech. If you control the menu, you control the choices. “This is exactly what magicians do,” says Tristan. “They give people the illusion of free choice while architecting the menu so that they win, no matter what you choose.”
[There is a not-so-subtle negation of free will there.]
“The problem is the hijacking of the human mind,” he says, and the problem must be solved by the tech companies themselves through better design. If we are addicted to social approval, it is because Facebook’s most potent dopamine-dropper, the Like button, was designed to hook us.
But like someone at a magic show, it’s easy to pay attention to the wrong things. Rene Girard discovered the “like” button at the heart of human nature—mimetic desire. It was driving addictive behavior long before the Internet was even invented. Facebook simply translated it into pixels.
But I wouldn’t say that “the problem is the hijacking of human desire.” The problem is that we’re allowing our desires to be hijacked in the first place.
The Technological Version of the Romantic Lie
Wired magazine’s Editor-in-Chief Nick Thompson talked to one unnamed tech executive who summed up Harris’s view like this:
“Tristan sees humans as pawns incapable of managing their own lives. I like to imagine Tristan reviewing the latest restaurant. ‘They have clearly intentionally added flavor to this dish to make me want to come back and visit this business again. What scoundrels!’”
All of Tristan’s warnings have something curiously in common: they address the relationship between an individual and the technology.
This is nothing more than the Romantic Lie in mimetic theory—the idea that we are monadic wanting machines, completely independent from the desires of other people. The problems come down to addictive tech—”design” problems. Here’s how social media looks to Tristan Harris:
He sees a physiological and neurological problem—the hijacking of the “mind.” Tech companies are competing in a race to the bottom of the brain stem.
There is no doubt that technology can be designed to be more or less addictive. But all of these warnings about addictive design leave out the most salient and addictive feature of social media altogether—other people.
The danger is not that we have a slot machine in our pockets. The danger is that we have a dream machine in our pockets—one that projects the desires of millions of people around the planet to us continually.
Mimetic desire fuels the engine of social media. Addictive design simply makes the engine run faster.
The real addiction is the same as it’s always been. People have a deep metaphysical desire for being—and that desire is not something that can be contained by or explained well in the material sphere alone. But the new ethicists of tech won’t stop trying.