The Desire Chamber

Clubhouse and the Rise of Sophomoric Solutions to Adult Problems

It happened, as it so often does, that I found myself in a Clubhouse room full of people talking about how great Clubhouse is. 

I don’t know how I got there. I suppose that’s the automagical allure of the place. But there I was on a digital “stage” with the infamous Internet entrepreneur Kimdotcom, the AI scientist Lex Fridman, and the voices of other Important People, discussing the future of social media. 

Each of the seven other speakers on stage qualified their comments about Clubhouse by saying they were optimists before effusively praising the app. It would lead to a more human kind of social media experience, they all said, where people could enter into productive dialogues through the medium of voice.

For a moment, I almost believed them.

Each time I tried to loosen my tongue to break the spiral of silence (I planned to qualify my statement as not “anti-optimist,” in the hope of not offending) a disembodied voice would jump in front of me. 

“I—oh—I—no, go ahead,” I responded, ceding the floor, not wanting to break any unknown taboos in this new platform I’d only recently been nominated (and all-too-honored) to join.

I plan to be a scapegoat later in life, but I’m not quite ready for it.

The Power of the Neighbor

In the isolation of the pandemic, our need for social media has been especially acute. It helps us to make sense of who we are. We didn’t know what to desire. So we turned to other people, on our phones, to tell us:

Watch Tiger King, drink negronis, adopt dogs, avoid doing dishes, fight for justice, watch Bridgerton, support local bookstores, Zoom, take extravagant local getaways, take up bird watching, redefine self-care.

We look to other people—as flawed and volatile and contentious as some may be—in order to see ourselves. 


There’s something buried deep within human nature—something we rarely want to acknowledge, and which the French cultural theorist René Girard called the “thing hidden since the foundation of the world”: our deep-seated propensity to covet our neighbors’ goods.

So perennial is the conflict caused by the concern for what our neighbor has that a prohibition against the desire itself was enshrined as the Tenth Commandment. 

“Thou shalt not covet thy neighbour's house: neither shalt thou desire his wife, nor his servant, nor his handmaid, nor his ox, nor his ass, nor any thing that is his.”

It commands that the ancient Israelites not covet anything of their neighbor’s. The Tenth Commandment suggests that there is a principle of rivalrous conflict at the very heart of human relations. And the emphasis on the neighbor is strange—so strange, in fact, that we normally overlook it.

It is a “fundamental revolution in the understanding of desire,” according to René Girard.

“We assume that desire is objective or subjective, but in reality it rests on a third party who gives value to objects. This third party is usually the one who is closest, the neighbor. To maintain peace between human beings, it is essential to define prohibitions in light of this extremely significant fact: our neighbor is the model for our desires.” 

This is what Girard calls mimetic desire: the idea that desires are generated and shaped by those around us (“models”) through a process of imitation that typically flies beneath conscious awareness.

Not all models are alike: we tend to gravitate toward those who are closest to us—not necessarily geographically, but socially and existentially.

In 2021 everyone is our metaphorical (and metaphysical) neighbor.

Social media has brought all of us into extremely close desirous proximity to one another. It is the world of what Girard calls “internal mediation”—the people who are models of desire for us are “inside” or “internal” to our own worlds because we can talk to them, engage with them, and most importantly compete with them. This is why social media is an engine of mimetic desire. 

This feature of social media has not received sufficient attention. While we tinker with design changes (“human-centered design” etc.) that might make our apps less neurologically addictive, we have overlooked our real addiction: to our neighbor’s desires. We are addicted to mimetic models. 

Our addiction is not primarily to the sounds or frequency of the notification, or the interface, or the “intermittent variable rewards” the apps give us, like slot machines. These are all superficial (materialist) diagnoses of the real problem. 

[I'm not referring to the “feeds” themselves, which are already automated by algorithms, but to the content that makes up the feeds. In other words, if social media was just AI-generated information, it wouldn't fascinate us.]

Social media relies on people imitating other people: quoting, retweeting, reacting. It encourages mimesis even in its form: the prescriptive, restrictive design of social media profiles causes a crisis of sameness. 

How do you encapsulate your you-ness in a 40-character profile bio? (Someone with a yacht wrote a haikuDamn). The ostentatious displays of “follower” counts encourage people to follow those who other people are following (a form of social proof). And the most highly rewarded posts—those that get the most engagement and visibility across most platforms—are disproportionately taboo or controversial. Anger and outrageousness are more mimetic than sharing what you ate for breakfast. 

Everyone has become our neighbor. It’s as if we’ve found a way, through technology, to fit all seven billion people on planet earth onto the head of a pin. At the same time, we removed all distinctions between them and forced them into the same profile boxes and platforms. Hell, we’re even trying to differentiate ourselves from one another now by the color of the laser eyes on our avatar.

Sure, there are different communities with different shapes of houses. But the McMansions in the Reddit neighborhood all look exactly alike; and the same with the McMansions in the Twitter and Facebook and Clubhouse neighborhoods. 

This is a recipe for conflict. Sameness, in the framework of mimetic theory, leads to escalating competition and rivalry. Geoff Schullberger, in his excellent piece on Peter Thiel and scapegoating, put it like this:

“Their equalizing structure–what is most widely celebrated about them–converts all users into each other’s potential models, doubles, and rivals, locked in a perpetual game of competition for the intangible objects of desire of the attention economy.” 

Sameness has, throughout history, been risky business. There are five stories of sibling rivalry in the book of Genesis alone. Almost every culture in the world has a myth about birth twins fighting and killing each other, like Romulus and Remus, the founders of Rome. Some cultures even put to death one or both identical twins after birth, seeing them as a harbinger of future violence.  

If human beings are indeed the most imitative creatures in the world—and if we rely on signals to help us determine who to imitate and who not to—then we have set up social media to be a Girardian house of horror.

The Modus Operandi

The greater the internal mediation of desire, the greater is social media’s hold on us.

This is the lure of Clubhouse. It is intoxicating to think that you could land in a room or on a “stage” with a celebrity or “public intellectual” you used to only be able to admire from afar.

Celebristan (the world of external mediation, where we couldn’t come into contact with our models) has collapsed in Freshmanistan (the world of internal mediation, where we can compete directly with our models for Stage Time.)

It should be no surprise, then, that social media has made the amplification of internal mediation a key part of its evolution.

Isn’t it interesting that Clubhouse is the social app most like high school or college? There are literal “hallways” built into the UX of the app, with “rooms” and “clubs” that have their cool kids—those speakers who have gained some form of authority (whether mimetic authority or real authority doesn’t matter...and the lines between those things are getting blurred anyway…)

The trend of social media is clear: the social media companies have figured out that more internal mediation of desire means greater engagement. 

Seeing social media for what it is—an engine of desire—is the key to developing better solutions: adult environments, not sophomoric ones; apps that place less importance on information and more on humanity; tech companies that don’t feed our basest mimetic desires and profit from them. 

Here are a few ideas which I am still working out:

First, platforms should consider ‘demetrification.’ This involves removing information about engagement from a post so that people have to evaluate it on its own terms rather than relying on signals about how many other people liked it. The demetrification might even extend to follower counts. When we meet someone in real life—at least most people—we have no idea how many other people “follow” them or secretly admire them. Relationships have time to develop organically. Respect must usually be earned. I’m not suggesting that a person’s follower count shouldn’t matter in terms of the number of people their posts reach; I’m simply suggesting that the exaggerated importance of “Followers” as a leading metric be deemphasized so that fake “follower-grabs” are not rewarded with mimetically-driven authority that may or may not be indicative of substance. 

Second, algorithms should not optimize information based on what people most “want” to see. Algorithms showing people what they most “want” to see is good for business but bad for humans. Humans can opt-in to curating sheltered lives for themselves by moving to like-minded communities and erecting barriers to people or ideas that threaten them—but we should not make it the default mechanism by which social media works. If people want to create their own echo chambers, they should encounter some serious friction in order to do it. Social media companies that care about creating a healthier human ecology should randomize the kind of information that people are exposed to in a way that approximates the exposure they would get in the physical world. 

Rather than imitating the machines, the machines should be imitating us—we should build these platforms around what real human relationships actually look like. We can do better. We have ‘augmented reality.’ Why not augmented social media? That is: social media which augments our relationships with other humans rather than undermining or replacing them.

Third, identity verification should be the norm and personal responsibility should be rewarded. Anonymity fuels disinhibition, misinformation and mimetic contagion. Pseudo-anonymous accounts are fine (and good in some cases)—but the person behind them should still have to verify that they’re a real person.

Balaji Srinivasan, the investor and former CTO of Coinbase, said on a recent episode of the Tim Ferriss show said that “pseudonymity stops both discrimination and cancellation.” He suggested that moving to pseudo-anonymity as the norm on social networks might usher in a new truce in the Internet Wars—a kind of mutual disarmament whereby nobody’s personhood can be canceled, only their online persona. 

But it would be tragic if our victory in the Internet Wars meant stripping ourselves of our humanity and identities. A Pyrrhic victory.

We’ll achieve a more humane online environment—a healthier human ecology—when people take more personal responsibility for what they say, not less. Social media companies have to reward personal responsibility. I don’t see how moving to an online world of total pseudo-anonymity is going to get us there.

Lately, we can mitigate disinhibition by incentivizing people to show their “faces” (I mean that metaphorically, in the strain of Emmanual Levinas), and to dialogue with the faces of others—which helps us recognize a moral responsibility toward them, in the mind of Levinas.

The recognition of moral responsibility is often lacking when we are dealing with disembodied Hot Takes and Smart-sounding voices.    

These problems cannot be resolved through technological solutions alone. Technology created the problems. The Slayer can not also be the Healer.

First, we’ll have to develop some real anti-machinery in our guts. Our technology is only as good as we are.