3

Free Speech, Censorship, and the New Media

The Problem of Mimesis and the New Media Landscape
3
Transcript

No transcript...

This is the first full piece of video content released from last month’s NOVITĀTE conference, which I hosted on November 3, 2023, in Washington, DC.

This conversation, titled “EVEN BETTER THAN THE REAL THING: The New Media Landscape”, featured author

, Substack Co-Founder , and Renée DiResta of the Stanford Internet Observatory. It was moderated by New York Times columnist Ross Douthat.

This was a conversation that needed to happen. A key issue of contention was whether or not media companies can “manage” mimesis effectively—and whether they should be in the business of trying to do so at all.

The release of this video is particularly timely in light of calls for Substack to moderate its content the way that other social media companies have tried, and failed, to do effectively. Today I signed this letter written by author

, along with dozens of other writers on this platform including , , , , and many others.

A full transcript of the conversation from last month is below.

Full Transcript

Luke Burgis (00:00:00):

I've the pleasure of introducing the second featured panel of the day titled Even Better Than The Real Thing, the New Media Landscape with Walter Kirn, Renée DiResta, and Hamish McKenzie.

Instead of the obligatory AI panel, which every conference seems to have, I opted for this one instead because I can't imagine something more important at the moment, especially given the tragic events in Israel and Gaza over the last month, which most of us only really have access to through the mediated reality of tweets and news images and the Discourse, with a capital D, that we see online. In fact, one of our registered attendees is not with us today because she's stuck in Israel at the moment.

(00:00:53):

So this panel is not really about that issue. This topic was landed on months ago before those events happened, but it just seems more important than ever. Media is an industry heavily influenced by mimesis. And all three of our panelists know about the media intimately from the inside, and so does our moderator, Ross Douthat. Ross is a New York Times columnist, in my opinion, one of the most levelheaded, antimimetic, sane, kind, thoughtful writers out there. And he's no stranger to Catholic University because he's been a fellow at our Institute for Human Ecology for quite some time. And he's moderated many panels for IHE. I'm not sure if ever one quite like this, but he's a pro at this by now, and I'm so grateful that he could join us for this featured panel. Ross Douthat, I'll let you take it from here.

Ross Douthat (00:01:49):

Good to see you.

Luke Burgis (00:01:50):

You too.

Ross Douthat (00:01:53):

I think I'm supposed to sit down. I was given very detailed instructions. Yeah? Yeah? Okay, great. Thank you all for being here. It's a real pleasure to be invited to moderate this empty stage, which will actually be populated by AI-based simulacra of the panelists in spite of what the introductory remarks suggested. And actually the last panel that I moderated here at CUA, there was a lot of talk about demons and demonology, and of course our microphones kept breaking throughout. So I'm hoping that the entirely undemonic new media landscape won't inspire any technological snafus.

(00:02:38):

But, I'm going to introduce the panelists. Much like the weather, everybody complains about the media, but nobody does anything about it except for the three panelists that we have here today who all in quite different ways that I'm sort of hoping will spark some interesting debates, have made it their business to not just study but try to figure out how to change, and you could argue, fight some of the Girardian pathologies that beset online media in all its various forms.

(00:03:19):

So we'll be joined first by Hamish McKenzie, who is a co-founder of Substack, a small internet platform with which I imagine many of you are familiar with, and a recovering journalist might be the right way to describe him. And there he is. Come on up. Come on up, Hamish. Sorry, I wasn't sure exactly... I want you to sit close to me. Let's go that way. So we have Hamish immediately to my right.

(00:03:45):

And next up is Walter Kirn, who is a celebrated novelist, an American literate, a man of letters. No, come, come. Just come. Just come. They're going to clap. A man of letters in the classic sense of the term. That's perfect. Who is most recently the co-founder and editor at large of County Highway, a print-only newspaper that I believe bills itself as America's only newspaper. Is that... Yeah. I have no comment on that description. And he's also the co-host of America This Week, a podcast with Matt Taibbi, who himself presides over a small empire within Hamish's larger imperium.

(00:04:34):

And then finally we have Renée DiResta. Come on, Renée. You get a car and you get a car. Have a seat. This is a bland sounding title, but she's a technical researcher at the Stanford Internet Observatory where she is a self-described expert on rumors and propaganda.

Renée DiResta (00:05:00):

I did not say expert.

Ross Douthat (00:05:01):

She has been described by The New York Times in its manifestation here as an expert on rumors and propaganda. Thanks to all of you for being here with me. And as I promised, I'm going to start with a very open-ended question, which will hopefully then segue us quickly into interesting conversation and occasional disagreement. So I'm going to ask each of you, starting with Hamish, who is the most eager to answer this question, to give us just a brief three to five minute or 30 seconds if you prefer, summary of what you see as either the state of the media landscape right now or sort of the particular kind of work that you see yourself doing in it, or really anything along those lines that you'd like to say. Hamish, over to you.

Hamish McKenzie (00:05:49):

Thank you. Thank you for the kind introduction. I think the state of the media is an interesting place to state the obvious. I think on one hand, we've got traditional media, which is laboring under the failure of the business model that has supported it for so long, and that leads to problems in the traditional media. The main problem with traditional media is that it can't make money or is not making enough money. The New York Times is doing fine, most others are not.

(00:06:15):

And then social media on the other end of that, where the business model is kind of working too well and they've created these big addiction machines that reward the behaviors and content that keep people engaged but don't have much to do with truth or helping us understand each other and leads to a lot of polarization, leads to a lot of conflict. It's reactive, it's disputatious. Substack is positioned in itself as an alternative to both of those. An alternative sometimes, a compliment sometimes, but something with a different set of rules where the rewards go to people who are respecting the attention and rewarding the trust of their subscribers over the course of a relationship.

(00:07:04):

There's no one piece of content that is monetized, the relationship is monetized, and that leads to a different scene of behaviors, of different types of content are made possible and a wider range of voices can succeed and be led into the ecosystem. I think our mission here is to try and improve discourse. Our literal mission statement is to create a new economic engine for culture, but this is largely a project to create a media system that has a different set of rules that can foster a healthier and more productive discourse.

Ross Douthat (00:07:43):

Before I go on, I'm just going to probe a little bit. What's been the hardest thing so far in that project? Or maybe either the hardest thing or the thing that surprised you the most in terms of challenges relative to when you launched Substack?

Hamish McKenzie (00:07:58):

Yeah, the first three years of Substack, it was three of us most of the time working kind of in the shades without a lot of attention and just quietly building and this thing is growing and there's some writer success stories, but it wasn't a phenomenon by any stretch at that point. The pandemic coincided with our fast growth and a bunch of big name writers coming to Substack and a lot more attention coming to Substack.

Ross Douthat (00:08:26):

Some turbulence in the media.

Hamish McKenzie (00:08:28):

Some turbulence in the media that we benefited from, some turbulence in the economy that we ended up benefiting from. And then with the pandemic, the culture wars intensifying and these things colliding in a way that put Substack at the middle of a lot of people's opinions and a lot of people's commentary. It's difficult enough trying to build a company. It's difficult to build a startup. It's difficult to build anything that's to do with media or writers. But when there's the roiling conflicts and loud chats and tomato throwing, it's difficult for the people building this thing. And it's difficult to keep your head and stay calm when there's so many forces trying to drag you into the not calm places. So that's something we didn't plan on, but we feel like it's important to go through and it's important to hold the line on.

Ross Douthat (00:09:27):

While you're scapegoated one might say.

Hamish McKenzie (00:09:29):

While we're scapegoated, yeah. There are seasons.

Ross Douthat (00:09:31):

Seasons. Yeah.

Hamish McKenzie (00:09:32):

Sometimes we're people's heroes and sometimes we're the villains.

Ross Douthat (00:09:35):

All right, Walter.

Walter Kirn (00:09:37):

Can everyone hear me? Because I wasn't able to hear at the first panel, and I didn't come all the way from Montana not to be heard.

Hamish McKenzie (00:09:46):

I can hear you.

Walter Kirn (00:09:48):

You can hear me. If I'm overly loud, raise your hands. If I were to give a title to this episode in the evolution of the media, it would be The Empire Strikes Back. What we had in the early 1960s when I was born and Girard started writing was a virtual media cartel supported by corporate advertisement and cozy with the postwar national security state. It was able to do things like suppress the truth about the assassination of an American president, sell the Vietnam War on the basis of a false military incident in the Tonkin Gulf, and on and on through the Iraq War and so on.

(00:10:48):

But when the free internet rose in a way that was unprecedented and in some ways unexpected, that cartel, The New York Times, CBS News, Time Magazine... Now, I don't speak as a rebel by the way. I've worked for these places. I've written for them. So I have inside knowledge of their mindset, their purposes, their ideals and the constructive purpose they served. But they were taken by surprise, by the rebel forces to overwork the Star Wars meme at this Girard conference.

(00:11:37):

And when they found themselves unable to prevent the things like the election of Donald Trump, things like Brexit, when they found themselves, especially humbled by the Elon Musk purchase of Twitter and so on, all of these things occurring over several years, but especially it's the Trump election that starts The Empire Strike Back process, they decided that under the rubric of misinformation and disinformation, a threat that supposedly was almost as bad as the weapons of mass destruction in Iraq that didn't exist, we would have to fight back. And this cartel would use its declining prestige and mount a kind of rear guard action against things like Substack. Not so much against things like Twitter because half of the people who worked at Twitter were borrowed from the national security state and the law enforcement organizations, as has been revealed by my partner Matt Taibbi in the Twitter files recipient of something called the Dow Journalism Award just the other night in Washington.

(00:13:08):

So this group of legacy media institutions, along with a whole array of academic, what is called civil society organizations, and frankly, Homeland Security, clerks of the government got together and in that revolving doorway that government works where you leave government proper and go to work for a corporation, you leave the corporation and you go to work for the NGO, ganged up to preserve this preferential cartel status for those groups and start shooting down the rebel ships. They did this at Twitter, they did this at Google, they did this at Facebook very successfully. Some people cooperated more completely, others resisted. We can see their emails to each other now in the Twitter files. We can see the emails of groups like that represented by the last person who will speak here, the Stanford Internet Observatory, the only observatory I know that has offensive capabilities to shoot down stars and planets-

Renée DiResta (00:14:35):

Darth Vader.

Walter Kirn (00:14:36):

... and which was responsible for the suppression of millions of tweets both around the election...

(00:14:44):

Well, there are a lot of organizations, there's a kind of fog of alphabetical confusion. There's the Virality project, there's the Election Integrity project, there's the Stanford Internet Observatory. They all have personnel and goals in common and funding, by the way. And with this concerted effort, we have tried to fuse corporate government and media power against this threat, which has never shown itself to be a harm commensurate with the systematic dismantling of the First Amendment that is being done in its name. The recent renaming of disinformation and misinformation as gossip and rumors is a homely and homespun way of retreating from that Orwellian lexicon with which the whole project was launched. Today we're halfway through The Empire Strikes Back. We don't know how it's going to turn out. We have people like Senator Klobuchar getting up and asking-

Walter Kirn (00:16:03):

We have people like Senator Klobuchar getting up and asking that Jeff Bezos's Washington Post-adjacent, Amazon-adjacent Alexa, basically a search engine, not be allowed to access an entire video platform called Rumble and maybe Substack because these are not the trusted partners for the future AI-run information state. To put this in Gerardian terms, what these organizations and what these nascent laws and actors would like to do is manage Mimesis on behalf of the corporate and state interests that don't believe you are wanting the right things, you might want Donald Trump or that you aren't wanting the things you should want enough, the Covid vaccine. And so, a word was coined in this universe, malinformation, things that are true but harmful.

(00:17:21):

So stories of other countries having problems with the Covid vaccine, which were true stories, would be labeled malinformation because they promoted something called vaccine hesitancy, which revealed the true nature of this entire project as a behavioral engineering enterprise, no longer having much to do with the truth, no longer having much to do with your right to desire what you wish or not desire what you don't wish, but acting on behalf of some unformed conglomerate of interests to make sure you did what they want. And that's where we stand. And all of these questions hang in the balance, which is why this panel is so important and why this debate is all important for our future.

PART 1 OF 4 ENDS [00:16:04]

Ross Douthat (00:18:16):

I have some follow-ups, but I think I should go to Renee to speak on behalf, perhaps, or not, of the management of Mimesis.

Renée DiResta (00:18:25):

Well, as a person who does not manage Mimesis, maybe we can wait for that part of the discussion. I feel like to tie this to Gerard though, what you've just witnessed is scapegoating, actually.

Ross Douthat (00:18:35):

I think we're doing really well with the Gerard so far.

Renée DiResta (00:18:37):

Thank you.

Ross Douthat (00:18:38):

I just want to say.

Renée DiResta (00:18:40):

Ross had asked us to prepare a couple minutes of comments, and I actually did on the plane, and I wanted to be careful in how I frame this because normally I speak at tech conferences, not about philosophy, and so to avoid getting over my skis, I'm going to try to contextualize some things here. A series of technological advances transformed communication. With each successive new technology, the communication environment changes and power restructures. And there are people who gain power, there are people who lose power, and there is, of course, a struggle between those who gain and those who lose. There's an attempt to claw back. There are new kind of communities that form as people have access to a new capacity to make themselves heard. And so the internet was one of those. And I think the period in which we all gained the potential for voice was actually the early era of the blogosphere, right?

(00:19:27):

I think that social media has been remarkably destructive, but I think that the kind of early days of the internet was this kind of halcyon era where everybody had the capacity to be heard, everybody had the capacity to speak, and we had both this union of free as in speech and free as in beer, to use the kind of way that the valley puts it. Social media followed, but that was primarily a shift in distribution, not in the capacity to create, but in the capacity to reach people. And so what you have in that regard is a shift that comes in incentives. People really want to be heard, but more than that, they now realize that they can monetize being heard. They can become media of one, they can become influential, they can become influencers. The term influencer comes about, it's a term actually that derives from marketing because it's the idea that you can use these people to sell a thing.

(00:20:11):

And so it's inherently based in the belief that you can use an individual to create a desire. We kind of heard that in the last panel a little bit with regard to marketing. And it's the role of the influencer though that becomes pretty remarkable when you realize that the same way that propaganda and marketing are very tightly coupled, you can also use people to sell ideas. And so what starts to happen is there's a platform set of incentives, right? So media in the form of social media reemerges, there's a set of incentives that come from the platforms. There's an incentive to keep people on site, there's an incentive to curate, there's an incentive to connect, and so they reshape the topology of human connection by connecting you not based on who you know, it starts with that, but they create an entire new social graph for you around what you believe or what you want.

(00:20:58):

And what you start to see is pushes, people driving you into communities, platforms, powerful platforms that are monetizing you, pushing you into communities around your interests. These become the people that you form your opinions around, these become the people that you form your social community around. It begins to be divorced from the real world, and it becomes primarily an online experience. And the way that the algorithms do this is around homophily, people who are statistically similar to each other. And based on machine learning, algorithms intuit what you'll want to see and who you'll want to speak to and what you'll want to do. And what starts to happen, we talked a little bit about this in one of the other great panels that Jeff led, is you have the formation of these online crowds and the influencer emerges from the online crowd, and so there's inherently a desire-based relationship.

(00:21:47):

There is the influencer is somebody who is just maybe a little bit better at intuiting what an algorithm will want, maybe a little bit better, a little more attractive, little better at taking the right photo or saying the right thing, funnier, wittier, and what you start to see is the rise of these very, very powerful figures. But effectively, it's a systems-based relationship. You have a system, a trinity of the influencer, the algorithm, the crowd. And these three things become the dominant form of shaping public opinion through this very bottom-up model, and that is what the internet, that is what social media in particular has really enabled, it's the formation of this trinity, and it's a mimetic desire machine. But again, you have the old system, the top-down model of narrative creation, and it begins to come into conflict with the bottom-up model of narrative creation.

(00:22:33):

And so what starts to happen is this battle for control of the narrative begins to take place from the top-down and also from the bottom-up. But what you start to see, as in any kind of propaganda ecosystem and technological shift, is one of the things that you do is you kill the old gods, right? And so the influencers and the bottom-up media ecosystem is actually quite incentivized to constantly undermine the thing that came before it because it is trying to take power, it is trying to take attention, and it is trying to become that dominant means of opinion shaping. And so then you have the top-down and the bottom-up. And what starts to happen is that the tech platforms become the mediators of these things. Not me, not a cabal, but you have the tech platforms that start to ask the question of, how should we curate? Who should we put together? What should we surface for the public?

(00:23:22):

And so there's a constant roiling battle, not only for control of the narrative as if there is some capital T truth, but what you start to see is a fragmentation of the public where the old idea of propaganda is that of this hegemonic model of top-down uniform opinion shaping across the entirety of the public. And what social media does is it destroys the capacity to have the unified public by creating the factions, by pushing people into the niches, into the small networks, and then ensuring, as the networks and the niches compete for attention, every single post in a Facebook feed is ranked, every single post in a Twitter feed is ranked, the question is only how. And so what you start to see is the battles for attention, the battles between the bottom-up and the top-down model playing out on the social ecosystem and somebody somewhere having to make a determination of what weight should be in that ecosystem.

(00:24:14):

So I think what you wind up seeing at this point is that, the Gerardian reference to Hobbes and Leviathan, the war of all against all where the war is for capturing attention, the war is for shaping reality, but it's not so much even a matter of truth, it becomes a matter of monetization, it becomes a matter of clout, it becomes a matter of just wanting to ensure that your niche is the one that is heard, that you become the avatar for that particular opinion and that you continue to monetize it. The platforms then find themselves in a position of having to decide, in certain key pivotal areas like elections, like Covid, but there are many, many others, this is a global phenomenon, we over-layer the American political onto it, but this is a state of affairs in a system, and the platforms are asked then to serve as the arbiter, the curator, of what is going to capture your attention.

(00:25:04):

And that is actually what you see, whether in the Twitter files or wherever else. You see that tension between the old system and the new, you see the platforms trying to say, what of this ecosystem should be seen by the public, and when? And as much as we might like to say that we're going to battle this out in some sort of marketplace of ideas, that's not how the system is designed, that is not how curation works, that is not how filtering works. So Substack is an interesting entrant, I think, and I'm a subscriber to many Substacks because it offers maybe a little bit of a departure from the competition of the ranked feed, though you're competing for people's attention in their inbox, so there is still that constant competition. The internet offered this sort of premise, the free is in speech model was this idea of the promise of the proliferation of voices, the ability to hear something beyond the hegemonic, top-down narrative.

(00:25:57):

But the peril became that the incentives of the top-down model, the same filters that Chomsky refers to when he describes how information makes it to the public through a series of incentives, those incentives also come into play in the new media environment as well. And so what you start to see is much more scapegoating as one faction pits itself against another, much more mimetic desire, including within the crowds as somebody who is no longer the pure avatar of an identity is attacked by the crowd itself and cast aside and a new influencer emerges and an incoherent series of niche publics largely acting around their own mimetic desires. I think that's the ecosystem that we find ourselves in now.

Ross Douthat (00:26:37):

And is that ecosystem you sort of mentioned in, I won't say a dismissive way, but as if it's sort of anachronistic concept, the marketplace of ideas?

Renée DiResta (00:26:48):

I would love the marketplace of ideas to be the marketplace. I'm saying that structurally, we are not in that, in a position where that's dominating.

Ross Douthat (00:26:55):

Right. But that implies that effectively, technology has ushered in a landscape where the concepts that are very important to liberal ideas, small illiberal ideas about debate and discourse, this hope that you can have some kind of marketplace where the best ideas somehow compete and win, that that is not going to exist in this landscape independent of some kind of conscious structuring by tech companies who are... I mean in your view, the tech companies are sort of... We've got top down and bottom up, and the tech companies are sort of a filtration layer in between. And from Walter's perspective, they're part of the top, they've merged with the top. But either way, is the idea that in order to achieve a working marketplace of ideas again, you were essentially structuring it the way free markets need a set of institutions to work. Is that the conception that you're trying to get? If you are an advisor to Twitter, hypothetically, or an advisor to... If a person is, the advice is, your advice would be, again, hypothetically, you're not under contract with them, right? To seek the marketplace of ideas.

Renée DiResta (00:28:16):

I think that's what we should want to see, and I think that there's a decentralization happening as people are leaving Twitter and other places and moving to new environments. We started to see it actually in 2017 or so, maybe 2018, a little bit later. You start to see first all platforms emerge on the right as people are upset about Twitter and Facebook's moderation frameworks, Truth Social, Parler, Gab, Gettr begin to emerge as niche places where people can go and form their infrastructure, form their community and form their spaces. You're seeing the same thing as moderation has flipped on Twitter to be sort of more preferential to certain types of moderation policies that people who are on the left dislike, and now you're seeing Bluesky, Mastodon and Threads take off. But I think that I appreciate that as a market correction, but I actually find it very, very troubling from the standpoint of all it's saying is that people retreat into their corners and there is no structural place for all the lofty rhetoric about public squares. That's not what we have.

Ross Douthat (00:29:21):

So how is that distinct from a pre-internet, right? The world that, again, Walter is describing as sort of an overly managed, I think it's fair to say, from his description landscape of debate. So if you get to a world where people are retreating into particular platforms, retreating to particular publications, how is that distinct from the world of, well, either the world of 1957 or the world of 1887, right?

Renée DiResta (00:29:52):

I don't necessarily think that it is. I think it's a bit of a reversion. I think we had this sort of momentary period where everybody was all in one place and we had very high hopes for it, and then I think that that experiment is starting to head in the other direction now.

Ross Douthat (00:30:05):

And Walter, and Hamish, both of you, in certain ways from that description of things in the sort of Star Wars narrative where there's sort of an empire and a rebellion fighting against it, the terrain of social media is the contested terrain, right? It's the galaxy that you're trying to fight over, right?

Walter Kirn (00:30:30):

It was Hamish's company that allowed the exposure of her district of disinformation control through the Twitter files and through Matt Taibbi's Substack account. You're looking at a direct fight. It's not that Hamish has a dog in the fight, it's that he has enabled writers like Taibbi, Schellenberger, Barry Weiss, etc, to mount a independent non-New York Times, non-CBS, non-Time Magazine challenge to the managers of Mimesis. Because in this entire presentation, which was all very passive and abstract, Renee did not tell you what she actually does.

Renée DiResta (00:31:25):

Why don't you tell them?

Walter Kirn (00:31:27):

Why don't you?

Renée DiResta (00:31:28):

Well, because you have a theory.

Walter Kirn (00:31:31):

By the way, my ability to scapegoat someone who works for... Little Walter Kirn to scapegoat Stanford University, Congress and the State Department, as she put in her bio she advises, places that take money from the Defense Department and Homeland security...

Renée DiResta (00:31:51):

I don't take money from either.

Walter Kirn (00:31:54):

Is minimal. My ability to go up against something like the Election Integrity Project, all of which you can read about. I'm going to give you the macros. You can decide whether what...

Walter Kirn (00:32:03):

... all of which you can read about. I'm going to give you the macros. You can decide whether what I told you today was accurate or not. Or the Virality Project, which was of course actually the anti-virality project because its job was to keep things from going viral that the powers that be didn't want to go viral.

PART 2 OF 4 ENDS [00:32:04]

Ross Douthat (00:32:20):

But at the risk of pulling you back from the intense conflict that is the great thing about all [inaudible 00:32:26].

Walter Kirn (00:32:27):

It's not conflict. It's not conflict. It was me asserting the truth beside someone who has... I didn't put the panel together. I didn't know that a panel on race relations would include David Duke and Cornel West.

Ross Douthat (00:32:51):

All right. Well, I had further tech questions, but Renee, what is it that you do? What is your description of the work that Walter is condemning, besides the abstract study of misinformation?

Renée DiResta (00:33:05):

Well, we actually do not what's on Twitter files. Writers picked six words out of an email and decided we did. So what we do at Stanford Internet Observatory is we study abuse on the internet, and we study it in several different ways. First is trust and safety. So we look at understanding how networks form. We look at understanding brigading and harassment. We look at understanding suicide, self-harm, child exploitation. Bucket two is information integrity. And we do in fact study disinformation because it is in fact a real thing. And for example, I've done work looking at Chinese influence operations, Russian, Saudi, Egyptian, American. Actually, we did a lot of work exposing some Pentagon-led information operations because again, when you create a new system that the powerful can use to reach people directly, what you see is manipulation campaigns do begin to emerge.

(00:33:53):

You see the creation of fake accounts, you see ways that people pretend to be things that they're not. And some of the work that we do, which again is all public and always has been and is not DOD funded or NHS funded or whatever the hell he's saying. You can go and you can look on io.stanford.edu and all of the reports are out there, including the rebuttals to the Twitter files.

(00:34:14):

What we did in election 2020 and what we did in the Virality Project was we chronicled the rise of viral narratives. We chronicled them as they happened. In the Virality Project in particular, all of the work that we did tracking what are the viral rumors, and I use rumors very specifically because rumors and disinformation are not the same thing. Rumors are the oral culture way that the internet has let people express a thing that they're concerned about, very concerned, because it's very salient to them. And what they begin to do is they say, "Hey, I saw this thing on the internet. I'm concerned about it. I saw this suitcase at my polling place. I saw this story of a nurse who fainted when she got a vaccine." And those stories go viral because other people are also very concerned by them.

(00:34:56):

And so what we do, the extremely unexciting work, is we actually chronicle them and we write reports and we stick them on the internet publicly at all times. All of our work is public. And so the Virality Project's PDFs sat on the internet for two years because what we were trying to do was enable doctors and physicians to counter-speak, and also sometimes the government because it does turn out that the government has a First Amendment right to counter-speak as well. There was no secret cabal, there were no millions of tweets censored or whatever else. I think if you'd like to see something interesting, Mehdi Hasan really dismantled that. There was an allegation that Walter repeated that we somehow censored 22 million tweets. This is an absolutely staggering number. It did not happen.

(00:35:37):

In terms of number of tweets we looked at, he was off by 21,997,000. So that's where we are. All of this information is out there on the internet and has been for quite some time. But again, the scapegoating happens because the people who are battling for control of the narrative, again, top down and bottom up, see the work that we do to chronicle and explain and categorize as somehow a threat to them. When you say, this is how the system works, these are the people, these are the outlets that are most adept at making rumors go viral, and also, certain things just are not true.

(00:36:15):

Here are the 8 million tweets related to Dominion or making an allegation that Sharpie markers were handed out to keep people from voting. When we say in an after-election study that there were 8 million tweets related to that, that doesn't mean that we had an opinion on 8 million tweets or said anything about them during the election. So unfortunately, very routine academic research then gets recategorized by people who are upset at the outcome and the findings as some sort of effort to suppress the things that we're studying.

Walter Kirn (00:36:46):

I'm going to read from a Virality Project email.

Renée DiResta (00:36:48):

And that's how you've been cut in half.

Walter Kirn (00:36:51):

I'm going to read from it and you can decide for yourselves.

Ross Douthat (00:36:54):

Well, wait, no Walter. If she says... Not since I moderated the Sorab Ahmari and David French debate have I had such a good time on stage. I don't think we can read an email from the Virality Project if she says it's been cut in half.

Walter Kirn (00:37:10):

The paragraph I'm going to read hasn't been cut in half.

Renée DiResta (00:37:14):

The top half explaining the context is gone.

Walter Kirn (00:37:16):

You don't want me to read it.

Ross Douthat (00:37:18):

Well, all right. We're going to do this. Why don't you read it and then Renee can explain the context, and then I'm going to pull us back slightly and ask Hamish a question.

Walter Kirn (00:37:29):

Well, I'm going to withdraw the debate from the debate after this because standard vaccine misinformation on your platform, this is a email sent to Twitter.

Ross Douthat (00:37:46):

To Twitter?

Walter Kirn (00:37:46):

Authenticity no one has disputed.

Ross Douthat (00:37:50):

This is an email from who at the Virality Project?

Renée DiResta (00:37:53):

An undergraduate.

Walter Kirn (00:37:55):

Oh, an undergraduate. Talk about scapegoating.

Renée DiResta (00:37:59):

That is literally-

Walter Kirn (00:38:00):

Known repeat offenders. False or misleading posts from the accounts of well-known repeat offenders such as Robert F. Kennedy Jr or Sherry Kenpen. This is a large volume of content that is almost always reportable, because their job is to do what they call flagging or reporting to the company such that they'll take stuff down or modify it or whatever. Also, RFK posts that 4, 000 vaccine adverse reaction were reported to CDC in one week. They want that out. True contents which might promote vaccine hesitancy. Viral posts of individuals expressing vaccine hesitancy or stories of the vaccine side effects. This content is not clearly mis or disinformation, but it may be malinformation, exaggerated or misleading.

(00:38:59):

Also included in this bucket are often true posts which could fuel hesitancy, such as individual countries banning certain vaccines. You have just heard in one paragraph whether the entire document has been cut in half for readability or not. A template for censorship of specific people, specific ideas, and on behalf of specific campaigns, that meaning the vaccine campaign, for behavioral reasons. Not because they're untrue, but because they might promote a behavior in the public that is ill-desired by whoever backs this apparently completely altruistic project.

Ross Douthat (00:39:50):

Renee.

Renée DiResta (00:39:52):

So the top half of the email said, "Hey guys. These are the categories we're using internally to categorize our work. These are the categories that we're using for the reports that we put out," which again, sit on viralityproject.org, and you can read the PDFs. These are the categories we use to describe the rumors, which sometimes include things like a nurse fainted. That is true. Here's how it's being worked into the public conversation in a misleading way. The thing that Matt claimed that email said was that this was a list of things we demanded Twitter censor. Again, the email said, "These are the lists of categories that we're using internally for our research. Which of these are of interest to you?"

Ross Douthat (00:40:30):

But what interest would Twitter have in them?

Renée DiResta (00:40:33):

Twitter was interested in understanding narratives that were going viral across the entirety of the social media ecosystem, because often what they did was they labeled the content. So one of the things that platforms do, this idea that moderation is censorship, doesn't understand how content moderation works. So there are three different buckets of interventions. There's remove, where it's actually taken down, and you can call that censorship. There's reduce, where it's reduced in distribution, for example, if they're trying to figure out if something is true or not and how they should curate it. Again, we were talking about the ranked feed. Reduce impacts the weightings that go into the termination of the ranked feed. And then there's form. And the overwhelming majority of vaccine-related or election-related content, again, this is in all of our reports that have sat publicly on the internet for three years now, was to receive a label.

(00:41:23):

And there are some people who want to tell you that a label is censorship. In my opinion, a label is counterspeech, a label is contextualization. And if you want to have a marketplace of ideas, one of the ways that you can do it through a design perspective, because again, the feed is ranked, is to include something that indicates that information might be contested or that you might want to go and look for more information about it. And so the overwhelming majority of Twitter's engagements and Twitter's interactions were actually to put up an act of labeling. None of the emails in the Twitter files showed anybody from Stanford Internet Observatory demanding that Twitter take down or censor anything. And if this is the smoking gun, then I don't know what to tell you, again-

Ross Douthat (00:42:04):

Well, let's pull back from the smoking gun question and turn to Hamish.

Renée DiResta (00:42:09):

Thank you.

Ross Douthat (00:42:11):

But I'm going to stay with this question because I think we can say that independently of the Stanford Observatory Project, what powers it might or might not have, what funding it might or might not have, what specific goals it might or might not have, big and small internet companies occupy a role that doesn't have a precise analog in the world either before the internet, in the world of print, in the world of village rumors. You could say a party line in an old-fashioned telephone system bore some vague relationship to Twitter.com. X.com, excuse me. But not really, right?

(00:42:57):

If you're exchanging rumors in a big city in 1887, there's no entity that can even consider whether it's going to put a label on you or not. So companies like Twitter and companies like Substack are in this position that those of us who work for places like the New York Times think of as sometimes a little bit evasive about what they're really doing. The New York Times has to take responsibility for better or worse for including my columns, all the content that gets published. Digital platforms do not in quite the same way, but I think it's clear that Substack thinks about this question of what your mediating role is differently than Twitter does, did in the pre-Musk era, however you want to think about it. So can you talk briefly after that long-winded bridge about how you think about a company like Substack's weird mediating role in the information cycle?

Hamish McKenzie (00:43:58):

New York Times is a publication and owns its content and owns responsibility for that. We're not like the New York Times, we're like the platform that New York Times is built on. So we see ourselves as a platform in that way. We're the host for this, we're not the publisher, and so the order of responsibility is different. We do have content and moderation policies where we protect the platform by the extremes of people inciting violence, all the rest.

Ross Douthat (00:44:21):

Wait, but pause just for a moment on that. What are the extremes?

Hamish McKenzie (00:44:24):

Inciting violence, spam, porn.

Ross Douthat (00:44:28):

Who defines porn? I mean, I know it when I see it.

Hamish McKenzie (00:44:35):

Ultimately, I guess we do.

Ross Douthat (00:44:37):

Okay.

Hamish McKenzie (00:44:40):

You can have erotic literature on Substack. You can't have porn.

Ross Douthat (00:44:45):

You can't. What? All right. Okay. I'm just establishing that you are doing some curation.

Hamish McKenzie (00:44:50):

Yeah, doing some moderation at the extremes. We approach those hands off on content moderation. We are strong believers in free speech, defenders of free speech and the free press. And I don't think you can really decouple free speech from the free press. If you believe in the free press, you should stand up for free speech. I think that the Substack contribution or proposed solution to this problem that we're dealing with as we see new information related problems in societal discord happening because of these models is not to come up with a more sophisticated content moderation apparatus. I think these companies, Twitter and Facebook, because of their models are, in an inescapably difficult position that no labeling system is actually going to get them out of because the problem's at the root, the problem is their models.

(00:45:53):

And so our proposed solution is not a sophisticated content moderation apparatus. Our solution is a different model, a different system entirely, and a model that gives people much more agency, much more control and ownership, and much more ability to choose what they want to put into their minds and how to populate their media diets. And so Substack in heart alignments and in the actual model is a lot like that older world of blogging that probably you and I came up in. And it's a network of blogs then it is like Twitter, which is a massive amplification machine that responds to whatever inputs are put into it and whatever incentives rule the day for them.

(00:46:38):

And so sometimes Substack is conflated with the likes of Twitter or Facebook on moderation questions. But actually, Substack is a totally different system and much more like a home to an island of independent publications that people can actively opt into or opt out of. You invite that publication into your inbox, that's a much more controlled space. It turns a temperature down on discourse. It's less reactive, it's much more of a thoughtful space where people have to defend themselves at length and they're not just looking for a quick drive-by dunk, and they're not ratcheting up the polarization and the tribalism.

Ross Douthat (00:47:17):

But your theory then is that the technological nature of the platform has a logic of its own, right? Because it seems like from your description, well, put it bluntly, would the Substack model, let's say, call it a lighter touch model than the model that Renee thinks bigger platforms are forced into and Walter thinks that they embrace in a... Do you think the Substack model works as an alternative to that for Twitter, for Facebook? Or is it contingent on you having people who are writing? Do you think? Yeah.

Hamish McKenzie (00:48:00):

I'm not a hundred percent sure what the-

Ross Douthat (00:48:03):

The question is if you.

Hamish McKenzie (00:48:03):

I'm not a hundred percent sure what the-

Ross Douthat (00:48:03):

The question is, if you were put in charge of Twitter or Facebook.

PART 3 OF 4 ENDS [00:48:04]

Hamish McKenzie (00:48:06):

I would not want that job.

Ross Douthat (00:48:08):

No, I know.

Hamish McKenzie (00:48:09):

I think they're-

Ross Douthat (00:48:09):

This is speculation. Does the Substack model, can it work on those platforms?

Hamish McKenzie (00:48:15):

Oh, can it work on? Yeah. Well, yes. If Twitter got rid of all its advertising and went all in on subscriptions, that would be a super interesting platform. Super interesting.

Ross Douthat (00:48:23):

But so it's the advertising then, that is the fundamental... The force that requires deeper moderation.

Hamish McKenzie (00:48:29):

At the root of it, the things funded by ads. And so it's important to keep people engaged and important to keep people on the feed. Never leaving the feed, always looking for one more hits. And one effective way to do that, because they had brilliant people working on these problems, is to appeal to people's most base instincts and get them into fights and get them and sort them into tribes. Twitter is often amazing. Facebook is often great. Well, it used to be-

Ross Douthat (00:49:05):

I'm not trying to get you to say bad things about your competition.

Hamish McKenzie (00:49:08):

No. But one of the outcomes is you get this thing that's, and really engaging and entertaining and fun sometimes, edifying sometimes, but doesn't care about truth, doesn't care about healthy relationships, doesn't care about the state of discourse.

Ross Douthat (00:49:22):

And so Walter, just to follow this sort of nature of the platform question. In your narrative, right, the new media is this sort of this emergent force that challenges old media and promises a different landscape, but in certain ways, you are a champion of the work that Matt Taibbi has done with the Twitter files on Substack, right? You yourself have co-founded a newspaper that withdraws completely from the digital. From your perspective, is the best media landscape one in which platforms like Twitter and Facebook continue to be dominant, but just have much more fundamental freedom of expression than you feel like was imposed in 2016? Or is there something fundamentally wrong with the way those platforms work where... I'm not trying to get agreement here, but where you and Renée could agree that there is some sort of problem and that we would be better off as Substackers and readers of County Highway and maybe some print novels too.

Walter Kirn (00:50:41):

The reason Substack doesn't yet face the interventions that Twitter and Facebook did is it's not yet powerful enough to do things like sway elections, or effect the percentage of vaccine uptake. Should it become that, I promise you that Hamish and his business will be in the crosshairs of the same forces. Facebook and Google, all of these media platforms that have basically now have a revolving door relationship with regulatory government and other NGO style organizations have near monopolies on certain things. And they were thus able to do things like effect elections. And so in the mimetic rivalry in which the state has the biggest weapons, they aimed them at those places. Those places surrendered without a shot. As you'll see in the Twitter files, Elon Musk just said the other day on Joe Rogan, people underestimate the extent to which Twitter was an instrument of the state.

(00:52:01):

That's a near accurate paraphrase. And it was that. And when he came in and he turned over the Slack conversations and the email records and the moderation, the minutes of the moderation meetings, this became absolutely manifest. Now, the reason I publish a paper that doesn't even touch the internet, that you can't retweet, you can't share the pieces. You can't like them, is because I want to get as far away from the damn thing as possible. Because what happened was the barriers to entry for media participation went way down. It used to cost a lot of money to run the printing presses, pay the newsboys and do the subscriptions of the New York Times. Time Magazine too, where I used to work. But all those costs have gone very low, and Hamish is able to give to individual and group creators the ability to compete almost head to head with those publications, at least in terms of the product.

(00:53:14):

You can read, it's legible, it has illustrations. You can share it, you can like it, you can comment on it. As those barriers to entry fell new drug dealers came to the block and the old mafia kingpins didn't control the thing. If you can tolerate my metaphor. And as they did, the old kingpins said, "How do we get control of the neighborhood?" That's really all that happened. It's that simple and it happening today. And the question is [inaudible 00:53:48] whether the platforms that people like Hamish have created will be allowed to prosper and reach their natural potential or whether they will become subject to these Amy Klobuchar speeches on the floor of the Senate about whether or not that content should be allowed into the big AI pot that's going to-

Ross Douthat (00:54:08):

But we've gone from a metaphor of the Rebel Alliance fighting the Empire to Fentanyl, sweeping the streets.

Walter Kirn (00:54:14):

It's not that big a difference when you think about it. No, no, no. It's really simple. Come on. The New York Times, Time Magazine, all these places-

Ross Douthat (00:54:25):

Well, Time Magazine... I mean not to speak ill of our media competitors.

Walter Kirn (00:54:29):

They used to get on the plane with a presidential candidate and Ben Bradley would write his speeches for him. Okay, JFK, I know that for a fact. It's not cozy anymore, and I'm not speaking against the elite.

Ross Douthat (00:54:45):

I mean, this is the one point I want to just... We only have a minute left and I want to get Renée back in. But I just want push you on. So in my reading of the media landscape, there has been an incredible shrinkage of major entities. Meaning no disrespect to employees of Time Magazine, nobody thinks of Time Magazine as an entity that bestrides American politics in any way, shape or form. And much of the sort of disinformation industry, so to speak, that has sprung up that you were such a pungent critic of is startups and it's affiliated with academia and it's affiliated... I guess it's not clear to me that there even exists apart from the New York Times, The Washington Post and the Wall Street Journal, enough of a sort of residual old media hegemony to say the old media is trying to reassert control. I think it's fairer to say that there's some sort of government and academic panic.

Walter Kirn (00:55:46):

The fact that they are diminished actually is a proof of my thesis, because they're attempting to come back through this process. In Europe, they have something called the Digital Services Act, and it's coming here, and it's basically a big brother, across the board governmental attempt, to recognize certain trusted institutional partners in the media space, and give them special privileges and special prestige. And to give them this through the process of allowing them to flag, they're called trusted flaggers, to flag what they consider to be disinformation.

(00:56:28):

Now, you will not see these things like the Stanford Internet Observatory going after The New York Times for getting Russia Gate wrong. You won't find them going after the CDC for over-promoting the Covid vaccine. You won't see them doing anything to hurt the big people, but you will see them all the time running after RFK Jr., who's probably the biggest of the targets. And every little person in Toledo who repeats something about a nurse collapsing or whatever. And why are they doing this? Ask yourself. Who would fund this altruistic policing of the internet for truth? And whose interest is that? Stanford universities? It just one day's decided it's a mess out there, and we're going to change that. If you think that's how the world works-

Ross Douthat (00:57:27):

But wait, who is funding it?

Renée DiResta (00:57:28):

Yeah, who is funding?

Ross Douthat (00:57:29):

Because, I don't think the New York Times is funding it, right? I mean, might be, but I don't think so.

Renée DiResta (00:57:34):

We're not funded by the The New York Times either.

Ross Douthat (00:57:39):

I think you can see your co-panelists as a kindly inquisitor or anything, but you should credit the sincerity of people who think that the Covid vaccine works, thinks that Robert F. Kennedy is a conspiracy theorist who's telling people things that will make them unhealthy and want to do something about that.

Walter Kirn (00:57:56):

But there are no doctors at the Stanford Internet Observatory.

Ross Douthat (00:58:00):

Right? But there are-

Walter Kirn (00:58:02):

So how do they know?

Ross Douthat (00:58:02):

Well, I'm going to go to Renée to finish because I've been pressing Walter and now I'm going to press you for a second. Is there any imaginable world in which the kind of work that your project does, the kind of reports that you do and the kind of outreach that you do to places like Twitter, places like Facebook does not inspire the immediate visceral kind of very American reaction that you're getting from the panelists to your left? I guess. Suppose that, yeah, there's no formal censorship. Twitter, the relationship is not cheek by jowl. It's not secretly funded, anything like that. You just want Twitter to curate its algorithm in the best possible way. How are people ever going to have trust in that kind of process? It just seems to me completely implausible, assuming the absolute best intentions that it doesn't just yield this kind of bristling response.

Renée DiResta (00:59:09):

No, I think that's true. I think a lot of the work that we've done over the years is actually advocating for transparency. We've advocated for transparency where when a government makes a request, it's published. Where if a platform is using particular types of content controls on users like Shadow Banning, that's disclosed. We've advocated for all of these things for years now. There's never been, again. Unfortunately, what you're hearing is a particular point of view shaped in a particular media environment, and it's reflective. Actually, your point about trust is very, very valid. Because it is.

Ross Douthat (00:59:40):

But, I'm saying how could that point of view not be in any media environment? A world where I just said to you, look, there are these companies that are neither newspapers nor individuals. They play this mediating role, and people who work at prominent American universities are sending them advise on how to label tweets. You're always going to get that reaction. It's not that Walter Kirn is trapped in a Taibian bubble. This is just an American reaction to that kind of story.

Renée DiResta (01:00:10):

Again, my hope would be that we would have the conversation around the actual facts as opposed to emails cut and taken out of context and the Cardin-Rissolu model of we're going to go cherry-pick some notes and hang you with six words in an email. I think that what we should be talking about, because the point is very valid. How do people want their information to be curated? How do people want their feeds to be assembled? So one of the things that I think is actually the best solution is devolving power back into the hands of users, and we've written about that too. Where the user becomes the entity that controls what they see in the feed. The user decides this is actually really, the Fediverse is... I think not the Fediverse. Well, not the Fediverse... Is going to offer this, where you're going to be able to subscribe to particular moderation frameworks that are created by people who you feel aligned with.

(01:00:58):

Or you're going to have an ability to say, this is my tolerance for certain types of offensive content, my tolerance for certain types of offensive language, my tolerance for pornography. What you're going to start to see-

Ross Douthat (01:01:10):

Hypothetically.

Renée DiResta (01:01:10):

There you go. What you're going to start to see is that devolution. I think that because there is such an absolute crisis of trust, you are never going to have... I think as terrible as the Chomskyite top-down hegemonic narrative was, everybody kind of trusted the same thing at the same time. And that was the Walter Cronkite model of that sort of period in society. As it's fragmented though, what you're describing is basically a crisis of trust where I am not going to be able to convince him that we did or didn't do a thing, because nothing that I say is going to be seen as honest, authentic communication. Because he fundamentally doesn't trust me, and I fundamentally don't trust Matt Taibbi. You're seeing this-

Ross Douthat (01:01:59):

But I guess we have to stop, I think it's clear that you guys are not going to achieve a meeting of the minds on what you did or did not do. I'm just suggesting that your own description of what you are doing, while clear by your own description, well-intentioned, it does not seem to me possible for it to escape this kind of interpretation. That's all. It's not so much he's getting an email wrong and so on, fine. But you do want Twitter to put labels on posts, right?

Renée DiResta (01:02:36):

No. What the majority of the work around labeling has been to say, can you present both points of view in a particular place? Can you say, this is the content and here is the other vision of the world. And that seems to be, given the design constraints of a social platform, the best possible way to handle that problem. Is it good? I actually don't really think it is. I'm saying this is in the design constraints in the universe we have, as Hamish notes. In this terrible system. This is the best possible way to do it.

Ross Douthat (01:03:09):

All right. On that note, I've carried us too long. I'm not sure this was my most successful moderation ever, but I do want to thank all of our panelists for giving us a great deal to think about. So thank you so much.

Hamish McKenzie (01:03:21):

Thank you.

3 Comments
Luke Burgis Newsletter
Luke Burgis Newsletter
Pursuing the mysterium tremendum et fascinans and writing at the intersection of philosophy, culture, art, technology, and religious wisdom.