Q&A

Steven Brill Mourns The Death of Truth—And Has a Plan to Revive It

The veteran journalist and NewsGuard cofounder talks to Vanity Fair about his new book, pursuing facts at a time when “everything has become a matter of opinion,” and becoming the target of Russian disinformation and Republican lawmakers.
Image may contain Steven Brill Accessories Formal Wear Tie Adult Person Wristwatch People and Computer Hardware
Steven Brill, co-chief executive officer of NewsGuard Technologies Inc., speaks during a Bloomberg Television interview, in 2018.By Christopher Goodney/Bloomberg/Getty Images.

When cofounding NewsGuard in 2018, Steven Brill recalls, “We thought we were right in the middle of this total shitstorm of misinformation.” But that was before the 2020 election, the COVID pandemic, January 6, Russia’s invasion of Ukraine, the Israel-Hamas war, and another presidential race playing out in deeply polarized America. “We were really in the calm before the storm even though we thought it was the storm,” Brill told me in an interview. “And with generative AI, it’s only going to get much worse before it gets better.”

It can feel like grim times for trying to live in a fact-based world, as bogus and conspiratorial claims flow freely across the internet and social media, a subject Brill explores in his new book, The Death of Truth. Brill, a veteran journalist and entrepreneur—he launched The American Lawyer, Brill’s Content, and Court TV—diagnoses the infodemic of our age, while also offering solutions, ranging from reforming Section 230, the law that shields internet companies from liability over content on their platforms, to suing social media companies for violating their own terms of service, to reining in programmatic advertising, in which an algorithm places automated ads based on users’ demographic data. A consequence of such advertising is major brands may inadvertently help prop up purveyors of misinformation. (One of the services offered by NewsGuard, along with providing reliability ratings for news outlets, addresses programmatic advertising.)

In a riveting Vanity Fair excerpt published this week, Brill recounted how his work at NewsGuard led to him being targeted by John Dougan—an American who, as The New York Times reported last week, spreads disinformation from Russia, where he has been given asylum—and House Republicans, led by Representative Jim Jordan. “We got an email from Chairman Jordan in essence accusing us of exactly the same thing that Dougan had accused of, which was, we were shills for the deep state,” Brill said. “The irony wasn’t lost on me. It’s a tragic comedy, but it’s not funny.”

“Their priority, basically, is to destroy the notion that there actually is truth in the world, that there actually are facts,” Brill added. “What the Death of the Truth is really about is the notion that everything has become a matter of opinion.”

In an interview, edited for length and clarity, Brill discusses his motivation for the book, how AI accelerates the misinformation crisis, and where he sees some hope for the news media.

Vanity Fair: I’d love to start with your motivation for the book. You wrote, “If we can understand how truth has been so eviscerated, we can see how to restore it.” To me, that sort of captured it.

Steven Brill: That’s exactly it. I’ve been living in this world, as you know, and what I started to think about, a year and a half ago, is that all these forces seem to combine in a perfect storm…. The combination of the algorithms of social media and [programmatic advertising], which inadvertently finances all this stuff. Together, they’ve created an ecosystem where nobody believes anything. You go online, and if you’re an average person, you just don’t know what to believe. I wanted to sort of dissect how that happened, explain the ramifications of it, but also explain what we can do about it.

Along the way, I’ve come to realize that it’s an even more serious problem than I thought, and that there are some actors like the Russians, who are much more serious about this and much more advanced about this, and much more on the way to using this as a way to really upend the global order than even I realized.

You’ve been looking critically at our information ecosystem for a long time, founding a media watchdog publication, [Brill’s Content], as well as an organization, NewsGuard. Even since 2018—that’s before the misinformation/disinformation around COVID, “Stop the Steal,” Ukraine-Russia war—it feels like things are only accelerating. But what has been your experience?

That’s absolutely right. When we started NewsGuard, in 2018, we thought we were right in the middle of this total shitstorm of misinformation, and yet it was before COVID, before “Stop the Steal,” before the vaccine misinformation, before January 6, before the Russian invasion of Ukraine, before the Israel-Hamas war, and before what is now the current election. So we were really in the calm before the storm, even though we thought it was the storm. And with generative AI, it’s only going to get much worse before it gets better.

We have a story about a guy named John Dougan who [is] the guy who threatened me at my home. He discovered generative AI as a force multiplier. This guy, in addition to doing all these phony documentaries from his studio in Moscow, has created 167 phony local news sites throughout the United States and the West that are all aimed at spreading Russian disinformation and just generally making everybody not believe anything—just creating all kinds of chaos. [Editor’s note: NewsGuard’s article states that it “discovered 167 Russian disinformation websites that appear to be part of Dougan’s network of websites masquerading as independent local news publishers in the U.S.”]

You mentioned John Dougan and that chapter in the Death of the Truth that was especially compelling, because it was a very personal chapter. You write about how when NewsGuard called out Dougan and these videos that were posted on YouTube, and YouTube took them down, you became a target of Russian disinformation, while then pretty much at the same time, you became a target of the House Republicans who were also going after NewsGuard for what they see [as] colluding with the US government to censor. Can you tell me about that?

It wasn’t even pretty much the same time. It was literally the same day. I’m not saying they planned it. But it was literally the same day where my wife and I were suffering through this news. We were just starting to understand about this guy who is calling our home and doing a YouTube video with aerial shots of our house and talking to me about my daughter. That afternoon, just as we were settling down, we got an email from Chairman Jordan, in essence accusing us of exactly the same thing that Dougan had accused of, which was, we were shills for the deep state. The irony wasn’t lost on me. It’s a tragic comedy, but it’s not funny.

What does it say to you that these two forces, Russian [propagandists] and Republicans in Congress, that their priorities seem to align even though they’re not working together. But their same priorities, at least in targeting you and NewsGuard, come together at the same time.

Their priority, basically, is to destroy the notion that there actually is truth in the world, that there actually are facts. What the Death of the Truth is really about is the notion that everything has become a matter of opinion. And what the House Judiciary Committee has tried to do with NewsGuard is the same that the Russians have tried to do, which is turn it into a matter of opinion over whether we are independent journalists who, in fact, give a higher quality score or reliability score to [Fox News’s website] than we give to [MSNBC’s website]. But they’re happy to ignore that and say we’re tilting against the right, or we’re organs of the Biden administration. But everything becomes a matter of opinion. There’s no such thing as fact.

This presents quite a challenge in covering this current election. You describe Trump as “the most blatant avatar, and beneficiary, of the modern misinformation and disinformation ecosystem that has become so much a part of the world’s [media] diet.” You can look at a number of different polls and studies where 60% of Republicans may say the 2020 election was stolen, or Biden didn’t win legitimately. So you have a large segment [of people], Donald Trump to members of Congress, who will support this fiction that the election was somehow stolen from Donald Trump. So how do you cover, in 2024, a political party where this seems to be core to their belief system even if it’s not true.

With great difficulty. One of the things I don’t think you should do: I prefer the term ‘misstate’ or ‘falsehood,’ not ‘lie.’ I think when you say someone’s lying, you’re really claiming to understand their state of mind.

For example, I profile some of the people who were arrested in the Capitol on January 6 and I go through what their media diets were. There’s a guy from Ohio, totally normal, middle-class graduate of Ohio State who was laid off during the pandemic and he’s home, and long story short, he basically goes down the misinformation rabbit hole on Facebook and YouTube and Twitter. And he ends up in the Capitol on January 6. He sincerely thought, and sincerely thinks, for whatever reason, that a) System is not working for him. He’s got some pretty good reasons to think that. And b) He believes Trump when Trump says the election was stolen because he went to sleep that night and Trump was ahead and then he woke up the next morning and Trump was behind. We know all the reasons for that, counting the absentee ballots later, etc. But he doesn’t believe it. That’s not a crazy person. That’s not even a bad person. It’s someone who was misled. And when you try to parse through the state of mind of all those people on January 6—or even all the people who were supporting the claim that the election was stolen—you really can’t get to exactly what their state of mind is. But what you can say is the election was not stolen and they’re wrong. And one of the real tragedies of the age we live in is, again, we’ve created an [ecosystem] for misinformation where nobody believes anything.

So for example, with generative AI, if the Access Hollywood tape were to break right at the end of this election season, instead of 2016, no one would believe it because Trump would say, well, that’s a deep fake. That’s not me. That’s not my voice…. So we’re living in a world where we’re really unmoored from reality, and it’s up to journalists to help people get those moorings back by really trying to stick to what is fact and what is a matter of opinion.

I understand the guy you’re talking about, going to January 6, who has been misled and maybe falls down certain rabbit holes and sincerely believes that the election is stolen. But you can identify bad actors who are lying.

There are bad actors. For example, J.D. Vance is obviously a fairly smart guy. He went to my law school, so he must be smart. And yet he was quoted in an interview saying one of the reasons not to send money to Ukraine is that Zelensky is using it to buy yachts and mansions all over the place. He’s way too smart to believe that. I just assume he’s way too smart. He can’t believe that. And yet he says it and people believe him.

You mentioned AI, which is a topic I wanted to get into. This really feels like a huge accelerant to misinformation.

That’s exactly the right word. It is an accelerant.

How do we even tackle this? We’ve seen, as we just talked about, with the advent of social media, misinformation has flowed, and now with AI, if somebody wants to deceive a large amount of people very easily they seem to be able to do it. So how do you even try to wrap your heads around that as an organization tracking this stuff online?

Well, if you’re a social media company, it seems to me you have the same responsibility that Condé Nast has overseeing Vanity Fair, which is, you ought to be responsible for the stuff that’s on your pages and on your website. Right now, they don’t have legal responsibility, but [they] should have legal responsibility. They should have moral and ethical responsibility.

One of the epiphanies I had doing this book was that I was at a conference of some media executives and talking about online disinformation, and someone who is a good friend of mine, mumbled, “Well, you know, what’s YouTube or Facebook supposed to do, people are posting thousands of videos a second. How can they screen you all that?” And that’s when I had this epiphany, I said, “Well, who died and said they had to post a thousand videos a second. Why do they have to do that?” Sitting in a conference room and on the wall of this auditorium, there’s a note from the department of buildings which says: Occupancy by more than 280 people is dangerous and unlawful. So you can’t fill that building with more than 280 people and that doesn’t take away your First Amendment right to free assembly. It’s a safety ordinance. Why is it that the platforms are able to say, “We can’t spend more money to screen things. We have to let as many people in as want to come in because that’s how we maximize our ad revenue, by getting as many eyeballs as possible.”

One of the things I suggest, right now, without any legislation, if you go online and you look at the terms of service for Facebook or X or YouTube. They all have terms of service and the terms of service say, we will not tolerate dangerous misinformation or disinformation. We will not tolerate bullying or hate speech. They say that in the terms of service. The terms of service are a contract. That is a contract between you and them. Right now, without any legislation, the FTC could sue those platforms for violating their terms of service.

I noticed in the book you mentioned how NewsGuard had almost come to an agreement with Twitter before the Elon Musk deal happened. Now, obviously, Elon Musk is somebody who has promoted bogus information on a mass scale. The Paul Pelosi incident being just one of them, to 100 million-plus followers. Is there any hope you see with Twitter/X now in terms of trying to vet information or is it kind of lost now?

The one thing you can say for Musk is he’s much more candid about not caring about that than the other guys. So I don’t see much hope in that. I’m glad you mentioned the Paul Pelosi story because, again, I want to come back to the other technology that is responsible for the death of truth and that’s programmatic advertising.

The night that Paul Pelosi was attacked, there was a story in the Santa Monica Observer, [which] NewsGuard had identified years before as a phony hoax site posing as a local news site in Santa Monica. We identified them that way because they had run a story, among others, that Hillary Clinton had died [and] it was a body double who had shown up for the debates with Trump. So this is a notoriously awful site.

They ran a story that night that said Paul Pelosi had been in an encounter with a gay prostitute and Elon Musk retweeted it…. It got zillions of views which went back to the Santa Monica Observer website. That’s why you use the social media platform if you’re a publication so people come back and look at your website. That rang the cash register for the Santa Monica Observer…. That night, and in the days thereafter, you could have looked at that article about Paul Pelosi and the gay prostitute, a totally fabricated article, and seen ads for Hertz, for [all] varieties of big blue-chip consumer brands. That’s because programmatic advertising does not distinguish between anything having to do with the content of the site. They’re just following a demographic.

A couple years back, the biggest single advertiser on Sputnik, the Russian propaganda news site, was Warren Buffet. Warren Buffet doesn’t wake up every morning and say, “How can I sit and give money to Vladimir Putin?” But he [is CEO of Berkshire Hathaway, a publicly traded holding that] owns Geico, a big programmatic advertiser and their agency is sending ads to Sputnik.

We have a product that tries to do something about that. So everything I’ve told you is a self-serving speech for that product. But the fact is, you do need to put a filter in front of this massive auction process, because if you don’t, it’s not like the days of Mad Men, where people sit around and have a three-martini lunch and decide, should I advertise in Vanity Fair or Time? Should I advertise on NBC or CBS? [Eighty] percent of all the advertising in the world is done through this automated auction process.

At the end of researching this book, and writing this book, as well as the work you’re doing at NewsGuard, are you hopeful at all that there could be a media-information ecosystem that is not just awash in bogus information because it can be hard to feel that way sometimes?

I actually am because I think there are some things, as I outline in the book, that can be done about it. I think we got past the world where online everything is free. Quality publications like yours charge for their content and that’s really good news because when publications, whether it’s The New York Times or The New Yorker or Vanity Fair or Wired, you name it, if they charge for their content, suddenly that means you and I are actually producing the revenue by virtue of what we write. We’re not just a cost center that runs around the ads, which used to be the revenue. If your revenue’s based on people actually reading and buying, paying something for the content, that creates incentives for journalists. I teach a journalism seminar at Yale that I’ve done for years, and I find there’s more demand lately for quality journalists than had been in the past, precisely because so many publications—in fact, most of them—now depend on reader revenue as much if not more than they depend on advertising. Don’t let your readers be resentful that you charge a subscription.