ARI SHAPIRO, HOST:
Some seemingly unrelated events are all tied to a crisis happening in the U.S. right now - the attack on the U.S. Capitol, people unwilling to get the coronavirus vaccine, legislation aimed at fixing a voting fraud problem that doesn't actually exist in American elections. This crisis is disinformation. And joining us now is NPR's Miles Parks, who covers disinformation for NPR.
MILES PARKS, BYLINE: Hey, Ari.
SHAPIRO: Has there actually been more disinformation in the last couple years?
PARKS: There has been. I mean, the quantity is way up. But I think what's really interesting, too, is just how much the landscape has kind of changed. You know, the biggest sources of disinformation in the U.S. right now about politics and about public health are domestic sources. It's coming from other Americans online. I talked to Darren Linvill. He's a Clemson University professor who studies Russian disinformation networks, specifically the so-called Internet Research Agency, or the IRA. These are the guys who played a big role in the 2016 election, but not so much in 2020.
DARREN LINVILL: I'm not even seeing IRA messaging much in English to the same extent that I've seen in the past because they don't need it. I mean, like, the GOP has taken the ball from them and run with it.
PARKS: You know, typically, it's American adversaries who want to spread disinformation about democracy and about elections in the U.S. not being trustworthy, but that narrative is mostly coming from inside the U.S. now.
SHAPIRO: Explain why. I mean, what's made the disinformation environment so much worse recently?
PARKS: The biggest reason is the pandemic. You know, it's hard to overstate just how much more time people are spending online. People are more online than ever before. A tracking company I talked to, Activate Consulting, said it was the biggest year-over-year jump in Internet and media use they'd ever seen.
And what that means in really simple terms is people are getting less information from person-to-person interactions, organic interactions, and they're getting more information kind of filtered through algorithms, through these social media platforms. And what we know about that is these algorithms are known to make people more polarized, and they're known to make people more prone to believing conspiracy theories.
SHAPIRO: But, Miles, it feels like the elephant in the room over the last year was Donald Trump, who spread misinformation and gave it a megaphone and a platform. I mean, now that he has lost his Twitter account and his Facebook account after the January 6 attack on the Capitol, what effect has that had?
PARKS: Yeah, that's definitely helped. The - Trump's account was sort of emblematic of everything that's been wrong with social media platforms thus far. Even as he was posting more and more misinformation last year, that didn't lead to a drop-off in followers. His Twitter audience actually grew by 30% over 2020. So his ban led to an immediate drop-off in misinformation online, according to a number of tracking firms.
But what experts say is that bans like this are short-term fixes and not long-term solutions. Basically, the underlying system that Trump used - posting things that were false, posting things that make people either angry or upset to make them engage with the content - all that is basically still happening. The system's still in place, just waiting to be used by other people.
Here's Joan Donovan, a disinformation researcher I talked to from Harvard University.
JOAN DONOVAN: We are in serious trouble. Disinformation has become an industry, which means the financial incentives and the political gains are now aligned, and we will see more of this.
SHAPIRO: See more of this, unless we do what? What's the path out of this?
PARKS: I don't want to be too bleak, Ari, but we're probably talking about something that's a generation or two from being resolved, according to people I talked to over the last couple of weeks. Whitney Phillips, a Syracuse researcher who just released a book on all this stuff, told me it might take 50 years to resolve this. Basically, she says a redesign of the entire Internet is the only solution. Information just spreads too easily right now, even if it's untrue or hateful. That has her less focused on moderation policies and more on actually educating young people because they're going to be the ones who have to fix this mess.
SHAPIRO: NPR's Miles Parks. Thank you.
PARKS: Thank you, Ari. Transcript provided by NPR, Copyright NPR.