© 2024 NPR Illinois
The Capital's Community & News Service
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

5 Takeaways From Big Tech's Misinformation Hearing

Mark Zuckerberg, chief executive officer of Facebook Inc., speaks virtually during a House Energy and Commerce Subcommittees hearing.
Daniel Acker
/
Bloomberg via Getty Images
Mark Zuckerberg, chief executive officer of Facebook Inc., speaks virtually during a House Energy and Commerce Subcommittees hearing.

Big Tech taking questions from Congress is becoming a quarterly event.

The latest edition came Thursday, when Facebook's Mark Zuckerberg, Twitter's Jack Dorsey, and Google's Sundar Pichai appeared virtually before the House Energy and Commerce Committee.

The hearing was centered around misinformation. It was the first time the executives took questions from lawmakers since the riot at the U.S. Capitol by pro-Trump supporters on Jan. 6 and since the widespread rollout of the COVID-19 vaccine began.

Here are five takeaways from the virtual hearing (which featured surprising few technical issues, other than the usual confusion over finding the mute button):

1) Everyone is mad

"It sounds like everybody on both sides of the aisle is not happy," said Fred Upton, R-Mich.

Indeed, Thursday's hearing laid bare just how frustrated lawmakers are with the social media platforms right now.

And Americans mostly agree: Pew Research recently found that about two-thirds of Americans feel like social media is having a negative effect on the country. Conversely, just 10% say the platforms are having a positive effect.

On Thursday however, lawmakers on both sides of the aisle were not in any sort of agreement about what the biggest issues were.

Rep. Mike Doyle, D-Penn., opened the hearing as chair of the House Subcommittee on Communications and Technology by asking all three executives whether they bore responsibility for the attack on the Capitol on Jan. 6.

Nearly half of rioters charged in federal court allegedly used social media to post photos, livestreams and other evidence of their involvement, according to a review of charging documents by George Washington University's Program on Extremism.

And experts say the social media companies allowed narratives falsely questioning the election's legitimacy to fester online for months.

But Twitter's Dorsey was the only executive who said yes — with the caveat that tech companies are not solely to blame.

"I think the responsibility lies with the people who took the action to break the law and do the insurrection," Facebook's Zuckerberg said. "And secondarily with the people who spread that content, including [President Trump]."

Almost entirely absent from hearing was the the far-right conspiracy theory QAnon, which flourished online until social media companies cracked down on accounts tied to the conspiracy after the Capitol riot.

Republicans steered almost completely clear of talking about election disinformation or the Jan. 6 attack at all; choosing mostly to focus on an entirely different line of criticism against the tech industry.

2) The new Republican concern: children

In previous hearings, Republican House members have focused on what they claim is bias by the tech platforms against conservative voices — even though research has actually shown that conservative content makes up a disproportionate amount of the most popular content online.

That line of criticism did make some appearances on Thursday. Rep. Steve Scalise, R-La., questioned Dorsey about moderation decisions made about articles related to Hunter Biden leading up to the 2020 election. But overall, claims of anti-conservative bias took a backseat to GOP concerns about the effect social media is having on America's youth.

"Big Tech is essentially giving our kids a lit cigarette and hoping they stay addicted for life," said Bill Johnson, R-Ohio.

Reports that Facebook-owned Instagram is working a version of the app for kids under 13 years old garnered lots of focus on Thursday.

Zuckerberg confirmed the plan and painted it as an effort to offer a version of the platform that includes parental controls, since many children already use Instagram by lying about their age.

But lawmakers were wary of the idea, along with Google's YouTube Kids product, because of the potential downsides that come with kids spending hours every week looking at screens.

Rep. Cathy McMorris Rodgers, R-Wa., noted research linking social media to depression among teenagers.

"I do not want their self-worth defined by your engagement tools," Rodgers said. "What will it take for your business model to stop harming children?"

Zuckerberg defended the platforms, saying he did not think the research about the impact of social media on mental health was "conclusive" at this point.

While the criticisms mostly came from Republicans, a number of Democrats raised them too, suggesting a rare point of bipartisan agreement that could be key when it comes to regulation down the road.

3) The companies didn't give much ground

Whether it came to teenage depression, political polarization or vaccine misinformation, the CEOs were reluctant to admit fault. Instead, they highlighted their policies, how much they spend on monitoring their platforms, and how much rule-breaking content they've removed.

Lawmakers' frustration boiled over several times, as members repeatedly tried — and failed — to get the CEOs to answer "yes or no" questions.

"I think it's irritating all of us," said Rep. Anna Eshoo, D-Calif. "No one seems to know the word 'yes' or the word 'no'. Which one is it? If you don't want to answer, just say 'I don't want to answer.'"

Dorsey even seemed to reference the lawmakers insistence on such questions with a mid-hearing tweet, which had a poll with a question mark and two answers: "yes" and "no."

While lawmakers pressed the companies about how their advertising-based business models and automated recommendation systems may be amplifying misleading content, the executives rejected the idea that they benefit from harmful posts.

"If we woke up tomorrow and decided to stop moderating content, we'd end up with a service very few people or advertisers would want to use," Dorsey said. "Ultimately, we're running a business, and a business wants to grow the number of customers it serves."

Zuckerberg turned to a defense Facebook has often used: that the content on its platform reflects problems in broader society.

"The reality is our country is deeply divided right now, and that isn't something that tech companies alone can fix," he said.

However, one notable exception came when Democrat Jan Schakowsky of Illinois asked Zuckerberg about comments his top deputy, Sheryl Sandberg, made shortly after Jan. 6, in which she said the events at the Capitol were "largely organized" on other platforms.

Zuckerberg seemed to back off that assertion, saying: "Certainly, there was content on our services, and from that perspective, I think that there's further work that we need to do to make our services and moderation more effective."

4) Regulation is coming... someday

Congress is clearly closer than ever before to passing some sort of legislation aimed at regulating internet platforms.

The lawmakers talked about it as an impending certainty, and mentioned the plethora of bills circulating that would attempt to tackle some of the problems.

Much of the conversation focused on the company's business models, which often reward inflammatory or misleading content. An NPR analysis released Thursday for instance, found that articles falsely connecting vaccines and death have been among the most highly-engaged-with content on the platforms this year.

"It is now painfully clear that neither the market nor social pressure will force these companies to take the aggressive action they need to take to eliminate this information and extremism from their platforms," said Rep. Frank Pallone, D-N.J. "Your business model itself has become the problem, and the time for self-regulation is over."

The tech leaders themselves have resigned themselves to the fact that federal mandates are coming. Facebook has even been running a large ad campaign calling for reforms.

Now the platforms are trying to make sure those mandates don't interfere too much with the way they do business.

One thing they warned about Thursday is the mandated moderation of content.

Dorsey warned that only the richest, largest platforms may be able to employ staffs large enough to meet standards if they are imposed, which could hurt competition.

Pichai said he has seen "good proposals around transparency and accountability" in some of the proposed reforms, but stopped short of endorsing specific changes.

And Zuckerberg said big platforms should be more transparent about how they deal with content that breaks the law, but that private companies shouldn't be "arbiters of truth."

"I don't think anybody wants a world where people can only say things that a private company judges to be true," the Facebook CEO said.

5) Some weird moments too

If you were wondering about the vaccine status of one of the world's richest men, then this was the hearing for you. Turns out, $100 billion in net worth doesn't mean you get to cut lines when it comes to getting a shot.

Zuckerberg, and the other two witnesses, were clearly confused when Rep. Billy Long, R-Mo. asked them whether they had been vaccinated yet. Zuckerberg and Dorsey said they have not yet been. Pichai said he got a shot last week.

Long followed up by asking Pichai whether he received a two-dose vaccine or the one-dose Johnson & Johnson vaccine, and Pichai said he will need to go back for a second dose.

Hearing viewers were also enamored with an odd clock-looking device in Dorsey's kitchen.

Jack Dorsey, chief executive officer of Twitter Inc., speaks with a non-clock clock next to his right shoulder.
Daniel Acker / Bloomberg via Getty Images
/
Bloomberg via Getty Images
Jack Dorsey, chief executive officer of Twitter Inc., speaks with a non-clock clock next to his right shoulder.

As Gizmodo reports, the device actually displays the prices of a variety of cryptocurrencies. Technology!

Editor's note: Facebook and Google are among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Miles Parks is a reporter on NPR's Washington Desk. He covers voting and elections, and also reports on breaking news.
Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.