Shannon Bond

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.

Bond joined NPR in September 2019. She previously spent 11 years as a reporter and editor at the Financial Times in New York and San Francisco. At the FT, she covered subjects ranging from the media, beverage and tobacco industries to the Occupy Wall Street protests, student debt, New York City politics and emerging markets. She also co-hosted the FT's award-winning podcast, Alphachat, about business and economics.

Bond has a master's degree in journalism from Northwestern University's Medill School and a bachelor's degree in psychology and religion from Columbia University. She grew up in Washington, D.C., but is enjoying life as a transplant to the West Coast.

Facebook has almost 2 billion daily users, annual revenue that rivals some countries' gross domestic product, and even its own version of a Supreme Court: the Oversight Board, which the company created to review its toughest decisions on what people can post on its platforms.

This week, the board faced its biggest test to date when it ruled on whether Facebook should let former President Donald Trump back on its social network.

Updated May 5, 2021 at 11:36 AM ET

Facebook was justified in its decision to suspend then-President Donald Trump after the Jan. 6 insurrection at the U.S. Capitol, the company's Oversight Board said on Wednesday.

Updated May 5, 2021 at 10:30 AM ET

Copyright 2021 NPR. To see more, visit https://www.npr.org.

RACHEL MARTIN, HOST:

Copyright 2021 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AUDIE CORNISH, HOST:

Copyright 2021 NPR. To see more, visit https://www.npr.org.

MARY LOUISE KELLY, HOST:

Last year, in the middle of the pandemic, Sinead Boucher offered $1 to buy Stuff, New Zealand's largest news publisher.

Boucher was already the company's chief executive and was worried that its Australian media owner would shut down the publisher. Things had started to look really grim: The economy had ground to a halt and advertising revenue had evaporated.

"I knew that they ... would potentially just decide to wind us up," said Boucher. "So it was just a punt."

Facebook is making changes to give users more choice over what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.

The changes, announced Wednesday, include making it easier for people to switch their feeds to a "Most Recent" mode, where the newest posts appear first, and allowing users to pick up to 30 friends or pages to prioritize. Users can now limit who can comment on their posts.

Tech workers say they have experienced more harassment based on gender, age and race or ethnicity while working remotely during the pandemic, according to a survey from a nonprofit group that advocates for diversity in Silicon Valley.

The increases were highest among women, transgender and nonbinary people, and Asian, Black, Latinx and Indigenous people.

Big Tech taking questions from Congress is becoming a quarterly event.

The latest edition came Thursday, when Facebook's Mark Zuckerberg, Twitter's Jack Dorsey, and Google's Sundar Pichai appeared virtually before the House Energy and Commerce Committee.

The hearing was centered around misinformation. It was the first time the executives took questions from lawmakers since the riot at the U.S. Capitol by pro-Trump supporters on Jan. 6 and since the widespread rollout of the COVID-19 vaccine began.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

ARI SHAPIRO, HOST:

Support for the siege on the U.S. Capitol. Bogus promises of COVID-19 cures. Baseless rumors about vaccines.

Who should be held accountable for the spread of extremism and hoaxes online?

Lina Khan, a prominent antitrust scholar who advocates for stricter regulation of Big Tech, may be about to become one of the industry's newest watchdogs.

President Biden on Monday nominated Khan to the Federal Trade Commission, an agency tasked with enforcing competition laws. She is the splashiest addition to Biden's growing roster of Big Tech critics, including fellow Columbia Law School professor Tim Wu, who announced earlier this month he would join the National Economic Council.

Facebook is failing to enforce its own rules against falsehoods about COVID-19, vaccines, election fraud and conspiracy theories when it comes to posts in Spanish, according to a coalition of advocacy groups.

"There is a gap, quite an enormous gap, in fact, in English and Spanish-language content moderation," Jessica González, co-CEO of the advocacy group Free Press, told NPR.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

RACHEL MARTIN, HOST:

Instagram recommended false claims about COVID-19, vaccines and the 2020 U.S. election to people who appeared interested in related topics, according to a new report from a group that tracks online misinformation.

"The Instagram algorithm is driving people further and further into their own realities, but also splitting those realities apart so that some people are getting no misinformation whatsoever and some people are being driven more and more misinformation," said Imran Ahmed, CEO of the Center for Countering Digital Hate, which conducted the study.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

ARI SHAPIRO, HOST:

There's a saying in Silicon Valley: Solve your own problems. Tracy Chou didn't have to look further than her social media feeds to see those problems.

"I've experienced a pretty wide range of harassment," she said. "Everything from the casual mansplaining-reply guys to really targeted, persistent harassment and stalking and explicit threats that have led me to have to go to the police and file reports."

On Feb. 1, the editor of an award-winning Indian magazine got a call from his social media manager: The magazine's Twitter account was down.

"I said, 'Are you sure? Can you just refresh, and check again?' " recalled Vinod K. Jose, executive editor of The Caravan, which covers politics and culture. "But she said, 'No, no, it's real.' "

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

AILSA CHANG, HOST:

All right. Well, for more on this dilemma facing Twitter in India, we're going to turn now to NPR tech correspondent Shannon Bond.

Hey, Shannon.

SHANNON BOND, BYLINE: Hey, Ailsa.

Twitter users aren't known for staying quiet when they see something that's flat out wrong, or with which they disagree. So why not harness that energy to solve one of the most vexing problems on social media: misinformation?

With a new pilot program called Birdwatch, Twitter is hoping to crowdsource the fact-checking process, eventually expanding it to all 192 million daily users.

"I think ultimately over time, [misleading information] is a problem best solved by the people using Twitter itself," CEO Jack Dorsey said on a quarterly investor call on Tuesday.

Facebook is expanding its ban on vaccine misinformation and highlighting official information about how and where to get COVID-19 vaccines as governments race to get more people vaccinated.

"Health officials and health authorities are in the early stages of trying to vaccinate the world against COVID-19, and experts agree that rolling this out successfully is going to be helping build confidence in vaccines," said Kang-Xing Jin, Facebook's head of health.

January brought a one-two punch that should have knocked out the fantastical, false QAnon conspiracy theory.

After the Jan. 6 attack on the U.S. Capitol, the social media platforms that had long allowed the falsehoods to spread like wildfire — namely Twitter, Facebook and YouTube — got more aggressive in cracking down on accounts promoting QAnon.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Updated at 3:16 p.m. ET

Facebook's oversight board on Thursday directed the company to restore several posts that the social network had removed for breaking its rules on hate speech, harmful misinformation and other matters.

The decisions are the first rulings for the board, which Facebook created last year as a kind of supreme court, casting the final votes on the hardest calls the company makes about what it does and does not allow users to post.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

ARI SHAPIRO, HOST:

The alternative social network MeWe had 12 million users at the end of 2020. Barely three weeks into 2021 — and two since a right-wing mob attacked the U.S. Capitol — the company says it's now passed 16 million.

CEO Mark Weinstein says this popularity is a testament to the reason he launched MeWe in 2016 as an alternative to Facebook. MeWe markets itself as privacy forward. It doesn't harness users' data to sell ads or decide what content to show them.

Two weeks ago, Facebook indefinitely suspended former President Donald Trump from its social network and Instagram, after a mob of his supporters stormed the U.S. Capitol. CEO Mark Zuckerberg said the risks of allowing Trump to keep using the social network were "too great."

Now, Facebook wants its newly formed independent oversight board to weigh in and decide whether it should reinstate Trump.

Updated at 3:05 p.m. ET

Willy Solis never saw himself as an activist.

"I'm an introvert, extreme introvert," he said. "That's my nature."

But 2020 changed that — like so many other things.

Pages