© 2024 Wyoming Public Media
800-729-5897 | 307-766-4240
Wyoming Public Media is a service of the University of Wyoming
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Transmission & Streaming Disruptions

Mark Zuckerberg's Big Blind Spot And The Conflict Within Facebook

Facebook CEO Mark Zuckerberg has said connecting the world means bringing people together. But increasingly the platform is being used by some very powerful elements to sow divisions.
Justin Sullivan
/
Getty Images
Facebook CEO Mark Zuckerberg has said connecting the world means bringing people together. But increasingly the platform is being used by some very powerful elements to sow divisions.

In July 2016, the aftermath of a police shooting of an African-American man was broadcast live on Facebook. Instantly, Americans of all stripes used the platform to step up the race wars and attack each other. Across the world, in India this past summer, a post of a cartoon of the Prophet Muhammad on Facebook sparked mob violence in which one person died.

And now, U.S. lawmakers are looking into Facebook's culpability after evidence that Russia-linked operatives placed ads on its platform during the 2016 presidential election in an effort to disrupt democracy.

In whatever corner of the world Facebook is operating, it has become clear that people are using this powerful platform as a communications tool in ways that Mark Zuckerberg never envisioned. He started the company as a young Harvard undergrad 13 years ago to connect students. It expanded exponentially since then under his supremely techno-utopian vision of connecting the world.

For Zuckerberg, connecting the world means bringing people together. But increasingly the platform is being used by some very powerful elements to do the exact opposite: sow divisions. That's led Facebook struggling almost every week to offer explanations for misleading and divisive news on its platform.

It's a huge blind spot for Zuckerberg, by his own acknowledgement. Indeed, he is fast coming to terms with the power of his platform to cause harm. Last November, Zuckerberg dismissed as "a pretty crazy idea" the notion that fake news influenced the U.S. presidential election.

But in the face of evidence, he shifted his stance. In September this year, he put up a post on his personal page saying he regretted calling it crazy: "This is too important an issue to be dismissive." He followed that up with another postpromising to do better: "For the ways my work was used to divide people rather than bring us together, I ask forgiveness and I will work to do better." According to interviews with current and former employees and other technology leaders close to the company, the 33-year-old CEO is now struggling to course correct.

"When it was Harvard or just the Ivies, they had norms that worked," says Tarleton Gillespie, a communications expert at Microsoft Research who is writing a book about harmful speech online titled Custodians of the Internet. But once everyone from moms breastfeeding their babies to white supremacists came in, he says, the system broke down. "Facebook was less prepared to make decisions about acceptable speech," he says.

Couple that lack of preparedness with the desire to ramp up a company at unimaginable speed. Facebook has grown in the span of just 13 years from a handful of college students to over 2 billion monthly users; with advertising revenue close to $27 billion last year alone. Along the way, it has hooked legacy media too. Many news organizations depend on Facebook to drive traffic and a number of other outlets, including NPR, have paid partnerships with Facebook.

Call it ambition or naïveté. The company's early mantra, "move fast and break things," proved to be a winning approach to software development. Not so much to human relationships. It was a classic Silicon Valley case of product before purpose.

Facebook leaders now say that, perhaps by moving too quickly, they failed to anticipate all the ways in which people could abuse their platform. For a group of engineers focused on data and algorithms, it's like the modern version of Frankenstein: they've created a monster whose powers they're just beginning to understand.

To do it, the company will need to shift the focus of its metrics. People who've work at Facebook say that from Day 1, Zuckerberg has been fixated on measuring engagement: how much you like, click, share, up to what second you watch a video. In product meetings, current and former employees say, any suggestion to tweak News Feed — Facebook's signature product — must include a deep analysis of how that would increase or decrease engagement. This dogged focus on metrics is also apparent from the company's own blog posts and research.

But it's the negative engagement — Russian operatives, trolls, hate groups — that the CEO and his leadership team has been far slower to register. Now the company is scrambling to figure out how to monitor and quantify the bad. It's hiring more than 4,000 new employees to weed out fake accounts and violators.

One problem: A key data source the company has been relying on — users to report bad content — is chronically unreliable, according to people at Facebook who've had to analyze it. Users flag a lot of things that they shouldn't. For instance, they can report a post as fake news because they disagree with it; others can say a business page is offensive when actually it's just a competitor they want to take down. Facebook's leaders have publicly said they stand behind this crowdsourcing approach. But, inside the company, people know this system just doesn't work.

One Facebook employee tells NPR that recently, engineers in-house began to do random sampling of News Feed — to proactively look for violations (not just wait for users to flag content). "We're trying," the employee says. "But you can't look for new bad behavior. It's a hopeless exercise." Meaning, you can only find violations that you know to look for. Prior to 2016, political interference in elections wasn't even a category on the radar of the engineers leading these efforts.

There's political pressure to clean up. This week the House and Senate are holding three hearings on election interference by the Russians, calling on Facebook, Twitter and Google representatives to testify.

There's also a notable shift in the zeitgeist of Silicon Valley. In a recent interview with CNN, Twitter co-founder Evan Williams said "Facebook is filled with crap ads"; and that all the ad-driven platforms (Twitter included) are benefiting from fake information, "generating attention at pretty much any cost. That's the most broken thing about the information economy we live in."

Inside Facebook, there's a deep conflict. On the one hand, employees who've spoken with NPR feel terribly that Facebook is a space for Russian operatives, hate speech and extremists. Engineers and product managers are racing to fix it, working overtime, reading history books on Russian propaganda techniques — just to try to wrap their minds around how the past is repeating itself, through new technologies.

At the same time, they're indignant, believing they've created a beautiful, seamless technology that does more good than harm, and that Facebook engineers aren't responsible for human nature.

What's clear is that Facebook is trying to understand its own platform's ability to cause harm. What's unclear is: Can Facebook stop it?

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tags
Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.
Related Content