The 'Ugly Truth' About Facebook: How The Company's Policies Cause Its Biggest Problems
In “An Ugly Truth: Inside Facebook’s Battle for Domination,” New York Times reporters Sheera Frenkel and Cecilia Kang chronicle a series of scandals between 2016 and 2021 over how Facebook policies its platform.
The new book explores in-depth the inner workings of the company and its top executives. The word ugly in the title comes from a memo written by one of Facebook’s own vice presidents, with Frenkel and Kang’s reporting highlighting that many of the platform’s perceived flaws are deliberate design choices.
“So many of Facebook’s problems are built into the way that they do business,” Frenkel says. “The very business model that they’re premised on … is to keep you online.”
The trillion-dollar company needs to keep users engaged in order to gather data and sell ads. The book chronicles the long-term impacts of this successful business model.
But for a long time, before the company became profitable, Facebook focused on growth as its primary goal. Co-founder Mark Zuckerberg brought on Sheryl Sandberg as chief operating officer to help scale up.
“An Ugly Truth” takes a close look into this “powerful business partnership” that created a company with 3 billion global users — “unprecedented and historical in its reach,” Kang says.
One striking pattern the book reveals is that corporate leadership consistently comes through as caught off guard or flat-footed when something significant happens. For example, leadership at Facebook didn’t seem to have a plan to address then-presidential candidate Donald Trump using the algorithms of the news feed to dominate attention on the platform.
Facebook’s top leaders didn’t look at warning signs about Trump delivered by other executives within the company, Kang says.
“Donald Trump only surfaced a lot of the core problems that had been baked into the technology and the business model,” she says.
And the problem goes far beyond Trump and the United States, Frenkel says.
“There’s only going to continue to be other leaders who take advantage of Facebook’s ability to make things go viral,” Frenkel says. “And I think until we really contend with that as a society, as a world, we’re not going to understand how social media has changed all of our lives.”
Another flaw that Frenkel and Kang describe in the book is the open-access that all Facebook engineers — some 16,000 plus at one point — had to the back-end profiles of users. The policy gave engineers more information about users than would be publicly available on their profiles.
The policy intended to help engineers innovate quickly from the beginning of Facebook, but some employees were abusing that power: Men often used the access to digitally stalk women they were interested in.
Frenkel says 52 people were fired for abusing their access to users’ data, which included real-time, intimate details such as the restaurant a person was dining in and predictions on future behavior.
Zuckerberg was “appalled” when he learned how many employees abused their access, but if given the chance to interview him, Frenkel says she would ask why he didn’t figure it out sooner.
“Men were getting fired all the time for doing this,” she says. “Couldn’t you put that pattern together yourself and see that this was a problem, given that it was Mark Zuckerberg that instituted this policy of transparency and giving these engineers access to everything?”
Zuckerberg is known for his belief that good speech will prevail on his platform. For example, he took a libertarian approach to Holocaust deniers by allowing users to continue posting about it. He seemed surprised several years later when data showed Holocaust denial started gaining traction, particularly among millennials.
“[Zuckerberg] finally kind of gets it, that it’s not that the best speech is winning out or even the truest speech is winning out,” Frenkel says, “but that the platform — the machine he made — is actually making the most emotive and sometimes the worst and conspiratorial speech win out.”
Zuckerberg’s ideas around a libertarian approach to free speech were embedded in him from the early days of Facebook, Frenkel says. The people close to him that the reporters spoke to for the book say those ideas are genuine — but Frenkel says this ideology is also “very good for business.”
Facebook could look different in the future, Kang says, but the 17-year-old company isn’t going anywhere. New technologies may come along, but companies will still see business in people’s desire to connect with others around the world.
Now, many companies are copying Facebook because of the success of the behavioral advertising business model, she says.
“Facebook made $85 billion last year. It’s valued at $1 trillion,” Kang says. “There’s no blaming other companies for wanting to emulate the company’s business model and success.”
The authors want people to read “An Ugly Truth” because the problems it outlines aren’t going away and need solutions, Frenkel says.
“Social media is here,” she says. “It’s going to be around for the rest of our lives.”
Editor’s note: Facebook is a financial supporter of NPR.
Chris Bentley produced and edited this interview for broadcast with Todd Mundt. Allison Hagan adapted it for the web.
Book Excerpt: ‘An Ugly Truth’
By Sheera Frenkel and Cecilia Kang
For nearly a decade, Facebook held an informal, company-wide meeting at the end of each week, known as “Questions and Answers,” or Q&A. Its format was simple, and fairly standard in the industry: Zuckerberg would speak for a short time and then answer questions that had been voted on by employees from among those they’d submitted in the days ahead of the meeting. Once the questions that had received the most votes had been addressed, Zuckerberg would take unfiltered questions from the audience. It was more relaxed than Facebook’s quarterly, company-wide meeting known as the “all-hands,” which had a more rigid agenda and featured programs and presentations.
A couple hundred employees attended the meeting in Menlo Park, and thousands more watched a livestream of the meeting from Facebook’s offices around the world. In the lead-up to the Q&A following Trump’s Muslim ban speech, employees had been complaining in their internal Facebook groups—known as “Tribes”—that the platform should have removed Trump’s speech from the site. In the broader forums where more professional discussions took place—known as “Workplace groups”—people asked for a history of how Facebook had treated government officials on the site. They were angry that Facebook’s leaders hadn’t taken a stand against what they viewed as clearly hate speech.
An employee stepped up to a microphone stand, and people grew quiet. Do you feel an obligation to take down the Trump campaign video calling for the ban on Muslims? he asked. The targeting of Muslims, the employee said, appeared to violate Facebook’s rule against hate speech.
Zuckerberg was used to fielding hard questions at Q&As. He had been confronted about ill-conceived business deals, the lack of diversity in company staff, and his plans to conquer competition. But the employee in front of him posed a question on which his own top ranks could not find agreement. Zuckerberg fell back on one of his core talking points. It was a hard issue, he said. But he was a staunch believer in free expression. Removing the post would be too drastic.
It was a core libertarian refrain Zuckerberg would return to again and again: the all-important protection of free speech as laid out in the First Amendment of the Bill of Rights. His interpretation was that speech should be unimpeded; Facebook would host a cacophony of sparring voices and ideas to help educate and inform its users. But the protection of speech adopted in 1791 had been designed specifically to promote a healthy democracy by ensuring a plurality of ideas without government restraint. The First Amendment was meant to protect society. And ad targeting that prioritized clicks and salacious content and data mining of users was antithetical to the ideals of a healthy society. The dangers present in Facebook’s algorithms were “being co-opted and twisted by politicians and pundits howling about censorship and miscasting content moderation as the demise of free speech online,” in the words of Renée DiResta, a disinformation researcher at Stanford’s Internet Observatory. “There is no right to algorithmic amplification. In fact, that’s the very problem that needs fixing.”
It was a complicated issue, but to some, at least, the solution was simple. In a blog post on the Workplace group open to all employees, Monika Bickert explained that Trump’s post wouldn’t be removed. People, she said, could judge the words for themselves.
Excerpted from “An Ugly Truth: Inside Facebook’s Battle for Domination” by Sheera Frenkel and Cecilia Kang. Copyright © 2021 by Frenkel and Kang. Republished with permission of HarperCollins.
This article was originally published on WBUR.org.
Copyright 2021 NPR. To see more, visit https://www.npr.org.