Another Meta whistleblower. When will it be enough?
Americans must demand accountability from social media executives, Meta especially.
By Yael Eisenstat
Meta (parent company of Facebook, Instagram and WhatsApp) has played an outsized role in shaping political outcomes around the globe for more than a decade while resisting responsibility for its contributions to political violence, extremism, and other antidemocratic activities in the United States and around the world. The company has recklessly scaled to dominate much of the world’s online communication flows, while benefiting from virtually unchecked power in the United States. And yet we still have little insight into this $1.5 trillion- company’s operations and business decisions, with virtually no federal transparency requirements and limited access to data for researchers and civil society organizations.
One of the few ways the public, regulators and legislators can understand the real impacts social media companies have on our safety, security, democracy and health is via whistleblower complaints and firsthand accounts from people who worked on the inside.
The latest such account comes from Sarah Wynn-Williams, who worked at Facebook from 2011 to 2017 in public policy roles. Her book, “Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism,” is filled with damning, detail-filled accounts of the reckless, at times possibly illegal, and certainly harmful business decisions the company’s leadership made over that time.
In 2018, I was hired to be Facebook’s “Global Head of Elections Integrity” in the political ads business. After leaving, I chose to speak publicly about what I saw as anti-democratic behavior and decisions that harmed public discourse and democracy. I do not know Wynn-Williams; she left the company the prior year. I cannot personally confirm her allegations, but the reckless political and business decisions she describes do not surprise me, especially those detailing how the company prioritized growth and political expediency ahead of public safety, democracy, or even free expression.
The book centers on the terrible behavior of the company’s top leaders, including Mark Zuckerberg, Sheryl Sandberg and Joel Kaplan. The book provides more than 40 chapters detailing sexual harassment, abusive behavior toward subordinates, and what she calls “lethal recklessness.” We have put tech leaders on a pedestal for so long, and it is time to see them for who they are and stop excusing such despicable behavior.
I want to focus on three key take-aways I hope our legislators, regulators and the public will consider when evaluating their relationships to with social media, Meta in particular, and how we can consider what accountability looks like.
The right to know
Meta is seeking to stop further promotion of Wynn-Williams’s book, arguing the author violated the non-disparagement clause in her severance agreement. The company has long fought against transparency efforts. In August, it shut down CrowdTangle, the tool that researchers, journalists and civil society used to study the platform.
Because we have no way to study this company, we must rely on any information we can glean from those who have seen behind the curtain. We should know how companies and their business decisions affect society. In fact, I wrote in 2021 about the challenges of speaking up, how companies try to silence people, and why we need more former employees to come forward (even if anonymously through whistleblower complaints) if we want to hold them accountable.
Potentially illegal and clearly anti-American behavior
China
The author filed a whistleblower complaint, as reported by the Washington Post, in April to the U.S. Securities and Exchange Commission detailing the lengths to which Facebook leadership was willing to go to enter the China market. The complaint includes internal documents showing Facebook agreed to grant the Chinese government access to Chinese users’ data and user data for people in Hong Kong. As she described it, Facebook was willing to “bargain Hong Kong users’ data for entry to China.” She details other egregious activities, including covertly launching numerous social apps in 2017 under the name of a China-based company.
Meta argues that these details are outdated—they are; she left the company in 2017—and that it did not enter China. But the details of just how far Facebook leadership was willing to go to accommodate the Chinese government should alarm all national security, human rights, and privacy experts. And Facebook did this all while lying to American lawmakers and the public about its plans, according to the author, and while expressing concerns about TikTok and Chinese surveillance.
Myanmar
Wynn-Williams says that she was the first employee to go to Myanmar (I assume in 2014, although she didn’t specify the year) to ask the military junta to unblock Facebook. She gives an incredible amount of detail about Facebook’s complicity in helping fuel the atrocities against the Rohingya. By now, this has been well documented, including by Amnesty International and the United Nations. But I am not aware of whether such intimate detail of how various company leaders handled the crisis and the unconscionable level of disregard for how their own platform was being used to incite such violence were truly known. I certainly hope Wynn-Williams provided this information to investigators years ago, when it mattered most.
Foreign Corrupt Practices Act (FCPA):
Wynn-Williams writes a sentence about how there was an internal investigation at the company because she had raised concerns that Facebook might be violating the Foreign Corrupt Practices Act in the Philippines. She provides no further detail, but this is certainly a topic I hope regulators follow up on.
Political speech and influence
When I was hired by Facebook in 2018, ostensibly to head the company’s election integrity efforts for political advertising, it was trying to recover from the Cambridge Analytica scandal. Since leaving, I have written extensively about the decisions leadership made that prevented us from fixing the problem, including rejecting efforts to combat voter suppression ahead of the 2018 U.S. midterm election.
Wynn-Williams’s book offers color I had not fully grasped at the time. She describes Kaplan’s approach to growing the political advertising revenue stream—not just for the money but also for the power. As she describes it, Kaplan, now president of Global Affairs, thought politicians who saw Facebook as critical to their campaigns would not want to regulate the company. She describes Sandberg as saying that politicians indebted to Facebook would be good for the company. Part of that strategy involved embedding Facebook teams in the Trump campaign ahead of the 2016 elections, helping them use the platform and its targeting tools to spread what we now know was a torrent of mis- and disinformation.
What now?
Some argue that this book is too late, but these details matter, even years later. With the power and influence social media companies have accumulated, it is critical for us to understand their motivations, how they operate, and what the true risks are to our political futures.
After I left the company, I was often asked if I believed Mark Zuckerberg had put his finger on the scales of the 2016 U.S. election. I would always say that, in addition to being somewhat unprovable, it was the wrong question. The more important question is: Why have we allowed any one individual to accumulate enough unchecked power that they could put their finger on the scale of our election? It is a question we have yet to answer, and Meta risks accumulating even more power in the race to dominate the artificial intelligence industry.
Zuckerberg and Kaplan (Sandberg is no longer at the company) have made clear that they have no intention of course-correcting and are seeking even further dominance of the tech and communications spheres globally. So, it is up to us, the public, to change our relationship with
Meta. We must push our lawmakers to finally prioritize transparency for this industry, and to reckon with accountability for how these companies operate and affect public safety and democracy.
And if all else fails, we can vote with our feet (or in this case, our eyeballs). Social media companies cannot succeed if they do not maintain an active user base, with all of our personal and behavioral data available for them to monetize.
Yael Eisenstat is a director of policy and impact at Cybersecurity for Democracy, based out of New York University. She was Facebook’s global head of elections integrity for political ads in 2018, and previously served as a diplomat, intelligence officer, and White House adviser.
When Zuckerberg announced that FB would no longer monitor content, leaving it to AI and its "Community Standards" to decide what stays up and what comes down, I closed my account. I am sickened by the power of the oligarchs in this county and by the harm that Instagram inflicts on young people. I am currently reading "Careless People". It's clear that Zuckerberg is thrilled by power and really doesn't care about those he has harmed in the past and will harm in the future.
"The more important question is: Why have we allowed any one individual to accumulate enough unchecked power that they could put their finger on the scale of our election?"
Excellent piece. Yes, it is the more important question. This is one more case of regulatory inaction. Facebook has been around long enough that regulatory capture may be in play. But with the Republicans in control, stronger corporate accountability regulation is out of the question.
But as you say, there's more to it than that. "As she describes it, Kaplan, now president of Global Affairs, thought politicians who saw Facebook as critical to their campaigns would not want to regulate the company. She describes Sandberg as saying that politicians indebted to Facebook would be good for the company. Part of that strategy involved embedding Facebook teams in the Trump campaign ahead of the 2016 elections, helping them use the platform and its targeting tools to spread what we now know was a torrent of mis- and disinformation."
Wow. Didn't know that. Thanks. That's more than regulatory capture. It's politician capture.
Then you ask "What now?" and suggest pushing lawmakers to enact accountability legislation. If that fails, which it already has, "we can vote with our feet" and leave social media companies. I'd like this to work, but realistically, consumer boycotts seldom work. In this case, the percentage of social media users who would be activist enough to leave is very small. Most are unknowingly addicted and couldn't leave if they wanted to. Plus there are many more issues of greater importance to social media users.
Yael, I wonder if you and perhaps others at Cybersecurity for Democracy have applied root cause analysis to this problem? That would require some time and thought, but it should take you further than this piece has been able to go with the question of "What now?" Looking at your website, I see "Cybersecurity for Democracy is a research-based, nonpartisan, and independent effort to expose online threats to our social fabric – and recommend how to counter them." I have some expertise here. If I can help, let me know.