Until Wednesday night, Facebook founder and CEO Mark Zuckerberg had been conspicuously absent from the conversation around the siphoning of data by researchers working with the research firm Cambridge Analytica — and, more broadly, the company’s alleged role in Russia’s manipulation of the 2016 US presidential election.
An awkward interview subject, Zuckerberg had preferred to let others at Facebook present its perspective on these hot-button issues — and to face the increasingly angry grievances of a public whose distrust of the platform is at an all-time high.
But on Wednesday, with numerous pundits saying the situation had devolved into an existential crisis for the company, Zuckerberg embarked on a whirlwind talking tour with journalists, most prominently in an exclusive TV interview with CNN’s Laurie Segall, who appeared on CNN’s “Anderson Cooper 360.”
The beginning of the two-part interview was uncomfortable. Cooper’s introduction included a clip of Aleksandr Kogan — the Russian-American researcher who created the app that allowed Cambridge Analytica to access Facebook users’ private data without their consent — that managed to underscore Kogan and Zuckerberg’s uncannily similar appearance.
Segall opened with a simple question: “What went wrong?” In his very first sentence, Zuckerberg used the term that Facebook has been adamantly rejecting throughout the crisis, referring to the debacle as a “breach.” My guess is that at that moment, there was a loud smacking sound as the high-paid publicists who prepped Zuckerberg for the interview brought their collective palms to their faces.
Things didn’t go much better from there. Apologizing, Zuckerberg said that if Facebook can’t protect data, it “doesn’t deserve the opportunity to serve people.”
He went on to outline “basic things” the company must do to prevent this kind of thing from happening “going forward.” At the same time, though, he referenced major policy changes Facebook had made on the platform to prevent bad actors from stealing data — in 2014, before the incident occurred.
Segall pressed Zuckerberg on Facebook’s actions after discovering the breach, noting that the company chose to “quietly” try to resolve the situation with Cambridge Analytica, hiding the data leakage from both its users and the government, instead of making a broad public announcement: Why the lack of transparency?
Zuckerberg, stammering at times, responded that Facebook executives were committed to “getting in front of things,” and he promised a “full forensic audit” of apps that had access to large amounts of user data. He stated that they intended to inform every user who was affected by the leak.
He added that in retrospect, they shouldn’t have taken Cambridge Analytica “at their word.”
That’s unsettling: A company as large, well resourced and intertwined in the lives of its users as Facebook shouldn’t allow any entities access to its data based solely on the benefit of the doubt — and if the company is now ready to perform thousands of forensic audits after the fact, it would seem to have been much simpler to have instituted a program of regular checks on apps to ensure that they weren’t exceeding their privileges — if Facebook was more concerned about security than growth.
On that note, responding to Segall’s question about the now well reported exploitation of Facebook by Russian troll armies, Zuckerberg touted the new AI tools they used in 2017 to successfully block trolls during the French elections, noting that “it wasn’t rocket science” to build them.
But that statement raises the question, overshadowing the entirety of the interview, of why Facebook hadn’t done these simple things years earlier, when it was already obvious that the platform was being swarmed with dubiously sourced and fraudulent news posts. The unspoken answer is that it wasn’t a priority for them until the activities of these “bad actors” were exposed.
Toward the end of the interview’s second half, Zuckerberg seemed to settle down. He gave reasonable and assured explanations of how Russian trolls actually operated on the platform (primarily by “sowing division”), agreeing that social and digital media should be regulated the way that broadcast and print are (as long as the regulations are rational and fairly applied to all internet players) and only stumbling when asked by Segall whether he’d be willing to testify before Congress. Then, he retreated to the excuse that, despite being the controlling shareholder of Facebook — and one of the most powerful people in the world — he might not be the right person to give our legislators the information they need to “do their important job.”
“But you’re the face of the brand,” Segall noted. Responded Zuckerberg: “We send the right people to answer these questions. … If the right person is me, I’ll go. I imagine at some point, there will be a topic for which I’m the right person.”
There were three things the world was looking for from Zuckerberg, once he finally emerged from the shadows: Humility, transparency and a willingness to take responsibility.
He showed a reasonable amount of the first, lingering resistance to the second and palpable ambivalence around the third, acknowledging the need for a “degree” of personal accountability but still framing Facebook as a victim.
There was one moment when the tenor of the conversation shifted in a way that highlighted Zuckerberg in human, rather than business, terms, and showcased what real recognition of obligation looks like. That was when Segall asked him how his view of his business had changed after he became a father.
The question brought a smile to Zuckerberg’s face, the first of the interview: His expression visibly softened, he answered in a slower and more measured pace, his focus perhaps directed inward toward calming thoughts of his family.
“I used to think that the most important thing was having the greatest positive impact in the world that I can,” he said. “Now, I really just care about building something that my girls, when they grow up, can be proud of me for. I work on a lot of things, but when I go home, I ask myself, ‘Will my girls be proud of what I did today?'”
If that’s the test Zuckerberg sincerely applies from now on toward decisions that affect the safety and security of the 2 billion individuals who use his website monthly, we may have a lot less to worry about.