Photograph by Alex Edelman / ZUMA
That changed on November 20th, when a parliamentary sergeant-at-arms showed up at Kramer’s London hotel room and escorted him to the halls of Westminster, where the Tory M.P. Damian Collins, the chairman of the Digital, Culture, Media and Sport Committee, demanded that Kramer turn over all his files in the case under threat of arrest. Kramer, who was in London on unrelated business, said that he panicked and moved a number of files from a Dropbox account to a USB drive, which he turned over to the M.P. How Collins knew that Kramer was in London has been traced to the tireless Guardian reporter Carole Cadwalladr, who has spent more than two years unravelling the connections between Cambridge Analytica, Facebook, Donald Trump, Steve Bannon, Robert Mercer, and Brexit. After arranging to meet with Kramer, she appears to have tipped off Collins to Kramer’s whereabouts. But how Collins knew that Kramer was in possession of the documents—which were not his to have—remains a mystery. There are reports that the two men had been in communication for the past few months, at Cadwalladr’s prompting, which suggests that the frog march to Parliament was just for show. (When Kramer’s lawyers found out that he was in possession of the discovery documents and that he had given them to Collins, they attempted to drop him, but, over the weekend, a California judge demanded that they continue to represent Kramer until the matter of the purloined files is adjudicated.)
Collins, for his part, claimed that whatever seal was in force in California was irrelevant in the U.K. On November 27th, he used the snatched e-mails to grill Richard Allan, the Facebook executive who was sent to speak for the company at an unprecedented international hearing on fake news and disinformation that Collins convened in Parliament. Mark Zuckerberg, who said he was too busy to appear—making this the fourth time that he has refused a request from Parliament—was represented by a nameplate positioned in front of an empty chair; every time the television cameras panned to Allan, they also showed his absent boss. Allan admitted that the optics were “not great,” and then went on to do his best Zuckerberg imitation, dodging and feinting as the lawmakers sought to put him on the spot. The most damning claim to emerge was that Facebook either gave, or considered giving, favored access to its users’ data to companies that spent at least two hundred and fifty thousand dollars in advertising on the platform. There also seemed to be evidence that Facebook had been informed, in 2014, that computers with Russian I.P. addresses were siphoning three billion data points a day from Facebook accounts, a claim that Allan refused to address in the hearing.
Who could have imagined that a creepy little app that scoured Facebook for pictures of women in bikinis might be the instrument that skewers the behemoth social network? Who, that is, besides Facebook executives and their lawyers? Until a little over a week ago, the company had successfully sequestered internal e-mails, which were obtained by the legal team of Ted Kramer, the founder of the app company Six4Three, during the discovery process in a 2015 lawsuit. At issue was Facebook’s policy of allowing third-party app developers to access the data of Facebook users’ friends—the very policy that enabled Cambridge Analytica to buy the data of eighty-seven million unwitting users on behalf of the Trump campaign. In Kramer’s case, his Pikinis app relied on that access; once Facebook changed its policy, in 2014, the app no longer worked. Kramer cried foul and sued Facebook for breach of contract. At the company’s request, the judge in the case ordered the records sealed to keep them, ironically, private.
On Wednesday, Collins published the full cache that he seized from Kramer. The two hundred and fifty pages of internal Facebook documents show, irrefutably, that the company did indeed whitelist a number of lucrative business partners, including Netflix, Lyft, and Airbnb, allowing them continued and unfettered access to the accounts of Facebook users and their friends after the company claimed that it had stopped the practice. The documents also reveal that, in 2015, a permissions update for Android devices, which users were required to accept, included a feature that continuously uploaded text messages and call logs to Facebook.
The fallout from Ted Kramer’s London misadventure was just one debacle of many during Facebook’s terrible, horrible, no good, very bad month. Back at Facebook headquarters, in Menlo Park, Zuckerberg was quite busy after all, doing damage control after a scathing New York Times exposé. “Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis,” which was published on November 14th, created one of the biggest crises in the company’s history. As if to prove the reporters’ claims, Zuckerberg publicly denied their veracity. But the Times piece was deeply sourced and contained the disturbing revelation that the company had enlisted a right-wing opposition-research group, Definers Public Affairs, to dig up dirt on George Soros after Soros gave a blistering speech at the World Economic Forum, in Davos, decrying the power of social media, especially Facebook and Google, for their “far-reaching adverse consequences on the functioning of democracy, particularly on the integrity of elections.”
Definers Public Affairs had previously pushed out misinformation about the Apple C.E.O., Tim Cook, on behalf of the tech company Qualcomm, which wanted to undermine Cook’s relationship with the Trump Administration. For Facebook, Definers launched a P.R. campaign—propagated through the conservative and alt-right media ecospheres—that traded on anti-Semitic innuendo to suggest that Soros was the deep pockets behind a grassroots anti-Facebook group. News of Facebook’s campaign against Soros came out three weeks after Cesar Sayoc, a fervent Trump supporter, sent a pipe bomb to Soros—and also to the Clintons, the billionaire Tom Steyer, and other prominent Democrats.
And then it got worse. On November 15th, shortly after the Times exposé was published, Facebook claimed, “The New York Times is wrong to suggest that we ever asked Definers to pay for or write articles on Facebook’s behalf or to spread misinformation.” The next day, Facebook’s chief operating officer, Sheryl Sandberg, who is in charge of the company’s communications team, said, somewhat intricately, “I did not know about or hire Definers or any firm. . . . Definers was hired, we have lots of firms. They were hired not to smear anyone.” But then, the night before Thanksgiving, in a Facebook blog post that the company perhaps thought would be lost in the holiday shuffle, Sandberg allowed that “I did receive a small number of e-mails where Definers was referenced.” A week later, after another damning piece in the Times, a company spokesperson admitted that it was Sandberg herself who had initiated an investigation into whether Soros had a financial interest in criticizing Facebook—that is, if the veteran financier had engaged in what is called “short and distort,” trash-talking a company after betting that its stock will lose value. Not surprisingly, he hadn’t.
Soros’s complaints about Facebook were hardly original: former Facebook employees such as Chamath Palihapitiya, who served as its vice-president for growth, and investors including Roger McNamee have made similar criticisms. Nor was Soros being speculative. At the time of his Davos talk, the company’s role in enabling Russian operatives to exploit social divisions in the United States through Facebook ads and user pages, and helping the Trump campaign suppress the African-American vote, were well known. Indeed, in a two-part “Frontline” documentary, “The Facebook Dilemma,” which was released at the end of October, a collection of former Facebook employees expressed concerns about the unwillingness of Zuckerberg and Sandberg to take meaningful responsibility for Facebook’s immense power and capacity to do harm. “My concerns, at the time, were that I knew that there were all these malicious actors who would do a wide range of bad things given the opportunity, given the ability to target people based on this information that Facebook had,” Sandy Parakilas, who had been the company’s platform-operations manager in 2011, told the filmmakers. “So I started thinking through what are the worst-case scenarios of what people could do with this data . . . and I shared that with a number of people, both people in privacy and some senior executives. And the response was muted, I would say. . . . I got the sense that this just wasn’t their priority. They weren’t that concerned about the vulnerabilities that the company was creating.”
Those vulnerabilities were on full display on November 5th, the day before the midterm elections. In an effort to prevent the kinds of malign interferences that had dogged the Presidential election two years earlier, Facebook had set up what it was calling a “war room” to monitor its network. The company had already removed thirty Facebook pages, which it believed were designed by Russians to interfere with the elections. The war room was both a public declaration that Facebook was taking fake news and propaganda seriously and an admission that the platform remained susceptible to manipulation. “This is going to be a constant arms race,” Katie Harbath, Facebook’s global-politics and government-outreach director, told the Verge. “This is our new normal. Bad actors are going to get more sophisticated in what they’re doing, and we’re going to have to get more sophisticated in trying to catch them.” Since elections happen around the world throughout the year, the company planned to keep the war room open “for the foreseeable future.”
That same day, Business for Social Responsibility, a nonprofit organization, issued a report that Facebook had commissioned to address concerns about the platform’s role in the Rohingya genocide, in Myanmar. While the Facebook platform has been used in other countries to cultivate ethnic and sectarian violence—Nigeria, Germany, Egypt, India, and Sri Lanka among them—its co-optation by the Myanmar Army was especially heinous. Adopting the playbook of Russian operatives, the Army created hundreds of seemingly innocuous Facebook accounts about celebrities, entertainment, and beauty, which it used to plant incendiary narratives about the country’s Muslim minority. In its response to the B.S.R. report, Alex Warofka, Facebook’s product-policy manager, said that the company had already implemented a number of its recommendations but that “there is more to do.”
Shortly after the Myanmar report was issued, Facebook was compelled to take down an inflammatory, racist video ad, paid for by the Trump campaign, on the grounds that the ad “violates Facebook’s advertising policy against sensational content.” The spot, which targeted voters in Florida and Arizona, blamed Democrats for an undocumented immigrant who had twice been deported before killing two sheriff’s deputies, in Sacramento, in 2014. Facebook’s policy disallows ads that are “dehumanizing or denigrating [to an] entire group of people, and using frightening and exaggerated rumors of danger.” The company removed the ad only after it had already been seen on Facebook by as many as five million people. And here’s the rub: while the ad “cannot receive paid distribution,” the company declared, “the video is allowed to be posted on Facebook.” So there you have it, the fecklessness of Facebook in a single sentence.
November has passed, but Facebook’s troubles have not. A departing African-American Facebook employee charged, in a Facebook post, that the company was “failing its black employees and its black users”; Facebook took down his post. Wired reported that the company was abandoning charities—which it had lured to the platform—once they’d been hacked. The election war room has been dismantled, leading some to suggest that it was a publicity stunt. Definers Public Affairs is no longer working for Facebook, but the damage to Sheryl Sandberg’s credibility and carefully coiffed public persona may prove irreversible—as Michelle Obama said the other day, “it’s not always enough to lean in, because that shit doesn’t work all the time.” And that creepy Pikinis app, courtesy of Damian Collins, has now supplied a portrait of naked greed and corporate dissembling. That empty seat in Parliament, where Mark Zuckerberg chose not to sit to answer the concerns of the global community, has suddenly got much hotter.
A previous version of this article misstated when Facebook first hired Definers Public Affairs.
- Sue Halpern is the author, most recently, of the novel “Summer Hours at the Robbers Library.” She has written for The New Yorker since 2005
source: https://www.newyorker.com/tech/annals-of-technology/facebooks-very-bad-month-just-got-worse