Facebook loses control of the narrative
Facebook finds itself at the crux of three major societal issues all at once. It was already fighting a battle of fake news like we've never seen before, but now finds itself presented with serious data privacy concerns from its users.
Both of these factors are leading to social unrest, particularly as Facebook has grown beyond two billion people using its tools and it grapples with how to enforce across vastly differing cultures, languages and backgrounds.
There's much discussion about what happens when the world is networked like it now is, and the culmination of these three problems has presented Facebook with a new challenge: will it finally draw its own lines in the sand? Or will Facebook, and Zuckerberg, continue to try and defer to the 'community' to resolve its own problems?
For those in the industry, Facebook has long been an enigma. It's a company that has an iron grip over its own narrative, and has been successful at both containing information within its own walls and deflecting any sort of bad press — until now.
Over the past two weeks we've seen the rapid decompression of the Facebook narrative and the company publicly scrambling to respond to a trifecta of issues that have changed public opinion.
This started with allegations that the company allowed Cambridge Analytica to siphon off data from its API of millions of users with little controls in place to stop it, but has grown substantially in recent weeks to a larger, more confusing story.
I wanted to try and piece together a timeline, because this is getting confusing as hell. Here's a detailed look at key events we've seen unfold so far, covering the pieces I personally find most important:
2017 January: Vice reports on Cambridge Analytica's involvement in the Trump campaign and the data that turned the world upside down.
May: The Guardian reports on how Cambridge Analytica, a data company, helped 'hijack' UK democracy.
2018 March 17: The Guardian reports that Cambridge Analytica harvested 50 million Facebook profiles via the API in 2015. Using an app masquerading as a questionnaire, data was harvested with authorization from the user, but unknowingly sold, with no restrictions or warnings about misuse from Facebook at the time.
The report claims that the data is used to create psychometric profiles of people to influence their voting habits and that it was able to access vast data, including that of friends, without detection.
Hours before publishing, Facebook threatens The Guardian with a lawsuit, but the story is published anyway.
Facebook immediately suspended Cambridge Analytica across the platform, but blocked the whistleblower as well for... no clear reason.
March 19: U.S senators begin demanding Zuckerberg testifies at Congress.
Facebook announces it is immediately doing an on-site audit of Cambridge Analytica, which willingly complied, but the UK government demanded the company pull the auditors as it planned to raid the office but didn't have a warrant yet.
March 20: Channel 4 releases first undercover video of Cambridge Analytica executives talking about their involvement in Trump campaign, how they use ephemeral email services and shell companies to cover their tracks as well as leveraging misleading videos/stories to gain attention.
Under pressure about Zuckerberg's silence, Facebook issues an awkwardly worded memo that he and Sheryl Sandberg are "working around the clock" on the problem.
Cambridge Analytica suspends CEO due to undercover videos and other allegations.
March 21: Zuckerberg, who has been silent for nearly five days, finally gets his voice back. In a well-crafted post, Zuck doesn't apologize, but does say that Facebook will address the issues with new controls. He then goes on a media tour, in which he cries when asked about the legacy he leaves but says that he isn't sure the company should be regulated.
Over on Recode, however, he yet again says that he would rather not be the one to draw lines on the platform about what's allowed: "I mean, who chose me to be the person that did that?" I think you did, Zuck, when you made it.
March 24: UK officials, after finally obtaining their warrant, spend seven hours at Cambridge Analytica gathering data.
March 25: Zuckerberg actually apologizes in a full-page newspaper advertisement across almost all major publishers.
March 26: Users exporting their data from Facebook when trying to delete it discover that it logs SMS and call history data for years.
March 27: Zuckerberg refuses to testify in the UK after being summoned, but agrees to testify in Congress.
March 28: Facebook announces first changes to make privacy tools better, based on already in-progress feature development before the news emerged.
Channel 4 reveals Cambridge Analytica data was never deleted, and releases video where it tracks down people from the data set in Colorado.
Facebook completely disables its app review process to new apps until "new policy changes" are brought into effect and kills its third-party data import tools, crippling many large advertisers.
On top of this, news emerges that Facebook planned a home speaker featuring cloud-backed facial recognition for May, but has delayed release over privacy concerns.
March 29: Apple's CEO, Tim Cook, takes Facebook to task for lack of privacy.
Buzzfeed publishes internal memo from 2016 discussing growth tactics and that its core function is to connect people regardless of consequence.
March 30: Facebook employees are furious about the leaked memo, and can't understand why someone would leak it, with some declaring war on leakers. Vanity Fair has a great piece on why this memo is dangerous for Facebook.
|