How Facebook failed to moderate its own platform

An amazing investigation from Motherboard's Jason Koebler and Joseph Cox went inside the company's walls, and how it's trying to grapple with the newfound moderation responsibilities pushed upon it.

It's a long read that details what happened, and what's changing internally, and what isn't:

Size is the one thing Facebook isn’t willing to give up. And so Facebook’s content moderation team has been given a Sisyphean task: Fix the mess Facebook’s worldview and business model has created, without changing the worldview or business model itself.

The company has mindfully changed a number of ways it defines what stays and what goes, including special weekly meetings for "content escalations" and special teams that deal with breaking events as they unfold.

This look at how Facebook actually wound up here can be unraveled from its earliest days. According to the piece, just under ten years ago in 2009, just 12 people moderated the entire platform:

“Originally, it was this guy named Karthi who looked at all the porn, but even from the beginning the moderation teams were fairly serious,” Willner told Motherboard. “It was a college site, and the concerns were largely taking down the dick pics, plus some spam and phishing. When I was joining the moderation team, there was a single page of prohibitions and it said take down anything that makes you feel uncomfortable. There was basically this Word document with ‘Hitler is bad and so is not wearing pants.’”

Zuckerberg mentioned the idea of a 'Supreme Court' for content earlier this year which seemed ludicrous, but as time goes on actually makes more sense. In this scenario, Zuckerberg suggested an actual, transparent process that allowed Facebook to rule on a decision, but then the community to make an appeal based on that which is judged by a trusted, independent group in the community.

If anything, what's missing from this investigation is simple: Facebook is trying to solve this problem on their own, and I don't know if it can. It's one thing to try and own the problem, but I'm of the belief that Facebook simply can't solely decide what stays on the platform as more than 2 billion people, with differing world views use it every day.

What that looks like, whether it's the court of content, or the government stepping in and regulating it, is anyone's guess, but it's clear we're going to be hearing about this for a long time to come.


Tab Dump

Technology companies plan a secret summit to deal with 2018 election
It's only a few months away, but the US midterms are creeping up on us. Technology companies including Google, Microsoft, Snapchat, Twitter and others in San Francisco are secretly meeting at Twitter's HQ to discuss how to address the problem. This seems a little strange to me, but it's good that they're tackling the issues together, rather than apart.

Magic Leap One Tear-down is delightful gadget gore

chrome_2018-08-24_09-15-54.png#asset:5994


It's rare you see such a tiny, well-crafted device, and this tear-down is just fantastic. The sheer work that's gone into the device is really impressive, so seeing it pulled apart like this gets me excited for a future iteration.

Windows 95 as an Electron app
There are some things that are so wrong, but should happen anyway. Behold.

JSON has taken over the world

Life as a bug bounty hunter: a struggle to get paid