Australia passes law cracking down on platforms that spread violent content
The technology industry is on notice: a world-first law in Australia places the blame for violent content that spreads online squarely on the platforms themselves, allowing the government to penalize executives at social media companies for content that isn't taken down.
The new law, called the "Sharing of Abhorrent Violent Material bill" is a watershed moment for legislation that tries to move the responsibility back toward the platforms themselves for better managing the content on their services.
Partially designed in response to the attacks in Christchurch, New Zealand, the law creates new criminal offences for those platforms that don't quickly remove 'violent conduct' such as terrorist acts, murders and a number of other categories. Other countries are likely to implement similar laws, with Germany already considering its own version.
It's a broad piece of legislation that comes with concerning implementation problems: the definition of "expeditious" is not defined, and the law threatens to implicate anyone at a technology company that could feasibly be to blame, such as developers implementing upload functionality.
Christian Porter, Australia's General Attorney, implied in an interview with The Guardian that 'expeditious' is intentionally vague, saying that "every Australian would agree it was totally unreasonable that it should exist on their site for well over an hour without them taking any action whatsoever."
In the case of the attack in New Zealand, the shooter's video remained on Facebook for a number of hours, until NZ police contacted the company directly—which would put Facebook in jeopardy of being prosecuted under this new ruling, but what's not clear is who would be punished at Facebook.
This law isn't really being read correctly by anyone in the industry, however—its intention is to make the punishments for broad, unfettered upload access to platforms like YouTube or Facebook, no longer feasible. It's that simple, and the mere existence of the law is a statement: perhaps we shouldn't be blindly allowing users to upload anything at all without really being confident it isn't safe.
The argument here is rather simple: if you're a company building tools for user generated content, you better be considering how they might be abused by your users at the very beginning—and build in deep moderation controls from the outset or don't launch it at all. Social platforms historically have shipped first, moderated later, and governments won't take cleaning up their mess anymore.
The problem with these types of rules is that they don't really understand that shutting down broad upload access entirely is really the only option here—artificial intelligence or machine learning will not help fix this, and the technology is simply not there to police video at scale automatically. We've seen the same misunderstandings of technology's capabilities in Europe's new copyright directive—but tech companies are the only ones to blame: they made it seem like AI could solve everything when the reality is it's not reliable enough at scale.
If anything, introducing 'pain' for social media companies that don't respond fast enough is designed to coerce them into prioritizing moderation as something that's a core concern for their business. If the fines are large enough—they're currently set at up to $10.5M or 10% of annual turnover—the way these features fundamentally work might begin to change.
Adobe adds 'content aware fill' for videos to latest updates
This feature allows video editors to just select an object and have the app magically remove it from the video. The future is wild.
Verizon launched a 5G network early in Chicago and Minneapolis
Speeds of up to 700 Mbps down, with the right handset. Now let's see how long it takes to scale this demo!
Amazon is quietly removing highlighted spots for its own products
As the technology industry is under more scrutiny than ever for antitrust, Amazon has started lowering the prominence of its products that previously showed up above third-parties in search.