After over a decade of telling us that tech companies are doing a great job of self-regulating how they publish and protect our content, Mark Zuckerberg has changed his tune. In an o-ed published in The Washington Post, the Facebook founder has softened his view saying the responsibility for monitoring harmful content is more than tech companies can handle.
Following the livestream of the murder of 50 people in Christchurch last month, which was able to run for almost half an hour before being stopped, and accusations of inconsistency in how Facebook applies its "community standards", the social media company accepts that it's time for regulators to step in.
The full article by Zuckerberg, discusses four areas where regulation is needed:
From what I’ve learned, I believe we need new regulation in four areas: harmful content, election integrity, privacy and data portability.
He also notes that Facebook has too much power over free speech.
We live in a very complex world where the difference between terrorist and revolutionary, or hate speech and dissent can be a question of context. And the cultural differences in the meaning of words are also challenging to understand.
The full article details some of the things Facebook is doing such as working with French officials on processes for content review, setting tighter rules on election advertising, producing "transparency reports" about the content that they remove and tighter privacy standards (too bad these weren't in place when the company was found to be storing hundreds of millions of user passwords as plain text).
The cynic in me says this was a smart move by Zuckerberg. On one hand, what he's suggesting is very reasonable - a global framework of laws and standards for how privacy, security and integrity are maintained on the internet. But it's also a near impossibility to achieve as it will require the world's 195 different countries, or a significant majority of them, to agree on things that not only cross legal but also cultural barriers.
In other words, at the moment they are nice sentiments but Facebook really doesn't have to do anything.