We ask a lot of Facebook. On one hand we want the social media giant to make it easier for us to communicate with our friends and to better engage with our community. We also see it as an entertainment platform and even as a proxy for trashy magazines as we read weird stories and complete some goofy quizzes. But many of us also expect it to be a reliable news source that doesn't spread disinformation. Now that the furore over the Cambridge Analytica scandal has receded, it's time to look at whether things are any better and what Facebook is doing.
Facebook continues to remove companies that it finds accesses user data outside the terms and conditions they set out. For example, they have suspended British firm Crimson Hexagon over concerns on how they used data. And it's intersting to note Facebook isn't yet convinced that Crimson Hexagon (which, let's face it, sounds like the name of a crime syndicate from a bad movie) they've done anything wrong. But they are suspicious and have suspended the firm pending an investigation.
The company has also announced their participation in the Data Transfer Project which will make it easier for us to transfer data out of Facebook to other places. That project has a much broader scope but is indicative of Facebook's understanding that we may want to take our data put of the social network.
The Facebook News Problem
The majority of people accept that the news we see is subject to censorship and manipulation. There has been a mountain of evidence presented that illustrates clearly that fake social media accounts have been used to peddle particular political and social views in order to sway public opinion.
Whether this was behind the narrow election win of President Trump is a matter of conjecture - there's no control experiment we can fall back on to do a 'what if' analysis - but with a victory that was only possible on the back of narrow wins in a number of key states we can see how manipulation of the right group of voters in the right place can change the outcome of an election.
Facebook does have a problem. That problem is also faced by Twitter, Instagram and other popular social platforms. How do platforms manage the publication of content that may be deliberately released in order to incite discord or sway public opinion?
The problem is that without becoming an unregulated censorship services, social media platforms run the risk of enabling exactly what they are seeking to prevent. If they block content they deem inappropriate because it's considered treasonous then they could support a tyrannical government.
The company has already taken some steps to remove dodgy pages from the service.
Controls On Developers
When Facebook moved from being a social network to a software platform it all looked like a bunch of fun. We had a new place to play some games and we discovered that filling in a goofy quiz was a good way to kill a few minutes and have a laugh.
Facebook launched the first iteration of their developer APIs back in 2006. That allowed developers access to Facebook friends, photos, events, and profile information. Since then, the capability of the APIs expanded and businesses realised that there was huge value in all that data we were freely handing over. Each time we gave an application or service access to a piece of our Facebook puzzle, they used it to assemble a picture of us that had value to someone somewhere.
The problem is that Facebook didn't put strong controls in place. Consequently, the number of developers, apps and services using the APIs grew so quickly that Facebook simply couldn't control what was going on. That's a big part of what has brought us to where we are today.
What Can We Do?
There was a big push earlier this year for people to dump Facebook. A look at their user numbers for the last year shows that all the negative publicity and Mark Zuckerberg facing hearings in the US and EU did nothing to stop the user numbers growing (Source: Statista).
Here's what I've done.
It's important to also take a look at what apps have access to your Facebook data and remove them if you're not sure how they got there. Facebook has added a bulk removal tool for this.
I'm sure some of the parties who create fun quizzes are acting ethically and within the bounds of Facebook's terms and conditions. But sorting the wheat from the chaff is hard so, I suggest you simply abstain from doing any of those Facebook quizzes that tell you what your spirit animal is, or who you were in a past life or whatever else. These work by looking at your data and there's no way of knowing what happens to that data once they have it - that's the lesson we've learned from Cambridge Analytica.
If you have kids who want to use Facebook, go through the security settings with them and educate them, and perhaps yourself, on what they mean and what access you're potentially giving to complete strangers.
Finally, review your security settings and ensure that you're only sharing information you're comfortable sharing. The default options are still a little more open than I'd like.