Facebook is facing a £500,000 fine and the parent company of Cambridge Analytica has been put on notice that it faces a criminal prosecution. Can we still trust the social giant?
Update 18/02/19: Facebook Regulation
MPs are calling for the urgent regulation of Facebook following the House of Commons’ report into fake news.
The Cambridge Analytica scandal, which we explained below back in July, formed a key part of the evidence in the Digital, Culture, Media and Sport Committee’s report.
Original Convo 11/07/18
Facebook has been put on notice by the Information Commissioner’s Office that it could face a fine of £500,000 after its investigation concluded that Facebook broke the law by failing to look after the information of its users.
This is one of the results of the investigation by ICO, which is the data protection regulator in the UK, into the use of data analytics in political campaigns.
It’s a measure of how seriously the ICO takes its findings that this is the first time it has said it intends to impose that maximum. The only other firm to have been fined that much is TalkTalk, which has been fined twice by the ICO for data breaches, with both fines together totalling £500,000.
The information commissioner, Elizabeth Denham, said on Wednesday that her office was also going to bring a criminal action against SCL Elections, the now defunct parent company of Cambridge Analytica.
How all this plays out is anyone’s guess – experts have raised some reservations about what the eventual outcomes might actually be – but there’s no doubt that the ICO takes what went on with Facebook and Cambridge Analytica very seriously.
It’s worth pointing out that Facebook has taken steps to make sure what happened with Cambridge Analytica can’t happen again: in 2014 it shut down the ability of third-party apps to access the data of the friends of people who had installed the app, and more recently Mark Zuckerberg said that an app’s developers wouldn’t be able to access your information if you hadn’t used the app for three months or more.
Access to data
As well as the news from the data protection regulator today, news also broke that some apps had more access to the data of users’ friends than had been previously thought – and, according to an allegation in Wired, one of those apps was Mail.ru, a Russian internet giant.
Mark Zuckerberg had flagged up in evidence to US lawmakers that some apps had had their access to the data of third parties extended by Facebook to give them time to comply with the new, tighter, rules. Plus, according to Wired, the Russian group was given access beyond the end of that official cut-off point in order to allow them to comply.
Given the concern about potential Russian government hacking, the news that Russian organisations were given access to third party data until May 2015 and beyond the cut-off point is, to say the least, concerning.
Facebook told Wired that Mail.ru’s apps haven’t had access to other people’s data since May 2015.
Stories like this remind us that it’s very difficult to know not only how Facebook uses and looks after your data, but also how third parties who have access to that data safeguard it. So it wasn’t a surprise when the research for our recent report, Control, Alt or Delete? found that most of us don’t have a clear or detailed understanding of how our data is used, and that many were actively shocked when they found out about the extensive ecosystem built on the data we provide in return for services such as Facebook.
How do you feel about Facebook now we know that the ICO has concluded that it broke the law? Do you think the safeguards put in place by Facebook since they first found out about what had happened with Cambridge Analytica are good enough? How do you think Facebook could do better?