Facebook issued record £500,000 fine over failure to safeguard user information during Brexit vote

Facebook broke the law during the Brexit referendum – by failing to safeguard people’s information after the data of 87 million people was harvested – and could be fined £500,000 for two breaches of data protection legislation, the Information Commissioner’s Office (ICO) announced as it called for a statutory code regulating the use of personal data in political campaigns to be introduced.

Information Commissioner Elizabeth Denham has this week published an update of her office’s investigation into the use of data analytics in political campaigns.

In March 2017, the ICO began looking into whether personal data had been misused by campaigns on both sides of the referendum on membership of the EU.

In May it launched an investigation that included political parties, data analytics companies and major social media platforms.

Today’s progress report gives details of some of the organisations and individuals under investigation, as well as enforcement actions so far.

This includes the ICO’s intention to fine Facebook a maximum £500,000 for two breaches of the Data Protection Act 1998.

Facebook, with Cambridge Analytica, has been the focus of the investigation since February when evidence emerged that an app had been used to harvest the data of 50 million Facebook users across the world. This is now estimated at 87 million.

The ICO’s investigation concluded that Facebook contravened the law by failing to safeguard people’s information. It also found that the company failed to be transparent about how people’s data was harvested by others.

Facebook has a chance to respond to the Commissioner’s Notice of Intent, after which a final decision will be made.

Ms Denham said: “We are at a crossroads. Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes.

“New technologies that use data analytics to micro-target people give campaign groups the ability to connect with individual voters. But this cannot be at the expense of transparency, fairness and compliance with the law.”

She added: “Fines and prosecutions punish the bad actors, but my real goal is to effect change and restore trust and confidence in our democratic system.”

A second, partner report, titled Democracy Disrupted? Personal information and political influence, sets out findings and recommendations arising out of the 14-month investigation.

Among the ten recommendations is a call for the UK government to introduce a statutory Code of Practice for the use of personal data in political campaigns.

Ms Denham has also called for an ethical pause to allow government, Parliament, regulators, political parties, online platforms and the public to reflect on their responsibilities in the era of big data before there is a greater expansion in the use of new technologies.

She said: “People cannot have control over their own data if they don’t know or understand how it is being used. That’s why greater and genuine transparency about the use of data analytics is vital.”

In addition, the ICO commissioned research from the Centre for the Analysis of Social Media at the independent thinktank DEMOS. Its report, also published today, examines current and emerging trends in how data is used in political campaigns, how use of technology is changing and how it may evolve in the next two to five years.

The investigation, one of the largest of its kind by a data protection authority, remains ongoing. The 40-strong investigation team is pursuing active lines of enquiry and reviewing a considerable amount of material retrieved from servers and equipment.

The interim progress report has been produced to inform the work of the DCMS’s Select Committee into Fake News.

The next phase of the ICO’s work is expected to be concluded by the end of October 2018.

Share icon
Share this article: