The Information Commissioner has told MPs that the Conservative Party illegally collected the ethnicity data of 10 million voters.
Elizabeth Denham said the Conservatives had deleted the data following a recommendation by the Information Commissioner’s Office (ICO) in a report last year. Speaking to MPs on the Digital, Culture, Media and Sport (DCMS) sub-committee on online harms and disinformation, Denham said it was unacceptable that the party had used people’s names to attempt to derive their ethnicity and religion.
In our audit work, where we looked at the practices of all political parties, our recommendation was for any kind of ethnicity data to be deleted and the Conservative Party – I’m told and we have evidence that the Conservative Party have destroyed or deleted that information.
Denham said the party had done this voluntarily, but it would have ordered it to destroy the data if it had not agreed to do so.
Pressed on the issue by SNP MP John Nicolson, Denham said:
Religion and ethnicity are both – like health information – special category data that requires a higher standard for a legal basis to collect. So again, ethnicity is not an acceptable collection of data, there isn’t a legal basis that allows for the collection of that data.
Asked to confirm if it was illegal, the Information Commissioner said:
It was illegal to collect the ethnicity data and that has been destroyed.
Privacy campaigners responding to Denham’s evidence said the ICO needed to do more to enforce rules around how political parties collect data on voters.
Jim Killock, executive director of the Open Rights Group, said:
The Conservative Party’s racial profiling of voters was illegal. Elizabeth Denham finally confirmed the unlawful nature of this profiling by the Conservative Party under pressure from MPs on the DCMS committee. Yet the ICO still has not explained what parties can and cannot do. Mass profiling of voters continues, even if this data has been removed. The ICO needs to act to stop unlawful profiling practices. That’s their job.
WhatsApp has since clarified details of the update and pushed back the deadline for users to agree to the policy.
Denham said she did not use Facebook “by choice” and used Signal – one of the apps which has seen a spike in new users since WhatsApp’s privacy announcement – for her “personal communications”. She added:
What’s really interesting about the WhatsApp announcement in ongoing sharing with Facebook is how many users voted with their virtual feet and left the platform to take up membership with Telegram or Signal which are end-to-end encrypted.
I think it’s a bigger issue of trust. Users expect companies to maintain their trust and not to suddenly change the contract that they have with the users and I think it’s an example of users being concerned about the trustworthiness and the sustainability of the promises that are made to users.