ICO fines topped $14 million in 2023 amid crackdown by regulator on data protection standards
ICO fines across 2023 exceeded £14 million, with TikTok among the worst-hit for data protection violations
The Information Commissioner’s Office (ICO) fined businesses more than £14.3 million for misusing data last year, according to an analysis by cyber security and data protection consultancy CSS Assure.
ICO fines were imposed on 18 businesses, with the ICO also reprimanding 36 companies, issuing enforcement notices against a further 19, and prosecuting four businesses for failing to meet their obligations on information rights.
The year's largest fine, £12.7 million, was imposed on social media platform TikTok for breaching data protection law around the use of children’s personal data, with the ICO estimating that up to 1.4 million under-13s in the UK were able to use the video sharing app in 2020.
Charlotte Riley, director of information security at CSS Assure, said the ICO fines underline the serious repercussions faced by businesses for failing to adhere to robust data protection standards.
"The fines imposed by the ICO in 2023 highlight the serious consequences of misusing data," she said. "Mishandling personal information not only violates data protection laws but also erodes trust among consumers."
"TikTok’s £12.7 million penalty underscores the importance of lawful use of personal data and implementing appropriate safeguards, especially when it involves children," Riley added. "TikTok is a large, well-known brand, and its fine was substantial due to the sheer amount of data involved."
ICO fines issued for marketing violations
There were a combined £310,000 in ICO fines for three marketing firms found to be making a total of 483,051 unsolicited marketing calls to businesses and sending 107 million spam emails to jobseekers, analysis from CSS Assure revealed.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Similarly, two energy firms were fined a combined £250,000 for making marketing calls to people on the UK’s ‘do not call’ register.
A business support consultancy was also fined £30,000 for sending 558,354 direct marketing SMS messages without valid consent while an appliance service and repair company was fined £200,000 for making more than 1.7 million unsolicited direct marketing calls.
During the second half of the year, 10 companies were collectively fined more than £800,000 for sending a total of 4,698,841 unwanted text messages, 39,906,342 emails, and making 1,937,028 nuisance phone calls.
Riley said the sharpened focus on cracking down on nuisance calls and spam marketing tactics serves a clear message to organizations across the country for the year ahead.
"The fines imposed on businesses for unsolicited calls and text messages, and spam emails, as well as firms for disregarding the 'do not call' register, demonstrate the significant impact of invasive marketing practices," she said.
"These penalties send a clear message that companies must respect individuals’ privacy preferences and refrain from bombarding them with unwanted communications."
The ICO had a “busy year” in 2023
The ICO described 2023 as 'a busy year', noting that it handled 116,000 business service calls and 70,000 public advice calls.
The watchdog also received more than 33,000 data protection complaints and over 7,000 FOI complaints, with 288 investigations opened and 346 closed.
Its priorities for this year, it said, are to support the new Data Protection and Digital Information (DPDI) bill as it makes its way through Parliament, and to set out clear expectations that privacy and artificial intelligence (AI) go hand in hand.
Enhance your organizations cyber resilience with proactive threat intelligence
DOWNLOAD NOW
Speaking at TechUK’s Digital Ethics Summit last month, information commissioner John Edwards said the regulator will be taking a measured approach to AI in 2024, but warned businesses that data privacy standards must be a key consideration.
"I want to make it clear from the very start that we are not against organizations using AI," he said.
"We just want to ensure that they are using AI in sensible, privacy-respectful ways, ensuring that people’s personal information and privacy rights remain protected throughout."
Emma Woollacott is a freelance journalist writing for publications including the BBC, Private Eye, Forbes, Raconteur and specialist technology titles.