A review of the last 12 months of enforcement action by the ICO shows a focus on stopping spam direct marketing communications, a slowdown in penalties for organisations that suffer personal data breaches and the ICO's enforcement activity reaching into a few new areas of privacy law. 

Unlawful direct marketing remains a hot topic

Over the last 12 months, the ICO has issued 19 penalties against companies for unlawful direct marketing to customers. This accounted for over 90% of the penalties issued by the ICO in that period showing the ICO's heavy focus on stopping unlawful direct marketing practices and real commitment to penalising spammers.

The penalties involved direct marketing to consumers by text message, email and by telesales that were conducted unlawfully. This ranged from tens of thousands of unsolicited calls to millions of emails in breach of the requirements, and the penalties ranged from £30,000 - £250,000.

One such penalty was for the retailer, Hello Fresh – an online meal and grocery delivery service. The trigger for the ICO investigation was the receipt by the ICO of 17 complaints made directly from consumers, and a further 15,221 complaints being logged with the 7726 service (the UK's centralised reporting service for spam text messages). Although the thrust of the complaints received were about text message marketing, the ICO looked at email marketing practices as well. The ICO concluded that HelloFresh sent 80m direct marketing messages without valid customer consents and imposed a fine of £112,000. This fact pattern is common in many of the penalties, with the ICO investigation often originating from a small number of individual complaints. 

Across the penalties two themes emerge. First, a number of the penalties are against intermediary sales companies who seek to generate leads on behalf of other businesses. Organisations therefore need to be careful about engaging such intermediaries and ensure that they are operating lawfully.

Second, organisations often fell down because their consent language was not specific or clear enough. In several penalties, consent was collected but it was insufficient. Frequent problems were that consent for one form of direct marketing was obtained (e.g. emails) but communications were then sent through another channel (e.g. text messages) or the communications were sent by an associated company that was not specified within the scope of the original consent. The ICO approach has been to read consent wording narrowly and strictly, reflecting the GDPR requirement that consent be clear, specific and unambiguous.

Cyber security has dropped down the penalty list

Cyber security remains a major issue, but the ICO has taken a light touch when using its enforcement powers in response to data breaches over the last year with only two financial penalties being issued. The YMCA accidentally added recipients of an email to the CC line rather than the BCC line, thereby exposing the recipients' names and email addresses. A modest penalty of £7,500 followed. The Ministry of Defence did the same thing but with much more sensitive data – the email addresses of 265 people being evacuated from Afghanistan. It received a £350,000 penalty, although it would have been £1,000,000 were it not for the mitigating factors of the urgent and pressurised circumstances of the evacuation and the MOD being a public body. 

The ICO has generally chosen to issue public reprimands for data breaches rather than impose financial penalties. This follows a change of approach at the ICO that started a couple of years back when it proposed greater use of reprimands and began publishing them, especially for the public sector where imposing a financial penalty only caused a circulation of money around public accounts. However, not all reprimands are published and so the complete picture is not available.1

The published reprimands generally fall into one of three camps. There are several inadvertent disclosure cases like the YMCA and MOD where data has been emailed or posted to the wrong recipient. This is an old problem but one that now has several modern solutions. Email monitoring tools can be installed that warn against emails suspected of going to the wrong recipient or which attach large amounts of data. Also mail merge and bulk mail systems are readily available that avoid the need to use ordinary email accounts for mass mailings and come with more safeguards. 

The second most common cause was a cyber-attack by a hacker. Being a victim of a cyber-attack alone is not a sufficient basis to draw enforcement action by the ICO; and, generally speaking, it needs to be paired with some technical insufficiency of information security or failure to put in place organisational measures that could have been readily implemented. Phishing was a major root cause of cyber-attacks, with the ICO especially taking action where multi-factor authentication was not deployed (or inadequately deployed) and would have prevented an attacker exploiting any phished credentials. Other themes were poor staff training (again, primarily about phishing), no penetration testing, a lack of scrutiny over third party vendors and inadequate backup systems resulting in data being irretrievably lost. 

The misconfiguration of online platforms was also a cause of data being accidentally shown to other platform users. This happened when a new customer portal was setup incorrectly so that it showed the names and addresses of other residents in a housing association; a webform on London.gov.uk was accidentally made publicly available so that any person clicking on form could see all the queries entered by other people; and a network storage container at a recruitment company holding 12,000 records of 3,000 workers was left open to the internet and accessed by a security researcher. These reprimands show the importance of regular testing of systems that face out to the internet.

Use of social media

The ICO issued three reprimands for improper processing of personal data over social media, being the WhatsApp and Telegram messaging apps.

Officers at the Dover Harbour Board (effectively the local police force of the port of Dover) and Kent Police were sharing information about suspected vehicle crime via Telegram. The ICO found this to be unlawful because there were inadequate security controls in place - the officers were using their personal devices, the encryption settings within the Telegram app were not always activated and the Harbour Board's data protection training was inadequate. Although this was found to be a breach of Section 40 of the DPA 2018 (the law enforcement equivalent to the GDPR obligation to have appropriate security measures), there was no evidence that the messages were accessed by anyone outside the controller and so there had not been an actual data breach. However, the lack of controls meant that Telegram could have accessed the messages and sent them to other Telegram companies, including ones outside the EU. This reprimand is a rare example of the ICO finding a breach of security obligations even though there has not been an actual data breach. Notably the ICO would have imposed a penalty of £500,000 had this been a private company and not a public body.

NHS Lanarkshire were reprimanded for using a WhatsApp group to share patient information between medical staff. There were 533 incidences of personal data being shared over a two year period. In both Lanarkshire and Dover Harbour Board a common problem was a lack of privacy and risk assessments in using WhatsApp and Telegram, and inadequate training and guidance to staff on their use. The ICO did not conclude that use of these apps was always inappropriate – even when dealing with highly sensitive information such medical details and suspected crime – but the lack of sound data protection governance at both organisations appears to have a been a key driver for the ICO taking enforcement action. 

Inaccurate data

There were several reprimands issued in relation to inaccurate data being processed and causing harm to individuals. Data protection law has always included an obligation to take proportionate steps to ensure personal data is accurate, but this area is now coming under real scrutiny by the regulator.

The Bank of Ireland received a reprimanded when it sent inaccurate reports to credit reference agencies about 3,284 individuals. West Midlands Police were reprimanded for repeatedly linking and merging the data of two individuals who had the same names.

The reprimands also revealed an overlap between lost and inaccurate data. Derby and Burton NHS Trust received a reprimand when its patient appointment system had a limitation in place leading to patient referrals to consultants falling off the system if not processed within 180 days, and becoming permanently lost after 550 days. In one sense, this is a form of inaccurate data processing where the system was not showing the referrals for some patients, but the ICO proceeded under Article 5(1)(f) GDPR finding that this was a form of inadequate data security in that the referral data had been accidentally lost. Similarly, in West Mercia Police an IT server was erroneously decommissioned leading to the loss of all the data on the server. The ICO found that there was a decommissioning procedure in place, but the process had insufficient checks regarding whether data needed to be kept. Both of these reprimands apply a wide meaning to the idea of data "security" and probably one that is broader than most organisations have in mind when they are considering data security issues. The integrity and completeness of data on IT systems, particularly ones undergoing upgrades or migrations, are a central part of IT practice, but perhaps not often seen as a data security issue, and not one that obviously falls under the ambit of GDPR security obligations.

It is not clear from the reprimands whether the ICO considered these situations to be "personal data breach" incidents requiring breach notification. If the ICO does consider such instances to be personal data breaches, then this could create a burdensome obligation on organisations to report IT project problems to the ICO should they have foreseeable impacts on individuals, especially given that the definition of a "personal data breach" extends to the temporary unavailability of data. 

Data Subject Access Requests

In 2022, the ICO began a crackdown on DSAR non-compliance, issuing seven reprimands to organisations with large data subject access request backlogs in the form of hundreds or thousands of unanswered DSARs. This pattern has continued in the last 12 months with reprimands being issued to organisations such as Southampton NHS Trust for answering only 59% of DSARs within the statutory timeframe; Lewisham Council was in a similar position with only 35% of requests responded to on time. Devon and Cornwall Police had a backlog of 287 requests and Dorset Police had a backlog of 120 requests; in both cases the backlogs had persisted for several years.

There was no published formal enforcement action taken in relation to any complaints about the handling of individual DSARs; in our experience the ICO typically takes informal action or offers guidance on what should be done in these cases. The ICO's formal action has focused on large scale systemic DSAR failings and it appears to see the resolution of individual DSAR complaints as largely a matter for private action between data subject and data controller thought the Courts. 

Biometrics coming to the fore

The ICO made another step into controlling the use of biometric data when it issued an Enforcement Notice to Serco using biometrics to track employees. Serco used facial recognition and fingerprints to record the attendance of employees at sites under its management, saying that the use of biometrics was needed because manual timesheets were unreliable and there had been instances of "buddy punching", where one employee signs in for another employee. The ICO held that use of biometrics – being special category data – needed to be necessary (not just proportionate). It found that Serco's use of biometrics was not necessary as alternative and less intrusive methods could be used to verify employee attendance.

The ICO reprimanded Chelmer Valley High School for using facial recognition to identify students as part of a cashless payment system for school catering. The use of facial recognition was not inappropriate per se, but the school had not completed an adequate data protection impact assessment and had relied on assumed consent from parents (which was not valid) rather than obtain explicit consent from the students. This decision differed from the Serco decision, where Serco were not relying on consent at all but on combination of different grounds such legitimate interests (article 6(1)(f)) and furtherance of employment rights (article 9(2)(b)). 

The Serco decision came shortly after the ICO lost the Clearview AI case in the Information Rights Tribunal. The ICO had issued a £7.5m penalty against Clearview for scraping data for its facial recognition technology, but that was overturned by the Tribunal. The Tribunal decision turned on the narrow facts of that case and Clearview's use on technology in a law enforcement context, which makes this decision unlikely to be analogous to the operations of most organisations. The ICO has however indicated an intention to appeal the decision which shows its ongoing willingness to place strict limits on the use of biometric data.

Areas not covered

Other than a highly fact-specific decision relating to the sharing of data between the Police Service of Northern Ireland and US Department of Homeland Security, there was no other published enforcement action taken by the ICO in relation to transferring of personal data to locations outside the UK.

There was also no published enforcement action taken in relation to use of cookies or other online tracking technologies. The ICO did publish an update about its review of tracking technologies on the top 100 UK websites, saying that most had now become compliant and indicating that enforcement action may be taken against websites if they do not pro-actively get compliant.

Both of these have been hot topics in the EU, following the Schrems II decision that invalidated Privacy Shield as a lawful method of transferring data to the US and the NOYB campaign of complaints against organisations with non-compliant cookie usage. The ICO however seems to be taking a lighter touch approach on these issues, that focuses on education and guidance rather than formal enforcement action. 

There was no enforcement action taken in relation to the use of AI. In late 2023 it did issue a preliminary enforcement notice against Snap Inc in relation to an AI chatbot deployed to Snapchat users, but on further investigation it decided that no action was required. In a helpful yet unusual step the ICO published a detailed decision document setting out its reasoning. In very brief summary, Snap were given the opportunity to refine and improve their original data protection impact assessment and to document additional safeguards, which the ICO then accepted met the requirements of the GDPR. This picks up the theme discussed above and which permeates much of the ICO's enforcement action over the last year - that good data protection governance is key and that risk assessments need to be detailed and meaningful, and not just administrative exercises. 

Key takeaways

  • You must have valid and lawful consents to send direct marketing communications to individuals or else a penalty will likely follow
  • Conducting regular staff training around phishing and implementing multi-factor authentication are baseline requirements of cyber security
  • Use of messaging apps needs to be done within a risk-assessed framework and with clear guidance and training
  • Any project involving biometric data comes with a high compliance risk and will need to be the only viable solution or based on explicit consent from data subjects
  • A lack of strong data protection governance and comprehensive risk assessments are a material cause of the ICO taking enforcement action. 

This article is for general information only and reflects the position at the date of publication. It does not constitute legal advice.