HMRC’s worrying GDPR breach
As a chartered accountant, chartered tax adviser and chartered IT professional, Donald Drysdale wonders what further action may follow as HMRC get a rap on the knuckles from the ICO for breaching GDPR.
Initial concerns
Three months ago I wrote an article describing Voice ID. This is a new voice authentication technology which HMRC had adopted in January 2017, asking callers to some of its telephone helplines to record their voice as their password.
The aim of my article was to echo concerns, expressed by civil liberties campaign group Big Brother Watch, that HMRC were holding taxpayers’ biometric voiceprints illegally in contravention of the General Data Protection Regulation (GDPR).
Amid the cut and thrust of the adversarial world of tax compliance, such a suggestion might easily have been dismissed as mere posturing to embarrass the tax authority. However, subsequent events have shown that the concerns raised were fully justified.
What is GDPR?
GDPR lays down rules relating to the protection of natural persons with regard to the processing of their personal data, and rules relating to the free movement of personal data within the EU.
As enshrined in the Data Protection Act 2018, GDPR seeks to protect fundamental rights and freedoms of individuals, and in particular their right to the protection of personal data, while enabling the free movement of such data among member states.
GDPR recites principles relating to the processing of personal data, rights of the ‘data subjects’, and consequential obligations of supervisory authorities, data controllers and data processors.
It would be wrong to suggest that complying with GDPR is easy, but there was massive publicity before it took effect almost exactly a year ago, on 25 May 2018. It seems scarcely possible that any organisation would have remained ignorant of it – certainly not HMRC!
Enforcement notice
On 9 May 2019 the Information Commissioner’s Office (ICO) issued HMRC with an enforcement notice under the Data Protection Act 2018 s 149 in relation to contraventions of the data protection principles set out in Article 5 GDPR.
Helpfully, the ICO also published an article the following day, reminding organisations of the potential challenges when choosing and using any systems involving biometric data.
This explained that HMRC had neither given callers enough information, nor advised them that they didn’t have to sign up to Voice ID. There had been no clear option for callers who didn’t want to register. HMRC had not obtained adequate consent from callers, and accordingly, the ICO ordered the tax authority to delete any data it was holding without consent.
The ICO referred to the significant imbalance of power between HMRC and taxpayers, and criticised HMRC for giving “little or no consideration to the data protection principles [of GDPR] when rolling out the Voice ID service”.
Until HMRC eventually published a privacy notice in July 2018 and changed the automated recording in October, they had not told callers how they could opt out of the Voice ID system and had not explained to them that they would not suffer a detrimental impact if they declined to participate.
Because the case has raised significant data governance and accountability issues that require monitoring, the ICO is going to follow up the enforcement notice with an audit to assess HMRC’s compliance with good practice in the processing of personal data. Hopefully, this will also confirm that the illegally-gathered voiceprints have indeed been deleted.
Data protection impact assessment
In a related failure by HMRC, a ‘data protection impact assessment’ (DPIA) required by GDPR was not in place before the system was launched. This was necessary to consider appropriately the compliance risks associated with processing biometric data.
Data controllers are required to complete a DPIA where their processing is ‘likely to result in a high risk to the rights and freedoms of natural persons’ such as the (large scale) use of biometric data. A DPIA should ensure that ‘data protection by design and default’ is a key concept at the heart of GDPR compliance.
Under GDPR, biometric data is specifically identified as ‘special category data’ that requires greater protection. Any consent obtained to hold such data must be explicit consent, and the benefits from the technology cannot be used as an excuse to override the need to meet this legal obligation.
ICO penalties
The ICO has power to levy fines for breaches of data protection law. For small organisations, the potential penalties may seem draconian, but they have been widely criticised as likely to be ineffective when applied to large organisations.
Fines levied by the ICO before GDPR came into effect were limited to a maximum of £500,000. This was the paltry sum which Facebook was fined in 2018 for improperly sharing personal data on 87m customers with Cambridge Analytica. Earlier that year, Facebook was reportedly making £500,000 every five and a half minutes.
Under GDPR, the ICO can now fine organisations up to €20 million or, if they are classed as ‘undertakings’, up to the higher of €20 million and 4% of the undertaking’s total annual worldwide turnover.
ICO fines charged to date, listed here among other enforcement action, have fallen woefully below the permitted maxima, and are unlikely to prove a deterrent to large organisations at fault.
So, what punishment should the ICO mete out to HMRC? In 2017/18 HMRC collected over £600bn in revenues – including penalties of £1.7bn levied on taxpayers. Would such an organisation really feel the pinch if faced with a €20 million penalty? I think not.
The need for trust
Members of the public should be able to trust that their privacy is at the forefront of the decisions made about their personal data – whether at HMRC or any other public or private body.
Time and again – for example, with iXBRL for corporation tax, data-bridging techniques for Making Tax Digital, and Voice ID – HMRC have sought to become the most digitally advanced tax authority in the world by operating at the ‘bleeding edge’ of new technologies, with seeming disregard for unwelcome risks this might attract for both HMRC and taxpayers.
On introducing Voice ID, HMRC said it was “well-proven” and “cutting edge” – expressions which are not necessarily synonymous. Although simple recording of phone calls has become commonplace, I sympathise with taxpayers who may be reluctant to have their biometric voiceprints recorded until the security aspects have been more rigorously tried and tested.
HMRC should observe the law. While an ICO fine might not guarantee HMRC’s future compliance with GDPR, the attendant reputational impact might persuade them to adopt a less negligent attitude to safeguarding the interests of taxpayers. The ICO should, therefore, charge them a robust penalty.
Article supplied by Taxing Words Ltd