Reducing The Risk Of ICO Enforcement Notices And Penalties Under The GDPR

3170 0

Since the GDPR regulations came into force on 25 May 2018, there have been hundreds of thousands of GDPR breaches resulting in enforcement action for non-compliance and /or penalties for data protection breaches. With regards to the latter, regulators across Europe have imposed much more severe penalties than previously seen under the Data Protection Act 1998 (DPA 1998), the record being £183m against British Airways for losing over 500,000 customers’ personal data.

It is not only large, global corporates that are penalised under the GDPR. Smaller organisations and even individuals have been penalised for GDPR breaches, for example: estate agents failing to keep tenants’ personal data safe; social workers emailing individuals’ sensitive data to their personal email addresses without authorisation; company directors selling personal data (which can lead to director disqualification for a period of 5 years); and organisations sending marketing material electronically to customers without their consent. Lack of transparency, not having a lawful basis for processing or failure to obtain valid consent feature in the growing number of complaints to the Information Commissioner’s Office (ICO).

Complaints relating to electronic communication

The DPA 1998 remains in force for the purposes of the Privacy and Electronic Communications (EC Directive) Regulations 2002 (“the PECR”) notwithstanding the introduction of the DPA 2008. The PECR gives effect to the right to privacy directive, 2002/58/EC. Section 55 of the DPA 1998 prohibits illegally obtaining personal data without the consent of the data controller and selling or sharing that data. Many complaints to the ICO are related to unsolicited direct marketing messages. Individuals have stated, via the ICO’s online reporting tool, that where they have not given their consent to receive marketing messages, they find it “concerning and worrying” how a company has managed to get hold of their personal information, e.g. a mobile phone number. It is also very concerning that they do not know what other information a company may have about them.

Minimising consent breaches

Organisations can minimise consent breaches by rectifying any or all of the following:

  1. Do not have consent ‘yes’ boxes defaulted to being ticked – under the GDPR, the default must always be ‘no’ – explicitly ask the customer if they would like to receive marketing information electronically;
  2. Review any data sharing agreements with third parties and make sure that if you do share data with third parties (providing you have the customer’s consent), require that the third party obtains the customer’s consent separately before sending them any marketing material;
  3. Where you receive personal data from a third party, always verify a customer’s consent before sending any marketing material even if the third party informs you that consent has been obtained; and
  4. Transparency: always tell a customer whose data has been shared with you by a third party from where you obtained their data and what you intend to do with it.

 

Automated decision-making and profiling

Article 22 of the GDPR Regulations provides for even stricter conditions to protect individuals from automated decision-making that has legal or similarly significant effects on them. More specifically, solely automated decisions, i.e. decisions that do not have a human input or where the human review of such decisions is not meaningful could amount to a breach under Article 22. Examples include: an online decision to award a loan or a recruitment aptitude test that uses pre-programmed algorithms and criteria. Profiling goes further and collates data on an individual to ascertain their buying habits or lifestyle with the aim of predicting individuals’ behaviour or to make decisions about them.

Organisations using or intending to use artificial intelligence (AI) applications and/or profiling should ensure that human reviewers can override automated decisions and not be penalised for doing so. Where there is a low or non-meaningful level of human decision-making, the risk of a breach of Article 22 is higher. The ICO has published guidance relating to the issues of what level of human involvement in automated decision-making is meaningful (See https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/automated-decision-making-and-profiling/what-does-the-gdpr-say-about-automated-decision-making-and-profiling/ ). It remains to be seen whether the increase use of AI will result in an increase in fines under the GDPR.

Amanda Lathia
Amanda joined Hunters in the Business Services team in October 2018.
Before pursuing a legal career, Amanda graduated with a M.Eng in Aeronautical Engineering with French from Bristol University and worked as a computer programmer/business analyst for British Airways and an e-commerce software company.


Amanda Lathia Web Site

In this article