Are companies violating individuals' human rights by relying on consent and legitimate interest to process personal data? On 3rd November 2019, the Joint Committee on Human Rights ("JCHR") published a report on the impact company practices for processing personal data has on human rights and whether the current regulatory regime is sufficiently robust.

The Regulations

The Data Protection Act 2018 and General Data Protection Regulation ("GDPR") were introduced eighteen months ago with a key aim being to further empower individuals with control of their personal data. The Human Rights Act 1998 protects, among other things, individuals' right to respect for private and family life and freedom from discrimination. The JCHR considered whether the data protection legislation has fallen short of its goal and its implementation by private (primarily online) companies has not prevented them from increasingly encroaching on these fundamental human rights.

Key observations

The use of the internet and online digital platforms has become central to people's home and work lives . The rapid development of technology and the growth of companies offering free online services in exchange for data has created new business models where millions of individuals¹ personal data is collected and sold online. To do this, companies most commonly rely on 'consent' or 'legitimate interest' as a legal basis for processing the personal data. However, the JCHR concludes that the 'consent model is broken' and legitimate interest is not sufficiently understood, thus, fostering an online industry that is encroaching on individuals' human rights.

Consent

Consent to process and share personal data must be 'freely given, specific and informed' under GDPR. Consent is generally collected online with a 'tick box' stating that the individual gives consent for their personal data to be used subject to a privacy policy. There will be a link to a privacy policy many of which, according to a BBC study in June 2018, are written at 'a university reading level' and are more complicated than reading Charles Dickens' A Tale of Two Cities.

The JCHR report concludes that the consent model unreasonably places the onus on individuals to educate themselves in order to understand the risks associated with sharing their personal data online. The complexity of privacy policies makes it almost impossible for individuals to understand what consent they are giving. Further, many businesses make use of their online services conditional on agreeing to their non-negotiable terms. Therefore, consent is often not 'informed' or 'freely given' under the GDPR.

Legitimate Interest

Personal data can be processed where 'it is necessary for the purpose of the legitimate interests' pursued by a company. The data protection legislation describes broad areas where legitimate interest could be relied upon. The JCHR report observes that there is a general lack of understanding on what would constitute a legitimate interest and calls for clearer guidance and a rigorous process to test whether companies are using legitimate interests appropriately.

How does this affect human rights?

Despite data protection regulations, companies are routinely buying, selling and sharing people's data without the individual's true consent or knowledge, clearly infringing on their right to privacy.

The JCHR report details how data is now being shared and at times aggregated to create online profiles of individuals without their knowledge, a concern shared by a number of European privacy regulators including the UK Information Commissioner's Office. These online profiles are often used for online targeted advertising. The algorithms used for targeted advertising draw inferences from a person's online profile and makes a decision whether to show that person particular ads, for example for a job or a services. As a result, the individual, who has no way of knowing about their online profile or correcting any of its inaccuracies, are discriminated against with no access to certain services and opportunities due to their online demographic.

The JCHR conclusions

The JCHR has recommended (among other things) the following:

  • Introduction of human rights impact assessments and due diligence to review how a company's gathering and sharing of customer data may result in an adverse impact on the human rights of their users;
  • Stronger enforcement of the data protection regulations, and a review as to whether further legislation is required to make companies take more responsibility for the safety of their users; and
  • The introduction of a mechanism, similar to a subject access report, where individuals should have the right to request data that companies have generated about them, to understand any inferences that have been drawn, such as if they have been refused access to a service based on inferences taken from online aggregated data.

Why is this important?

The implementation of the data protection legislation is a changing landscape. On 8th April 2019, the Government published its Online Harms White Paper, which concluded that the legal framework provides adequate protection against the misuse of people's data by internet companies. However, the JCHR are urging the government to re-consider this position and include the violation of people's right to privacy and freedom from discrimination as part of the government's list of 'online harmful activity' to consider. If you are a private company who utilises machine learning for targeted advertising, or relies on legitimate interest or consent as a legal basis for processing customer personal data, it may be worth taking the time to consider whether your practices could impact on those individuals' human rights and whether to make any changes.

Footnotes

1. The Office for National Statistics in the UK stated that 87% of all adults used the internet daily or almost every day in 2019. 

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.