On Jan. 21, 2019, the French Data Protection Authority (CNIL) levied a 50 million euros sanction against Google LLC1 for violating the EU General Data Protection Regulation2 (GDPR) in the context of the first enforcement action in Europe under the GDPR. Google has already lodged an appeal against this decision with the French Supreme Administrative Court (Conseil d'Etat).3

The CNIL's decision touches upon four main aspects. First, it concerns the right of associations to bring actions on behalf of data subjects (I). Second, it fixes limits on the "One-Stop Shop" mechanism introduced by the GDPR (II). Third, it sets high standards in terms of intelligibility of privacy policies for internet users (III). Finally, the CNIL describes the factors to be taken into account when determining the amount of a fine (IV).

I. Admissibility of complaints filed by data protection associations

The CNIL's investigation was triggered by the complaints of two associations: None of Your Business (NOYB), a not-for-profit organization founded by Max Shrems (a privacy activist known for having successfully denounced Facebook's mass data collection, which led the Court of Justice of the European Union (CJEU) to overturn the Safe Harbor agreement between the European Union (EU) and the United States),4 and La Quadrature du Net, a French advocacy group promoting digital rights of citizens. These associations claimed Google did not have a valid legal basis to process its users' personal data, particularly for ad personalization purposes. They focused mainly on Android's setup process, where users had to create an account in order to use their device.

These complaints were filed in accordance with Article 80 of the GDPR, which provides that "the data subject shall have the right to mandate a not-for-profit body, organization or association which has been properly constituted in accordance with the law of a Member State, has statutory objectives which are in the public interest, and is active in the field of the protection of data subjects' rights and freedoms with regard to the protection of their personal data to lodge the complaint on his or her behalf." The French Data Protection Act5 is more restrictive in this regard, as it provides that associations need to have existed for at least five years to be able to represent data subjects.6

Because NOYB was established on June 12, 2017, Google challenged the admissibility of its complaint. The CNIL's response to this claim was twofold. First, the CNIL highlighted that, even if this complaint was not admissible, it could carry out an investigation on its own initiative. Second, the CNIL relied on Article 80 of the GDPR to establish that the two associations at issue met the conditions for filing a collective complaint. Surprisingly, the CNIL disregarded the restriction set forth in the French Data Protection Act. Nevertheless, it is unlikely that the French legislature will lift this restriction because it is aimed at preventing the creation of an association for the sole purpose of filing a complaint.

II. Limits to the new "One-Stop Shop" mechanism

The previous Directive 95/46/EC7 dealt very briefly with the issue of cooperation between various Data Protection Authorities (DPAs). It did not address the issue of which authority would be competent in cases of multinational companies established in several Member States.8 Therefore, it was not uncommon for these companies to be subject to inconsistent decisions issued by multiple DPAs across different Member States. In 2018, the CJEU held in the Facebook case that "a supervisory authority which is competent under its national law is not (...) obliged to adopt conclusions reached by another supervisory authority in analogous situation"9 and "is competent to assess, independently of the supervisory authority of the other Member State, the lawfulness of data processing."10 While this decision is based on the 1995 Directive, it continues to be of value in the framework of the GDPR, as is shown by the CNIL's Google order.

Notwithstanding this decision, and in order to address once and for all the lack of coordination between multiple DPAs and to reduce any administrative burden, uncertainty and inconsistency for data controllers and processors resulting therefrom, the GDPR introduced the "One-Stop Shop" mechanism. According to this mechanism, organizations carrying out cross-border personal data processing activities only have to deal with one DPA, the so-called lead supervisory authority.11

A. Strict interpretation of the "lead supervisory authority" concept

The identification of the lead supervisory authority depends on the location of the controller's "main establishment" in the EU. In the case at issue, the CNIL's competence depended on the determination of what was Google's main establishment. According to Google, its main establishment was in Ireland, where its headquarters in Europe are located. Google argued therefore that the CNIL did not have jurisdiction over the claims, which should have been transmitted to the Irish Data Protection Commission (DPC).

The CNIL did not follow that line of reasoning. When determining Google's main establishment, it first recalled Article 4 (16) of the GDPR, which provides that a main establishment is, "as regards a controller with establishments in more than one Member State, the place of its central administration in the Union, unless the decisions on the purposes and means of the processing of personal data are taken in another establishment of the controller in the Union and the latter establishment has the power to have such decisions implemented, in which case the establishment having taken such decisions is to be considered to be the main establishment." The CNIL then concluded that the main establishment "should imply the effective and real exercise of management activities determining the main decisions as to the purposes and means of processing," and should be assessed in concreto, according to objective criteria.

In order to determine whether the main establishment should correspond to the controller's headquarters in the EU, the CNIL took into account several elements:

First, while recognizing that Google Ireland Limited had important financial and human resources allowing Google's activity in Europe, the CNIL concluded that this mere fact was not sufficient to qualify Google's European headquarters in Ireland as its "main establishment."

Second, the CNIL underlined that Google Ireland Limited did not have any decision-making powers on the processing operations carried out in the context of the Android operating system and the services provided by Google LLC in relation to the creation of an account during the configuration of a mobile phone.

Third, Google Ireland Limited was not mentioned within Google's privacy policy as making decisions in this respect.

Fourth, the CNIL underlined that Google Ireland Limited was not appointed as the data protection officer in charge of data processing in the EU.

Based on the above, the CNIL concluded that the "One-Stop Shop" mechanism was not applicable.

B. Consequences of non-application of the "One-Stop Shop" mechanism

This conclusion allowed the CNIL to retain jurisdiction in light of Google's establishment in France.

The CNIL also held that it was not obliged to request the European Data Protection Board (EDPB), which is composed of the national DPAs, to interpret the provision. In this respect, the CNIL underlined that it shared, in accordance with the cooperation procedure, the complaints with other DPAs, in order to identify their respective competences and verify whether a lead supervisory authority had been appointed. No DPA, including the Irish DPC, claimed to be the lead supervisory authority. The Irish DPC even stated so publicly: "Google LLC, an American company, is the data controller and so Google cannot avail of the [one-stop shop] mechanism at all."12

This decision makes apparent the limits of the newly introduced "One-Stop Shop" mechanism. The company's mere presence in a Member State does not trigger this system. The main establishment designated as such by the controller should have the necessary characteristics, pursuant to the criteria set forth in the GDPR and to the Guidelines of the Article 29 Data Protection Working Party (the predecessor of the EDPB, G29).13 Otherwise, companies must deal with each national supervisory authority. Google currently risks facing this problem with respect to other data processing and, in particular, to its policy of providing users of its Android platform with location-specific advertising by tracking their movements through the "location history" and "web & app activity" features. The Irish DPC and the Swedish DPA seem to be willing to take the lead in these investigations.14

Undoubtedly, the GDPR does not provide an adequate solution to the situation in which a controller has two main establishments in the Member States and thus may be confronted with different individual measures of the respective competent DPAs. Article 56.2 of the GDPR even reaffirms that each supervisory authority shall be competent to handle a possible infringement if the subject matter substantially affects data subjects located in the Member State under the authority's control. Nevertheless, a dispute resolution mechanism introduced by Article 65 of the GDPR may be useful, since it allows the EDPB to arbitrate disputes arising between DPAs and establish harmonized decisions in individual cases.15 Hopefully this mechanism will be enforced.

III. Far-reaching control of the GDPR compliance measures

The CNIL's decision also shows its willingness to implement a far-reaching control over measures implemented by companies to ensure compliance with the GDPR. Indeed, while recognizing Google's efforts toward greater transparency and information of users, the CNIL concluded that the company did not comply with the GDPR requirements. It held, first, that Google violated the transparency principle, and second, that it did not satisfy the requirement to have a "lawful basis for processing" when sending personalized ads.

A. Lack of transparency and information

The GDPR places great importance on the companies' obligation to inform their users about the way in which their personal data are used and what rights they have to intervene in the processing. Article 13 of the GDPR lists the information that must be disclosed to data subjects before any processing takes place, such as the nature and purpose of the processing, the period for which the personal data is stored, the existence of the right to request from the controller access to and rectifications or erasure of personal data, etc. Article 12 of the GDPR requires that this information be conveyed in a "concise, transparent, intelligible and easily accessible" form. In other words, controllers should ensure that information provided to data subjects allows those subjects to "determine in advance what the scope and consequences of the processing entails and that they should not be taken by surprise at a later point about the ways in which their personal data has been used."16

In this case, the CNIL concluded that Google failed to provide data subjects with sufficient transparency and information mainly for two principal reasons.

First, the CNIL found that information provided by Google was not intelligible because of its complex architecture. Indeed, the information was disseminated across several documents provided at different times, thus making it difficult for users to easily access the information in its entirety. In addition, users were forced to navigate a large amount of information, across complex web policies and several links to click, in order to be able to understand the ways in which their personal data has been used. For instance, the relevant information on the geo-tracking service was only accessible after several steps involving up to six positive actions by the user.

Second, the CNIL also took into account the nature of the processing and its concrete impact on data subjects. It explained that Google's data processing was particularly "massive and intrusive," because of the significant number of services offered by the company, the wide variety of sources from which the data originated (messaging services, YouTube, activities generated by users' web browsing, geolocation, etc.) and the nature of some of data obtained individually (in some cases the processing concerned sensitive data, such as interests, tastes or opinions). Taking into account the particular nature of Google's processing, the CNIL further concluded that the users were not able to fully understand what they were agreeing to. Indeed, the purposes of the processing were described in a very generic way and the description of the data collected was "particularly incomplete and inaccurate." The CNIL referred to a balancing test: The more invasive processing is, the clearer the information that users have to be provided with in order to satisfy the transparency requirement.

Even more interestingly, the DPAs encouraged a layered approach to transparency. For example, the recent Guidelines on consent jointly issued by the Privacy Commissioners of Canada, Alberta and British Columbia provide that "presenting information in a layered-format (...) helps make better sense of lengthy, complex information by offering a summary of the key highlights up front."17 In this decision, the CNIL concludes nevertheless that this approach is not enough and that organizations must ensure that a data subject can easily understand core processing activities. The CNIL seems to admit that Google could not provide within the first layer of its policy complete information on data processing, as it would be contrary to the transparency requirement. The CNIL distinguishes, therefore, between the first layer of information, where users should be able to appreciate the number and scope of data processing operations, and further layers where complementary information should be provided. By doing so, the CNIL obliges companies to determine themselves, at their own risk, just the right amount of information.

B. Lack of effective consent

Article 4 (11) of the GDPR defines consent as "any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."

Before assessing whether the consent was specific and unambiguous, the CNIL examined whether it was informed. For consent to be informed, the G29's Guidelines provide that "a controller must ensure that consent is provided on the basis of information that allows the data subject to easily identify who the controller is and to understand what they are agreeing to. The controller must clearly describe the purpose for data processing for which consent is requested."18 In this case, due to the dilution of information within several documents, users could not have an informed perception of what they were consenting to. Thus, the consent was not sufficiently informed.

The CNIL thus followed its case law under the former Data Protection Act. In its formal notice issued in November 2018 against Vectaury (a small ad-tech company focusing on providing geolocation-based data to retailers), the CNIL concluded that the consent was not informed, as the explanatory text displayed to the user upon launching the app was too complex and imprecise.19 Moreover, users were not informed of the identity of the companies with which their data was shared, as this information was only available after several clicks and scroll-downs.

With respect to the unambiguity of the consent, the CNIL recalled recital 32 of the GDPR, which requires a clear affirmative act given through an active motion or declaration. While recognizing that users could theoretically modify some options associated with their account by clicking on the "More options" link, the CNIL underlined that the account personalization settings were pre-checked by default. Moreover, if users did not click on the "More options" link and simply continued the creation of the account, their consent was deemed automatically given to Google. The CNIL has therefore concluded that this consent was not unambiguous, as there was no affirmative action from the user to consent to ad personalization. Therefore, in its decision, the CNIL followed the above-mentioned Guidelines stating that "the GDPR does not allow controllers to offer pre-ticked boxes or opt-out construction that require an intervention from the data subject to prevent agreement."20

Finally, the CNIL observed that the consent was not sufficiently specific. Indeed, in order to create an account, users had to agree to Google's terms of service and to the processing of their personal data as detailed in the latter, and, in doing so, they had to accept all data processing as a whole. However, the G29's Guidelines provide that the consent must be specific to the purpose, as "a safeguard against gradual widening or blurring of purposes for which data is processed, after a data subject has agreed to the initial collection of the data."21 The consent should be given distinctly for each purpose.

IV. Significance of the fine

The fine imposed by the CNIL, a first under the GDPR, shows its willingness to ensure law enforcement. In October, the UK Information Commissioner's Office (ICO) imposed a fine of 500,000 pounds on Facebook for breaches uncovered as part of the Cambridge Analytica investigations under the former Data Protection Act.22 Even though 50 million euros23 is manageable for a company with a turnover over 96 billion euros, this fine is a warning sign.

Interestingly, when determining the amount of the fine, the CNIL took into account not only the importance of the principles at stake, the continuance of the violations and the massive and intrusive nature of Google's processing, but also the fact that Google's business model is based on the exploitation of personal data and that it has a dominant position in the operating system market.

This decision could therefore be seen as a warning to the largest tech companies. In this respect, for example, the Bavarian DPA has recently announced that it was considering fining a number of companies under the GDPR for their cookies and user tracking practices, as their websites lacked the transparency needed for informed consent.24

Therefore, companies in all sectors of business should be mindful of incurring sanctions should they fail to comply with the GDPR, especially with regard to transparency and validity of the data subject's consent. Particular attention should be given to privacy policies, which should provide easily accessible, concise information to enable users to fully understand the extent of the processing of their data. While a layered approach to transparency is encouraged by various DPAs, organizations must ensure that users can easily understand core processing activities. This can be a very difficult task, especially when the processing concerns complex activities. As this decision shows, the threshold for what constitutes valid consent is extremely high.

Footnote

1 Decision No. SAN – 2019-001 of Jan. 21, 2019, imposing financial penalty against Google LLC.

2 Regulation (EU) 2016/679 of April 27, 2016, on the protection of natural persons with regard to the processing of personal data and on the free movement of such data.

3 "Données personnelles: Google va faire appel de l'amende record infligée par la CNIL" (Le Monde, Jan. 24, 2019): https://www.lemonde.fr/pixels/article/2019/01/24/donnees-personnelles-google-va-faire-appel-de-l-amende-record-infligee-par-la-cnil_5413581_4408996.html.

[4] CJEU, Oct. 6, 2015, Maximilian Schrems v Data Protection Commissioner, Case C-362/14.

5 Act No. 78-17 of Jan. 6, 1978, on information technology, data files and civil liberties, as amended by Act No. 2018-493 of June 20, 2018, on personal data protection.

6 Article 43 ter of the French Data Protection Act.

7 Directive 95/46/EC of Oct. 24, 1995, on the protection of individuals with regard to the processing of personal data and on the free movement of such data.

8 Article 29 of the Directive.

9 CJEU, June 5, 2018, Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH, Case C-210/16, para 70.

10 ibid para 74.

11 Article 56 of the GDPR: "Without prejudice to Article 55, the supervisory authority of the main establishment or of the single establishment of the controller or processor shall be competent to act as lead supervisory authority for the cross-border processing carried out by that controller or processor in accordance with the procedure provided in Article 60."

12 G. Doyle, "Who regulates Google" (The Irish Times, Feb. 15, 2019):
https://www.irishtimes.com/opinion/letters/who-regulates-google-1.3608052.

13 G29, Guidelines for identifying a controller or processor's lead supervisory authority, April 5, 2017.

14 Irish DPC, "Statement from the Data Protection Commission on Google and the use of location data," Nov. 26, 2018: https://www.dataprotection.ie/ga/news-media/press-releases/statement-data-protection-commission-google-and-use-location-data.
Swedish DPA, "Request for Reply and Further Clarification," Jan. 19, 2019:
https://www.datainspektionen.se/globalassets/dokument/ovrigt/google---request-for-reply-and-further-clarification---skrivelse-till-tillsynsobjekt.pdf.

15 M. Kaiser, "The diversified structure of European data protection enforcement – national authorities between autonomy, hierarchy and cooperation," BRJ, 2017, p. 41.

16 G29, Guidelines on transparency under Regulation 2016/679, April 11, 2018, WP260 rev.01, p. 10.

17 Privacy Commissioners of Canada, Alberta and British Columbia, Guidelines for obtaining meaningful consent, May 2018.

18 G29, Guidelines on consent under Regulation 2016/679, April 10, 2018, WP259 rev.01, p. 14.

19 Decision No. MED 2018-042 of Oct. 30, 2018, issuing notice to Vectaury.

20 G29, Guidelines on consent (n. 19) p. 16.

21 ibid p. 12.

22 ICO, Facebook Ireland Ltd monetary penalty notice, Oct. 24, 2018.

23 According to Articles 47 and 65 of the Act No. 2016-1321 of Oct. 7, 2016, the CNIL could not impose penalties exceeding 3 million euros.

24 Bavarian DPA, « Sicher im Internet – Digitale Dienste im Datenschutzcheck am Safer Internet Day 2019 », Feb. 1, 2019: https://www.lda.bayern.de/media/pm2019_2.pdf; https://www.lda.bayern.de/media/sid_ergebnis_2019.pdf.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.