By Christina Etteldorf//
The Irish Data Protection Commissioner (DPC) announced in early September that it had opened two own-volition inquiries against the popular social networking and video service TikTok.1 On the one hand, these proceedings are intended to assess TikTok’s compliance with the GDPR’s data protection by design and default requirements in the context of platform settings and transparency obligations concerning minors. On the other hand, the DPC plans to take a closer look on possible data transfers to countries outside the EEA. “It’s about time”, one might say, as the data protection policy of TikTok, which is especially popular with children with its social networking features, has been under critical scrutiny for quite some time now and was (or: is) therefore on the radar of many data protection authorities (DPAs) in the EU Member States. However, very little is known about the proceedings at present, and it is not likely to become known any time soon – at least, if the latest DPC proceedings with cross-border relevance against WhatsApp and Twitter are anything to go by. Yet, this is not surprising given the complexity of the procedural and substantive questions arising here.
The tricky matter of jurisdiction
TikTok’s headquarters are located in California, USA, according to the company. However, owner Bytedance is based in and operates the Chinese version of TikTok (‘Douyin’) from Beijing, China. With the growing popularity in Europe, branches in several EU Member States have also been established and significantly expanded. For the application of the GDPR, this is irrelevant at first sight, as its market location principle is linked to the processing of data of EU users – regardless from where this takes place. This also entails jurisdiction of (all) national DPAs. However, it gets tricky when it comes to enforcement measures: Put simply, only the lead supervisory authority can take final measures, which, in the case of cross-border relevance, are regularly tied to conducting a cooperation procedure involving the European Data Protection Board (EDPB). The lead jurisdiction is determined by the location of the main establishment of the data controller in the EU. The latter was (and perhaps still is) unclear for TikTok because of the rapid rise of and changes to the platform and the subsidiaries established.
This is documented by a large number of proceedings by EU DPAs against TikTok, two examples being Denmark2 and France3. However, the proceedings regularly did not lead to taking actions against TikTok; rather, the DPAs shifted their efforts to supranational cooperation processes. In June 2020, the EDPB even established a taskforce to coordinate potential actions and to acquire a more comprehensive overview of TikTok’s practices across the EU.4 Only two DPAs took the final step to sanction the social platform: In January 2021, the Italian DPA imposed an immediate ban on the processing of data of those individuals whose age cannot be determined with certainty by TikTok and reasoned this with the network’s opaque and unclear information for users and default settings violating data protection law.5 The Dutch DPA imposed a fine of EUR 750,000 on TikTok in April 2021, which was mainly substantiated by the fact that the privacy policy was only provided in English and that Dutch users, especially minors, were therefore not informed in a sufficiently clear and comprehensible manner about the handling of their personal data.6 In the proceedings, TikTok had regularly claimed the lack of authority to act vis-a-vis the DPAs referring to the DPC’s lead jurisdiction, as its Dublin branch was to be considered its EU headquarter since July 2020.7 Meanwhile, the imprint of TikTok also points out that the platform for European users is operated by the TikTok Technology Limited seated in Ireland.
Although the DPC proceedings now opened do not definitively establish jurisdiction, which depends on objective (partly evaluative) criteria of main or single establishment (Art. 56(1) GDPR)), it is nevertheless an indicator that Ireland wants to take up this role.
The opaque matter of data transfers overseas
The territorial spread of TikTok is also related to another question relevant under data protection law: where does the processing of EU citizens’ data by TikTok actually takes place? Storing user data on Californian servers, for example, or merging them in Beijing with data from the Chinese parent company, is only permitted under the GDPR if it is ensured that the EU level of protection is not undermined. The Commission can establish the existence of such an adequate level of protection in a specific country by means of an adequacy decision (Art. 45 GDPR), which is, however, not the case for China or the US. This means that data controllers such as TikTok, if they do not keep processing operations exclusively within the EU, must ensure a level of protection comparable to the GDPR themselves. As the CJEU emphasised in its Schrems II ruling8 (that brought down the EU-US Privacy Shield in the first place), this may require the adoption of additional safeguards, which in many regards (such as the possibilities of data access by police and security authorities in other countries) cannot even be realized by private companies. The DPC will now have to critically examine whether such data transfers abroad are taking place. The fact that this is difficult to accomplish, especially in the case of large foreign tech companies, due to the opacity of their processing procedures, is shown, for example, by the recent emergency decision of the EDPB on the merging of data between Facebook and WhatsApp. In this case – which, by the way, is also under the lead responsibility of the DPC – the EDPB had to limit itself to the statement, that there is indeed “a high likelihood that Facebook IE already processes WhatsApp’s user data […], however, the EDPB is not in a position to determine whether such processing takes place in practice”9.
The urgent matter of protecting minors’ data
The most pressing issue that will be eagerly awaited in the DPC proceedings, however, is the protection of minors’ data. The GDPR emphasises in several places that the data of minors require special protection. For example, a particularly high level of protection is to apply in the area of advertising, profiling and the comprehensible provision of information (recitals 38 and 58). Consent for the use of information society services is also subject to special conditions: According to Art. 8 (1) GDPR, this is dependent on parental consent for children under the age of 16, although Member States may provide for a different age threshold (not below 13 years). Data controllers should ensure this by making “reasonable efforts” taking “into consideration available technology” (Art. 8 (2)). Besides that, the GDPR does not specify what this means in concrete terms, for example whether tools for age verification or parental control are mandatory on platforms.
It does, however, provide that this issue can be dealt with by industry codes of conduct (Art. 40(2) GDPR) and is encompassed in the DPAs’ duty to provide guidance (Art. 57 (1)(b) GDPR). Unfortunately, such a uniform standard for the protection of minors’ data has not yet been developed in the EU – although the drafting of guidelines in this area was on the EDPB’s to-do list for 2019/2020.10 The DPC itself has also published draft guidelines and had them evaluated in a public consultation without however publishing them as final guidance yet.11 Here, the DPC lays down 14 fundamental principles that are formulated in a rather “catchy” way (e.g. “consent doesn’t change childhood” or “your platform, your responsibility”), but nevertheless attempt to concretise the broad wording of the GDPR in some areas. Rule 4, for example, under the heading “know your audience”, provides that online service providers should take steps to identify their users and ensure, if necessary, a child-specific level of data protection. Rule 11 states “You can’t absolve yourself of your controller responsibilities to child users by simply stating that children below a certain age aren’t welcome on your platform/service […]”.
If one takes a look at the registration process and the privacy policy of TikTok on this basis, it is already possible to make a partial prediction for the upcoming DPC proceedings: although TikTok asks for the date of birth when registering, a secure age verification system is not implemented. In the privacy policy for the EEA, UK and Switzerland, there is a note that the service is not available to persons under 13 years of age and that certain functions are restricted for persons under 18 years of age.12 At the time of writing this post, a link contained at that point to a safety centre for parental controls only led to an “Oops! 404”-page. In addition, the use of user data for advertising purposes, for profiling and for location tracking is described.
The DPC will therefore have plenty to consider. It will be particularly interesting to see how it assesses the age threshold of 13 years set by TikTok (which is most likely due to the minimum age threshold of the GDPR). In Ireland, the threshold for consent on the basis of the GDPR’s opening clause is 16, as in other countries such as Germany, while Belgium and Sweden, for example, set the age at 13 and Italy and Spain at 14.13 In this light, it is to be welcomed that, due to the cross-border significance of the proceedings, there will most likely be a triggering of the cooperation mechanisms of the GDPR, in the context of which the DPAs of other Member States can also raise their concerns. Not so welcome, however, are the associated time delays already experienced in the proceedings against Twitter and WhatsApp, which were decided more than two years after the initiation of the proceedings and were associated with conflicts over the assessment conducted by the DPC.14
This is all the more worrying as the protection of children in the digital environment is an issue that is currently being increasingly pushed forward, especially at the European level.15 Online interaction has become a matter of course for the generation growing up now and in the future; the appropriate level of protection for children is not yet. It should be emphasised that data protection law is about the protection of children’s privacy, i.e. initially “only” about questions of the undisturbed exercise of private life and thus also of “growing up”. It does not directly concern issues of youth media protection per se, i.e. the extent to which children must be protected from content that impairs their development, such as pornography or violence, or whether and how measures must be taken against dangerous phenomena such as cybergrooming or cyberbullying. However, these are strongly interwoven online. On the one hand, data protection law also directly addresses issues such as how children may be addressed by advertising or to what extent personality profiles may be created, which has direct consequences for the selection of content displayed to them on social networks or other platforms. On the other hand, it is indirectly linked to the online protection of minors, because only those who know the age of their users (meaning those who collect these data) can take the necessary protective measures in other areas (age recommendation systems, closed user groups, advertising restrictions, restriction of direct messages, etc.). Data protection law is thus an important interface and connecting point for protection of minors online as a whole.
Christina Etteldorf is Research Associate at the Institute of European Media Law; Saarbrücken (Germany).
1 Press release of 14.09.2021, https://www.dataprotection.ie/en/news-media/latest-news/dpc-launches-two-inquiries-tiktok-concerning-compliance-gdpr-requirements-relating-processing.
2 Datatilsynet, press release of 30.6.2020, https://www.datatilsynet.dk/presse-og-nyheder/nyhedsarkiv/2020/jun/datatilsynet-undersoeger-tiktok.
3 Le Monde, 11.8.2020, La CNIL lance une enquête sur TikTok, https://www.lemonde.fr/tiktok/article/2020/08/11/la-cnil-lance-une-enquete-sur-tiktok_6048695_6013190.html.
4 EDPB, thirty-first Plenary session, https://edpb.europa.eu/news/news/2020/thirty-first-plenary-session-establishment-taskforce-tiktok-response-meps-use_en.
5 GPDP, press release of 3.2.2021, https://www.gpdp.it/web/guest/home/docweb/-/docweb-display/docweb/9524224#english_version.
6 Autoriteit Persoonsgegevens, decision of 9.4.2021, https://autoriteitpersoonsgegevens.nl/sites/default/files/atoms/files/decision_to_impose_a_fine_on_tiktok.pdf.
7 Cf. on this Lievens, E., Dutch DPA Fines TikTok for not Offering Understandable Information to Children European Data Protection Law Review 7(2021)3, pp. 423 – 428.
8 Judgment of 16.8.2020, C-311/18.
9 EDPB, Urgent Binding Decision 01/2021 of 12.07.2021, https://edpb.europa.eu/our-work-tools/our-documents/urgent-binding-decision-board-art-66/urgent-binding-decision-012021_en, p.48.
10 EDPB, Work Program 2019/2020, https://edpb.europa.eu/sites/default/files/files/file1/edpb-2019-02-12plen-2.1edpb_work_program_en.pdf.
11 DPC, The “Children’s Fundamentals” – A guide to protecting children’s personal data, https://www.dataprotection.ie/en/dpc-guidance/blogs/the-children-fundamentals.
12 https://www.tiktok.com/legal/privacy-policy-eea.
13 Bird&Bird, Children online, https://www.twobirds.com/en/in-focus/general-data-protection-regulation/gdpr-tracker/children.
14 Cf. EDPB, Decision 01/2020 on the dispute arisen on the draft decision of the Irish Supervisory Authority regarding Twitter International Company under Article 65(1)(a) GDPR, https://edpb.europa.eu/our-work-tools/our-documents/binding-decision-board-art-65/decision-012020-dispute-arisen-draft_en; Binding decision 1/2021 on the dispute arisen on the draft decision of the Irish Supervisory Authority regarding WhatsApp Ireland under Article 65(1)(a) GDPR, https://edpb.europa.eu/our-work-tools/our-documents/binding-decision-board-art-65/binding-decision-12021-dispute-arisen_en.
15 Council of Europe, Recommendation CM/Rec(2018)7 of the Committee of Ministers to Member States on guidelines to respect, protect and fulfil the rights of the child in the digital environment, https://search.coe.int/cm/Pages/result_details.aspx?ObjectId= 09000016808b79f7; European Commission, Communication on an EU strategy on the rights of the child, COM(2021) 142 final, https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX%3A52021DC0142; Cf. on this Lievens, E., Dutch DPA Fines TikTok for not Offering Understandable Information to Children European Data Protection Law Review 7(2021)3, p. 426 et seq. with further references.