Children’s Privacy – lost on TikTok? DPC fines € 345 million

TikTok is the most popular social media service for children. The Irish DPC provided a decision on the platform regarding the data processing of children’s data. The DPC imposed a fine of €345 million against TikTok and an order to bring the data processing into compliance with GDPR. However, the decision has a very limited scope. The DPC held the public by default setting when opening an account was not compliant with GDPR. In contrast, the missing of an effective age validation procedure for kids under 13 was not regarded as an infringement of GDPR.


The DPC acted as the lead supervisory authority (LSA) of the case since TikTok Limited (TTL) has the main establishment in Ireland.

The personal data concerned was the data of children. The terms of TTL open the platform for minors older than 13. However, with Apple the TikTok app was available for the age of 12. According to GDPR, minors are children until16, whereas Member States are granted an opening clause to deviate from this age. In praxis, children under 13 can access the platform by faking their age since TTL does not validate the age of the users in advance.

The data in question included sensitive data according to Art. 9 GDPR, like health data or information of political opinion or religious beliefs.

The decision addressed the period between 29 July 2020 – 31 December 2020.

From the perspective of TTL, the purpose of the data processing of TikTok is to provide a “global entertainment platform“. However, the user can communicate with each other and the EDPB regarded TikTok as social media service in the binding decision.

The proceeding addressed solely two types of data processing. Firstly, the public by default setting of the account. And secondly, the age verification process.

The default setting of the account was public. As consequence, any user of TikTok could see the identity and the account data of the children. Similarly, the videos posted were published to TikTok users and in addition to were visible to non-members. The children had to actively to opt out to choose the private setting. In addition, TikTok did not provide a understandable information of the consequences of the public setting of the account to the children. The wording was ambiguous and did not clearly address e.g. who the recipients of the videos are.

TTL provided several options for pairing with the account of the children, e.g. with an option for “Family pairing“. This setting allowed a “Parent user“ to access and control the Child user’s platform settings. In addition, the “Parent user“ could allow direct messages to Child user’s above the age of 16. Since TikTok did not provide a verification process to control the access to the account of the “Parent user“ any user could assign as “Parent user“ of any child.

In its binding decision, the EDPB broadly analysed the technical and organisational measures according to Art. 25 GDPR to prevent the children from risks arising from the processing of their data. Especially, the EDPB checked whether the age verification process of TTL met the state of the art requirement of the TOMs according to Art. 25 GDPR. The binding decision differed between ex ante and ex post measures of the access restrictions to minors under 13. Whereas, TTL provided some checks after the opening of the account, there was no process in place to prohibit kids under 13 to access the platform. However, the EDPB found not enough information to define the state of the art standard at the relevant period. Therefore, the EDPB held an infringement by TTL could not be concluded.


After the LSA provided a draft decision several national supervisory agencies (CSAs) objected the findings of the LSA. As consequence, the EDPB adopted a binding decision according to the cooperation mechanism of Art. 65 GDPR. Hence, the LSA was bound to draft a final decision taking into account the conclusions of the EDPB.

The binding decision of the EDPB especially mentioned the opinion of the Italian and German SA. The German SA addressed the aspect of TikTok nudging the user to a setting with a negative effect for their privacy. This was regarded as an infringement of the principle of fairness (Art. 5 (1) GDPR). In addition, the wording of the information and the missing links to this information were not in line with Art. 12 and 13 GDPR.

After the period in question in January 2021, TikTok changed the setting of children to a privacy by default setting.

The DPC fined

  • the public by default setting with €100 million,

  • the option for “Family pairing“ with €65 million and

  • the nudging and the missing information of the public by default setting and the missing information about recipients with €180 million.

In addition, the DPC imposed an order on TTL to bring these issues in compliance with GDPR.

The Irish DPC regarded the infringement of GDPR solely in parts as willful.

Assessment: Children’s Privacy on TikTok

According to the DPC the risk to children includes physical harm, online grooming or other sexual exploitation, normalisation of sexual comments, risk of social anxiety, self-esteem issues, bullying or peer pressure, access too harmful or inappropriate content and loosing anonymity.1 The DPC regards these risks to be deriving from ‘bad actors’ and not directly from TikTok.

When we open the limited scope of the proceeding to the broader aspect of the Children’s privacy on TikTok we have to take additional issues into account.

What about the tracking of the online-behavior of children? What are the effects on mental health of the TikTok-algorithm and of extensive consume of mini-videos? Has the Chinese regime access to the children’s data tracked by TikTok?

Children are one of the most vulnerable groups in our society and TikTok is the most popular social media service for kids. Within the EU, several million users are children.

This is not just an implementation deficit after 5 years of GDPR. Children are ‘let alone’ on TikTok – in the worst meaning of this term.

1 DPC, Final decision, page 100