Children’s Rights in the Digital Space

by Sarah Kunz von Hoyningen-Huene and Jutta Oberlin//

The digital age not only fundamentally changed our professional environment, but also our social interactions in our private lives. Digital communication channels simplify getting in touch with many people and letting people participate in your own life or family life via social platforms.

The profiles of the “family bloggers” have thousands to millions of followers who watch the lives of the underage social media stars on public profiles every day and are informed about even the smallest details in the life of the family. No area of ​​life is left out, whether it’s the bedtime ritual, teeth brushing, vacation or even the last doctor’s visit including the diagnosis.

In some examples the parents themselves are also successful social media stars and share information about intimate topics such as pregnancy, fertility, etc. The children of the family may be shown with little to no clothing and in embarrassing moments, which the parents find funny or worth showing to connect with their audience, appear relatable and foster a connection which results in monetary benefits. Like in a script, the children talk about adult topics such as coffee, clothes, trends and makeup. It is not known how self-determined the children’s social media presence is. Certain on the other hand is that families make fortunes through clicks, paid partnerships and traffic on social media.1

1. Fundamental Rights of Children

The growing awareness of children as independent members of society in the past few decades led to a fundamental reevaluation of the treatment of children and their legal status. This rethinking was finally reflected in the first international declaration of children’s rights, which was issued in 1989 by the United Nations. In 1997, Switzerland ratified the United Nations Convention on the Rights of the Child.2 Art. 14 of the Swiss Civil Code3 defines a “child” in the same way as the CRC; every person before reaching their 18th birthday is considered underage and therefore a child. The best interest of the child is the priority, it should determine the legal status of children within any legal system of the member states. There are still areas in which Switzerland has room for improvement4: the legal framework for enforcing children’s right to a non-violent upbringing is still inadequate. The ideal of equal opportunities, especially in education, needs to be implemented even better. In addition, federalism leads to differing treatment of children in different cantons, particularly with regard to protection from neglect and violence. Summing up, the legal situation of children in Switzerland is generally satisfactory compared to other countries and progress is undeniable.5

We must never lose sight of the fact that children’s rights have improved tremendously compared to the last century. Most people in Switzerland nowadays know that it is forbidden to sexually exploit children, that it is at least problematic to hit children, that children can and must go to school as a matter of principle instead of going to work. However, there are still many areas of children’s fundamental rights that are hardly known. Although the broad public is aware that the most elementary human rights also apply to children, children’s rights to freedom of opinion and religion and to information about life situations that affect them6 are largely unknown. As a result, it is often assumed that a lack of children’s rights’ implementation and enforcement is primarily an issue of developing countries.7

2. Sharenting

Sharenting” (stemming from “to share” and “to parent”) is a phenomenon that has gained a lot of traction in the last few years.8 Parents share content on social media displaying their children, often in every area of their life. With user and audience numbers skyrocketing, crucial moral and ethical questions remain widely unanswered. In a sort of overwhelm of our own media competence we seem to remain in a state of shock – or is it thoughtlessness? – regarding the regulation of the phenomenon. Society’s interest in young children appears to be high as content displaying them is widely popular – and therefore monetarily profitable.9 Authorities, as well as the providers themselves, tolerate the publication of children’s images as long as they do not violate the relevant standards of the Criminal Code (StGB)10 or the legality of data processing according to the EU General Data Protection Regulation (GDPR)11 or violate the guidelines of the platform operators.

But what about children’s rights? Is it legal for parents to act as they wish within the legal gray areas and not give the best interests of the child the top priority?12

3. Children’s rights & Data Protection

3.1 Civil Law and Data Protection – hand in hand

Furthermore, to be able to situate the legal problem at hand, it is necessary to assess two additional legal fields; the legal areas of data protection law and civil law, which merge. For example, according to the prevailing doctrine, a violation of personal rights according to Art. 28 Swiss Civil Code13 can only exist if personal data is involved. When personal data is involved this leads directly to the topic of data protection.14

According to Art. 4 (1) GDPR, personal data is all information that relates to a specific or identifiable person. The ability to identify a person is based on various factors that enable identification. This means that photographs and videos in which people can be identified are personal data within the meaning of the relevant data protection laws. For example, a conclusion can be drawn about a pixelated person if the family environment, place of residence, etc. are known.

Art. 4 (14) GDPR states so-called facial images as personal biometric data. However, the biometrics cannot be “found” in every facial image, since certain requirements, such as the facial geometry, must be met. Consequently, not all facial images belong to the special categories of data within the meaning of Article 9 of the GDPR. When it is published, other data such as place of residence, friends, school, events attended, medical history, etc. are usually also publicly displayed. With this information, an entire personality profile within the meaning of Art. 3 lit. d DSG15 can be created in a simple manner, which is highly problematic also for example from a crime prevention perspective.16

3.2 Consent

Processing activities involving special categories of data require explicit consent, which according to Art. 7 GDPR must be given voluntarily, for a specific situation and in an informed manner and can be revoked at any time without having to suffer disadvantages. According to Art. 9 GDPR, explicit consent is required. The term “explicit” is not specified in the GDPR itself, but according to the then Article 29 working group it means that the disclosure of information requires active action. Especially when it comes to these special categories of data, the individual should consider carefully whether this data might reveal too much and should not be published. Although the data subject can, under current legislation, have the data controller delete the data, this seems too risky given that the information is already known to third parties or may have been duplicated or shared with others.

In the law itself, there is no legal definition of publication. According to the prevailing doctrine, however, publication is referred to as making personal data accessible towards an uncontrollable circle of viewers. In principle, publication on the social platforms can not only be accessed if the user has not changed the privacy settings, but also if the users give an uncontrollable group of people access to the uploaded content by adding unknown people to their “friends list”. In addition, photos can still be shared with an indefinite group of people via screenshots, targeted downloading of content, etc.17

3.3 Measures under data protection law

In the course of this complex situation, the platform operators have drawn up their own guidelines, which are based on the US Children’s Online Privacy Protection Act (COPPA)18 and GDPR, under whose scope they fall. COPPA stipulates that no data may be collected from children under the age of 13. Parents who, despite everything, still want to create a profile with pictures of their children, often do so as so-called family bloggers. Even if the children are usually the focus, it is a family profile that is not managed by the child himself, but by the adults.

If a profile is managed by a child who has not yet reached the age of 13, the parents must add an “Account managed by parents” note to the profile, as Instagram requires when registering. From the age of 16, children may legally consent according to the GDPR. However, due to the opening clause in Art. 8 GDPR, the scope of which is specified in Recital 38, this can be downgraded by the Member States to a minimum age of 13 years.19

If personal data is processed, this must always be done in a transparent manner for the data subject. This principle of transparency results retrospectively from Article 5 (1) (a) GDPR. Recital 39 sentence 2 also makes it clear that this principle of transparency is prospective and the aim is for the data subject to be fully informed before the actual data processing activity begins. This principle is directly related to the specification resulting from the obligations to inform the data subject and other data subject rights (Art. 13, Art. 14 and Art. 15 GDPR). In principle, the data protection declaration must make it possible for the data subject to be explained in an understandable way how their personal data is processed by the innovative solutions, especially if it is an automatic decision-making process, the so-called automated individual decision-making, including profiling according to Art. 22 GDPR. In this context, in the future there could be more focus on the transparent communication of the dangers of sharing children’s pictures. In the course of the topic, not only the data protection risks are relevant, but also the risks in the criminal sense. Transparent communication on this topic, for example enforced by the authorities or by being anchored in law, can also lead to inappropriate images being discovered and reported earlier. Appropriateness must also be interpreted more broadly in future. So far, this has been limited to criminally relevant content such as child pornography, hate speech, insults, defamation, etc. Above all, this should be extended to images that have the potential to expose people or to a certain extent negatively affect them in their social environment.20

To detect these inappropriate images, an advanced artificial intelligence could be used that can learn over time which images represent a “new” spectrum of embarrassment or inappropriateness. In this case, the data protection officers, engineers and lawyers have to be creative and “feed” the artificial intelligence with exemplary facts.

It is also of great practical importance to ensure that in the case of a child who has not yet reached the age of 16 or 13, the legal guardians have given their consent and not the child itself.21 The hurdles for the child to consent “in the name” of the parents are not high in the nature of things. A simple click is usually enough. The so-called verification obligations according to Art. 8 Para. 2 GDPR affects the person responsible, but also other data processors. To a reasonable extent, they must use the available technology to ensure that the parents actually give their consent. The above click is therefore not an adequate guarantee.

One idea to ensure this is, for example, a double opt-in procedure in which the parents’ e-mail address must be given. They must then confirm the incoming e-mail using the double opt-in procedure. But even with this method, the identification of the person with custody is only possible to a limited extent, since e-mail addresses can be created quickly. If the person responsible wants to be sure that the legal guardians agree and no one else, the consent can also be given based on a document signed by the parents. Furthermore, identification by the credit card of the legal guardian is also possible. When processing special categories of data in accordance with Art. 9 GDPR, identification by means of an identification document can also be considered. However, it is important that the copy of the ID card is deleted again after identification.

It is of great practical relevance that the platform operators check, for example, with each uploaded image whether any rights of the persons depicted have been violated or whether the requirements for the consent of a child according to Art. 8 (1) DSGVO have been complied with. One idea would be to implement a popup window that is displayed to the user during the upload process with the following question:

Dear user, please make sure that everyone shown in the picture has consented to the publication of the image/video. If you’re not sure, don’t post the picture.

In addition, the platform operator is free to implement various provisions on this subject in the form of soft law in its own data protection notices or in the terms of use.22

3.4 Legitimate Interest

It can be read several times that posting of children’s pictures by the family is a legitimate interest of the publisher. However, in this context, a legitimate interest within the meaning of Article 6 (1) lit. f GDPR can only be subsumed under the legal basis of Art. 6 Para. 1 lit. f GDPR if this outweighs the interests or fundamental rights and freedoms of the person concerned – especially when the data subject is a child. In contrast to Article 8 GDPR, age is not specified in Article 6 Paragraph 1 lit. f GDPR. The draft of the GDPR standardized in Art. 4 No. 18 E-GDPR that every person up to the age of 18 is to be classified as a child. However, the legal definition of age has been completely removed. Reference can still be made to the definition of the CRC in this context. Due to this special need for protection, it is difficult or impossible to argue for a legitimate interest in posting children’s pictures – even if the non-material interests are taken into account. According to Art. 2 No. 2 lit. c GDPR, the GDPR does not apply to the processing of personal data by natural persons for the exercise of exclusively personal or family activities. It is questionable whether the household privilege applies to a user account in which no appropriate privacy settings have been made. It must be assumed that not only personal contacts can access the content, but an indefinite group of recipients. In this case, the household privilege is excluded. But even with non-public user profiles, it can be assumed for many users that not only personal acquaintances can access the account, since so-called friend requests can also be generated by fictitious profiles that send these requests to an unspecified group of people for advertising or private reasons .23

3.5 The legality of the publication

Irrespective of the civil law perspective, minors can independently consent to data processing in accordance with Art. 8 GDPR. For minors who have not yet reached the age of 16, Art. 8 GDPR stipulates that consent is only effective if it has been given by the legal guardian. The conditions of consent according to Art. 8 GDPR only refers to the services of the information society. The legal definition can be found in Art. 4 (25) GDPR. In this sense, the offer must have the following characteristics:

  • Regularly for a fee;

  • Electronically;

  • in “distance selling”;

  • Service provided at the individual request of a recipient.

The platform operator provides many social media platforms free of charge. This means that there is an enormous limitation of the scope here, and that the free social platforms cannot be subsumed under Art. 8 GDPR. However, the paid premium version offers various social platforms to open up the scope of Art. 8 GDPR and the platform operators are therefore obliged to implement the conditions of a child’s consent. The discernment of the data subject plays a major role in both Art. 7 GDPR and Art. 8 GDPR, as this is a prerequisite for the effectiveness of both standards. The discernment includes the ability to see risks and assess the possible consequences of the risks. Art. 8 (1) GDPR is based on the irrefutable assumption that the ability to understand occurs from the age of 16. Depending on the case or situation, the capacity to make judgments under civil law is assumed between the ages of 12 and 14. This also makes sense, since digital data processing activities are very complex and might involve abstract risks. Similar to civil law, consent is not lawfully obtained against the will of the child who is of legal age.24

4. GDPR requirements

Various requirements of the GDPR are partly catapulted into problematic spheres by the data subjects themselves. For example, it is hardly compatible with the principle of data minimization when parents share vast amounts of children’s pictures on social media platforms.

According to Art. 5 (1) lit. c GDPR, personal data are to be processed appropriately for the purpose and limited to what is necessary. This means that data processing beyond the purpose or the collection of “too much” data is not lawful and the data processing therefore has no legal basis.

Uploading pictures of children to social media involves partly giving up the rights to the pictures to the platform operator. This illustrates the extent of the consequences of the legal representation of children furthermore. It must be pointed out that this is sensitive data from minors. The publisher permits access to others and partially loses control over the data and its utilization. Regaining control of the pictures proves to be difficult or impossible as the internet never forgets anything. The concept of data minimization should therefore be measured against the category of the data itself and may also be limited in this sense in the future. For example, the rule could be specified so the data uploaded by minors themselves or by third parties are not used for processing purposes that are not in the interests of the child. This could be regulated in the form of a restriction on the processing activities for data from minors, e.g. in accordance with Article 18 (1) GDPR.25

5. A short glimpse into the perspective of a public prosecutor

Law enforcement authorities and data protection experts repeatedly illustrate the inherent risks of supposedly unproblematic children’s photos. These photos, taken from social media, often appear on problematic internet forms and image sharing sites, where they are clearly sexualized, for example by the added comments. An Australian study found that around half of the photos shared on an Australian photo sharing site with a pedophile background came directly from social networks.26 These are often supposedly harmless photos that show children engaged in everyday activities, such as playing or doing sports.27 There are often drastically sexual comments to be found in the comment sections of such images, with videos so-called “time stamps’ ‘ that directly identify the most interesting moments from a pedophile point of view to other users. Some users exchange email addresses in order to share images with one another outside of the platforms and thus spread them further.28

Again and again there are reports of photos of children by so-called family bloggers, who suddenly appear on the dark web in the midst of (criminal) child pornographic material. It is not surprising that data protection is therefore one of the central issues of the moment. Legislation reflects this development – but the awareness of users of social networks often lags behind. We can either accept this with a shrug – data protection, like health protection, is a personal decision; personal data exhibitionism is therefore just as deliberate for everyone as smoking, for example. Every adult decides for himself or herself which data and pictures they want to publish about themselves. But since creating a digital footprint has such widespread consequences it should be a subject of individual autonomy, beginning at the point when we have the capacity to judge, in the sense that we are able to assess the consequences of our disclosure and thus make a reflective decision. Children are not yet able to have this foresight and must not be burdened with irresponsible decisions of their legal guardians.29

6. Conclusion

In summary, it can be stated that children have comprehensive rights in the digital space, which must not be neglected. Especially when parents have control over their children’s presence on social media, it is of crucial importance that children’s fundamental rights in the digital space are respected. The purpose of this article was to show that these rights go far beyond mere protection under criminal law. Rather, children should be perceived as independent personalities with inherent personality rights, who are also entitled to comprehensive data protection rights. Respecting this is not only part of modern parenting, but also part of media competence, which must be exemplified and taught.

Sarah Kunz von Hoyningen-Huene, Public Prosecutor, Frauenfeld (Switzerland)

Jutta Oberlin, Lawyer and Data Privacy Specialist (Switzerland)

 

1 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

2 Übereinkommen über die Rechte des Kindes abgeschlossen am 20. November 1989 (SR 0.107) (zit. UN- Kinderrechtskonvention).

3 Schweizerisches Zivilgesetzbuch (ZGB) vom 10. Dezember 1907 (SR 210).

4 Humanrights.ch, 10 Jahre Kinderrechtskonvention in der Schweiz <https://www.humanrights.ch/de/ipf/menschenrechte/kinder/kinderrechte-10-jahren>), insbesondere «Kinderrechte wenig bekannt».

5 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

6 Michelle Cottier, Subjekt oder Objekt? Die Partizipation von Kindern in Jugendstraf- und zivilrechtlichen Kindesschutzverfahren, Eine rechtssoziologische Untersuchung aus der Geschlechterperspektive, Bern 2006, 66 f.

7 Humanrights.ch, Informationelle Selbstbestimmung – (noch) kein neues Grundrecht, <https://www.humanrights.ch/de/ipf/menschenrechte/privatsphaere/informationelle-selbstbestimmung> (zuletzt besucht am 13.10.2022).

8 Richard Godwin, The rise of the nano-influencer: How brands are turning to common people, The Guardian vom 14.11.2018.

9 Catalina Goanta/Isabelle Wildhaber, In the Business of Influence: Contractual Practices and Social Media Content Monetisation, SZW 2019 346 ff.

10 Schweizerisches Strafgesetzbuch (StGB) vom 21. Dezember 1937 (SR 311.0).

11 Verordnung (EU) 2016/679 des Europäischen Parlaments und des Rates vom 27. April 2016 zum Schutz natürlicher Personen bei der Verarbeitung personenbezogener Daten, zum freien Datenverkehr und zur Aufhebung der Richtlinie 95/46/EG (Datenschutz-Grundverordnung), ABl. L 119 vom 4.5.2016, 1 (zit. DSGVO).

12 Roland Fankhauser/Nadja Fischer, Kinderfotos auf Facebook oder wenn Eltern die Persönlichkeitsrechte ihrer Kinder verletzen, in: Roland Fankhauser/Ruth E. Reusser/Ivo Schwander (Hrsg.), Brennpunkt Familienrecht, Festschrift für Thomas Geiser zum 65. Geburtstag, Zürich/St. Gallen 2017, 193 ff., 198.

13 Schweizerisches Zivilgesetzbuch (ZGB) vom 10. Dezember 1907 (SR 210).

14 BfDI, Datenschutz in sozialen Netzwerken, <https://www.bfdi.bund.de/DE/Buerger/Inhalte/Telefon- Internet/TelekommunikationAllg/DatenschutzInSozialenNetzwerken.html>.

15 Bundesgesetz über den Datenschutz (DSG) vom 19. Juni 1992 (SR 235.1).

16 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

17 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

18 US-Children’s Online Privacy Protection Act of 1998, 15 U.S.C. 6501–6505.

19 Benedikt Buchner/Jürgen Kühling, Einwilligung des Kindes in Bezug auf Dienste der Informationsgesellschaft, in: Jürgen Kühling/Benedikt Buchner (Hrsg.), DS-GVO, Datenschutz-Grundverordnung, Kommentar, München 2017, 281–291 Rn. 24.

20 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

21 Noémie Helle, Publication de l’image de l’enfant sur les réseaux sociaux: de quel(s) droit(s)?, ZKE 2019 500 ff., 504.

22 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

23 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

24 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

25 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.

26 Victoria Richards, Paedophile websites steal half their photos from social media sites like Facebook, Independent online vom 30.9.2015, <https://www.independent.co.uk/news/world/australasia/paedophile-websites-steal-half-their- photos-social-media-sites-facebook-a6673191.html>.

27 Gabriela Dettwiler, Vom Familienalbum in die Timelines von Social-Media-Plattformen: Wie soll man mit Kinderbildern im Internet umgehen?, NZZ vom 19.7.2019.

28 Lucy Battersby, Millions of social media photos found on child exploitation sharing sites, The Sydney Morning Herald vom 29.5.2015, <https://www.smh.com.au/national/millions-of-social-media-photos-found-on-child-exploitation- sharing-sites-20150929-gjxe55.html>.

29 KUNZ VON HOYNINGEN-HUENE, SARAH/ OBERLIN, JUTTA SONJA: Innocence in Danger, Schweizerische Juristen-Zeitung, 118/2022.