Samo zamislim si kaj bi rada da piše v sodbi, to vpišem v iskalnik, in dobim kar sem iskala. Hvala!
Tara K., odvetnica

09.03.2026
5424-1/2025/7
Moderne tehnologije, Obdelava osebnih podatkov otrok in mladoletnih, Profiliranje in avtomatizirano odločanje, Svetovni splet, Vgrajeno in privzeto varstvo podatkov
The Information Commissioner of Republic of Slovenia (hereinafter: the IC) received your research questions in connection with children’s privacy and sharenting for your doctoral research. Below we are providing our replies.
1. Special legal provisions for children online
Does your country's national legislation contain specific provisions aimed at protecting the personal data of children (under 18) in the digital environment? If so, please specify the relevant legislation.
Based on Article 8 of Slovenian Act on personal data protection (Zakon o varstvu osebnih podatkov, hereinafter: ZVOP-2) a child’s consent for the use of information society services offered directly to children or which can reasonably be expected to be used by children shall be valid if the child is 15 years of age or older. Where the child is under 15 years of age, the consent shall be valid only if it is given or authorised by one of the child’s parents, the child’s guardian, or a person entrusted with parental responsibility. Where an information society service is offered free of charge, the consent may also be authorised by a foster parent or by a representative of the institution in which the child is placed. Where the terms and conditions of the information society service provider prescribe a higher minimum age for the use of such services, the age specified in those terms and conditions shall apply. A child’s consent shall not be made conditional upon excessive requirements imposed by the controller, such that the child would be required to provide more personal data than is necessary for the purpose of carrying out such activity.
In the Slovenian legal system, children acquire limited legal capacity at the age of 15 and full legal capacity at the age of 18. This means that until reaching the age of 15, children do not have legal capacity and are represented in such matters by their parents or legal guardians. A child who has reached the age of 15 may independently conclude legal transactions. However, for transactions that significantly affect the child’s life before or after reaching the age of 18, the consent of the parents is still required. This rule also applies in the context of personal data protection, for example to the exercise of rights under the General Data Protection Regulation and to the giving of consent to the processing of personal data by the child or by the parents or guardians (in cases not already covered by Article 8 of the ZVOP-2).
2. Age of consent and age verification (Article 8 of the GDPR)
What is the age limit for valid consent of a child to the processing of personal data in online services under the legal system of your country?
The reply to this question is the same as to the previous one. Pursuant to Article 8 of the ZVOP-2, a child’s consent for the use of online services, as information society services, offered directly to children or which can reasonably be expected to be used by children, shall be valid if the child is 15 years of age or older. Where the child is under 15 years of age, the consent shall be valid only if it is given or authorised by one of the child’s parents, the child’s guardian, or a person entrusted with parental responsibility. Where the terms and conditions of the information society service provider prescribe a higher minimum age for the use of such services, the age specified in those terms and conditions shall apply.
What problems or challenges have you encountered in applying this provision (age verification, demonstrable consent of a parent/legal guardian)?
The IC has not conducted proceedings, in which the issue of a child’s consent or the authorisation of their parent in relation to information society services was examined. Therefore, we can only provide general observations regarding the challenges.
In our view, age assurance mechanisms are an important element in the protection of children in the digital environment; however, such mechanisms must be effective and fully compliant with the GDPR. In practice, we have observed a number of challenges related to the implementation of such mechanisms. For example, controllers often rely on self-declaration mechanisms, which are not effective in practice and do not reliably prevent younger children from accessing services without parental consent. On the other hand, more intrusive age assurance methods are increasingly being used, such as social media algorithms that determine a user’s estimated age based on their activities and the content they produce or engage with. Such practices raise additional data protection concerns, particularly in light of the principle of data minimisation.
In the context of ongoing discussions on age assurance mechanisms and the protection of minors online, in February 2026 the Slovenian Government instructed the Ministry of Digital Transformation, in cooperation with other competent ministries, to prepare a proposal for a legislative framework that could restrict the use of social media and similar online services by children. As a result, the Ministry has initiated consultations and preparatory activities to examine possible measures in this area.
3. Complaints – social networks and children's privacy
a) How often do you receive complaints concerning the protection of children's privacy on online platforms?
We have received very few complaints of this nature in the past years.
b) How many such cases have you dealt with, what were the most common reasons, who filed them, and what measures or decisions did the authority take (guidance, initiation of proceedings, decision, sanction, other)?
We do not have exact number, but number of such cases is very low. For example, we handled a complaint concerning a parent’s request for erasure of his children’s personal data. The complainant asked the provider of an AI chatbot service to erase the personal data of his children that appeared in the chatbot’s responses. As it was a cross-border case, we transmitted the complaint to the lead supervisory authority.
c) Do you have procedures or tools in place for the child-friendly enforcement of children's rights (clear communication, anonymization, special forms)?
We are currently preparing a dedicated webpage for children. The webpage will be dedicated to raising children’s awareness of personal data protection. Among other things, it will introduce them to the fundamental concepts in this field, explain why it is important to protect their personal data and privacy, and provide guidance on how they can protect themselves online. It will also explain children’s rights under the GDPR and how they can exercise those rights in practice. In addition, we will outline how to act in the event of an infringement connected to their personal data and in which cases they may file a complaint with the IC. The text will be written in child-friendly language.
We do not have any special forms for children. However, there is no obligatory form for filing a complaint with the IC.
4. Complaints related to sharenting
a) Have you encountered complaints directly related to the phenomenon of sharenting (the publication of data/photos/videos of children by parents or relatives without the knowledge/consent of the children)? How often, how did you handle them, and how many of them led to DPA intervention?
No, we haven’t encountered such complaints.
b) If a complaint is initiated by a child (or on their behalf) against content published by a parent, what approach does the DPA take and are there internal guidelines or established practices (e.g., assessing the best interests of the child, age, communication with legal guardians, recommendation for judicial resolution, etc.)?
We do not have experience with such cases. However, we would always consider the child’s best interests as the primary guiding principle in such situations. This means, for example, that we would not reject a complaint solely on the basis that the child is younger than 15 years of age, particularly in cases where the child is exercising their rights in relation to their parents.
c) Have you dealt with the exercise of the right to erasure or restriction of processing in relation to content about a child published by another person (in particular a parent)? If so, what criteria are decisive in the assessment (age, nature of the content, consent, public interest, risks)?
No, we haven’t.
5. Cross-border cases and cooperation with platforms/other authorities
To what extent do you address the protection of children's privacy online in cooperation with other authorities (other DPAs in the EU, cross-border mechanisms, platforms)? What are the typical obstacles (jurisdiction, speed of content removal, evidence, identification of the operator)?
We conduct cross-border proceedings in accordance with the cooperation and consistency mechanisms from Chapter VII of the GDPR. Where the processing involves cross-border data processing and such processing affects individuals in Slovenia, the IC acts as a concerned supervisory authority. This means, inter alia, that the IC may raise objections to the draft decision, which the lead supervisory authority must take into account. The supervisory authorities cooperate through the Internal Markets Information system.
The IC is also the competent supervisory authority for Articles 26(1), 26(3) and 28 of the Digital Services Act. This means that the IC is active within the European Board for Digital Services, more specifically within Working Group 6, which deals with the protection of minors. As the Agency for Communication Networks and Services acts as the Digital Services Coordinator in Slovenia, the IC maintains close cooperation with the Agency.
6. New trends and threats
What new trends or threats to children's online privacy have you noticed in recent years (e.g., data collection by apps, targeted advertising/profiling of minors, AI in toys/educational tools)? How does the Office respond to them, and do you consider the current framework (GDPR and national law) sufficient to protect personal data in the digital environment?
In recent years, several trends have emerged that may pose increased risks to children’s online privacy. We will point out just a few of them.
We observe the increasing use of artificial intelligence in digital services that are used by children, from educational tools, chatbots, smart toys to recommender systems. AI systems may process large volumes of interaction data and infer behavioural patterns. This can create risks related to excessive data processing, lack of transparency, automated decision-making, and the potential manipulation of user behaviour.
We would particularly like to highlight AI chatbots, which are increasingly used also by minors. Concerns arise in relation to the collection and further processing of personal data entered by children during interactions with such systems. Children may disclose personal and even sensitive data without fully understanding the consequences, including how their personal data are stored and processed. Such systems are often designed in a way that encourages users to provide additional personal data and may create the impression that they are interacting with a human being who can be trusted. The use of such chatbots raises questions regarding transparency, purpose limitation, and the lawful basis for processing. There is also a risk of profiling where user behaviour and interaction patterns are analysed to personalise responses or services.
In the context of online video games, particular challenges include extensive data collection, in-game profiling, and targeted advertising. Many games incorporate social features, chat functions, and user-generated content, which may expose children to additional privacy and safety risks. Furthermore, multiplayer games can expose children to cyberbullying, unwanted contact with strangers (including grooming), and inappropriate or illegal content. Such games often include monetisation techniques, such as in-game purchases, which are problematic, as children may not fully understand the real value of money in a virtual environment.
We also observe the increasing use of deceptive design practices (so-called “dark patterns”) across digital services frequently used by children, including video games, social media platforms, AI chatbots, and various mobile applications. Such design choices may nudge or pressure children into sharing more personal data than necessary, disabling privacy-protective settings, or accepting terms without genuine understanding. Certain engagement-driven design features (such as endless scrolling, autoplay, reward mechanisms, default public settings, hidden privacy controls, and personalised notifications) may contribute to excessive use and potentially addictive behaviour. From a data protection perspective, these practices raise concerns regarding fairness, transparency, data minimisation, and the validity of consent, particularly where minors are involved.
In our view, awareness-raising plays an important role in addressing such issues. For this reason, we have launched the project Become a PrivacyPRO(tector) - https://www.ip-rs.si/varstvo-osebnih-podatkov/projekti/privacypro/. Within this project, we raise awareness among children, teachers, and parents about personal data protection, the rights provided under the GDPR, and how these rights can be exercised in practice. A particular focus of the project is the protection of personal data in the digital environment, especially in the context of using mobile applications, video games, artificial intelligence tools, and social media platforms.
We have developed an educational board game for children, through which they learn what personal data are, how to protect them, and with whom it is appropriate to share such data. The game is designed in an engaging and interactive way, allowing children to assume fictional identities and learn through play. We have also prepared a comic book series that educates readers on the same topics and is published monthly in one of the most popular children’s magazines, Pil. Furthermore, we have produced engaging awareness-raising videos featuring a well-known Slovenian actor, which are broadcasted on one of the most widely viewed television channels in Slovenia.
In addition, we have prepared ready-to-use PowerPoint presentations for teachers. These materials enable teachers, even without prior specialised knowledge of data protection, to deliver essential information on these topics to children within two school lessons. We will also organise dedicated workshops for teachers across Slovenia. We carry out various other supper activities, such as phone-line, mail-line, creative contest for children, hosting an open house at the IC and other activities.
We believe that through these activities we significantly contribute to empowering children, parents, and teachers to better understand what happens in the digital environment and how they can effectively protect themselves.
Kind regards,
dr. Jelena Virant Burnik
Information Commissioner of the Republic of Slovenia
Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Commission. Neither the European Union nor the granting authority can be held responsible for them.