Progress and developments in the policies of children’s rights online’s.
1. Regulating product and service providers
- The General Data Protection Regulation (GDPR) stipulates that the personal data of children under 16 can only be processed with parental consent. Furthermore, it must be clear to the children what processing their data involves. The GDPR also prohibits the creation of personality and user profiles for children when it is not in their best interest.
- Under the Digital Services Act (DSA), very large online platforms (VLOPs) are required to provide at least one alternative to personalised recommendations. This requirement is intended to reduce the risk of falling into information biases. Providers must make significant efforts to make their terms and conditions understandable to minors. VLOPs are also prohibited from displaying advertisements based on profiles of minors’ personal data.
- These provisions in the GDPR and DSA protect children from data profiling and assure their right not to be tracked online.
- To ensure that online platforms comply with the DSA’s obligations, a roundtable with the European Commission, scientists, regulators, online platforms, and societal organisations will be organised in the fall. The focus will be lie on effective and adequate protection of children under the DSA.
- In the spring of 2023, several expert panels were convened to determine the requirements for age verification systems and how systems will comply. It was concluded that the market, with its technical knowledge and capacity for innovation, is uniquely capable of developing adequate age verification systems. The Digital Services Act compels the market to do this, as businesses must be able to determine a user’s age, for instance in cases where advertising based on profiling is prohibited for minors. However, this should not be left entirely to the market. The government must also be involved.
- The criteria for such systems include robustness (the degree to which age can be determined accurately), privacy-friendliness, security, and inclusivity of age verification systems. Based on a detailed list of requirements, the feasibility of a certification system for age verification is being explored. Ongoing conversations between the Cabineet and entities such as the Royal Netherlands Standarization Institute (NEN) and the European Commission are taking place. Additionally, an eye is being kept on international developments. This list of requirements by risk category will be made public after the summer. The possibilities regarding certification or other potential implementations of age verification requirements will be shared by the end of the year.
- To safeguard children’s rights at all stages of online products and services’ lifecycle, we are implementing the following new measures:
1. Development of a Children’s Rights Impact Assessment (CRIA). This tool is designed to identify risks to children’s rights. It will be ready in the second half of 2023.
2. Updating and transforming the existing Code for Children’s Rights into a more practical tool for designers of digital services and products. This ensures that children’s rights are safeguarded in online services. It will also be completed in the second half of 2023.
3. Establishing a Children’s Rights Seal that certifies that an online product or service upholds children’s rights. More details on its implementation will be available at the beginning of 2024.
Approach to persuasive techniques and dark patterns
In addition to the GDPR, DSA, and the Audiovisual Media Services Directive (AVMDS), we plan to extensively regulate providers. Including through consumer legislation, particularly targeting online persuasive techniques and so-called dark patterns. These techniques may encourage users to spend prolonged periods online or engage in activities not always in their best interests, such as making purchases.
The European Commission is assessing whether current regulations sufficiently protect consumers online (‘fitness check’). The Minister of Economic Affairs and Climate Policy (EZK) will request the Commission to enhance protection for vulnerable consumers, such as children, against harmful online commercial practices related to in-app and in-game purchases. In this context, the minister proposes to amend European legislation to classify loot boxes as an unfair commercial practice under all circumstances. The fitness check will also review whether consumer legislation needs updating to address dark patterns. Research by the Commission shows that almost all of the 75 websites and apps most used by European consumers contain at least one dark pattern. Therefore, it is crucial that consumer legislation is flexible enough to respond quickly to these issues.
Additionally, a specific new policy is the development of a guide on persuasive techniques in games, which will clearly explain to parents, children, and caregivers the persuasive techniques present in a game.
2. Media education and literacy, and building resilience in young people
- Research shows that non-educational use of mobile phones negatively impacts students’ concentration and engagement. The Ministry of Education, Culture, and Science (OCW) has researched ways to eliminate mobile phones from classrooms unless they are used for educational purposes during lessons. This policy will be implemented in secondary education from 1 January 2024. Primary education will also adopt this policy.
- The Netherlands Youth Institute (NJI) has developed the Media Education Toolbox ‘Media? Gewoon opvoeden!’ (in English: ‘Media? Just Educate!’). It includes fact sheets for professionals and teachers, as well as tip sheets for parents.
- Together with young people themselves and organisations such as MIND Us and the Dutch Media Literacy Network, efforts are focused on promoting media literacy, digital skills, and digital balance.
- Media literacy helps protect children from the risks of ‘unhealthy’ media usage; not by shielding them from harmful media but by teaching them how to handle it effectively. The Ministry of OCW supports the Media Literacy Network, an extensive programme aimed at enhancing media literacy among all Dutch citizens, with special attention to young people and vulnerable groups. The Network consists of over 1,000 partners.
- The Media Literacy Network has created the MediaDiamant, a tool for parents to help them have the right conversations with their children about media use. These are some important topics: enjoying possibilities, preventing risks, guiding your child, recognising suitable content, and maintaining a healthy balance. The MediaDiamant is intended for parents of children aged 0 to 18 years and was developed by experts and scientists.
- To better inform parents and children about handling digital products and services, a multi-year public communication initiative will begin this autumn. It focuses on the effects of being online for extended periods of time. Future phases of this campaign will also address disinformation and awareness of data processing by apps.
- Should something untoward happen online to a minor, it can be reported on Meldknop.nl. This includes online bullying, discrimination, disturbing videos, grooming, sexting, fraud, and stalking. Meldknop.nl is an initiative by Veilig Internetten, a collaboration between the Ministry of Economic Affairs and Climate Policy (EZK), the Ministry of Justice and Security (JenV), the National Cyber Security Centre (NCSC), and (Platform for the Information Society) ECP. It is supported by the police. Additionally, children can seek help from the Children’s Ombudsman if they face issues with a government agency, whether online or offline.
- Everyone should have the opportunity to develop their digital literacy. This is facilitated through the provision of digital knowledge and skills in education and through retraining and additional training programmes.
- In partnership with UNICEF Netherlands, a youth panel has been established. This panel discusses the opportunities and risks of the digital world with young people four times a year. Themed sessions will address for instance harmful content or online advertising, explore how children can be creative in the digital world or how it can enhance their learning capabilities. The first session took place in September 2023, and the outcomes from these sessions are used to further shape the initiative.
3. Mental health approach
From the Ministry of Health, Welfare and Sport (VWS), in collaboration with the Ministry of Education, Culture and Science (OCW) and the Ministry of Social Affairs and Employment (SZW), the ‘Mental Health Strategy for Us All’ was launched in June 2022. This strategy aims to improve the mental health of Dutch residents. It specifically focuses on young people and their online environments. The ‘Mentally Healthy Online’ action line of this strategy sets out goals and actions to ensure that young people remain media-savvy. This includes promoting awareness of digital balance and how to maintain it. Specific objectives of this strategy include helping young people and parents better recognise the opportunities and dangers of the online world and improving mental support to align more closely with the online environments of young people.
4. Media Act and Dutch Advertising Code
- Video influencers comply with regulations from the Media Act to protect consumers from online advertising. It must be immediately clear that the content is advertising: merely tagging the sponsoring company is insufficient.
- The Dutch Advertising Code has been updated, implementing the coalition agreement to provide extra protection for children against online advertising. The Code is revised as current events or regulatory changes dictate. Initially, the Media Authority supervises influencers with over 500,000 followers.
- The Secretary of State for Culture and Media is discussing the impact of influencers and enhancing supervision with the Media Authority. There is also a focus on facilitating supervision and enforcement of compliance.
- Efforts to use children as a revenue model by parents or companies need better control. Therefore, the Minister of Social Affairs and Employment will develop and codify clear, guiding standards for child labor involving young influencers, based on existing regulations for artistic and cultural work, which include working and rest hours for those up to 16 years old.
5. (Inter)National cooperation
- The Netherlands is examining how other countries address these issues. For instance, France has introduced a regulatory proposal setting a minimum age of 15 years for social media use. The proposal requires online platforms to implement age verification systems approved by the regulator. France also has the ‘Je protège mon enfant’ initiative, a platform for digital parenting.
- The European Consumer Organisation (BEUC) recommends further defining, standardising, and harmonising transparency and disclosure requirements at the EU level. It also calls for easier supervision and enforcement of compliance. Additionally, BEUC suggests classifying the promotion of illegal products and services by influencers as unfair commercial practices by including it in the blacklist of the Unfair Commercial Practices Directive. In Europe, the Netherlands is actively advocating for the adoption of these BEUC recommendations.