Key Developments in Online Age Verification Laws
The legislative approach to online age verification at both an Irish and EU level has been subject to significant recent developments, moving from a lack of any clear or strict obligations, to a position where certain entities are now legally required to put effective age verification mechanisms in place, that go beyond mere self-declaration of age, in certain circumstances. Coimisiún na Meán’s binding codes, published on 21 October and 5 December this year, mark the most recent developments in this area and are significant steps towards more stringent online age verification obligations.
Current Legal Framework
GDPR – Proportionate verification
The first piece of EU legislation to contain a requirement to take age verification measures in relation to information society services is the EU General Data Protection Regulation (Regulation (EU) 2016/679) (“GDPR”), which has applied since May 2018. Article 8 GDPR provides that children can only consent to the processing of their personal data for information society services if they are of the age of digital consent, i.e., between 13 and 16 years depending on the implementation of the GDPR in each Member State. The GDPR is silent on whether controllers are required to process personal data to verify the age of a child seeking to consent to processing of their data. However, European Data Protection Board Guidelines 05/2020 (the “Guidelines”) state that an obligation on controllers to take “reasonable efforts” to verify that a user is over the digital age of consent is implicit in the GDPR. The Guidelines state that appropriate checks should be carried out for this purpose and that the method of verification should be “proportionate to the nature and risks of the processing activities”. The Guidelines do not set out what measures might be appropriate in different circumstances but suggest a risk-based approach should be taken. In situations which are low-risk, self-declaration of age by a user might be sufficient. However, controllers should not engage in age verification methods that involve excessive collection of personal data.
The Guidelines demonstrate a hesitancy towards imposing age verification obligations in a strict manner and there is a clear tension between the collection of personal data to verify age, and the principle of data minimisation under GDPR.
Digital Services Act – Mitigation of risks
The Digital Services Act (Regulation (EU) 2022/2065) (“DSA”), which has applied since 17 February 2024, expressly refers to age verification measures to protect children online. Article 35 DSA requires providers of very large online platforms (“VLOPs”) and very large online search engines (“VLOSE”) to put in place reasonable, proportionate and effective mitigation measures to protect against risks to children and specifies that such measures may include age verification. However, it does not specify the kinds of circumstances in which age verification would be required.
Notably, the DSA does not extend the same obligations to online platforms that do not meet the criteria for a VLOP or a VLOSE (i.e. less than 45 million monthly active users in the EU). Article 28(1) DSA requires providers of online platforms that are accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service. However, Article 28(3) clarifies that such online platforms are not obliged to process personal data in order to assess whether the recipient of the service is a minor.
On 31 July 2024, the European Commission issued a Call for Evidence on their proposed guidelines for online platforms for protecting minors online, which are intended to be adopted in 2025. Despite the wording of Art 28(3), the call for evidence suggests that these guidelines, once adopted, will factor in the European Commission’s work on a harmonised approach to age verification. If this occurs, given that these guidelines apply to all online platforms (except for small and micro platforms) and not just VLOPs, the scope of entities which will be subject to a legal obligation to implement effective age verification mechanisms would be significantly widened.
Since the DSA came into force, the enforcement of age verification obligations has become a priority area for the European Commission. Its investigations, initiated under Articles 34 and 35 DSA, have focused on holding VLOPs accountable for failure to put effective age assurance measures in place. We expect more clarity as to what VLOPs must do to implement effective age verification measures as the Commission conducts more investigations and issues its findings.
Audiovisual Media Services Directive and Online Safety and Media Regulation Act 2022 – Appropriate measures
The Audiovisual Media Services Directive (Directive (EU) 2018/1808) (“AVMSD”) requires Member States to ensure that media service providers and video-sharing platform services (“VSPSs”) have age verification systems in place, where appropriate in certain circumstances. Article 6a(1) AVMSD seeks to ensure that audiovisual media services provided by media service providers, which may impair the physical, mental or moral development of minors, are only accessible in such a way that ensures minors will not normally see or hear them. Such measures may include age verification. Further, Article 28b(1)(a) seeks to ensure that VSPSs take appropriate measures to protect minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental or moral development. Notably, appropriate measures for this purpose shall include age verification measures.
Both provisions of the AVSMD further demonstrate an intention at EU level to impose age verification obligations in circumstances where the risks to children are greater. However, the Directive leaves it up to Member States to create such obligations in national law.
The Online Safety and Media Regulation Act 2022 (“OSMRA”) empowers Coimisiún na Meán to make binding codes to impose these obligations. On 21 October 2024, Coimisiún na Meán published its final and binding Online Safety Code for VSPSs (the “Code”), which imposes the above AVMSD requirements on VSPSs. In addition, on 5 December 2024, Coimisiún na Meán published a Media Service Code and Rules, which imposes the above AVMSD requirements on Audiovisual On-demand Media Service Providers.
Online Safety Code and Media Service Code and Rules (Audiovisual On-demand Media Service Providers) – Focus on harms and risk
The Online Safety Code is designed to protect users of VSPS services from harmful content and represents a significant development in the law on age verification as it mandates age verification in two broad circumstances.
Firstly, in accordance with the relevant provisions of the OSMRA and AVMSD, the Code sets out that appropriate measures to protect children must include age verification systems for users with respect to content which may impair the physical, mental or moral development of children. The Code goes on to state that, for the purpose of this section, what is meant by “age verification” includes effective age assurance measures (which are defined in the Code as measures that involve estimating or verifying a user’s age). The Code states that effective age assurance measures may include age estimation, but that measures based solely on self-declaration of age by users of the VSPS service shall not be considered effective. The Code provides guidance on what will be considered ‘appropriate measures’ for protecting children online. It states that appropriate measures will be those which Coimisiún na Meán is satisfied are “practicable and proportionate, taking into account the size of the video-sharing platform service and the nature of the service that is provided”. It also states that, in relation to protecting minors, the content that is the most harmful will be subject to the strictest measures.
The Code also requires VSPSs to implement age verification in respect of adult-only content. VSPSs whose terms and conditions don’t prohibit users from uploading or sharing adult-only content must implement effective age assurance measures to ensure that this content cannot normally be seen by children. Adult-only content is defined as pornography or content involving realistic representations of, or of the effects of, gross or gratuitous violence or acts of cruelty. Self-declaration of age will not be an effective age verification measure for these purposes.
Practical Implications of Coimisiún na Meán’s Codes
While it is clear that self-declaration of age will not be a sufficient measure in the above cases, the Code does not otherwise prescribe or give guidance as to what form of age verification VSPSs may be required to impose. Rather, it suggests that what will be required will vary from case-to-case and that a risk-based approach will be taken to assessing what constitutes appropriate age verification, factoring in the severity of the potential harm, the size of the VSPS and the nature of its services.
The implications of these provisions of the Code will likely be very significant for VSPSs for two reasons. Self-declaration will no longer be an effective age verification measure where adult only content or content which may impair the physical mental or moral development of minors is concerned. In addition, the circumstances in which VSPSs are required to impose a greater form of age verification are potentially broader than previously required under Irish and EU law. Notably, the Code does not define, or provide guidance on, what kind of content would be considered content that may impair the physical mental or moral development of minors. As a result, it’s possible that a very broad interpretation could be taken as to what might constitute harm to physical, mental or moral development. The Code also requires stronger forms of age verification in respect of content which has the potential to have such an effect, which again broadens the scope of what might fall within this term.
While this is an Irish Code, it is likely to have EU wide impact. The ten VSPSs that have been designated by Coimisiún na Meán to date are large platforms that operate across Europe and so the age verification measures they implement to comply with the Code will affect users across the EU.
In addition, Coimisiún na Meán’s Media Service Code and Rules applying to audiovisual on-demand media service providers extend “age assurance” obligations to such providers (although it is not clear if “age assurance” is to be distinguished from “age verification” in this context). It requires providers to take appropriate measures to ensure that programmes containing content which may impair the physical, mental or moral development of children are only made available in such a manner that ensures children will not normally see or hear them. Appropriate measures for this purpose may include age assurance, but, once again, measures that solely rely on self-declaration of age will not be considered effective. This media service code broadens the scope of entities that may be required to put more robust age verification measures in place.
Key Takeaways
It is clear from the legislative developments to date that the EU is adopting a risk-based approach, by imposing stronger obligations where it is reasonable and proportionate to do so. However, EU law stops short of mandating specific age verification measures in specific circumstances. While there can be no one size fits all approach, what is reasonable and proportionate will differ depending on the specific risks that arise in the context of each entity’s unique activities. This leaves some uncertainty as to what is required to comply with the law.
- As a starting point, entities should be aware of the various online age verification obligations in Irish and EU law that apply to them. Those that are within scope will need to assess the various risks that their services pose to children and decide on age verification mechanisms that are a proportionate response to protect against those risks.
- For example, online platforms that already prohibit access to children will be expected to justify the effectiveness of the mechanisms being adopted while online platforms that are available to all-comers will need to undertake a careful assessment of whether age assurance/age verification measures are necessary at all given the nature of their services, and if so, how can they be applied in a manner that is proportionate to the specific risks to children that are presented by their services.
- Online platforms that are designated as VSPSs and audiovisual on-demand media service providers should ensure that they are in compliance with the higher standards the Online Safety Code and Media Service Code and Rules respectively require, paying particular attention to the circumstances in which self-declaration is no longer an effective measure.
- Entities should also be alert to future developments in the law that may extend age verification obligations to them, particularly online platforms that are not VLOPs.
The law in this area is still in development, with the European Commission and regulatory bodies such as Coimisiún na Meán consulting with a wide variety of stakeholders on the various considerations that need to be taken into account to strike a balance between protecting minors online while respecting privacy and other fundamental human rights. It is anticipated that this area of law will develop further over the next few years.
We would like to thank Emma Haddigan for her contribution to this article.