Article 8 of the GDPR exists because children are not small adults. They interact with technology differently, they understand privacy risks differently, and European lawmakers decided they deserve a separate set of protections when their personal data is processed through online services. If your website or app collects data from users who might be under 16, this article affects you directly.

What Article 8 Actually Says

The provision is narrow but powerful. Where consent under Article 6(1)(a) is the legal basis for processing, and the processing relates to offering an information society service (ISS) directly to a child, the data subject must be at least 16 years old for their consent to be valid on its own. Below that age, consent must be given or authorised by the holder of parental responsibility.

EU member states may lower this threshold, but not below 13.

Article 8(2) adds a practical obligation: controllers must make reasonable efforts to verify that consent has genuinely been given or authorised by someone with parental responsibility, taking available technology into account. This is deliberately vague. The GDPR does not prescribe a specific verification method, which gives website operators flexibility but also creates uncertainty about what counts as "reasonable."

One often-missed carve-out appears in Recital 38: parental consent is not required for preventive or counselling services offered directly to a child. A mental health helpline chatbot aimed at teenagers, for instance, would not need to route through a parent first.

The Digital Age of Consent Varies by Country

The 16-year default is just a starting point. Most EU member states have exercised their right to set a lower threshold, creating a patchwork of ages across Europe. If your website serves visitors from multiple countries, you need to know which threshold applies where.

Age ThresholdCountries
13 yearsBelgium, Czech Republic, Denmark, Finland, Ireland, Latvia, Poland, Portugal, Sweden, United Kingdom
14 yearsAustria, Bulgaria, Cyprus, Italy, Lithuania, Spain
15 yearsFrance, Greece, Slovenia
16 years (GDPR default)Germany, Hungary, Luxembourg, Netherlands, Romania, Slovakia

This means a 14-year-old in Ireland can consent to data processing by an online service on their own, while a 14-year-old in Germany cannot. For a website with EU-wide traffic, the safest approach is to either apply the strictest threshold (16) universally, or use geo-detection to apply the correct national threshold based on the visitor's location.

Kukie.io's geo-detection feature can help identify where visitors are located, which feeds directly into determining which consent rules apply.

What Counts as an Information Society Service?

Article 8 only applies to information society services offered directly to a child. The definition of an ISS comes from EU Directive 2015/1535: any service normally provided for remuneration, at a distance, by electronic means, and at the individual request of a recipient. That covers most commercial websites, apps, games, social media platforms, streaming services, and online marketplaces.

"Remuneration" does not mean the child has to pay. Ad-supported services qualify because the economic activity exists, even if the user pays nothing. Free mobile games, social media platforms, and educational apps funded by advertising are all information society services.

What falls outside? Public authority websites providing information (a school's homepage, a library catalogue) typically do not qualify. Offline services are excluded by definition. And services not directed at children - even if some children happen to use them - occupy a grey area that regulators are still working through.

"Directed to a Child" - A Contested Boundary

Article 8 applies when an ISS is offered "directly to a child." This phrase does more work than it might seem.

A children's game app with cartoon characters and bright colours is clearly directed at children. A professional accounting software tool is clearly not. Between those extremes sits a vast range of services - social media platforms, general-audience games, video sharing sites, messaging apps - where the answer is less obvious. The US approach under CCPA's sibling law COPPA uses specific criteria to determine whether a service is "directed to children": subject matter, visual content, use of animated characters, the age of models, music, and similar factors. The GDPR does not provide an equivalent list, leaving it to regulators and courts to fill the gap.

The UK's ICO took a broad stance with its Age Appropriate Design Code (also called the Children's Code), which applies to any service "likely to be accessed" by under-18s - a much wider net than "directed to." Under the UK GDPR framework, even services not aimed at children must comply if children are among their likely users. The Code's 15 standards include requirements for high privacy settings by default and restrictions on nudge techniques that encourage children to share more data.

Age Verification: What "Reasonable Efforts" Look Like

Article 8(2) requires controllers to make "reasonable efforts" to verify parental consent, but does not specify how. This has been one of the most debated aspects of the regulation since 2018.

The EDPB's 2020 Guidelines on Consent noted that verification measures should be "proportionate to the nature and risks of the processing activities." A low-risk newsletter sign-up might warrant less rigorous verification than a social media account that collects location data and enables public posting. In February 2025, the EDPB adopted Statement 1/2025 on Age Assurance, listing ten principles for the compliant processing of personal data when determining a user's age. The statement identifies three categories of age assurance: age estimation (inferring age from behavioural or physical signals), age verification (confirming age against an authoritative source), and self-declaration (the user states their own age).

Self-declaration alone - a simple date-of-birth field or a checkbox saying "I am over 16" - is widely considered insufficient for anything beyond the lowest-risk scenarios. The EDPB expressed serious doubts about TikTok's age gate during its 2023 investigation, noting that the platform's mechanism could be easily circumvented by anyone willing to enter a false birth date.

Practical verification methods range from light-touch to heavy:

  • Email-based confirmation - asking a child to provide a parent's email address, then sending a verification link. Simple, but easy to fake.
  • Credit card or payment verification - requiring a small transaction from a parent's payment method. Provides reasonable assurance but introduces friction.
  • Government ID checks - scanning a passport or driving licence. High assurance, but raises data minimisation concerns (you collect far more data than needed to confirm age).
  • Third-party age verification services - outsourcing to providers who confirm age without passing unnecessary personal data back to the service. The ICO's 2024 opinion on age assurance endorsed "waterfall techniques" that combine methods, such as a self-declaration step followed by AI-based behavioural analysis if the declared age appears inconsistent.
  • AI-based age estimation - using facial analysis or behavioural signals to estimate whether a user falls above or below a threshold. Increasingly used but raises accuracy and bias questions.

The EDPB's 2025 statement emphasised that age assurance must not create new data protection risks. Collecting a passport scan to verify a 14-year-old's age creates a honeypot of sensitive identity documents. Controllers must ensure that any data collected purely for verification is deleted immediately once the check is complete, in line with the data minimisation principle under Article 5(1)(c).

Enforcement: The Fines Are Real and Growing

Children's data has become one of the highest-priority enforcement areas for EU data protection authorities. The penalties handed down in recent years make the financial risk clear.

In September 2023, the Irish Data Protection Commission (DPC) fined TikTok EUR 345 million for GDPR violations related to its processing of children's data between July and December 2020. The investigation found that TikTok set accounts belonging to 13-17 year olds to public by default, enabled public commenting, and used design patterns that the EDPB described as unfair under Article 5(1)(a). TikTok was also fined EUR 530 million in 2025 for separate violations concerning data transfers involving children's data.

Meta's Instagram received a EUR 405 million fine from the DPC in September 2022 for similar issues - public-by-default account settings for children aged 13 to 17, and allowing children's contact information to be exposed through business account features.

These are not outliers. A third of all major GDPR fines levied against social media platforms have concerned the mishandling of children's data. Regulators are signalling that child protection is not optional, and that claiming ignorance about child users is not a defence.

Cookies, Tracking, and Children's Data

The intersection of cookie consent and children's data creates specific compliance challenges. If your website sets analytics cookies like _ga or advertising cookies like _fbp, and children visit your site, those cookies process children's personal data. The same consent requirements apply - and arguably more stringent ones, given the GDPR's expectation of heightened protection for minors.

Under the ePrivacy Directive (Article 5(3)), storing cookies on a user's device requires prior informed consent regardless of the user's age, except for strictly necessary cookies. But where the user is a child, the quality of that consent faces additional scrutiny. Can a 12-year-old meaningfully understand a cookie banner explaining that their browsing behaviour will be shared with 47 advertising partners? Almost certainly not.

The UK's Age Appropriate Design Code is explicit: analytics, personalisation, and advertising features that go beyond the core service should be switched off by default for users under 18. The ICO considers data collection to "improve" or "personalise" a user's experience as beyond core service provision, meaning it requires separate consent - and for under-13s in the UK, that consent must come from a parent.

For website operators, this means your cookie consent setup cannot be one-size-fits-all if children are among your users. You need to consider whether your cookie banner is comprehensible to younger audiences, whether non-essential cookies are genuinely off by default, and whether you have a mechanism to identify and handle child users differently.

Practical Steps for Website Operators

Compliance with Article 8 is not a single checkbox exercise. It requires a combination of technical, legal, and design measures working together.

Determine whether your service is likely accessed by children. If you operate a general-audience website and have no evidence that children use it, your obligations under Article 8 may be limited. But "no evidence" is not the same as "no children." If your analytics show users in age brackets that include minors, or if your service has features (games, social elements, educational content) that would appeal to younger users, assume Article 8 applies.

Conduct a Data Protection Impact Assessment. The EDPB's 2025 statement on age assurance specifically notes that age assurance processes are likely to present high risks to rights and freedoms, making a DPIA advisable in many cases. A DPIA should map what personal data you collect from children, through what mechanisms (including cookies), and what risks that processing poses.

Implement age-appropriate consent mechanisms. Your consent flow should use clear, plain language suitable for the youngest users likely to encounter it. Avoid legal jargon. The GDPR explicitly requires that where information is addressed to a child, it should be in language the child can easily understand (Recital 58). A cookie consent banner shown to a 13-year-old should not read the same as one shown to a 35-year-old IT professional.

Apply the correct age threshold based on the user's location. Use geo-detection to determine which national implementation of Article 8 applies. A visitor from France triggers the 15-year threshold; a visitor from Ireland triggers 13.

Choose proportionate age verification. Match your verification method to the risk level of your processing. A basic content site might reasonably rely on a self-declaration age gate. A service that profiles users, enables social interaction, or serves behavioural advertising needs something more robust.

Default to privacy for child users. Non-essential cookies, tracking pixels, and third-party scripts should be off by default for anyone identified or reasonably suspected to be a child. This aligns with both Article 25 GDPR (data protection by design and by default) and the ICO's Children's Code.

The Regulatory Direction: Stricter Rules Ahead

If anything, Article 8's requirements are likely to intensify rather than soften. Several developments point toward stricter regulation of children's data across Europe.

The EU's Digital Services Act (DSA), fully applicable since February 2024, requires very large online platforms to assess and mitigate risks to minors, including through age verification. The EDPB's February 2025 statement on age assurance was explicitly positioned to support consistent enforcement across both the GDPR and the DSA.

France enacted its SREN law in May 2024, granting the ARCOM regulator authority to enforce age verification requirements for pornographic content and expanding enforcement powers to include administrative fines and site blocking. France's "digital majority" law of 2023 sought to restrict social media access for under-15s without parental approval, though implementation has stalled over conflicts with the EU's e-Commerce Directive. In June 2025, French President Macron pushed for EU-wide age checks on social media, gaining support from at least 13 member states.

Denmark announced in November 2025 a political agreement to set a national minimum age of 15 for social media access, with parental consent required for 13-14 year olds. The European Parliament adopted a resolution calling for a harmonised European digital age limit of 16 for social media, with a hard floor of 13 below which no minor could access such platforms at all.

For website operators, the message is clear: invest in proper age assurance and consent mechanisms now, because the regulatory bar is only going up.

Frequently Asked Questions

What age can a child consent to data processing under GDPR?

The GDPR default is 16, but each EU member state can set a lower threshold down to 13. The UK has set its threshold at 13. France uses 15, while Germany retains the default of 16. Below the applicable threshold, a parent or guardian must give or authorise consent.

Does GDPR Article 8 apply to all websites or only those aimed at children?

Article 8 applies specifically to information society services offered "directly to a child" where consent is the legal basis for processing. General-audience websites may still be caught if they are likely to be accessed by children, particularly under the UK's Age Appropriate Design Code which applies to any ISS likely to be used by under-18s.

How do I verify a parent has given consent for their child's data?

The GDPR requires "reasonable efforts" proportionate to the risk. Options include email confirmation loops, payment card verification, government ID checks, or third-party age verification services. Simple self-declaration checkboxes are generally not considered sufficient for higher-risk processing activities.

Can I use cookies on a website that children visit?

Strictly necessary cookies do not require consent regardless of the user's age. All other cookies - analytics, advertising, social media - require prior informed consent under the ePrivacy Directive. Where the user is a child below the applicable age threshold, that consent must come from a parent or guardian, and the consent interface should use language a child can understand.

What are the fines for mishandling children's data under GDPR?

The maximum fine under GDPR is EUR 20 million or 4% of global annual turnover, whichever is higher. In practice, fines for children's data violations have been substantial: TikTok was fined EUR 345 million in 2023 and EUR 530 million in 2025, while Instagram received a EUR 405 million penalty in 2022, all related to inadequate protection of children's data.

Do preventive or counselling services need paren