A Data Protection Impact Assessment - commonly shortened to DPIA - is a structured process for identifying and minimising the privacy risks of a specific data processing activity. Under Article 35 of the GDPR, carrying one out is not optional when the processing you plan is likely to result in a high risk to the rights and freedoms of individuals. The assessment must happen before you begin processing, not after.

That sounds straightforward enough. The difficulty lies in the phrase "likely to result in a high risk". The GDPR does not define it precisely, and regulators have spent years clarifying what it means in practice. This article breaks down the legal triggers, the practical criteria published by the European Data Protection Board (EDPB), and the situations where a website owner or marketer is most likely to need one.

What Exactly Is a DPIA?

A DPIA is a documented risk assessment. It describes the processing you intend to carry out, evaluates whether it is necessary and proportionate, identifies risks to individuals, and sets out the measures you will take to address those risks. Article 35(7) of the GDPR lists the minimum contents:

  • A systematic description of the processing operations and their purposes
  • An assessment of necessity and proportionality
  • An assessment of the risks to the rights and freedoms of data subjects
  • The safeguards and security measures planned to mitigate those risks

A DPIA is not a one-off checkbox. The ICO describes it as a "living process" - something you revisit whenever circumstances change, new risks emerge, or you alter the scope of your processing. If a new security flaw is discovered, or you expand the data you collect, you should review and update the assessment.

The Three Automatic Triggers in Article 35(3)

Article 35(3) of the GDPR lists three scenarios where a DPIA is always required. No judgement call needed - if your processing falls into one of these categories, you must carry out an assessment.

Systematic and extensive profiling with significant effects. If you evaluate personal aspects of individuals based on automated processing (including profiling), and those evaluations produce legal effects or similarly significant consequences, a DPIA is mandatory. Credit scoring, automated insurance underwriting, and algorithmic hiring decisions all fall squarely into this category.

Large-scale processing of special category data. Special category data includes information about racial or ethnic origin, political opinions, religious beliefs, trade union membership, genetic and biometric data, health data, and data about sex life or sexual orientation. Processing any of these at scale - or criminal conviction data - triggers the DPIA requirement automatically. A hospital implementing a new patient database, for instance, would need one.

Systematic monitoring of publicly accessible areas on a large scale. CCTV systems in shopping centres, public transport hubs, or city streets are the classic example. If you are recording individuals in public spaces in a systematic way, you need a DPIA before switching the cameras on.

These three scenarios are explicitly called out in the regulation. But they are not exhaustive. Article 35(1) makes clear that any processing likely to result in high risk requires a DPIA, whether or not it matches one of these three examples.

The EDPB Nine Criteria for High-Risk Processing

To help controllers decide when processing crosses the "high risk" threshold, the EDPB endorsed guidelines (WP248 rev.01) setting out nine criteria. Meeting two or more of these criteria generally means a DPIA is required, though meeting even one criterion may be sufficient depending on the context.

CriterionDescriptionExample
1. Evaluation or scoringProfiling or predicting behaviour, performance, preferences, or interestsCredit reference checks, behavioural advertising
2. Automated decision-making with legal or similar effectsProcessing that leads to decisions affecting access to services, employment, or creditAutomated loan approvals, algorithmic recruitment
3. Systematic monitoringObserving, monitoring, or controlling data subjectsEmployee location tracking, website behaviour monitoring at scale
4. Sensitive or highly personal dataSpecial categories under Article 9, criminal data under Article 10, or data considered highly personal (financial records, location data)Health apps, biometric access systems
5. Large-scale processingProcessing that affects a significant number of data subjectsNational loyalty programmes, large e-commerce platforms
6. Matching or combining datasetsCombining data from multiple sources in ways that exceed individual expectationsMerging CRM data with third-party advertising profiles
7. Vulnerable data subjectsProcessing data of children, employees, patients, or others with a power imbalanceEdTech platforms, employee monitoring systems
8. Innovative technological or organisational solutionsUsing new or emerging technology where the privacy impact is not yet fully understoodFingerprint access combined with facial recognition, IoT devices
9. Processing that prevents data subjects from exercising a right or using a serviceBlocking or restricting individuals based on data processingA bank denying services based on automated screening

The EDPB advises that the more criteria your processing meets, the more likely it is to present a high risk. Two criteria is the general threshold, but one criterion alone - particularly involving innovative technology or vulnerable data subjects - can be enough.

National DPA Lists Add Extra Triggers

Article 35(4) of the GDPR requires each national supervisory authority to publish its own list of processing operations that mandate a DPIA. These lists supplement the EDPB criteria and vary between member states. The EDPB has issued opinions on 22 national lists to promote consistency, but differences remain.

The Irish Data Protection Commission (DPC), for example, specifically calls out processing that involves systematically monitoring or tracking individuals' location or behaviour on a large scale, and processing that combines or cross-references datasets to profile or analyse user behaviour. Both of these are directly relevant to websites using advertising cookies, remarketing pixels, or cross-device tracking.

The French CNIL has taken a particularly active stance on DPIAs in the context of artificial intelligence. In its 2024 guidance, the CNIL stated that the development of foundation models or general-purpose AI systems requires a DPIA when it involves processing personal data, even where those models are not classified as high-risk under the EU AI Act. The reasoning: the wide range of potential future uses makes it impossible to assess all risks upfront.

When Does a Website Owner Need a DPIA?

Most small business websites - a company blog, a brochure site, a simple online shop - do not process personal data at a scale or intensity that triggers the DPIA requirement. Setting a handful of functional cookies and running basic server-side analytics is unlikely to qualify as high-risk processing.

That changes quickly once you introduce more complex tracking. Consider these scenarios where a DPIA becomes necessary or strongly advisable:

Cross-site tracking and behavioural advertising. If you use third-party cookies or tracking pixels (such as _fbp from Meta or advertising cookies from programmatic networks) to build profiles of visitors across multiple websites, you are matching datasets and engaging in systematic monitoring. Two EDPB criteria met - a DPIA is warranted.

Combining analytics with registration data. The Irish DPC flagged this exact scenario in its cookie sweep report. If you link analytics cookie data with user account information to identify which pages specific individuals have read, you are profiling individuals. That processing requires both consent and a DPIA.

Large-scale e-commerce with personalisation. A site with hundreds of thousands of users that processes purchase history, browsing behaviour, and location data to serve personalised product recommendations is processing at scale, using evaluation/scoring techniques, and potentially combining datasets. Multiple criteria apply.

Health, financial, or children's data. Any website collecting special category data or data from minors triggers additional scrutiny. An online pharmacy, a mental health app, or a children's educational platform should treat a DPIA as a baseline requirement.

Not sure which cookies and trackers your site actually sets? That is the right starting point. A comprehensive cookie scan identifies every first-party and third-party cookie, local storage item, and pixel on your site. Without this audit, you cannot meaningfully assess whether your processing is high-risk. Kukie.io's cookie scanner detects and categorises these automatically, giving you the raw data a DPIA needs.

The EU AI Act and DPIAs: A New Overlap

The EU AI Act entered into force in August 2024, with obligations being phased in through to 2027. While the AI Act introduces its own conformity assessment framework for high-risk AI systems, it does not replace the GDPR's DPIA requirement. If your AI system processes personal data - and most do - you may need to complete both a DPIA under the GDPR and a conformity assessment under the AI Act.

Article 26(9) of the AI Act explicitly requires deployers of high-risk AI systems to use the information provided by the AI system's provider to carry out a DPIA under Article 35 GDPR. The two frameworks are designed to coexist, not to duplicate effort. The CNIL has noted that the documentation prepared for AI Act compliance can feed into the DPIA, provided the GDPR's own requirements under Article 35(7) are all addressed.

For website owners, this matters if you deploy AI-powered tools - chatbots that process visitor queries, recommendation engines that profile users, or automated content moderation systems. Each of these could qualify as both a high-risk AI system and a high-risk data processing operation under the GDPR.

What Happens If You Skip the DPIA?

Failing to carry out a DPIA when one is required is itself a GDPR infringement, regardless of whether any data breach occurs. Under Article 83(4)(a), supervisory authorities can impose fines of up to 10 million euros or 2% of global annual turnover for DPIA-related failures.

The ICO puts the UK-specific figure at up to 8.7 million pounds or 2% of global turnover, whichever is higher. And since the UK's Data (Use and Access) Act 2025 retained the existing DPIA requirements - an earlier proposal under the DPDI Bill to introduce more flexible "high-risk processing assessments" was dropped - the obligation remains firmly in place for UK-based controllers.

Enforcement has been tangible. When the Irish DPC fined Meta's Instagram 405 million euros in 2022 for mishandling children's data, one of the findings was that Instagram had failed to conduct proper DPIAs for its business account feature, which made teenagers' contact information publicly visible. The absence of a DPIA was not the headline violation, but it formed part of the regulatory finding and contributed to the penalty.

Beyond fines, the practical risk is significant. Without a DPIA, you have no documented evidence that you considered the privacy impact of your processing. If a complaint is filed or a breach occurs, regulators will ask for your DPIA. If it does not exist, the conversation becomes substantially more difficult.

How to Decide Whether You Need a DPIA: A Practical Checklist

Run through these questions. If you answer "yes" to two or more, you should carry out a DPIA. If you answer "yes" to even one - particularly questions 4, 7, or 8 - you should seriously consider it.

#QuestionEDPB Criterion
1Does your processing evaluate, score, or profile individuals?Evaluation/scoring
2Do automated decisions affect individuals' access to services, contracts, or rights?Automated decision-making
3Do you systematically monitor individuals (including online behaviour tracking at scale)?Systematic monitoring
4Do you process special category data, criminal data, or highly personal data like financial or location records?Sensitive data
5Is the processing carried out on a large scale (affecting many individuals or involving large volumes of data)?Large-scale processing
6Do you combine, match, or cross-reference datasets from different sources?Matching/combining datasets
7Does the processing involve vulnerable individuals (children, employees, patients)?Vulnerable data subjects
8Are you using new or innovative technology where the privacy impact is not yet well understood?Innovative technology
9Could the processing prevent someone from exercising a right or accessing a service?Preventing rights/services

Even if none of these apply strictly, conducting a voluntary DPIA is considered good practice. The European Commission notes that a DPIA should be treated as a "living tool, not merely a one-off exercise", and many organisations carry them out as standard whenever launching a new project that touches personal data.

What Should a DPIA Contain?

The GDPR sets minimum requirements, but most supervisory authorities recommend a more detailed approach. A well-structured DPIA typically includes these elements:

Scope and description. What data are you collecting? From whom? Through what means? Where is it stored and who has access? Include technical details - cookie names, tracking scripts, database structures, third-party processors. If your site sets _ga, _gid, and _fbp cookies, name them explicitly.

Necessity and proportionality. Why do you need this specific processing? Could you achieve the same purpose with less data or less intrusive methods? This is where data minimisation principles bite hardest. If you are collecting location data to serve regional content, do you actually need precise GPS coordinates, or would a country-level lookup from the IP address suffice?

Risk identification. What could go wrong for the individuals whose data you process? Risks include unauthorised access, data breaches, function creep (using data for purposes beyond what individuals expected), algorithmic bias, and loss of individual autonomy through invasive profiling.

Mitigation measures. For each identified risk, document the safeguard you will put in place. Encryption at rest and in transit, access controls, pseudonymisation, retention limits, consent mechanisms, and regular security audits all count. If your website uses a consent management platform to block non-essential cookies until the visitor actively opts in, that is a mitigation measure worth documenting.

Residual risk assessment. After applying your safeguards, is there still a high risk? If so, Article 36 of the GDPR requires you to consult your supervisory authority before proceeding. You cannot simply press ahead.

The ICO publishes a downloadable DPIA template in .docx format that many organisations use as a starting point. It is free and thorough.

DPIAs and Cookie Consent: Where They Overlap

Cookie consent and DPIAs serve different purposes but intersect in practice. The Kukie.io blog covers cookie categories and consent requirements in detail elsewhere, but the key distinction here is straightforward. Cookie consent - governed by Article 5(3) of the ePrivacy Directive - addresses the act of placing or reading information on a user's device. The DPIA addresses the wider processing that happens with the data those cookies collect.

A website that sets advertising and functional cookies is performing two legally distinct actions: storing data on the device (ePrivacy) and processing personal data derived from that storage (GDPR). The DPIA covers the second part. But the information you gather during a cookie audit - which cookies exist, what data they collect, which third parties receive that data, how l