Many IT managers and compliance officers ask the same question: "We already do DPIA under GDPR — do we now need another document?" The answer is yes and no. DPIA and FRIA overlap on important points, but they are not the same, and they do not fulfil each other's requirements.
This article explains the difference, when you need which, and how to build a coordinated workflow that avoids duplication of effort.
What Is a DPIA?
A Data Protection Impact Assessment (DPIA) is a requirement under GDPR (Article 35). You are obliged to conduct a DPIA before initiating processing activities that are likely to result in a high risk to the rights and freedoms of natural persons.
Data protection authorities have identified a list of processing activities requiring DPIA — including systematic profiling, large-scale processing of sensitive personal data, and surveillance of publicly accessible areas.
DPIA focuses on:
- Proportionality and necessity of data processing
- Risks to the rights and freedoms of data subjects
- Technical and organisational security measures
- Consultation with data processors and possibly the supervisory authority
DPIA is a data hygiene document. It is primarily about personal data, processing grounds, and data protection by design and by default.
What Is a FRIA?
A Fundamental Rights Impact Assessment (FRIA) is a requirement under the EU AI Act for deployers of high-risk AI systems (Article 27). FRIA is a newer requirement — it has not existed under GDPR — and has a broader focus than DPIA.
FRIA asks questions like: Which fundamental rights can this AI system potentially affect? Which individuals and groups are affected? What is the probability and severity of negative effects? And what is your mitigation plan?
FRIA focuses on:
- Impact on fundamental rights — including the right to equal treatment, human dignity, the right to privacy, the right to fair treatment
- Broader than personal data: the right to employment, access to education, social services
- Vulnerable groups: children, elderly, people with disabilities, minorities
- The system's intended and unintended effects
FRIA is a rights document. It is about people, not just data.
Overlap and Differences — A Comparison
| Dimension | DPIA | FRIA | |---|---|---| | Legal basis | GDPR Article 35 | EU AI Act Article 27 | | Purpose | Protect personal data | Protect fundamental rights | | Trigger | High-risk data processing | High-risk AI system (Annex III) | | Focus | Data and processing activity | AI system's effects on individuals | | Scope | Personal data | Rights and freedoms (broader) | | Responsible | DPO + data controller | AI responsible + system owners | | Review requirement | When processing changes | At least every 3 years or when changes occur |
When do they overlap?
When an AI system processes personal data and is high-risk under the AI Act, it requires both assessments. This is the case for:
- AI-driven recruitment systems that process CV data
- AI systems for credit assessment that process personal financial data
- Biometric identification systems
- AI-based health assessment
In these cases, DPIA and FRIA should be coordinated — but they are still separate documents with separate purposes.
When Do You Need Which?
DPIA only (not FRIA):
- Large-scale processing of personal data without an AI component
- Profiling based on rule-based (not machine learning-based) systems
- Data studies and analyses without direct individual consequences
FRIA only (not DPIA):
- High-risk AI systems that do not process personal data
- AI management of physical infrastructure (Annex III category 2) without data-intensive component
- AI systems for societal-critical decisions not involving personal data
Both (DPIA + FRIA):
- High-risk AI systems that process personal data
- AI recruitment tools
- AI credit assessment systems
- AI-based health assessment
Who Does What?
DPIA is typically the DPO's (Data Protection Officer's) area of responsibility, coordinated with IT and the business. The DPO has the competence for GDPR analysis and knows the processing register.
FRIA is typically the AI responsible or IT manager's responsibility, coordinated with HR (for employment systems), finance (for credit assessment), and possibly legal.
In many midmarket organisations, the DPO and IT manager are the same two or three people. Coordination is simpler here — but it still requires both assessments to be conducted with the correct focus.
A Practical Workflow for Coordination
Here is a five-step process for organisations that need to conduct both assessments for a high-risk AI system:
Step 1: System mapping (joint) Describe the system: what it does, who uses it, what data is processed, which individuals are affected. This starting point is shared for both assessments.
Step 2: DPIA track (DPO-led) Run the traditional DPIA analysis: processing ground, proportionality, security measures, consultation with data processors. Output: DPIA report.
Step 3: FRIA track (AI responsible-led) Assess the five dimensions in the AI Act's FRIA requirements:
- Which categories of individuals are affected?
- Which fundamental rights can the system affect?
- What is the probability and severity of negative effects?
- What mitigation measures are in place?
- Is it possible for those affected to complain or seek redress?
Step 4: Coordination review (joint) Review whether DPIA and FRIA analyses are consistent. Identify overlap and contradictory conclusions. Document coordination points.
Step 5: Periodic revision (joint) DPIA is revised when processing changes. FRIA is revised at least every three years or when the system changes. Set a reminder.
Practical Mistakes That Cost Time
Mistake 1: Treating FRIA as a DPIA with a new name FRIA is broader than DPIA. The right to equal treatment, the right to work, and the right to education are fundamental rights that are not personal data issues. A FRIA that only focuses on personal data is incomplete.
Mistake 2: Ignoring FRIA because you already have a DPIA This happens regularly. The argument is "we did a thorough DPIA, that must be enough." It is not. The AI Act explicitly requires a FRIA for high-risk deployers.
Mistake 3: Conducting both assessments in isolation If the DPO conducts the DPIA without knowing FRIA results, and vice versa, you risk inconsistent conclusions. The coordination point is important.
Mistake 4: Conducting FRIA once and never updating AI systems change. Vendors update models. Use scenarios expand. FRIA is not a "set and forget" document. Build a review reminder into your governance calendar from the start — at minimum once every three years, and after every significant system change. The three-year requirement is a floor, not a target: high-risk systems in active use warrant annual review.
Documentation Requirements and Retention Periods
Both DPIA and FRIA are documents that must be retained and kept up to date. Here are the relevant requirements:
DPIA:
- Must be retained for as long as the processing activity takes place
- Must be updated when there are significant changes to the processing
- Must be made available to the supervisory authority on request
- No specific retention period after processing ends — recommended minimum 3 years
FRIA:
- Must be retained for at least 10 years after deployment (AI Act Article 27)
- Must be updated when there are significant changes to the system's functionality or use
- Must be updated at least every three years
- Must be made available to supervisory authorities on request
It is strongly recommended that DPIA and FRIA for the same system are stored together in your governance documentation, with clear version history and date of last review.
Six Questions to Determine Whether You Need a FRIA
Use these questions as a first filter:
- Is the system classified as high-risk under Annex III? (Recruitment, credit, health, biometrics etc.) Yes → FRIA is required.
- Is the system used for decisions affecting individuals' access to resources, services, or employment? Yes → FRIA is likely required, even without Annex III classification.
- Does the system process personal data? Yes → Consider whether a DPIA is required in parallel.
- Are you a deployer (using someone else's system)? Yes → You are obligated to conduct a FRIA as deployer.
- Are you a provider (building the system)? Yes → Additional requirements beyond FRIA.
- Is the system new or significantly changed? Yes → New FRIA (or update of existing) before deployment.
Summary
DPIA and FRIA are complementary, not redundant. DPIA protects personal data under GDPR. FRIA assesses broader effects on fundamental rights under the AI Act. Both are required for high-risk AI systems that process personal data.
The good news: much of the preparatory work is the same. System mapping, user analysis, and impact assessment are reused across both documents. If your organisation already has a mature DPIA process, the marginal effort to run a coordinated FRIA for the same high-risk system is lower than starting from scratch.
Use DPIA and FRIA as a systematisation exercise, not as a compliance burden. They force you to understand your own AI systems better — and that is worth every hour spent.
Spekir builds the layer that connects strategy to the IT portfolio. See Atlas →
Related articles
EU AI Act for Midmarket — What You Actually Need to Do
A pragmatic roadmap for the IT manager or compliance coordinator who needs to translate the EU AI Act into action without a dedicated compliance team. The 20 things, prioritisation, and what is realistic.
9 min read →
Annex III Explained — When Is Your AI 'High-Risk'?
The eight Annex III categories explained with concrete examples from Nordic midmarket. When is your recruitment tool, credit scoring, or OT system high-risk under the EU AI Act?
8 min read →
Your AI Policy — 8 Sections You Cannot Skip
What must an AI policy contain? The eight mandatory sections, common mistakes, and what separates a policy that is actually used from one that lives in a PDF folder nobody opens.
8 min read →