Empty Link Skip to Content

EDPB Highlights Role of DPAs in the AI Framework

Following the publication of the EU Artificial Intelligence ("AI") Act ( Regulation (EU) 2024/1689) in the Official Journal of the European Union ("OJEU") on 12 July 2024, the EDPB has adopted a statement on the role of data protection authorities ("DPAs") in the AI framework.

The EDPB has recommended that DPAs should be designated under the AI Act as Market Surveillance Authorities ("MSAs") for high-risk AI systems used for law enforcement, border management, administration of justice and democratic processes. The AI Act requires Member States to designate at least one notifying authority, and one or more MSAs, as national competent authorities before 2 August 2025, for the purpose of supervising and enforcing the AI Act.

In its statement, the EDPB points out that DPAs already have experience and expertise with carrying out enforcement actions on AI-related issues with respect to the processing of personal data at national and international level. The EDPB further state that the designation of DPAs as MSAs would benefit all stakeholders in the AI value chain by making available a single point of contact, facilitating the interactions between different regulatory bodies that are concerned by both the AI Act and EU data protection law.

In the view of the EDPB, such designation would ensure better coordination among different regulatory authorities, enhance legal certainty for all stakeholders and strengthen the supervision and enforcement of both the AI Act and EU data protection law. On this basis, the EDPB makes various recommendations on how such designation of DPAs and MSAs could be implemented.

It remains to be seen which national competent authority(ies) will be appointed by the Irish Government to supervise and enforce the AI Act. The Irish Department of Enterprise, Trade and Employment launched a public consultation on the implementation of the AI Act last May 2024, which closed on 16 July 2024.

On 18 July 2024, the Irish Data Protection Commission (“DPC”) published its first guidance (in the form of a blog post) on the interplay of data protection laws and artificial intelligence. The guidance emphasises the importance of understanding how personal data is used in AI systems, ensuring transparency, and complying with the GDPR and data protection legislation. Given the DPC’s role as lead supervisory authority for so many multinational technology companies whose EU headquarters are located in Ireland, the DPC's role in regulating AI product providers and deployers’ compliance with data protection laws will be crucial. However, significant challenges would inevitably accompany any expanded role of the DPC in further regulating compliance with the AI Act, in light of the considerable resources (including funding, personnel, knowledge and expertise), which would be required to supervise and enforce the AI Act.

EDPB Statement

The EDPB highlight that the purpose of the AI Act is to support innovation and promote the uptake of trustworthy AI, and ensure the health, safety, and protection of the fundamental rights enshrined in the Charter of Fundamental Rights of the EU, including the right to privacy and to protection of personal data (respective, Articles 7 and 8 of the Charter). From this perspective, the EDPB notes that the AI Act and EU data protection legislation should be viewed and considered as complementary and mutually reinforcing instruments.

The EDPB emphasises that EU data protection law is fully applicable to the processing of personal data involved in the lifecycle of AI systems, as explicitly recognised in Article 2(7), and Recitals 9 and 10 of the AI Act. For this reason, national DPAs have been active with regard to these technological developments.

The EDPB considers the following points to be of particular importance:

  • DPAs, in addition to their expertise in AI technologies, are skilled in many of the areas referred to in Art. 70(3) AI Act, such as data computing and data security, and in assessing risks to fundamental rights posed by new technologies.
  • DPAs, due to their full independence, can provide effective independent supervision of AI systems (as required by Art. 70(1) AI Act).
  • DPAs or other authorities with the same requirements on independence, must be designated as MSAs for high-risk systems listed in point 1 of Annex III of the AI Act, insofar as the systems are used for law enforcement purposes, border management and justice and democracy, and for high-risk systems listed in points 6, 7, and 8 of Annex III of the AI Act (as required by Art. 74(8) AI Act).
  • Where EU institutions, bodies, offices or agencies fall within the scope of the AI Act, the EDPS shall act as the competent authority for their supervision (as required by Art. 70(9) AI Act).
  • A close relationship is expected between the data protection impact assessment and the fundamental rights assessment.

Recommendations

The EDPB recommends that DPAs should be designated by Member States as MSAs for the high-risk AI systems referenced in Art. 74(8) AI Act, as well as for the remaining high-risk AI systems listed in Annex III, taking account of the views of the national DPAs (unless those high-risk AI systems are in sectors covered by a mandatory appointment required by the AI Act).

In addition, as the single point of contact under the AI Act should be a MSA (pursuant to Article 70(2) AI Act), DPAs (acting as MSAs) should be designated as the single points of contact for the public and counterparts at Member State and EU levels.

From a broader perspective, the EDPB note that there is a need for sound cooperation between MSAs and other entities which are tasked with the supervision of AI systems, including DPAs. Clear procedures are required in this regard pursuant to Article 74(10) AI Act. The EDPB recommends that such procedures are created and developed under the principle of sincere cooperation provided by Article 4(3) of the Treaty of the EU, as highlighted by the Court of Justice of the EU in the Bundeskartellamt case (C-252/21). The EDPB states that, in this way, inconsistencies between decisions taken by different oversight authorities and bodies can be prevented, and synergies can be exploited in complementary enforcement actions for the benefit of individuals and in the interests of legal certainty.

The EDPB further note that where a general purpose AI model entails the processing of personal data, it may, like any other AI system, also fall under the supervisory remit of the relevant national DPAs and of the EDPS. Accordingly, national DPAs and the EDPS must be involved whenever questions arise in relation to such processing.

Contact Us

Matheson's Technology & Innovation Group is available to guide you through the complexities of understanding your organisation's obligations under the AI Act.  For more information, or if you would like assistance with putting in place an AI strategy, please contact any member of our Technology and Innovation Group or your usual Matheson contact.