The Data Protection Agency has identified three priority areas for guidance and supervision in 2026: the use of AI in the public sector, data protection for children and young people, and processing by law enforcement authorities. For organisations that handle personal data, this means heightened scrutiny of certain types of processing and stronger expectations for clear governance and documentation—areas where many organisations benefit from support from an experienced data protection lawyer to ensure compliance in a practical and defensible way.

Background – why the Data Protection Agency’s priorities matter

The Agency selects areas each year where privacy risks are considered particularly significant or where compliance needs to improve. These priorities guide both supervision and guidance, which in practice means that organisations operating within or close to these areas can expect increased attention to how they process personal data.

The data protection framework is based on accountability: the controller must be able to demonstrate compliance with the GDPR. When the Agency places certain areas under the spotlight, expectations rise for robust documentation, risk analysis, contracts and operating procedures, including vendor management and records of processing activities.

AI in the public sector – transparency, risk and the data protection impact assessment

Public bodies are deploying AI-based systems across an increasing number of processes. For individual citizens there is often no realistic alternative to the digital service provided. That raises the bar for a well-structured approach to data protection in AI in the public sector.

Key questions for public-sector actors include:

  • Have we conducted an adequate data protection impact assessment (DPIA) for the AI solution, alongside an AI risk assessment for high risk processing?
  • Do data subjects know when and how AI is used, and can we explain this clearly in a privacy notice?
  • How are special category data and other highly protected information handled within the AI systems, including controls on cloud services providers and third party vendors?
  • Are roles and responsibilities between the data controller role and the data processor role clearly defined and governed by a processor agreement/data processing agreement?

Because both data protection rules and the EU’s emerging and progressively applied AI regulation affect this area, the public sector needs to work systematically with governance documents, risk analysis and follow-up, supported by access controls, audit logs and data protection training for staff.

Children and young people – data requiring special protection

Children and young people are among the most active users of digital services, while finding it harder to grasp the consequences of how their personal data are used. The GDPR emphasises that children’s data require special protection, which tightens expectations on both public and private actors.

Typical organisations affected include schools, municipalities, app and platform providers targeting children, healthcare providers and associations. For these organisations, the following issues are central:

  • Is the information provided to children and guardians clear, age-appropriate and easy to understand, and reflected consistently in the privacy notice?
  • What legal basis for processing is used, and is it appropriate in the context of children?
  • Which third party vendors are used (for example cloud services providers and analytics tools) and how do we control their processing through vendor management and suitable data processing agreements?
  • Do we have incident handling procedures to limit or stop inappropriate processing quickly, and do our access controls and audit logs support rapid response?

Law enforcement – powerful tools and significant privacy risks

Law enforcement authorities need effective tools to prevent and investigate crime. At the same time, tools such as covert surveillance, broad monitoring or the collection of biometric data entail major intrusions into privacy.

Processing of personal data for law enforcement purposes is partly governed by legislation other than the GDPR. When the Data Protection Agency focuses on law enforcement, the following aspects become important:

  • Purpose limitation – are the data used only for lawfully defined purposes?
  • Data minimisation – is more data collected than necessary for the purpose?
  • Retention and deletion – are data retained longer than permitted by the applicable framework?
  • Access controls – who has practical access to the data and how are access events captured in audit logs?

Private actors that disclose data to law enforcement also need to understand the applicable rules and their own responsibilities in such situations, including the controller/processor allocation and the legal basis for processing.

2026 actions – governance, records of processing activities and data protection impact assessments

For organisations directly or indirectly affected by the Data Protection Agency’s 2026 priorities, it is prudent to perform a targeted review of their data protection programme. Practical steps include:

  • Identify whether you use AI solutions, process children’s data or interact with law enforcement authorities, and whether high risk processing is involved.
  • Review records of processing activities, data protection impact assessments (DPIAs) and other core documentation linked to these processing operations.
  • Update processor agreements and other contracts with suppliers, particularly for AI services and digital platforms used by children, and reinforce vendor management of third party vendors and cloud services providers.
  • Ensure privacy notices and policies are up to date, understandable and reflect actual processing and the legal basis for processing.
  • Deliver targeted data protection training for staff across IT, operations, legal and leadership, and test incident handling procedures.

The Agency’s supervision and guidance aim to strengthen privacy while improving practical compliance. For organisations that act proactively, 2026 can be the year in which data protection is both assured and better integrated into day-to-day management.

At Morling Consulting, our data protection lawyers help companies and organisations assess how the Data Protection Agency’s priorities affect their processing of personal data and develop practical, business-oriented solutions to meet the requirements of the GDPR and adjacent frameworks.