Analysis of the Proposed Digital Omnibus Regulation

Analysis of the Proposed Digital Omnibus Regulation

Executive Summary

This briefing document provides a synthesized analysis of the European Commission's proposed "Digital Omnibus" regulation, which seeks to amend the General Data Protection Regulation (GDPR) and ePrivacy rules. The analysis, conducted by the organization noyb, concludes that the proposal is deeply flawed, presenting a significant threat to fundamental data protection rights within the European Union.

The central thesis of the analysis is that the Digital Omnibus, far from achieving its stated goals of simplification and clarification, introduces profound legal uncertainty, reduces the rights of data subjects, and creates systemic conflicts with the EU’s Charter of Fundamental Rights and established case law from the Court of Justice of the European Union (CJEU). The proposal is heavily criticized for being developed without a proper impact assessment, stakeholder consultation, or evidence collection, with many problematic changes appearing at the last minute.

Key findings include:

  • Erosion of Core Definitions: The proposed change to the definition of "personal data" in Article 4(1) introduces a subjective standard that would make the application of the GDPR dependent on the internal capabilities and intentions of individual data controllers. This would cripple enforcement, render data subjects' rights practically meaningless, and create massive legal uncertainty for businesses.
  • Creation of Dangerous Loopholes: The introduction of an overly broad definition for "scientific research" in Article 4(38) and new exceptions for processing sensitive data for Artificial Intelligence (AI) in Article 9 threaten to create massive loopholes. These changes would allow purely commercial activities to bypass core GDPR principles like purpose limitation and data minimization, effectively nullifying fundamental rights under the guise of "innovation."
  • Conflict with Fundamental Rights: Multiple proposals are identified as being in direct conflict with Article 8 of the EU Charter of Fundamental Rights and the "necessary and proportionate" standard for limiting such rights. The analysis argues that several amendments reduce protections below the minimum standard set by previous EU law and would likely be annulled by the CJEU.
  • Contradiction of CJEU Case Law: The Commission's proposals are found to selectively interpret or directly contradict a significant body of CJEU rulings that have consistently favored a broad and protective interpretation of data protection principles.

Ultimately, noyb recommends the outright rejection of the most significant and damaging amendments, including those concerning the definitions of personal data and scientific research, and the new allowances for AI. It is argued that only a few minor changes offer potential benefits, and even these require improvement. The document urges the EU legislator to conduct a profound scrutiny of the proposal and to leverage the ongoing "Digital Fitness Check" as the appropriate venue for a thorough, evidence-based examination of any necessary reforms.

Overview of Recommendations

The analysis provides a clear set of recommendations for the EU legislator on the key articles of the Digital Omnibus proposal. The following table summarizes the suggested approach for each amendment.

Article Amended

Subject Matter

Recommendation

Article 4(1) & Article 41a

Definition of “Personal Data”

Reject

Article 4(38)

Definition of “Scientific Research”

Reject

Article 9(2)(k) & (5)

Processing via AI Systems

Reject

Article 12(5)

Limitation of the Right to Access

Reject

Article 13(4)

Exception from the Right to Information

Reject

Article 13(5)

Limitation of the Right to Information

Reject

Article 22(1) & (2)

Automated Decision Making

Improve

Article 33

Data Breach Notifications

Improve

Articles 35

DPIAs

Adopt

Article 88a

Access to Terminal Equipment

Improve

Article 88b

Automated Signals

Improve & Adopt

Article 88c

Processing via AI Systems

Reject

Article 5(3) ePrivacy

Access to Terminal Equipment

Improve

--------------------------------------------------------------------------------

Detailed Analysis of Key Proposals

Article 4(1): Definition of “Personal Data”

Proposal Overview: The Commission proposes to amend the core definition of "personal data" by introducing a subjective standard. The new text suggests that information is not personal data for a given entity if that entity "cannot identify the natural person... taking into account the means reasonably likely to be used by that entity." This would mean data could be "personal" for one controller but not for another, even if it pertains to the same individual.

Core Criticisms & Analysis:

  • Conflict with the Charter of Fundamental Rights: The proposal risks reducing the scope of data protection below the minimum standard established by Directive 95/46, which is referenced in Article 8 of the Charter. The Directive explicitly included "identification numbers" (a common form of pseudonym), which the proposed amendment seems designed to exclude from GDPR protections in certain contexts. This creates a high risk of the amendment being found to violate the Charter.
  • Contradiction of CJEU Case Law: The proposal is based on a selective interpretation of the EDPS v SRB (C-413/23P) ruling, which had a very specific fact pattern. It ignores a wealth of conflicting CJEU case law that supports a broad, objective interpretation of personal data, including:
    • C-582/14 Breyer: Dynamic IP addresses can be personal data if a legal possibility to identify exists, regardless of likelihood.
    • C-604/22 IAB Europe: Data can be personal even if the entity cannot combine it with other identifiers itself.
    • C-479/22 P OC: Information is personal data for the first controller if a subsequent recipient has the means to identify the individual, a direct contradiction to the proposal's wording.
  • Creation of Legal Uncertainty: By making the definition of personal data dependent on the subjective abilities and intentions of a single party, the proposal creates massive legal uncertainty for data subjects, business partners, and supervisory authorities. This "subjective" approach undermines the legal principle of public disclosure ("Publizitätsprinzip") and opens the door for manipulation.
  • Systemic Conflicts within the GDPR: This change to a core definition would have cascading, negative effects throughout the GDPR, creating logical inconsistencies regarding:
    • Security (Article 32): The rules governing the proper separation of data and identifying keys would not apply to an entity that claims the data is not "personal."
    • Controller/Processor Roles: It would become unclear how joint controllerships or processor relationships would function if one entity falls under the GDPR and another does not.
    • International Data Transfers: The system under Chapter V could be bypassed by routing data through an exporter that claims it cannot identify the individuals.
  • Impact on Stakeholders:
    • Data Subjects: Would be at the mercy of controllers' claims and face a "chicken and egg" problem where the right to access (Article 15) is denied because the data is allegedly not "personal." This would make the enforcement of rights practically impossible and is described as the "final nail in the coffin of GDPR rights."
    • Controllers: Most SMEs would face increased complexity and risk, while only a small number of sophisticated or aggressive controllers (e.g., big tech, data brokers) would benefit by finding ways to escape the GDPR entirely.
    • Supervisory Authorities (SAs): Would be forced into lengthy and complex technical investigations just to establish their jurisdiction, effectively derailing enforcement actions.

Recommendation: Reject This amendment is fundamentally flawed. It is unsuitable for an Omnibus law, creates more legal uncertainty than it resolves, conflicts with the EU Charter and CJEU case law, and primarily benefits entities engaged in large-scale data processing. The EDPB is already developing non-legislative guidelines, which is a more appropriate path.

Article 4(38): Definition of “Scientific Research”

Proposal Overview: The Commission proposes to introduce a new, legally binding, and extremely broad definition of "scientific research" into Article 4. The proposed text defines it as "any research which can also support innovation," including "technological development and demonstration" and "apply[ing] existing knowledge in novel ways." Activities falling under this definition would benefit from significant exemptions from core GDPR obligations, such as purpose limitation and data subject rights.

Core Criticisms & Analysis:

  • Creation of a Massive Loophole: The definition is so vague that it could be abused to shield purely commercial processing activities from GDPR scrutiny. The connection to "innovation" is tenuous—a possibility ("can") of a byproduct ("also") that "supports" innovation is sufficient. This could allow big tech companies to claim that their product development or marketing analysis constitutes "scientific research."
  • Conflict with the Charter of Fundamental Rights: The broad exemptions granted for "research" would effectively waive fundamental rights guaranteed under Article 8 of the Charter, including purpose limitation and the right to erasure. Such a blanket allowance for a poorly defined purpose is unlikely to meet the "necessary and proportionate" test required by Article 52(1) of the Charter for limiting fundamental rights. Furthermore, outsourcing the limitation of rights to private "ethical standards" violates the principle that such limitations must be provided for by law.
  • Poor Legal Quality: The definition contains over 20 vague and partly contradictory criteria. It conflates research with commercial application and tilts the definition heavily toward "technical development," potentially excluding legitimate academic research in humanities or natural sciences that is not aimed at "innovation." This devalues the work of the scientific community.
  • Systemic Conflicts: The new definition would likely conflict with various national laws that Member States have already passed to implement the research provisions in Article 89 of the GDPR.
  • Impact on Stakeholders:
    • Data Subjects: Would see their rights under numerous GDPR articles (e.g., Articles 15, 16, 17, 18, 21) massively limited or abolished whenever a controller claims a "research" purpose.
    • Controllers: While some may abuse the loophole, others face increased legal uncertainty. Actual academic researchers may find their work excluded from the privileges if it doesn't align with the "innovation" focus.
    • Supervisory Authorities (SAs): Would be tasked with evaluating matters far outside their competence, such as the methodological quality of research or adherence to industry-specific ethical codes, adding to their already strained resources.

Recommendation: Reject This "last minute" addition to the Omnibus creates a foreseeable and massive loophole for big tech and other actors. It has a direct and severe impact on Charter rights, is poorly drafted, and may even harm legitimate scientific research. Any need for a harmonized definition should be examined thoroughly within the "Digital Fitness Check" with proper expert input.

Article 9(2)(k) & (5): AI and Sensitive Data

Proposal Overview: The proposal introduces a new legal basis in Article 9(2)(k) to permit the processing of special categories of personal data (sensitive data) for the "development and operation" of AI systems. A new Article 9(5) sets conditions, requiring controllers to implement measures to avoid collecting sensitive data. If such data is identified, it must be removed, unless doing so requires "disproportionate effort." In that case, the controller must simply protect the data from being used in outputs or disclosed.

Core Criticisms & Analysis:

  • Conflict with the Charter of Fundamental Rights: This creates a significant limitation on the protections for sensitive data under Article 8 of the Charter without the required proportionality assessment. The proposal's recital text appears to conduct a "reverse proportionality test," focusing only on not disproportionately hindering AI developers, which is described as "unheard of." It also abandons the GDPR's technology-neutral approach by creating a special privilege for AI.
  • Weak and Unenforceable Protections: The proposed safeguards are vague and riddled with loopholes.
    • The obligation to remove sensitive data is nullified by the "disproportionate effort" exception, a term that controllers have historically used to render similar obligations meaningless.
    • The terms "avoid," "appropriate measures," and "effectively protect" lack clear, objective standards, making them practically unenforceable for data subjects and SAs.
  • Poor Legal Quality & Systemic Conflicts: The proposal uses the extremely broad definition of an "AI system" from the AI Act. While intended to be protective in the AI Act, using this definition for a legal exemption in the GDPR creates an exceptionally broad privilege. The protection offered under Article 9(5) appears even weaker than the general data minimization principle in Article 5(1)(c), creating internal inconsistencies within the GDPR. Recitals justifying the processing of data that is "not necessary" show "structural intellectual and analytical errors" that conflict with core GDPR and Charter principles.
  • Impact on Stakeholders:
    • Data Subjects: Would have no realistic way to enforce these weak protections, given the technical complexity of AI, confidentiality claims by developers, and the vague legal language.
    • Controllers: While large or aggressive players may exploit the legal uncertainty, SMEs could face a more complex and risky legal situation.
    • Supervisory Authorities (SAs): Would face deep, resource-intensive technical investigations into AI data pipelines to assess vague standards like "disproportionate effort," making effective oversight highly unlikely.

Recommendation: Reject The proposed changes fail to solve existing problems for controllers while creating massive legal uncertainty and undermining the high level of protection for sensitive data. The provisions are inconsistent with the GDPR's structure and the fundamental rights protected by the EU Charter.

Read more

Generate Policy Global Compliance Map Policy Quest Secure Checklists Cyber Templates