top of page
Search

The European Union’s Digital Services Act: An Act with the Potential to Reshape Online Privacy and Online Safety



The European Union
The European Union

Introduction: The digital landscape has transformed our world, shaping how we communicate, conduct business, and access information. However, this rapid evolution comes with increased concerns over data privacy, data protection, data security, and the unchecked power of online platforms. The European Union's Digital Services Act (EU DSA) is a landmark Act that attempts to reshape online privacy and online safety by enforcing stricter transparency, user control, and content moderation requirements on digital platforms. Enacted to create a safer and more transparent digital environment, the EU DSA imposes stringent obligations on online intermediaries and platforms, while aiming to protect users' rights and to foster fair competition. It also underscores the EU's commitment to upholding fundamental rights in the digital space, while setting a precedent for global online governance.


Understanding the EU DSA:

Adopted in October 2022 and fully effective as of February 2024, the EU DSA establishes a comprehensive framework for regulating digital services within the EU. The Act is designed to modernize existing digital regulations by introducing clearer obligations for online platforms, ensuring stronger consumer protection, and improving the accountability of digital service providers. It provides a harmonized legal framework applicable across all EU member states, reducing regulatory fragmentation and ensuring consistent enforcement.


One of the primary goals of the EU DSA is to tackle the spread of illegal and harmful content online while balancing the protection of fundamental rights, such as freedom of expression. By imposing clear responsibilities on digital service providers, the Act seeks to prevent the dissemination of disinformation, online hate speech, and counterfeit goods while safeguarding user privacy and data security.


The EU DSA applies to a broad range of digital services, from small content-sharing websites to large multinational technology firms. Depending on their function and scale, online services are categorized into distinct groups, each with distinct obligations. For instance, intermediary services, hosting services, and online platforms are subject to progressively stricter requirements based on their impact on users and society. Additionally, the act introduces a specific classification for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), which face the most stringent compliance requirements due to their extensive reach and influence.


The EU DSA as a Global Regulatory Model: Looking ahead, the EU DSA can potentially serve as a regulatory benchmark for digital regulation worldwide. Policymakers in jurisdictions such as the United States, Canada, Australia, and the United Kingdom are closely monitoring its implementation, considering similar regulatory frameworks to enhance platform accountability, user protection, and online governance. However, the EU DSA’s far-reaching obligations have also sparked concerns among global corporations and non-EU governments. Key issues include:

  • Compliance Burdens: Companies fear rising operational costs due to mandatory audits, transparency reporting, and risk assessment requirements.

  • Regulatory Overreach: Some critics argue that the EU’s strict content moderation rules may conflict with free speech protections in other regions.

  • Potential for Fragmented Digital Markets: As countries adopt their own versions of platform regulation, businesses may face inconsistent legal obligations across jurisdictions, increasing compliance complexity.


Applicability: The EU DSA applies to:

  • Intermediary Services: Providers offering network infrastructure, such as Internet access and domain name services, must:

o    Ensure transparency in terms of service, including content moderation policies.

o    Cooperate with national authorities to address illegal content promptly.

  • Hosting Services: Services that store information on behalf of users must:

o    Implement systems for users to report illegal content.

o    Act expeditiously to remove or disable access upon receiving valid notice.

  • Online Platforms: Platforms facilitating interactions between users, like social media and marketplaces, have additional duties, including:

o    Establishing internal complaint and redress mechanisms for users to challenge content removal or account suspension decisions.

o    Clearly informing users about sponsored content and the entities behind advertisements.

o    Designing interfaces that do not manipulate users into making unintended choices, thereby safeguarding informed consent.

  • VLOPs: Platforms with over forty-five million monthly active users bear the most extensive obligations, like:

o    Conducting regular risk assessments to identify and address systemic risks, including the spread of illegal content and adverse effects on fundamental rights.

o    Undergoing annual independent audits to ensure compliance with the EU DSA’s mandates.

o    Providing vetted researchers with access to platform data to facilitate studies on online risks.

  • VLOSEs: Search engines with over forty-five million monthly active users in the EU bear the most extensive obligations, such as:

o    Conducting regular risk assessments to identify and mitigate systemic risks, including the spread of illegal content, disinformation, and adverse effects on fundamental rights.

o    Undergoing annual independent audits to assess compliance with the EU DSA’s mandates.

o    Providing vetted researchers with access to search engine data to facilitate studies on online risks, ranking systems, and information integrity.


VLOPs and VLOSEs Key Similarities and Key Differences: the EU DSA imposes similar, but not identical obligations on VLOPs and VLOSEs. While VLOPs and VLOSEs have similar strict transparency, accountability, and risk management rules; there are key dissimilarities in their obligations due to the nature of their services.

  • Key Similarities Between VLOPs and VLOSEs: The key similarities between VLOPs and VLOSEs include:

o    Risk Management: Both must conduct annual risk assessments and take steps to mitigate systemic risks (e.g., disinformation, electoral manipulation, child safety).

o    Transparency Reports: Both are required to publish detailed transparency reports at least once a year.

o    Data Access for Researchers: Both must allow vetted researchers to access platform data for studying systemic risks.

o    Independent Audits: Both undergo annual independent audits to assess compliance.

o    Advertising & Algorithmic Transparency: Both ensure ads are clearly labeled and provide transparency on recommendation systems.

o    Crisis Response Mechanism: Both must comply with emergency obligations imposed by the European Commission in public crises.

o    Enforcement & Penalties: Both face fines of up to six percent of global annual turnover for violations.


Key Differences Between VLOPs and VLOSEs: The following table highlights the key differences between VLOPs and VLOSEs:

Obligation

VLOPs

VLOSEs

Content Moderation

Must actively moderate and remove illegal content

Less direct content moderation responsibility (search engines do not host content)

Appeals & Redress

Users can appeal content moderation decisions

Users can request corrections of search results but have limited redress

Recommendation Systems

Must allow users to opt out of personalized recommendations

Must provide transparency on search ranking algorithms

Targeted Advertising

Must prohibit targeting based on sensitive data

Not directly relevant (search engines do not display ads in the same way)

EU DSA Enforcement Authorities: Powers, Roles, and Responsibilities: National

authorities and the European Commission oversee the EU DSA’s enforcement. They pay

particular attention to VLOPs and VLOSEs. This enforcement framework ensures compliance, accountability, and regulatory oversight across digital services operating within the EU. The EU DSA enforcement authorities include:

  • National Digital Services Coordinators (DSCs): Each EU member state designates a DSC responsible for monitoring compliance and investigating potential violations.

o    DSCs have the authority to impose fines, order content removals, and request transparency reports from digital platforms.

o    They coordinate with other EU regulators and the European Commission to ensure cohesive enforcement.

  • The European Commission:

o    Directly oversees the enforcement of the EU DSA’s provisions for VLOPs and VLOSEs (platforms with over forty-five million monthly users in the EU).

o    Conducts risk assessments and ensures platforms adhere to strict content moderation and advertising transparency requirements.

o    Has the authority to impose fines of up to six percent of an organization’s global annual revenue and order service suspensions for repeat offenders.


Note: By establishing a multi-tiered enforcement framework, the EU DSA ensures a robust mechanism for accountability and compliance, reinforcing trust in the digital ecosystem.


EU DSA Enforcement and Penalties: The EU DSA establishes a rigorous enforcement framework that grants both national authorities and the European Commission the power to ensure compliance, investigate violations, and impose penalties on digital service providers. Organizations that fail to meet the EU DSA’s regulatory obligations may face significant financial penalties and legal consequences. Additionally, users who suffer harm due to non-compliance—such as exposure to illegal content, privacy violations, or

deceptive platform practices—may have the right to seek compensation under EU

law.


Fines and Penalties for Non-Compliance: The EU DSA outlines a tiered penalty system

that is based on the severity and nature of the violation:

  • Serious Non-Compliance Fines: Organizations found in breach of key obligations, such as failing to remove illegal content or not implementing required transparency measures, may be fined up to six percent of their global annual turnover from the preceding fiscal year.

  • Periodic Penalty Payments: If an entity fails to comply with regulatory orders or corrective measures, it may be subject to daily fines of up to five percent of its average daily worldwide turnover until compliance is achieved.

  • Fines for Providing Incorrect or Misleading Information: If an organization supplies false, incomplete, or misleading information to authorities, or fails to rectify such errors, it may face additional fines of up to one percent of its global annual turnover.


Use Cases of Investigations and Fines for Non-Compliance: The European Commission and national regulators have actively enforced the EU DSA by monitoring platform compliance, investigating violations, and imposing regulatory measures. As of February 2025, enforcement efforts have led to several high-profile investigations and regulatory actions against major digital platforms. These cases highlight key areas of concern, including algorithmic transparency, electoral integrity, product compliance, and content moderation. Several use cases include:

  • X (formerly Twitter) – Regulatory Scrutiny and Algorithmic Manipulation:

o    Preliminary Findings: The European Commission notified X of potential EU DSA violations, citing concerns over dark patterns, lack of advertising transparency, and failure to provide researchers with adequate data access.

o    Algorithm Investigation: The Commission escalated its probe, formally requesting internal documentation on X’s algorithms over suspicions that they were manipulating content visibility to favor far-right narratives.

o    Political Content Scrutiny: Increased regulatory attention followed Elon Musk’s hosting of Alice Weidel, leader of Germany’s far-right AfD party, on X. Regulators began monitoring the platform for potential misinformation risks ahead of German elections.

  • TikTok: Electoral Interference in Romania: Investigation: EU regulators launched an investigation into TikTok amid allegations of political interference in Romania’s presidential election. Concerns focused on algorithm-driven content promotion, political disinformation, and inadequate safeguards against manipulation. 

  • Temu: Product Safety and Consumer Protection Violations Compliance Investigation: The European Commission opened an inquiry into Chinese e-commerce platform Temu, citing concerns over its failure to prevent the sale of illegal and counterfeit products in the EU. Regulators examined whether Temu’s content moderation and seller verification systems met EU DSA standards.

  • WhatsApp – Designation as a VLOP Regulatory Classification: Following WhatsApp’s report of 46.8 million average monthly active users in the EU, the platform crossed the EU DSA’s 45-million-user threshold, triggering VLOP obligations. This classification subjects WhatsApp to stricter content moderation, transparency, and compliance measures under the EU DSA.


    Note: These cases highlight the EU’s active enforcement efforts to regulate digital platforms, uphold content moderation standards, enforce advertising transparency, and protect user rights under the EU DSA.


EU DSA Enforcement Challenges: While the EU DSA establishes a robust regulatory framework, its effective enforcement presents several key challenges that impact both regulators and digital service providers. Ensuring consistent compliance across a vast and evolving digital ecosystem remains a complex undertaking, particularly when addressing jurisdictional, operational, and legal barriers. The enforcement challenges include:

  • Ensuring Non-EU Platforms Compliance:

o  One of the most pressing enforcement challenges is ensuring compliance by platforms operating outside the EU but offering services to EU-based users. Many large multinational technology firms may seek to circumvent the DSA’s jurisdiction by modifying their corporate structures, legal entities, or operational models to avoid direct applicability. This raises concerns about regulatory loopholes and the effectiveness of cross-border enforcement mechanisms.

o  To address this, the EU DSA applies extraterritorially, meaning any digital platform or intermediary that targets EU users must comply regardless of where it is headquartered. However, enforcing these obligations across international borders requires cooperation with foreign regulators and the development of new cross-jurisdictional enforcement strategies.

  • Resource Constraints and Regulatory Capacity: The enforcement process heavily depends on national DSCs within each EU member state. However, many national regulators face resource constraints, including:

o   Limited staffing and expertise to oversee compliance across a rapidly evolving digital landscape.

o   Technical challenges in effectively monitoring, detecting, and investigating violations at scale.

o   Coordination difficulties in aligning enforcement strategies across 27 EU member states, leading to potential inconsistencies in interpretation and application.

o   Without adequate funding, staffing, and cross-border cooperation, regulatory authorities may struggle to enforce the EU DSA uniformly, potentially allowing violations to persist due to delayed or inconsistent enforcement actions.

  • Balancing Enforcement with Free Speech Protections:  The EU DSA aims to combat illegal content, including hate speech, misinformation, and counterfeit goods, but enforcement must be carefully balanced to protect freedom of expression and prevent unjustified censorship. Key concerns include:

Defining Illegal Content: Identifying what constitutes illegal content can be highly subjective and context-dependent, particularly across diverse cultural and legal traditions within the EU.

o  Overreach & Unintended Censorship: There is a risk that platforms, in an effort to comply with the EU DSA, may over-moderate content, leading to the removal of legitimate speech, news reporting, or political discourse.

o  Transparency & Accountability in Moderation: Ensuring fair, unbiased, and proportionate content moderation is crucial to maintaining public trust and safeguarding democratic values.


EU DSA Regulatory Implications: These enforcement actions reflect the EU’s commitment to ensuring compliance with the EU DSA, particularly in areas such as:

  • Algorithmic accountability: Ensuring transparency in content ranking, amplification, and targeting.

  • Election integrity: Preventing disinformation and foreign interference in EU democratic processes.

  • Consumer protection: Regulating e-commerce platforms to prevent the sale of illegal or unsafe products.

  • Platform accountability: Holding large platforms to higher content moderation and transparency standards.


Note: These cases highlight the EU’s active enforcement efforts to regulate digital platforms, uphold content moderation standards, enforce advertising transparency, and protect user rights under the EU DSA.


A Catalyst for Future Digital Governance:

Despite these challenges, the EU DSA’s success in tackling systemic online risks and strengthening digital accountability could pave the way for global regulatory harmonization. Governments worldwide may seek to align their digital governance frameworks with the EU DSA’s standards on content moderation, data protection, and algorithmic transparency.


As the digital landscape continues to evolve, the EU DSA remains a pivotal experiment in regulating online platforms on a scale. Its long-term impact will depend on consistent enforcement, ongoing regulatory refinements, and the ability to balance innovation with consumer protection. If successfully implemented and adapted, the EU DSA could inspire a new era of global digital governance—one that prioritizes user rights, platform accountability, and a safer online ecosystem.


Key Questions for Organizations Preparing for EU DSA Compliance: As the EU DSA reshapes digital governance, organizations must proactively evaluate their policies, technologies, and operational frameworks to ensure compliance. Beyond fulfilling legal requirements, organizations should explore how the EU DSA-aligned practices can strengthen online privacy, enhance user safety, and promote responsible digital governance. To navigate this evolving regulatory landscape, organizations should reflect on the following critical questions:

  • Online Privacy & Data Protection:

o Do our data collection, storage, and processing practices provide users with full transparency and control over their personal information, as required by the EU DSA?

o Are we adequately safeguarding user data against breaches, unauthorized access, and unethical use, while aligning with both the EU DSA and the EU General Data Protection Regulations compliance requirements?

o How can we build greater trust with users by incorporating data protection-enhancing technologies, stronger encryption practices, and robust consent mechanisms?

  • Online Safety & Content Moderation:

o  How effective are our content moderation policies in identifying and removing illegal or harmful content, while also safeguarding legitimate speech and avoiding unjustified censorship?

o What proactive measures are in place to prevent the spread of disinformation, hate speech, and harmful digital behaviors that could undermine user safety?

o  Are our automated moderation tools and human oversight mechanisms aligned with best practices to prevent both over-removal and under-enforcement of harmful content?

  • Algorithmic Transparency & Ethical AI Use:

o What steps are we taking to ensure that our algorithmic decision-making, particularly in content ranking, recommendation systems, and ad targeting—is transparent, explainable, and free from bias?

o How can we empower users with greater control over personalized content curation while maintaining compliance with the EU DSA’s algorithmic transparency mandates?

o Are we conducting regular audits of our AI-driven systems to assess their impact on user privacy, safety, and digital well-being?

  • Regulatory Compliance & Risk Mitigation

o How can we strengthen our compliance mechanisms to seamlessly integrate EU DSA-mandated risk assessments, independent audits, and transparency reporting into our broader digital governance strategy?

o What safeguards do we have in place to prevent regulatory non-compliance, and how prepared are we to respond to enforcement actions, financial penalties, or operational restrictions under the EU DSA?

o Given the extraterritorial reach of the EU DSA, how should we adapt our global privacy and platform accountability strategies to align with evolving regulatory frameworks beyond the EU?


By addressing these strategic questions, organizations can move beyond mere regulatory compliance and position themselves as leaders in online privacy and safety, fostering greater user trust, accountability, and long-term resilience in the evolving digital landscape.

 

Sources:

2.      Digital Platforms Inquiry: Final Report” – Australian Competition & Consumer

3.      Digital Services Act Tracker | DLA Piper” – DLA Piper

4.      Commission

6.      “Enforcing the Digital Services Act: State of play” – European Parliamentary Research Service

 
 
 

1 Comment


bottom of page