Skip to content

DOMAIN:EU_REGULATION:DIGITAL_SERVICES_ACT

OWNER: eric
ALSO_USED_BY: julian, aimee, margot
UPDATED: 2026-03-26
SCOPE: DSA obligations for client platforms and GE-built services


OVERVIEW

The Digital Services Act (Regulation (EU) 2022/2065) establishes a comprehensive framework
for digital intermediary services accountability, content moderation, and platform transparency.
It replaced the liability provisions of the E-Commerce Directive (2000/31/EC).

STATUS: fully in force since Feb 17, 2024 for all digital services
VLOP/VLOSE: obligations applied from Aug 25, 2023

WHY_GE_CARES: any client platform that hosts user content, operates a marketplace, or
provides search functionality has DSA obligations. Eric must assess DSA applicability at onboarding.


SERVICE CATEGORIES — TIERED OBLIGATIONS

The DSA uses a layered approach. Each tier inherits obligations from the tier below.

TIER 1 — INTERMEDIARY SERVICES (all digital services)

DEFINITION: services that transmit, cache, or host information provided by recipients
EXAMPLES: ISPs, CDNs, DNS providers, VPN services

OBLIGATIONS:
- designate single point of contact for authorities and users (Art. 11-12)
- include DSA compliance information in terms of service (Art. 14)
- publish annual transparency reports on content moderation (Art. 15)
- comply with orders from judicial/administrative authorities (Art. 9-10)
- cooperate with Digital Services Coordinators (Art. 11)

TIER 2 — HOSTING SERVICES (including cloud, web hosting)

DEFINITION: services that store information at the request of a recipient
EXAMPLES: cloud hosting, web hosting, file sharing, social media (as hosts)

ADDITIONAL_OBLIGATIONS:
- notice-and-action mechanism for illegal content (Art. 16)
- statement of reasons when restricting content (Art. 17)
- notification to law enforcement of suspected serious criminal offences (Art. 18)

TIER 3 — ONLINE PLATFORMS

DEFINITION: hosting services that disseminate information to the public at the request of recipients
EXAMPLES: social media, app stores, review platforms, content-sharing platforms
EXCLUDES: open-source repositories, non-profit online encyclopedias, cloud/web hosting (no public dissemination)

ADDITIONAL_OBLIGATIONS:
- internal complaint-handling system (Art. 20)
- out-of-court dispute settlement (Art. 21)
- trusted flaggers programme (Art. 22)
- measures against misuse (repeated infringers) (Art. 23)
- enhanced transparency reporting including automated tools detail (Art. 24)
- online advertising transparency — must label ads and disclose main targeting parameters (Art. 26)
- recommender system transparency (Art. 27)
- protection of minors — no profiling-based advertising targeting minors (Art. 28)
- dark pattern prohibition — no deceptive design in user interfaces (Art. 25)

MICRO/SMALL_ENTERPRISE_EXEMPTION:
Platforms with <45M monthly EU users AND qualifying as micro/small enterprise
are exempt from Art. 20, 24, 26, 27 (complaint handling, reporting, ad transparency, recommender).

TIER 4 — ONLINE MARKETPLACES

DEFINITION: online platforms allowing consumers to conclude distance contracts with traders
EXAMPLES: Amazon Marketplace, Etsy, Bol.com, eBay

ADDITIONAL_OBLIGATIONS:
- trader identification and verification (KYCT — Know Your Customer Trader) (Art. 30)
- compliance-by-design of online interface (Art. 31)
- right to information for consumers who purchased illegal products (Art. 32)

TIER 5 — VLOP / VLOSE (Very Large Online Platforms / Search Engines)

THRESHOLD: 45+ million monthly active EU users
DESIGNATED_VLOPS (as of 2026): X, Facebook, Instagram, TikTok, YouTube, Amazon, AliExpress,
Temu, Booking.com, LinkedIn, Pinterest, Snapchat, Wikipedia, Google Search, Bing, others
NOTE: Commission ended Stripchat designation after user count dropped below threshold

ADDITIONAL_OBLIGATIONS:
- systemic risk assessment (Art. 34) — at least annually
- risk mitigation measures (Art. 35)
- independent audit (Art. 37) — at least annually
- data access for researchers (Art. 40) — delegated act in force Oct 29, 2025
- crisis response protocol (Art. 36)
- additional advertising transparency (Art. 39) — searchable repository
- compliance officer (Art. 41)
- annual supervisory fee to Commission (Art. 43)


NOTICE-AND-ACTION (Art. 16)

APPLIES_TO: all hosting services (Tier 2+)

REQUIREMENTS

MECHANISM: electronic system for anyone to notify illegal content
NOTICE_MUST_CONTAIN:
- explanation why content is illegal
- URL or other identifier of specific content
- name and email of submitter
- statement of good faith belief

RESPONSE_OBLIGATIONS:
- process notices in timely, diligent, non-arbitrary manner
- use automated means where appropriate for high-volume
- give priority to notices from trusted flaggers
- inform notifier of decision and available remedies

STATEMENT OF REASONS (Art. 17)

WHEN content removed or restricted, MUST provide:
- facts and circumstances relied upon
- reference to legal/contractual ground
- applicable territorial scope
- information on redress options
IF_AUTOMATED_DECISION: state that fact explicitly


ILLEGAL CONTENT HANDLING

DEFINITION

ILLEGAL_CONTENT (Art. 3): any information that is itself illegal or relates to illegal activities
INCLUDES: terrorist content, CSAM, hate speech, counterfeit goods, copyright infringement,
illegal product listings, consumer fraud

LIABILITY EXEMPTIONS (Art. 4-6)

MERE_CONDUIT: no liability if provider does not initiate, select receiver, or select/modify content
CACHING: no liability if not modified and removal upon actual knowledge
HOSTING: no liability if no actual knowledge + acts expeditiously upon obtaining knowledge

CRITICAL: no general monitoring obligation (Art. 8)
HOWEVER: specific monitoring orders from judicial/admin authorities ARE permitted

TRUSTED FLAGGERS (Art. 22)

DESIGNATED_BY: Digital Services Coordinator in their Member State
CRITERIA: particular expertise and competence, independence, diligence
EFFECT: notices from trusted flaggers get priority processing
ABUSE: designation revoked if flaggers submit unjustified notices


TRANSPARENCY REPORTING

ALL INTERMEDIARY SERVICES (Art. 15)

ANNUAL report including:
- number of content moderation orders from authorities
- number of notices received and action taken
- content moderation at own initiative
- number of complaints received via internal system
- use of automated means
- human resources devoted to content moderation

VLOP/VLOSE ADDITIONAL (Art. 42)

BIANNUAL reports including everything above plus:
- average monthly active users (EU)
- systemic risk assessment outcomes
- risk mitigation measures taken
- audit results
- dedicated advertising repository data

FIRST HARMONISED REPORTS

Data collection per Implementing Regulation: from Jul 1, 2025
First harmonised reports: due beginning of 2026


ADVERTISING TRANSPARENCY

ALL ONLINE PLATFORMS (Art. 26)

REQUIREMENTS:
CHECK: clearly mark content as advertisement (real-time, per-ad labelling)
CHECK: identify natural/legal person on whose behalf ad is displayed
CHECK: identify natural/legal person paying for ad
CHECK: disclose main parameters used for targeting the recipient
CHECK: provide mechanism for recipients to report advertising content

VLOP/VLOSE ADDITIONAL (Art. 39)

MUST maintain publicly accessible, searchable advertisement repository including:
- content of the advertisement
- identity of advertiser and payer
- period during which displayed
- number of recipients reached (per Member State)
- targeting parameters used
- whether targeted at specific vulnerable groups

PROHIBITION: profiling-based advertising targeting minors (Art. 28)


DARK PATTERNS PROHIBITION (Art. 25)

SCOPE: all online platforms

PROHIBITED_PRACTICES:
- giving more visual prominence to one option to influence decisions
- requesting confirmation to end account but not for creating one
- making cancellation harder than subscription
- preselecting consent or agreement options
- making opt-out paths longer or more complex than opt-in
- using deceptive visual design (e.g., hidden "decline" buttons)

ENFORCEMENT_LINK: also prohibited under Unfair Commercial Practices Directive (2005/29/EC)


ENFORCEMENT

STRUCTURE

NATIONAL: Digital Services Coordinator (DSC) in each Member State — primary enforcer
COMMISSION: direct enforcement for VLOPs/VLOSEs
COOPERATION: Board for Digital Services coordinates national DSCs
DUTCH_DSC: ACM (Autoriteit Consument & Markt) designated as DSC for Netherlands

PENALTIES

INFRINGEMENT: up to 6% of global annual turnover
INCORRECT_INFORMATION: up to 1% of global annual turnover
PERIODIC_PENALTIES: up to 5% of average daily worldwide turnover per day
LAST_RESORT: temporary suspension of service in EU (Art. 51)

ENFORCEMENT ACTIONS (2025-2026)

  • X (Twitter): EUR 120M fine — first major DSA enforcement decision (Dec 2025)
  • 14 investigations opened into VLOPs/VLOSEs by Nov 2025
  • Targets: AliExpress, Facebook, Instagram, Temu, TikTok, X, pornographic platforms
  • Focus areas: minor protection, illegal content, algorithmic transparency, advertising

GEOPOLITICAL CONTEXT

US administration accused EU of targeting American companies.
Visa bans imposed on EU figures involved in DSA enforcement.
First time digital regulation triggered diplomatic retaliation.
IMPACT_ON_GE: enforcement resolve appears to be strengthening, not weakening.


DSA ASSESSMENT FOR GE CLIENTS

CLASSIFICATION FLOWCHART

1. Does client service transmit/cache/host third-party content?  
   NO → DSA does not apply  
   YES → Tier 1 (intermediary service)  

2. Does client service store information at recipient's request?  
   YES → Tier 2 (hosting service)  

3. Does client service disseminate stored info to the public?  
   YES → Tier 3 (online platform)  

4. Does client service enable distance contracts between consumers and traders?  
   YES → Tier 4 (online marketplace)  

5. Does client service have 45M+ monthly active EU users?  
   YES → Tier 5 (VLOP/VLOSE)  

COMMON GE CLIENT SCENARIOS

SAAS_WITH_USER_CONTENT: likely Tier 3 (online platform) — full platform obligations
MARKETPLACE/E-COMMERCE: likely Tier 4 — trader verification required
BUSINESS_TOOL_NO_PUBLIC_CONTENT: likely Tier 2 (hosting) — notice-and-action only
API/BACKEND_SERVICE: likely Tier 1 — minimal obligations
STATIC_WEBSITE: likely not in scope (no hosting of third-party content)

ACTION: Eric must classify every client project by DSA tier during onboarding.
ACTION: Julian must ensure technical architecture satisfies applicable tier obligations.


IMPLEMENTATION CHECKLIST

FOR_TIER_2_AND_ABOVE:
CHECK: notice-and-action mechanism implemented
CHECK: statement of reasons system for content decisions
CHECK: law enforcement notification process
CHECK: terms of service include content moderation policy
CHECK: annual transparency report generated

FOR_TIER_3_AND_ABOVE (add):
CHECK: internal complaint-handling system
CHECK: trusted flagger integration
CHECK: ad labelling and transparency
CHECK: recommender system transparency
CHECK: minor protection (no profiling-based ad targeting)
CHECK: dark pattern audit of UI/UX

FOR_TIER_4 (add):
CHECK: trader identity verification (KYCT)
CHECK: product compliance-by-design checks
CHECK: consumer notification system for illegal products


READ_ALSO: domains/eu-regulation/index.md, domains/eu-regulation/consumer-protection.md, domains/privacy/index.md