DOMAIN:PRIVACY:PITFALLS¶
OWNER: julian
ALSO_USED_BY: aimee, eric, victoria
UPDATED: 2026-03-26
SCOPE: common privacy mistakes and misunderstandings in digital services
PITFALL: CONSENT FATIGUE¶
THE PROBLEM¶
Users are overwhelmed by consent requests.
Average EU user encounters 3-5 cookie banners per day.
Result: users click "accept all" reflexively — consent becomes meaningless.
Regulators are aware and tightening requirements accordingly.
CONSEQUENCES¶
- "accept all" rates above 90% may indicate dark patterns (regulatory red flag)
- consent obtained under fatigue may not meet GDPR "freely given" standard
- repeated consent requests erode user trust and brand perception
- growing regulatory push toward consent-free alternatives (see cookie-consent.md)
MITIGATION¶
DO: minimise consent requests — only ask when genuinely needed
DO: use legitimate interest where legally defensible (with documented LIA)
DO: use contract performance basis where applicable (no consent needed)
DO: implement consent-free analytics (self-hosted, first-party)
DO: design consent UX to be quick and clear (one-click reject)
DO_NOT: bundle multiple purposes into single consent
DO_NOT: re-ask consent that was already given (unless policy changed)
DO_NOT: use consent for processing that has another valid lawful basis
PITFALL: DARK PATTERNS (illegal under DSA + GDPR)¶
THE PROBLEM¶
Dark patterns in privacy interfaces are now explicitly illegal under multiple regulations.
DSA Art. 25 prohibits deceptive design.
UCPD Annex I blacklists specific manipulative practices.
GDPR requires consent to be "freely given" — dark patterns undermine this.
97% of EU apps still had dark patterns in 2024.
COMMON DARK PATTERNS IN PRIVACY¶
CONFIRMSHAMING: "No, I don't care about my privacy" as reject option
FORCED_ACTION: requiring account creation to access basic content
HIDDEN_OPTIONS: burying "reject all" in settings submenu
VISUAL_INTERFERENCE: bright "accept" button, grey "decline" text
OBSTRUCTION: 5 clicks to reject, 1 click to accept
TRICK_QUESTIONS: double negatives in consent checkboxes
PRESELECTION: consent checkboxes pre-ticked
SNEAKING: changing privacy settings during unrelated updates
NAGGING: repeated pop-ups pressuring consent
ENFORCEMENT¶
SHEIN: EUR 150M (cookies before consent — forced action + sneaking)
GOOGLE_FRANCE: EUR 100M (reject harder than accept — obstruction)
CNIL: systematic review of top websites for dark patterns
DUTCH_DPA: 50 organisations warned specifically about cookie dark patterns (Apr 2025)
AUDIT APPROACH¶
- time the "reject all" path vs "accept all" path — must be comparable
- count clicks to reject vs accept — must be equal
- verify button prominence — equal visual weight
- check for manipulative language
- verify no pre-selected options
- test on mobile — dark patterns often worse on small screens
PITFALL: LEGITIMATE INTEREST OVERREACH¶
THE PROBLEM¶
Legitimate interest (Art. 6(1)(f)) is the most flexible lawful basis.
This flexibility leads to overuse and weak justifications.
DPAs are increasingly scrutinising legitimate interest claims.
COMMON MISTAKES¶
TRAP: using legitimate interest without documenting LIA (Legitimate Interest Assessment).
TRAP: claiming legitimate interest for marketing to non-customers (weak basis).
TRAP: legitimate interest for cookie-based tracking — EDPB Guidelines 1/2024 explicitly say NO.
TRAP: legitimate interest for AI training without proper three-step assessment (EDPB Opinion 28/2024).
TRAP: claiming legitimate interest when consent was previously relied upon and withdrawn.
TRAP: asserting legitimate interest for processing that clearly overrides individual rights.
WHEN LEGITIMATE INTEREST WORKS¶
VALID: fraud prevention on own platform
VALID: network and information security
VALID: direct marketing to existing customers (with easy opt-out)
VALID: internal analytics on own services (limited scope)
VALID: enforcement of legal claims
VALID: intra-group data transfers for administrative purposes
WHEN LEGITIMATE INTEREST FAILS¶
INVALID: cookie-based advertising tracking
INVALID: selling data to third parties
INVALID: large-scale profiling of individuals
INVALID: processing special category data
INVALID: processing children's data for marketing
INVALID: systematic monitoring of public areas (usually requires DPIA + often consent)
REQUIRED DOCUMENTATION¶
EVERY use of legitimate interest MUST have documented LIA:
1. identify specific legitimate interest (not generic — "business improvement" is not enough)
2. demonstrate necessity (could you achieve the goal with less intrusive means?)
3. conduct balancing test (do individual rights override the interest?)
4. document reasoning and conclusion
5. review periodically (at least annually)
PITFALL: PROCESSOR vs CONTROLLER CONFUSION¶
THE PROBLEM¶
Misclassifying the controller/processor relationship has serious consequences.
Each role has different obligations, liabilities, and enforcement exposure.
Advanced Computer Software: GBP 3.1M — first fine on a data processor.
DETERMINING ROLES¶
CONTROLLER: determines WHY and HOW personal data is processed
PROCESSOR: processes personal data on behalf of controller
JOINT_CONTROLLERS: two or more controllers jointly determine purposes and means
COMMON MISCLASSIFICATIONS¶
TRAP: SaaS provider assumes they are always a processor — may be controller for their own analytics on customer data.
TRAP: cloud provider claims "no personal data processing" — if data stored includes PII, they are a processor.
TRAP: "we just provide the software" — if GE decides what data to collect and how, GE may be controller.
TRAP: marketing agency claims processor status — if they decide targeting criteria, they are controller.
TRAP: two companies sharing data assume one is processor — if both determine purposes, they are joint controllers.
GE ROLE ANALYSIS¶
GE_AS_PROCESSOR: operating client platform where client decides what data to collect
GE_AS_CONTROLLER: processing employee data, vendor data, own marketing
GE_AS_JOINT_CONTROLLER: if GE and client jointly decide processing (rare, avoid if possible)
GE_AS_SUB-PROCESSOR: when GE uses cloud services holding client's customer data
CONSEQUENCES OF MISCLASSIFICATION¶
IF_CONTROLLER_CLAIMS_PROCESSOR:
- missing lawful basis documentation
- missing privacy notice obligation
- missing DPIA obligation
- missing direct DPA enforcement exposure
IF_PROCESSOR_CLAIMS_CONTROLLER:
- acting outside controller's instructions (breach of DPA)
- potential for both parties to be fined
- data subjects confused about who to contact
DECISION FRAMEWORK¶
ASK: who decides WHAT data to collect? → likely controller
ASK: who decides WHY data is processed? → likely controller
ASK: who decides HOW data is processed? → may indicate controller (if fundamental)
ASK: does the entity have its own interest in the processing? → likely controller
ASK: does the entity only act on documented instructions? → likely processor
PITFALL: CROSS-BORDER TRANSFERS¶
THE PROBLEM¶
Any transfer of personal data outside the EEA requires a legal mechanism.
"Transfer" includes remote access — EU-stored data accessed from US = transfer.
Schrems II made US transfers high-risk. DPF helps but is fragile.
HIDDEN TRANSFERS¶
TRAP: using US SaaS tool (Slack, GitHub, Jira) — data transferred to US.
TRAP: US-based support team accessing EU customer data — transfer by access.
TRAP: CDN caching personal data at edge locations outside EEA.
TRAP: backup replication to non-EEA data centres.
TRAP: AI/ML tools (Claude API, OpenAI) processing EU personal data in US.
TRAP: email service (SendGrid, Mailgun) processing through US infrastructure. USE: Brevo (FR) or Mailjet (FR) as EU-first alternatives.
TRAP: cloud provider's sub-processors in non-adequate countries.
TRANSFER MAPPING¶
ACTION: map ALL data flows including:
- primary storage location
- backup/replication locations
- access points (who accesses from where)
- sub-processor locations
- CDN/edge locations
- third-party integrations
DPF FRAGILITY¶
STATUS: valid since Jul 10, 2023
RISK: NOYB challenge ("Schrems III") announced
RISK: PCLOB quorum issues undermine oversight mechanism
RISK: US political environment creates uncertainty
CONTINGENCY:
- maintain SCCs as backup for all US transfers
- keep TIA documentation current
- monitor DPF adequacy decision status
- design data flows for quick switch from DPF to SCCs
PITFALL: DATA BREACH NOTIFICATION (72 HOURS)¶
THE PROBLEM¶
GDPR Art. 33 requires controller to notify DPA within 72 hours of becoming "aware" of breach.
Art. 34 requires notification to affected individuals if high risk.
Most organisations miss the 72-hour window or fail to assess properly.
WHAT COUNTS AS A BREACH¶
DEFINITION (Art. 4(12)): breach of security leading to accidental or unlawful destruction,
loss, alteration, unauthorised disclosure of, or access to personal data.
INCLUDES:
- ransomware encrypting personal data (availability breach)
- email sent to wrong recipient (confidentiality breach)
- database exposed to internet (confidentiality breach)
- employee accessing records without authorisation (confidentiality breach)
- data corruption without backup (integrity/availability breach)
- lost laptop with unencrypted personal data (confidentiality breach)
NOTIFICATION TIMELINE¶
HOUR_0: breach detected or reported
WITHIN_72H: notify supervisory authority (unless unlikely to result in risk to individuals)
WITHOUT_UNDUE_DELAY: notify affected individuals (if high risk to rights and freedoms)
NOTIFICATION TO DPA (Art. 33)¶
MUST INCLUDE:
- nature of breach (categories and approximate number of data subjects and records)
- DPO contact details
- likely consequences of breach
- measures taken or proposed to address breach
IF full information not available within 72h: may provide in phases.
DUTCH_DPA: notification via online form at autoriteitpersoonsgegevens.nl.
NOTIFICATION TO INDIVIDUALS (Art. 34)¶
REQUIRED WHEN: breach likely to result in high risk to rights and freedoms
MUST: communicate in clear, plain language
MUST: describe likely consequences and measures taken
EXEMPT IF: data was encrypted, risk mitigated, or individual notification requires disproportionate effort (public communication instead)
COMMON MISTAKES¶
TRAP: starting 72h clock from when IT team discovers, not when DPO/management informed — clock starts at "awareness."
TRAP: not notifying because "we're just a processor" — processor must notify controller without undue delay, controller notifies DPA.
TRAP: waiting for forensic report to complete before notifying — notify within 72h, supplement later.
TRAP: underestimating risk to decide notification is not needed — document reasoning.
TRAP: not logging all breaches (including minor ones that don't require notification) — Art. 33(5) requires internal breach register.
PITFALL: CHILDREN'S DATA (AGE VERIFICATION)¶
THE PROBLEM¶
Children's data has enhanced protections under GDPR (Art. 8) and DSA (Art. 28).
Age verification is difficult to implement without creating privacy problems.
No reliable, privacy-friendly age verification exists at scale yet.
REQUIREMENTS¶
CONSENT_AGE (for information society services):
- Netherlands: 16 years
- Germany: 16 years
- France: 15 years
- Belgium: 13 years
- Spain: 14 years (changed from 16 in 2023)
BELOW_AGE: parental consent required — "reasonable efforts" to verify
AGE VERIFICATION CHALLENGES¶
TRAP: asking for date of birth — trivially falsifiable, creates new PII.
TRAP: requiring ID upload — disproportionate for most services, creates data minimisation issue.
TRAP: credit card verification — excludes legitimate child users, not reliable.
TRAP: AI-based age estimation — accuracy issues, potential discrimination, DPIA required.
CURRENT APPROACHES¶
SELF-DECLARATION: lowest bar — user states age. Legally insufficient for high-risk services.
CREDIT_CARD: moderate assurance — but privacy concerns and exclusion.
ID_VERIFICATION: high assurance — but disproportionate data collection.
eIDAS_WALLET: future solution — selective disclosure of age attribute without full identity.
THIRD_PARTY: delegated age check (Yoti, AgeChecked) — reduces data held but adds sub-processor.
DSA REQUIREMENTS¶
PLATFORMS MUST NOT: use profiling-based advertising targeting minors (Art. 28).
PLATFORMS MUST: take appropriate measures when aware a user is a minor.
VLOPs MUST: assess systemic risks to minors (Art. 34).
READ_ALSO: domains/privacy/index.md, domains/privacy/gdpr-implementation.md, domains/privacy/cookie-consent.md