DPO Radio

Get AesirX CMP Lifetime Deal - Save up to 86% on AppSumo

Bridging Offline Values with Online Actions

Oct 01, 202513 minute read

If It’s Not Okay Offline, It’s Not Okay Online

blogdetail image
If It’s Not Okay Offline, It’s Not Okay Online

TL;DR: Our scans show 82% of Danish and 91% of Swedish municipalities, and 73% of 36,500 Danish business websites, still fire trackers before consent. That isn’t just non-compliant - it’s unethical: treating people as targets, not participants. We’d never accept that offline.

Why customers would walk out of our “offline stores” if we behaved like most websites

Imagine this.

  • You step into a boutique. Before you even cross the doormat, an employee silently rifles through your pockets, takes an imprint of your key, notes where you came from, and whispers your details to a dozen vendors across the street - “just in case we need to optimize your journey.”
  • You sit down at a café. The waiter copies your entire address book “for service improvements,” then tells the bakery next door how often you meet your friends and what you usually order.
  • You check in at a hotel. The receptionist says, “We’ll follow you to every restaurant in town tonight - but only to improve the guest experience.”

In the physical world, this is absurd. Unethical. A sackable offense. Customers would leave… and tell everyone!

Online, this is routine.

Most data capture starts before consent, routes information through third parties, and buries the truth in dark-pattern banners that simulate choice. We’ve normalized behaviors that would be unthinkable face-to-face.

“This isn’t just a compliance flaw. It’s a trust failure - one that costs brands loyalty, lifetime value, and reputation.”

What we reward, we repeat

For a decade, digital teams were incentivized to collect as much data as possible, as early as possible. When growth is measured only by short-term conversion and ad efficiency, ethics looks like friction.

But customers aren’t abstract “users.” They are people with rights, context, and memory. When they sense manipulation, they churn. And increasingly, the law - and the market - punishes the shortcuts.

  • Consent must come before access to someone’s device, not after - that’s the entire point of the “pre-consent” rule in ePrivacy Directive Article 5(3).
  • Technically required ≠ everything we’d like to run. The exemption is narrow: only what’s strictly necessary for the service the person asked for. Analytics and CMP telemetry generally aren’t ‘strictly necessary’ for the service requested.
  • First-party control is the ethical (and durable) default. Keeping collection within your own domain and infrastructure reduces risk, improves accuracy, and earns trust.

“Necessary is narrow: session continuity, load-balancing; not analytics, tag managers, invoicing telemetry, A/B, ads, social embeds.”

When we align to these fundamentals, something powerful happens: data quality improves, consent rates rise, and the relationship becomes reciprocal again.

“If it’s not okay offline, it’s not okay online.”

We don’t wake up in the morning and think, “I can’t wait to be profiled today.” We step into shops, cafés, hospitals, and town halls expecting something simple: to be seen as people first, not data points. In the physical world, the rules are understood - you don’t get followed without cause, you don’t get recorded without notice, and you don’t get upsold by a stranger who overheard your private conversation. Yet the moment we move online, too many brands swap those human rules for machine logic and call it “optimization.” To reset our instincts, let’s bring the web back into the real world. The following vignettes pair everyday, offline moments with their digital equivalents - not to shame, but to make the ethical line visible again. If it’s not okay across a counter, it isn’t okay across a cookie.

Nine vignettes you can use with your teams

  1. Shadow Filming vs. Store Cameras
    Security cameras with signage and purpose limits? Fine. Secret filming of every aisle and selling the footage to ad brokers? Not fine. That’s the difference between legitimate necessity and surveillance capitalism.
  2. Bag Check vs. Bag Copy
    A bag check at a venue is narrow, consensual, and time-bound. Mirroring your bag contents to a vendor’s warehouse “for analytics”? That’s what many SDKs and pixels do with your device. ePrivacy says: ask first.
  3. Receipts vs. Dossiers
    You expect a receipt (first-party record). You don’t expect the cashier to compile a cross-store dossier. Third-party trackers turn receipts into dossiers unless constrained by first-party design.
  4. Door Counter vs. Tail
    A simple footfall counter equals minimal, aggregated metrics. A stranger tailing you across town equals cross-site tracking. The former can often be consent-free; the latter requires explicit, informed consent.
  5. Coat Check Ticket vs. Biometric Stamp
    A claim ticket isn’t your identity. Fingerprinting and ‘hashed identifiers’ remain personal data when they single out or can reasonably single out a person. Treat them accordingly.
  6. Helpful Clerk vs. Pushy Middleman
    It’s ethical for your staff to assist. It’s not ethical for a third-party middleman to insert themselves into every interaction and extract value without the customer’s clear say-so. Legitimate interest doesn’t override ePrivacy.
  7. House Rules on the Wall
    In a physical space, rules are visible. Online, disclosures must be concise, specific, and accessible - not twelve clicks deep.
  8. Revoking Membership
    If I cancel a store card, the store stops using it. Likewise, digital consent must be revocable, logged, and honored across systems.
  9. Fire Exits and Floor Plans
    You’d never run a venue without safety maps. In data terms, that’s auditable records and technical enforcement - not banners that don’t block.

Private sector reality check

In July 2025, we ran privacy scans on 36,500 Danish business and e-commerce sites. 73% loaded third-party tags or wrote to the device before any consent. The most common patterns were early-loaded pixels, SDK beacons, tag managers set to fire pre-consent, analytics beacons, session-replay scripts, and fingerprinting helpers.

55% of the scanned sites loaded Google Tag Manager from US servers before any consent. That alone shares IP address and referrer with Google, and in many cases GTM then pulled third-party scripts including CMP telemetry prior to consent. A “container” still transmits personal data on the initial request. The referrer can expose full URLs and query strings. The IP is an identifier. That is processing.

Under ePrivacy Directive 5(3), any storing or reading on the device for non-essential purposes requires prior consent. Pre-consent GTM and the scripts it triggers are not strictly necessary, so the device access is unlawful without consent.

Under GDPR, IP and referrer are personal data. Pre-consent transmission to a US provider requires a valid lawful basis and a compliant Chapter V transfer mechanism with a real transfer risk assessment. If the ePrivacy gate fails, there is no valid GDPR basis to process the resulting data.

If a store kept a dossier on you without asking, that would not be okay offline. The online equivalent is not okay either. Ethics and compliance align here: default-deny before choice, no reads or writes until a clear yes, and a short consent lifetime with renewal.

For those who want to verify or cite, the methodology and full numbers are listed in Source Notes and Further Reading.

From the private-sector data to the public-sector duty: citizens need public websites to show what good looks like, not mirror the same pre-consent patterns.

The public sector must lead - and stop normalizing hypocrisy

Citizens can’t “opt out” of government. That’s exactly why public institutions owe a higher duty of care than any brand. Yet our latest scans in August and September 2025 show the opposite: 82% of Danish and 91% of Swedish municipalities load beacons and trackers before consent, in direct conflict with ePrivacy Directive 5(3). When the referee breaks the rules, the game is rigged.

Look at the pattern:

  • Schools as testbeds. Classroom platforms and devices are rolled out first, risk-assessed later - exporting children’s telemetry by default and treating DPIAs as paperwork, not design.
  • Ministries negotiating telemetry, not trust. Productivity suites and copilots ship with “standard” settings that quietly phone home; only after pushback do configurations tighten, while residual risks remain.
  • Health data without a social license or consent. Large data platforms are procured for “efficiency” before earning legitimacy with clinicians and patients, fueling backlash about governance and oversight.
  • Analytics is treated as harmless by default. Public sites still embed trackers as if they were essential utilities, normalizing pre-consent data grabs under the banner of “measurement.”

Leadership means changing the defaults. Here’s the minimum viable pledge every public body should make - and verify:

  1. First-party by design. No third-party trackers on citizen services unless strictly necessary and explicitly consented. Replace plugins and pixels with first-party equivalents you control, and publish a domain allowlist.
  2. Consent before collection - with proof. Nothing that can read or write to a device runs pre-consent; enforce at network/script level; reject must be as easy as accept. Keep auditable logs to demonstrate enforcement.
  3. Transparency people can understand. Plain-language notices that name recipients, purposes, and retention - on every service, not buried three links deep.
  4. Public DPIAs and annual red-teaming. Treat analytics, productivity suites, and AI copilots like critical infrastructure. Document residual risks, invite independent scrutiny, and act on findings.
  5. Measure what matters. Report a quarterly “pre-consent tracker rate” and drive it to zero. If a new campaign reintroduces risk, fix it within the sprint, not the fiscal year.

If governments want legitimacy for digital transformation, they must earn it in code, not in press releases. Flip the default - first-party by default, consent before collection, continuous verification - and those 82%/91% numbers start to fall.

Dark-pattern consent is not consent

Consent should feel like a conversation. Dark patterns turn it into a trap. The page loads; a banner slides in like a friendly usher; the “Accept” button glows while “Reject” hides behind a fold or a euphemism. You haven’t chosen anything, but scripts are already whispering to third parties. That isn’t consent - it’s choreography designed to make “yes” the path of least resistance and “no” the path of attrition.

Watch how the misdirection works. The banner claims “necessary cookies only,” yet the “necessary” bucket smuggles analytics, ad beacons and even the telemetry for the consent banner solution itself to invoice the website owner based on usage. A single, oversized “OK” pretends to be neutral, while “reject” requires a scavenger hunt. “Legitimate interest” arrives pre-switched, as if rights were opt-outs from a loyalty program. Even the language is engineered: “improve your experience,” “partners,” “personalization” - soft words that conceal hard data flows.

The harm isn’t abstract. Dark patterns distort metrics, poison consent logs, and teach people that their choices don’t matter. They also erode the very thing marketers need most: trust. When someone clicks “reject” and still sees the network tab light up, they don’t feel “optimized”; they feel deceived. And once a person senses the game is rigged, every future prompt - for email, for payment, for permission - inherits that suspicion.

There’s a simple, human standard to hold onto: if a person says nothing, nothing should happen. If they say no, the site should keep working without punishment or nagging. If they change their mind, the change should take effect everywhere, immediately, with a record that proves it. Real consent is calm, symmetrical, and reversible. It does not shout “ACCEPT” while whispering “decline,” it does not bundle unrelated purposes, and it never treats silence as a signal.

Dark patterns are a growth hack with a half-life. You can squeeze a few extra percentage points today, but you pay for them in complaints, churn, and credibility tomorrow. The ethical - and effective - alternative is to design for agency: block by default, explain plainly, separate purposes, make “reject” as easy as “accept,” and let your behavior match your banner. When people feel respected, they don’t just comply; they commit.

“Open a private window, don’t touch the banner, reload with DevTools → Network. You should see zero third-party calls.”

Principles matter only when they survive a sprint - here’s how to make them do that.

A practical Data-Ethics Playbook 

We don’t fix dark patterns with better speeches; we fix them with better defaults. Ethics only matters if it survives contact with a sprint, a campaign launch, or a vendor handoff. So let’s turn principles into shippable practices - the kind that are boring in the best way because they work every day without heroics.

Think of this as moving from posture to proof. If a stranger opened your dev tools in a fresh session, would your site’s behavior match your banner’s promises? Could you explain every data flow in plain language and back it with logs? That’s the bar.

What follows isn’t theory. It’s a playbook you can implement in weeks, not quarters - a set of choices that makes “respect by default” the easiest path for your team and the clearest signal for your customers. Pick one step, ship it, measure the improvement, then take the next. Ethics scales the same way software does: incrementally, and in code.

  1. Block before you ask. Ensure any script or SDK capable of storing/reading identifiers does not execute until the person says yes. UI is not enough - use system-level enforcement.
  2. Collect only what you can defend. If you can’t explain it to a customer (or a regulator) in 20 seconds, don’t collect it. Data minimization isn’t a slogan; it’s a design constraint.
  3. Go first-party by design. Replace third-party plugins with first-party equivalents where possible. Your risk - and your reliance on surveillance incentives - drops immediately.
  4. Make consent real. Use explicit, informed, granular choices with easy revocation. Log every consent event with an audit trail.
  5. Be transparent in plain language. People should understand who gets data, what for, and for how long. No euphemisms.
  6. Continuously monitor. New campaigns and embeds re-introduce risk. Scan daily, enforce automatically, and keep verifiable logs.
  7. Prove the upside. Ethical data isn’t less valuable - it’s more valuable. First-party data is cleaner, more durable, and more actionable.

“CMP interfaces don’t block; script/network controls do. Treat UI as a signal, not enforcement.”

Choose consent you can prove and data you can defend

If a single third-party script runs before a choice, that isn’t “optimization”; it’s privacy theater. People can feel it. The fix isn’t a better banner - it’s a better default.

Flip the default, keep the trust:

  • First-party by design. Own the data path. Replace plugins/pixels with first-party equivalents you control.
  • Consent before collection. Silence means silence. “Reject” works as easily as “Accept,” and the site still works.
  • Enforcement you can audit. Block at the network/script level and log every consent state and change.

Make it real in 30 days:

  • Week 1 - Discover: Inventory every third-party call; measure your pre-consent tracker rate; identify “shadow code.”
  • Week 2 - Gate: Implement hard blocking for non-essential scripts; make reject-as-easy-as-accept; ship a visible revoke control.
  • Week 3 - Replace: Swap high-risk embeds for first-party alternatives; remove anything you can’t justify in one sentence.
  • Week 4 - Prove: Turn on consent logging, publish your domain allowlist, and re-measure. Share the before/after with your team.

When you can open dev tools on a clean session and see nothing run before a choice, you don’t just comply - you signal respect. That signal compounds: higher trust, cleaner data, fewer surprises.

If you want a structured pressure test, book a Privacy Compliance Review. We’ll map the risk, flip the defaults, and leave you with data you can stand behind - to customers, to regulators, and to yourself.

“First-party by design beats surveillance by habit.”

The Line We Choose

Growth at any cost is just extraction with better branding. Real growth is the compounding effect of trust: people who return because they feel respected - not managed. That’s the choice in front of us.

Collect less. Explain more. Block by default. Let “no” be as easy as “yes.” Build first-party systems you can stand in front of. If a stranger opened your dev tools on a clean session, would your behavior match your banner? That’s the mirror that matters.

This isn’t about winning arguments. It’s about earning permission - every day, with code that honors the person on the other side of the screen. When we do, we don’t just comply with laws; we align with people.

Choose consent you can prove and data you can defend. Not because regulators demand it, but because your customers deserve it - and because your brand becomes the sum of these choices.

Draw the line. Hold it – in code.

Ronni K. Gothard Christiansen
Technical Privacy Engineer & CEO, AesirX.io

Source Notes & Further Reading

If you’re the kind of reader who wants more than opinions, this is your trail of receipts. Over the past months I’ve documented the incentives, the technical mechanics, and the human costs behind today’s data practices - from public-sector “sovereignty” that quietly exports telemetry, to growth playbooks that treat consent as a UX obstacle. These aren’t hot takes; they’re field notes from audits, scans, and real implementations.

Use the pieces below as a map. Start with the big picture of why most sites still break the rules, then dive into how dark patterns persist, and finally into the engineering choices that flip the defaults. Share them with a colleague who’s skeptical, a leader who needs the business case, or a developer who wants the how-to. Each link expands one argument in this essay and shows what “ethical by design” looks like in practice.

Enjoyed this read? Share the blog!