Consent mode v2: the compliance tax on your analytics
What happens to your analytics when 40-60% of users decline cookies. How Google's behavioral modeling fills some of the gap, and whether you should trust it.
In March 2024, Google made consent mode v2 mandatory for anyone running ads in the EEA. If you hadn’t implemented it by then, your remarketing audiences stopped working and your conversion data went dark.
I spent that February on calls with panicking marketing teams. Most of them had never heard of consent mode until their Google Ads rep told them their campaigns would break in three weeks. Good times.
Now we’re two years in. Consent mode v2 is table stakes. But the thing nobody wants to talk about is what it actually did to your analytics quality. So let’s talk about it.
What consent mode v2 actually does
The short version: it lets Google tags adjust their behavior based on what a user consented to.
Before consent mode, the choice was binary. Either you fired all tags (ignoring consent, which is illegal under GDPR), or you blocked all tags until consent was given (losing data on everyone who clicked “reject” or just ignored the banner).
Consent mode sits in between. When a user declines cookies, Google tags still fire, but in a restricted way. They send “cookieless pings” to Google’s servers. These pings contain no personal identifiers, no cookies, no user IDs. They include basic information: a page was viewed, a conversion happened, the general geographic region, the timestamp.
Version 2 added two new consent signals: ad_user_data and ad_personalization. These give more granular control over whether user data can be used for ads. In practice, they mean Google needs explicit consent for remarketing lists and personalized ads, separate from analytics consent.
The technical implementation involves setting default consent states before any tags fire, then updating those states when a user interacts with your consent banner. If you use a CMP (consent management platform) like Cookiebot, OneTrust, or Usercentrics, they handle the integration. If you have a custom banner, you need to wire up the gtag('consent', 'update', {...}) calls yourself.
The data gap is real
Here’s the number everyone wants to know: how much data are you losing?
It depends on your audience, your consent banner design, and your geographic mix. But across my clients, I see consistent patterns.
German sites: 55-65% opt-out rates. Austrian sites: 50-60%. Netherlands: 45-55%. France: 40-50%. Southern Europe tends to be lower, around 30-40%. UK sites post-Brexit sit around 25-35% because the regulatory pressure is softer.
If you’re running a European e-commerce site, you’re realistically losing direct measurement on about half your traffic.
That’s not a rounding error. That’s half your data.
For a client doing €200K/month in revenue, consent mode means GA4 only directly measures about €100K-120K of that. The rest? You’re trusting Google’s model.
How Google fills the gap (behavioral modeling)
This is where it gets interesting. Google doesn’t just show you the data from consented users. It uses machine learning to model the behavior of non-consented users based on the cookieless pings and the patterns it observes from consented users.
The logic goes: if 3% of consented users who visited the pricing page converted, and we see 1,000 cookieless pings on the pricing page, we’ll model approximately 30 conversions from that group.
Google calls this “behavioral modeling” and it kicks in automatically once you have consent mode running and enough data. The threshold is roughly 1,000 events per day for at least 7 days, though Google is vague about exact requirements.
In GA4, you see modeled data mixed in with observed data. There’s a small icon in reports indicating when modeling is applied, but it’s easy to miss. In most reports, you can’t separate modeled from observed. The numbers just blend together.
For Google Ads, the impact is more direct. Modeled conversions appear in your conversion columns. Your automated bidding strategies use these modeled numbers. Your ROAS calculations include them. Your budget allocation depends on them.
My honest take: directionally useful, not precise
I’ve been watching consent mode data across about 40 properties for two years now. Here’s what I’ve concluded.
For traffic volume and page-level metrics, the modeling is pretty good. Total sessions, top pages, general traffic trends. I’ve compared modeled GA4 numbers against server logs, and they’re usually within 10-15% of actual. Good enough for trend analysis. Not good enough for A/B test results.
For conversion data, it’s more variable. I’ve seen months where modeled conversions tracked within 5% of actual (verified against payment processor data), and months where they were off by 30% or more. The modeling seems to struggle with low-volume conversion events, seasonal shifts, and anything that disproportionately affects consented vs. non-consented users.
Here’s a specific example. One of my e-commerce clients runs a lot of display advertising in Germany. Display traffic has a higher opt-out rate than organic or direct traffic, about 70% vs. 50%. Google’s model doesn’t fully account for this difference because it can’t see which channel the non-consented user came from (that information requires cookies to attribute). So the model tends to over-attribute conversions to organic and under-attribute to display. By about 20% in this case.
For campaign optimization, this matters. If Google Ads thinks your display campaigns convert 20% less than they actually do, automated bidding will underspend on display. Which might be fine if display genuinely performs worse. But the error isn’t in the campaign. It’s in the measurement.
I don’t think this makes consent mode useless. I think it makes it a directional tool. It tells you things are going up or down. It tells you which pages get traffic and roughly how much. It does not give you numbers you can put in a financial report with confidence.
Not sure your consent mode is set up right? I'll review your implementation, check your data gaps, and make sure you're not losing conversions to misconfiguration.
Book a Free Audit →Setting it up: basic vs. advanced
There are two implementation modes, and picking the wrong one limits what Google can model.
Basic consent mode blocks all Google tags until the user makes a choice. If they decline, no data is collected at all. This is simpler to implement but gives Google nothing to model from. You get a clean conscience and a big hole in your data.
Advanced consent mode sends cookieless pings even when users decline. Tags fire in a restricted mode: no cookies set, no personal data collected, but basic signals still reach Google’s servers. This is what enables behavioral modeling.
Most businesses should use advanced consent mode. It’s compliant (the French CNIL and German DSK have reviewed the approach, and while opinions vary, the mainstream legal interpretation supports it). And it gives you at least some signal from your non-consented traffic.
The implementation looks like this:
// Set defaults BEFORE any tags load
gtag('consent', 'default', {
'ad_storage': 'denied',
'ad_user_data': 'denied',
'ad_personalization': 'denied',
'analytics_storage': 'denied',
'wait_for_update': 500
});
// Your CMP triggers this when user makes a choice
function updateConsent(consentChoices) {
gtag('consent', 'update', {
'ad_storage': consentChoices.marketing ? 'granted' : 'denied',
'ad_user_data': consentChoices.marketing ? 'granted' : 'denied',
'ad_personalization': consentChoices.marketing ? 'granted' : 'denied',
'analytics_storage': consentChoices.analytics ? 'granted' : 'denied'
});
}
If you’re using Google Tag Manager, the consent overview section shows you which tags respect consent signals and which don’t. Third-party tags (Meta Pixel, TikTok, LinkedIn) each have their own consent requirements. Don’t assume they work the same way as Google tags. They don’t.
One gotcha: wait_for_update is set to 500 milliseconds above. That’s how long Google tags will wait for the consent signal before proceeding with the default (denied) state. If your consent banner takes longer than 500ms to load and render, increase this value. I’ve seen banners on slow mobile connections take 2-3 seconds. Set it to 2000 to be safe, but know that it delays your tag firing.
Server-side tagging changes the math
If you’re running server-side Google Tag Manager (and you probably should be for other reasons), consent mode works differently. The browser sends a request to your server-side container, and the server decides what to forward to Google, Meta, and other vendors.
This matters because server-side tagging can model data itself before sending it. Some setups use first-party cookies set by the server (which aren’t blocked by browser privacy features) combined with consent-granted signals to maintain better user identity across sessions.
This doesn’t bypass consent requirements. You still need user consent to set identifying cookies. But it does mean your consented users get tracked more accurately, which improves the quality of the baseline that Google uses for modeling the non-consented group.
In my experience, properties with server-side tagging have 15-25% more accurate modeled data. It’s not a silver bullet, but if you’re already investing in measurement quality, it’s the logical next step.
Frequently asked questions
Q: What is Google Consent Mode v2 and is it required?
Consent Mode v2 lets Google tags adjust their behavior based on user consent choices. It became mandatory in March 2024 for anyone running Google ads in the EEA. Without it, remarketing audiences stop working and conversion data goes dark. It adds two new consent signals: ad_user_data and ad_personalization.
Q: What is the difference between basic and advanced consent mode?
Basic consent mode blocks all Google tags until the user consents, giving Google nothing to model from. Advanced consent mode sends cookieless pings even when users decline, with no personal identifiers or cookies. Advanced mode enables Google’s behavioral modeling, which fills data gaps using patterns from consented users.
Q: How accurate is Google’s behavioral modeling in consent mode?
For traffic volume and page-level metrics, modeling is typically within 10-15% of actual numbers verified against server logs. For conversion data, accuracy varies more widely and can be off by 5-30% depending on the month. The modeling struggles with low-volume events, seasonal shifts, and channels with disproportionately high opt-out rates.
Q: How much data do you lose with consent banners?
Opt-out rates vary by country: 55-65% in Germany, 50-60% in Austria, 45-55% in the Netherlands, 40-50% in France, and 25-35% in the UK. For a typical European ecommerce site, you lose direct measurement on roughly half your traffic. Advanced consent mode with behavioral modeling partially fills this gap.
Rethinking what “analytics” means now
Here’s where I get philosophical, so bear with me.
For twenty years, web analytics operated on the assumption that you could measure everything. Every visit, every click, every conversion. Google Analytics was basically a census of your website traffic.
That era is over. Consent mode, iOS privacy changes, ad blockers, Firefox and Safari blocking third-party cookies by default. We’ve moved from census to sampling. And not clean, statistical sampling. Messy, biased sampling where the people you can’t measure are systematically different from the people you can.
Privacy-conscious users tend to be higher income, more tech-savvy, and (in some verticals) higher converting. The data you’re missing isn’t random noise. It’s a specific demographic you’re undercounting.
I think the industry needs to stop pretending we can get back to census-level measurement. We can’t. The regulatory direction is clear, browser vendors are adding more restrictions every year, and users are increasingly aware of tracking.
Instead, we need to get comfortable with a different kind of analytics. One that blends observed data with statistical models. One where the absolute numbers matter less than the trends and comparisons. One where we triangulate between multiple data sources instead of trusting any single one.
Compare your GA4 revenue to your payment processor. Compare your GA4 sessions to your server logs or CDN analytics. Use Google Search Console for organic traffic counts since those don’t depend on cookies. Build a picture from multiple imperfect sources instead of depending on one “complete” source that isn’t actually complete anymore.
That’s harder than reading a single dashboard. It requires more statistical thinking and more skepticism. But it’s more honest. And honestly, the old way was never as accurate as we pretended it was. We just didn’t have a reason to question it.
Consent mode v2 is the tax you pay for doing business in a world that decided privacy matters. The tax is real. It costs you data quality, it costs you implementation effort, and it costs you certainty. But the answer isn’t to fight it. It’s to build measurement systems that work within the constraints.
Because those constraints aren’t going away.
Artem Reiter
Web Analytics Consultant