GA4 · · Last updated: April 28, 2026

The analytics audit checklist I use on every new client

The exact 30-point checklist I run through before touching anything. Takes 2 hours. Saves weeks of guessing.

The analytics audit checklist I use on every new client

Every new client engagement starts the same way. Before I write a single line of GTM code, before I open Looker Studio, before I make any recommendations, I run an audit. Every single time.

I’ve been doing this long enough that I can finish it in about two hours. Early on, it took a full day. The checklist has evolved over maybe 80 or 90 client engagements. Things get added when I find a new way tracking can be broken. Things get removed when they stop being relevant (goodbye, Universal Analytics checks).

Here’s the whole thing. I’m giving it away because honestly, the value isn’t in the checklist. It’s in knowing what to do with the findings.

Why I audit before doing anything else

I learned this the hard way. Years ago, a client hired me to set up conversion tracking for their lead gen forms. Straightforward job, right? I set everything up, launched it, and the numbers looked wrong immediately. Turns out their GA4 property was misconfigured from the migration, collecting data from three different domains with no cross-domain tracking, their GTM container had duplicate tags firing, and they had two GA4 measurement IDs sending data to the same property.

I spent two weeks untangling that mess. If I’d spent two hours auditing first, I’d have known about the problems before building on top of them.

Now I audit first. Always. Even when the client says “our tracking is fine, we just need X.” Especially when they say that.

The checklist

GA4 Configuration (8 items)

1. Property settings review. Check the data stream configuration. Is the measurement ID correct? Is enhanced measurement enabled? Which enhanced measurement events are active? I’ve seen properties where enhanced measurement was toggled off because someone was “testing something” six months ago and forgot to turn it back on.

2. Data retention settings. GA4 defaults to 2 months for event data. Two months. That means your Explorations reports only go back 60 days. Most clients don’t know this. Change it to 14 months immediately. There’s no reason not to.

3. Google Signals. Is it enabled? Should it be? Google Signals enables cross-device reporting and remarketing audiences, but it also applies data thresholding, which means GA4 will hide data from reports when the user count is too low. For small sites, this can make reports nearly useless. I’ve turned off Google Signals for clients who couldn’t figure out why their data kept showing “(not set).”

4. Internal traffic filters. Is your office IP filtered out? What about remote workers? VPN traffic? I check the filter definitions and verify they’re actually active (not just defined but sitting in “Testing” status forever). On at least a third of audits, I find internal traffic filters that were created but never activated.

5. Unwanted referrals. Check the referral exclusion list. Payment gateways like Stripe, PayPal, and Klarna should be excluded. So should any third-party authentication providers. If these aren’t excluded, you’ll see inflated session counts and broken attribution every time a user gets redirected through a payment flow.

6. Cross-domain tracking. If the client has multiple domains (main site, shop subdomain, booking system on a different domain), is cross-domain tracking configured? I test this by actually clicking through the user journey and checking if the _gl parameter passes correctly. Automated checks miss this more often than you’d expect.

7. User-ID implementation. If the site has logged-in users, is User-ID set up? Is it sending a hashed or anonymized identifier (not a raw email address, which I’ve seen more than once)? Is the reporting identity set to “Blended” or “Observed”?

8. BigQuery linking. Is the GA4 property linked to BigQuery? If the client has any plans for advanced analysis, this needs to be set up now. The export isn’t retroactive. Every day without it is a day of raw data you’ll never get back.

GTM Health (6 items)

9. Container organization. How many tags are in the container? Are they organized with naming conventions? I use a simple format: Platform - Type - Detail (like “GA4 - Event - Form Submit” or “Meta - Pixel - PageView”). When I open a container with 85 tags named “Tag 1”, “test”, “new tag copy”, and “DO NOT DELETE”, I know we have problems.

10. Tag firing frequency. Open the GTM preview mode and navigate through the site. Are tags firing on every page that they should? Are any tags firing more than once per page? Duplicate firing is one of the most common GTM mistakes I find. It inflates your data and often points to trigger configuration problems.

11. Trigger accuracy. Are triggers using the right conditions? I’ve found click triggers using “Click URL contains /contact” that also fire on “/contact-us”, “/contact-success”, and every other URL with “contact” in it. Check the specificity of your trigger conditions.

12. Variable configuration. Are data layer variables actually reading from the data layer? Is the data layer populated correctly? I test this in the browser console: dataLayer shows you everything that’s been pushed. Compare what’s there with what GTM is trying to read. Mismatches between variable names in GTM and data layer key names are incredibly common.

Want me to run this audit on your setup?I'll go through all 30 points and give you a prioritized fix list.

Book a Free Audit →

13. Tag sequencing and dependencies. Are tags that depend on other tags using tag sequencing? The classic case: a conversion tag that needs the GA4 config tag to fire first. Without sequencing, race conditions mean the conversion tag sometimes fires before GA4 is ready, silently dropping events.

14. Consent mode integration. Is GTM integrated with a consent management platform? Are tags configured to respect consent signals? I check that tags are set to fire based on consent status and that the CMP is actually sending the right consent signals to GTM’s consent API.

Tracking Accuracy (6 items)

15. Pageview completeness. Compare GA4 sessions with server logs or another analytics tool for the same period. A 10-15% discrepancy is normal (ad blockers, consent refusals). More than 25% means something is broken. Either GA4 isn’t loading on some pages, or consent mode is blocking more traffic than expected.

16. Event validation. Go through each custom event and verify it fires correctly. Submit the actual form. Complete the actual purchase flow. Click the actual button. Don’t just check the GTM preview. Check that the event shows up in GA4 DebugView with the correct parameters.

17. Conversion tracking accuracy. Compare GA4 conversion counts with source-of-truth data. Form submissions in GA4 vs. form submissions in your CRM. Purchases in GA4 vs. purchases in Shopify. Revenue in GA4 vs. revenue in your payment processor. Note the discrepancy percentage. Under 5% is excellent. Under 15% is acceptable. Over 15% needs investigation.

18. Parameter quality. Check event parameters for data quality issues. Are there null values where there shouldn’t be? Are currency values in the right format? Are product names consistent or do you have “Blue Widget”, “blue widget”, and “Blue Widget” (extra space) as three different products?

19. Session and user counting. Check if session counts make sense. A sudden spike in sessions with zero engagement usually means bot traffic. A suspiciously round number of users might indicate a caching issue where everyone gets the same client ID. Look at the user-by-browser report and check if one browser type dominates unrealistically.

20. Cross-device and cross-platform consistency. If the client has a mobile app and website, check that events are named consistently. An “add_to_cart” on web should match “add_to_cart” in the app, not “addToCart” or “cart_add.” Inconsistent naming makes cross-platform reporting impossible.

Reporting Setup (5 items)

21. Custom dimensions and metrics. Are custom dimensions configured for the business-specific data that matters? Things like customer type (new vs. returning), membership tier, content category, or experiment variant. GA4 gives you 50 custom event-scoped dimensions. Most clients use two or three. That’s leaving insights on the table.

22. Audience definitions. Check what audiences are configured. Are they being used for Google Ads remarketing? Are there audiences for key segments (high-value customers, cart abandoners, engaged users)? I often find audiences that were created for a specific campaign months ago and are now irrelevant, cluttering the interface.

23. Dashboard functionality. Open every Looker Studio report connected to this GA4 property. Do the reports load? Are the date ranges sensible? Do filters work? Are there broken charts showing errors? I’ve found “active” dashboards where half the charts showed errors because someone renamed a custom dimension in GA4 without updating the reports.

24. Exploration reports. Check if the team is using GA4 Explorations or just the standard reports. Standard reports answer generic questions. Explorations answer specific business questions. If nobody’s building Explorations, they’re probably not getting enough value from GA4. This is a training issue, not a technical one.

25. Alerts and anomaly detection. Are custom alerts set up for critical metrics? At minimum, I want alerts for: sessions dropping more than 30% day-over-day (tracking might be broken), conversion rate dropping below a threshold (site issue or tracking issue), and revenue anomalies. GA4’s built-in insights catch some of this, but custom alerts are more reliable.

Privacy and Compliance (5 items)

26. Consent banner implementation. Does the consent banner actually work? I click “reject all” and then check if GA4 tags still fire with full measurement capabilities. You’d be surprised how often “reject” doesn’t actually stop tracking. This is a legal liability.

27. Consent mode v2. Is advanced consent mode implemented? Are the ad_storage, analytics_storage, ad_user_data, and ad_personalization parameters being set correctly based on user consent? Check the network requests in the browser developer tools. The gcs parameter in GA4 requests tells you what consent state was active.

28. Data Processing Agreement. Is there a DPA in place with Google? For EU-based businesses, this is required under GDPR. It takes five minutes to accept in the GA4 admin panel. I find it missing about 40% of the time.

29. Data deletion capabilities. Can the client fulfill a GDPR deletion request? GA4 has a user deletion API and a deletion request feature in the admin panel. Does the team know it exists? Have they ever used it? Can they match a customer complaint to a GA4 user ID?

30. Cookie policy accuracy. Does the privacy policy accurately describe what’s being tracked? If the policy says “we use Google Analytics” but the site also has Meta Pixel, TikTok Pixel, LinkedIn Insight Tag, and Hotjar, the policy is incomplete. I compare the tag list in GTM with what’s disclosed in the cookie policy.

Common findings and what they mean

After doing this enough times, patterns emerge.

Data retention at 2 months shows up in about 60% of audits. It’s the easiest fix and usually the first thing I change.

Duplicate tag firing appears in roughly a third of audits. Usually caused by GTM triggers that are too broad or by multiple analytics implementations (someone added GA4 directly to the site AND through GTM).

Missing internal traffic filters appear in about a third of audits. The impact depends on how much internal traffic there is relative to real traffic. For a small B2B site where the team visits 50 times a day, this can skew data significantly.

Broken consent implementation is the finding that scares clients the most, and rightly so. It appears in about 25% of audits. The fix is usually straightforward but the legal exposure from months of non-compliant tracking is real.

No BigQuery export is almost universal among clients who haven’t worked with an analytics consultant before. Setting it up takes ten minutes and costs pennies for most sites. I set it up immediately.

How to prioritize fixes

Not all findings are equal. I categorize everything into three tiers.

Critical (fix this week). Anything that means your data is wrong right now. Duplicate tags, broken consent, missing conversions, cross-domain tracking failures. These undermine every decision you make using the data.

Important (fix this month). Things that limit your capabilities but don’t make current data wrong. Missing BigQuery export, insufficient custom dimensions, no alerts, data retention at 2 months.

Nice-to-have (fix this quarter). Optimization and organization items. Tag naming conventions, audience cleanup, exploration templates, documentation.

I deliver the audit as a spreadsheet with these three tiers, plus an estimated time for each fix. Clients can see exactly what’s broken, how bad it is, and how long it takes to fix. No ambiguity, no padding, no mystery.

The audit isn’t glamorous work. Nobody gets excited about checking data retention settings. But it’s the single most valuable thing I do for clients, because everything built on bad data is wrong, and you can’t fix what you haven’t diagnosed.

AR

Artem Reiter

Web Analytics Consultant

Related Articles

Need help with your analytics?

Free 30-minute discovery call. I'll look at your setup, tell you what's broken, and whether I can help. No commitment.

Or email directly: artem@reiterweb.com