Marketing Audit: The 7-Step Framework I Run for Clients
A marketing audit is a structured review of every channel, asset, and dollar a business spends on growth. The goal isn’t to find a silver bullet. It’s to find the leaks, rank them by how much money they’re costing you right now, and plug the top three before you touch anything else.
Most audits fail because they try to grade everything at once. This one doesn’t. It runs in a fixed order across seven areas, each with a time budget and a decision at the end. You can run the whole thing in a week for a small business or stretch it to a month for a midmarket company with five channels and a sales team.
Step 1: Audit the goals, not just the metrics
Every audit starts here because every other step depends on it. If you don’t know what the business is trying to do, you can’t tell whether the marketing is working.
The first question I ask a founder is always the same. “What does a great quarter look like in dollars?” Not traffic, not MQLs, not engagement. Revenue. Then I back into the inputs.
The second question is what the goal system looks like. Most teams say they use OKRs but actually run on vibes and quarterly planning docs nobody opens. Ask to see the current OKRs, the last two quarters of results, and the scoring.
Checklist:
- Current revenue target for the year and the quarter
- CAC target (cost to acquire a customer) and whether the business actually tracks it
- Target LTV (lifetime value) and how it’s calculated
- Payback period expectation (most SaaS targets 12 months, ecomm targets 3-6)
- Marketing-sourced pipeline target versus sales-sourced
- Last quarter’s actual-vs-target deltas
Typical findings: No LTV model. CAC calculated without including salaries. Revenue targets set by the CEO without marketing input. Sales and marketing using different definitions of “lead.”
Red flags: Different numbers from different team members for the same metric. Targets that haven’t changed in three quarters. No one owns forecasting.
Tools: HubSpot revenue analytics, Salesforce reports, a shared Google Sheet that actually gets updated.
Step 2: Audit the channels and channel mix
Now you know what the business is trying to do. Next question: where is the money going, and is any of it working?
Pull the last 12 months of marketing spend broken down by channel. Not the plan, the actuals. Then pull the attributed revenue or pipeline per channel. Calculate ROAS or CAC by channel. Most companies have never actually done this exercise in one place.
You’re looking for the 80/20. Almost always, one or two channels drive most of the wins and the rest are either noise or drag. If everything shows roughly equal output, the attribution model is broken.
Checklist:
- Spend by channel over the last 4 quarters
- Attributed pipeline or revenue by channel
- CAC by channel (or CPL if B2B)
- Conversion rate by channel from first touch to customer
- Channel-level seasonality (does December crater? Does January spike?)
- Opportunity channels nobody is running yet but competitors are
Typical findings: Meta Ads still running at a 2014 ROAS target and losing money every month. LinkedIn spend creeping up without tracking. SEO delivering 35% of pipeline while getting 5% of the budget. A newsletter with 12,000 subscribers getting one send a quarter.
Red flags: Any channel that hasn’t been reviewed in more than two quarters. Channels the CMO is emotionally attached to. Agency relationships older than 18 months that have never been rebid.
Tools: GA4 for channel-level attribution, SEMrush or Ahrefs for organic, Meta Ads Manager and Google Ads for paid, a spreadsheet pulling it all together.
Step 3: Audit the content engine
Content is where audits go to die because there’s always too much of it and no one wants to be the person who says the blog is underperforming.
Start with the simple numbers. How many pages on the site? How many get traffic? How many get enough traffic to matter? In most content operations I’ve audited, fewer than 10% of published pages drive 80% of the traffic. The rest are either decaying or never worked.
Then look at depth. Is the content written by someone who knows the topic, or by someone filling a brief? Is there first-party data, original examples, specific numbers? Or is it a summary of what already ranks?
Checklist:
- Total indexed pages versus pages getting any organic traffic
- Top 20 pages by traffic and top 20 by conversion (rarely the same list)
- Average article age and publishing cadence
- Keyword cannibalization audit (multiple pages targeting the same query)
- Content decay list (pages losing traffic quarter over quarter)
- Editorial standards document, if one exists
- Content-to-revenue connection (which articles drive signups?)
Typical findings: 600 published articles, 47 driving traffic. No one on the team has read the old posts. The highest-converting article was written three years ago and hasn’t been updated. The newest content sounds like every other piece on the SERP.
Red flags: Publishing cadence prioritized over quality. Freelance briefs that don’t specify first-party angles. AI-generated drafts shipped without substantial human editing. Zero content-led signups.
Tools: Search Console for traffic data, Ahrefs for content gap analysis, a spreadsheet audit of the top 50 pages, GA4 for the conversion overlay.
Step 4: Audit paid acquisition
Paid is where waste is easiest to find and hardest to cut, because someone is usually emotionally attached to the campaigns.
Run the audit at the account level first, then the campaign level. Check account settings before anything else. I’ve seen accounts where enhanced conversions weren’t enabled, offline conversions weren’t being uploaded, or the pixel was firing on the wrong URL.
Then look at campaign structure. Google Ads campaigns should follow Performance Max plus tightly themed search. Meta campaigns should be consolidated with Advantage+. LinkedIn campaigns need audience sizes above 50,000 or they starve.
Checklist:
- Account-level conversion tracking sanity check
- Campaign structure (too many campaigns is usually the problem)
- Negative keyword lists on Google Ads (most are embarrassingly short)
- Placement exclusions on Meta and Google Display
- Creative refresh cadence (weekly for Meta, monthly for LinkedIn)
- Landing page match (is the ad sending to the right page?)
- Bid strategy alignment with actual goal (tCPA, tROAS, max conversions)
Typical findings: 40% of Google Ads spend on one phrase match keyword that’s actually triggering on competitor queries. Meta creative hasn’t rotated in 90 days and frequency is at 6.2. LinkedIn targeting includes “business owner” as a job title (matches 200M people, none of them ICP).
Red flags: Anyone on the team says “the agency handles that.” Campaigns older than 12 months without a major refresh. Brand search spend that isn’t broken out separately.
Tools: Google Ads Editor for bulk audits, Meta Ads Manager breakdowns, LinkedIn Campaign Manager, Triple Whale or Northbeam for post-iOS 14 attribution on DTC.
Step 5: Audit conversion and the funnel
Traffic and leads don’t matter if they don’t convert. This step is where most teams find their fastest wins.
Start with the homepage and the top three landing pages by traffic. Run a 5-second test (show the page for 5 seconds, ask what the business does). Then check mobile versus desktop conversion rates. If mobile is more than 30% lower than desktop, the mobile experience is broken.
Then look at the signup or checkout flow. How many steps? How many fields? How many chances to abandon? Where do people actually drop off?
Checklist:
- Homepage conversion rate (baseline for everything else)
- Top landing pages by traffic and their conversion rates
- Mobile-vs-desktop conversion gap
- Form completion rate and field-level drop-off
- Checkout abandonment rate (ecomm) or signup abandonment (SaaS)
- Session recordings of the top three conversion paths
- Heatmaps of the primary CTA areas
- A/B test history (has anything shipped in the last 6 months?)
Typical findings: Mobile conversion at 0.8% while desktop is 4.2%. Form has 11 fields and 7 of them are optional but nobody marked them optional. Checkout loads a third-party script that adds 1.4 seconds to LCP. Zero active A/B tests.
Red flags: No experimentation program. No one owns CRO. The design team ships changes without telling marketing. Hotjar installed but nobody watches the recordings.
Tools: Hotjar or Microsoft Clarity for heatmaps and recordings, GA4 funnel exploration, PageSpeed Insights for speed, VWO or Optimizely for tests, PostHog if the team is technical.
Step 6: Audit analytics and measurement
If the analytics are broken, every other audit finding is suspect.
The first check is sanity. Does GA4 match Search Console? Do the ad platforms match GA4? Does the CRM match the billing system? These four sources almost never agree. The question is whether the gap is 3% (normal) or 40% (broken).
Then check events. Most GA4 implementations I audit have 40 events firing and only 6 of them are actually useful. The rest are noise.
Checklist:
- GA4 versus Search Console traffic reconciliation
- Ad platform conversions versus GA4 conversions
- CRM contact counts versus GA4 user counts
- Key events configured and marked as conversions
- UTM parameter hygiene (any direct/none traffic that should be tagged?)
- Consent mode v2 configured for EU traffic
- Server-side tagging in place if post-iOS 14 is material
- Data retention settings (14 months is the GA4 default maximum for some fields)
Typical findings: GA4 shows 28% more sessions than Search Console because consent banner blocks some tracking. Meta reports 3x more conversions than GA4 because of attribution windows. No server-side tagging, so ad platforms are guessing.
Red flags: Different teams reporting different numbers in the same meeting. No written measurement plan. Events that fire but nobody uses in a report.
Tools: Google Tag Assistant for debugging, GA4 DebugView, Search Console for the source-of-truth organic number, Segment or RudderStack for CDP if the stack is complex.
Step 7: Audit the competitive landscape
This one comes last because it only matters in context. Competitive analysis without a baseline is just envy.
Pick the three competitors the sales team mentions most. Not the ones that show up in Gartner reports, the ones that actually come up on sales calls. Then pull their organic keyword overlap, their paid ad history, their content publishing cadence, and their pricing changes.
You’re looking for two things. One, where they’re outspending or outproducing you. Two, where you have an angle they don’t.
Checklist:
- Organic keyword overlap with top 3 competitors (Ahrefs Content Gap works well)
- Estimated paid spend by competitor (SimilarWeb, SEMrush)
- Content publishing cadence comparison
- Share of voice for the top 20 category keywords
- Competitor pricing and packaging changes in the last 12 months
- Review sentiment analysis (G2, Capterra for SaaS; Trustpilot, Reddit for DTC)
- Feature parity or gap sheet (for product-led categories)
Typical findings: Competitor A publishes three times a week to your once. Competitor B has quietly built 400 comparison pages while you have none. Competitor C raised prices 20% six months ago and nobody noticed.
Red flags: You haven’t looked at competitor pricing pages in six months. The sales team can’t name the top objection for each competitor. No alerts on competitor brand mentions.
Tools: Ahrefs or Semrush for organic and backlinks, SimilarWeb for traffic estimates, G2 and Capterra for reviews, Visualping for pricing page change detection, Reddit search for unfiltered sentiment.
Summary: the 7 steps and their priority
| Step | Area | Typical time | Highest-leverage findings |
|---|---|---|---|
| 1 | Goals and metrics | 2-4 hours | Broken LTV math, wrong CAC definition |
| 2 | Channel mix | 1-2 days | Underfunded winners, overfunded losers |
| 3 | Content engine | 2-3 days | Decay, cannibalization, thin libraries |
| 4 | Paid acquisition | 1-2 days | Tracking gaps, stale creative, bid strategy errors |
| 5 | Funnel and CRO | 1-2 days | Mobile gap, form friction, dead testing |
| 6 | Analytics | 1 day | Source-of-truth disagreements, event chaos |
| 7 | Competition | 1 day | Share-of-voice gaps, missed pricing moves |
Who should run the audit: internal or external?
Internal audits are cheaper but rarely honest. The person running the audit probably built the thing being audited. That’s a conflict even with the best intentions. Internal audits work when you have a senior marketer joining who wasn’t involved in the last two years of decisions.
External audits cost more, usually $5,000 to $25,000 for a small-to-midmarket company, but they surface the politically awkward findings an internal reviewer won’t name. The right external auditor is one who will refuse the engagement if you won’t act on findings. The wrong one sells you a 60-slide PDF that sits in Drive forever.
My rule: run a light internal audit quarterly against this same checklist. Bring in an external every 18-24 months or after any major leadership change. Not both at the same time.
The deliverable that actually gets acted on
A marketing audit is only useful if it changes what the team does next Monday. The deliverable isn’t a deck. It’s a prioritized list of 5-7 initiatives, each with an owner, a budget, a deadline, and an expected outcome.
Anything more than 7 initiatives won’t get done. Anything less than 5 and you’re playing it safe.
The format I use: one page, seven rows, four columns (finding, action, owner, outcome). Not 47 recommendations across six workstreams. The goal is behavior change, not thoroughness theater.
Most audits fail not because the findings are wrong, but because the findings are too many. Ruthless prioritization is the actual skill. If you can do that part, even a mediocre audit produces real revenue. If you can’t, even a perfect audit produces a binder.
FAQs
How long does a marketing audit take?
A full audit across all seven steps takes 1-4 weeks depending on company size. Small businesses with one or two channels can be audited in a week. Midmarket companies with paid, content, email, and partnerships usually need three to four weeks, spread across the audit team and stakeholder interviews.
How much does a marketing audit cost?
External audits for small-to-midmarket companies range from $5,000 to $25,000. Enterprise audits from the larger consultancies run $50,000+. Internal audits cost staff time, usually 40-80 hours across the marketing team. The real cost is the actions the audit surfaces, which can run five to seven figures to implement.
How often should I run a marketing audit?
Run a light internal audit every quarter using the same seven-step framework. Run a deeper external audit every 18-24 months or after major events like a leadership change, a pivot, a funding round, or a bad quarter. More frequent than quarterly and you’re auditing instead of executing.
Should the CMO run the audit themselves?
Not if the CMO has been in the role for more than six months. A sitting CMO auditing their own function has too many conflicts. Either bring in an external auditor, or have a senior marketer from another function run the review. The one exception is a new CMO in their first 90 days, where the audit doubles as an onboarding exercise.
What tools do I need for a marketing audit?
Minimum stack: GA4 or similar analytics, Search Console, Ahrefs or Semrush, access to every ad platform, the CRM, and a spreadsheet tool. Optional but useful: Hotjar or Microsoft Clarity for CRO, SimilarWeb for competitive estimates, and a CDP like Segment if the data stack is complex.
What’s the difference between a marketing audit and a content audit?
A marketing audit covers all seven areas of the function. A content audit is one slice of that audit, focused only on published articles, pages, and assets. If you only need a content audit, run step 3 of this framework and skip the rest. If you need everything, run all seven.
What’s the single most valuable finding in most audits?
The channel mix step almost always surfaces the biggest dollars. Companies chronically underfund the channel that’s working and overfund the one that used to work. Reallocating spend based on actual CAC per channel, not historical habit, is the single fastest ROI improvement in most audits.
How do I get executive buy-in to act on the audit findings?
Lead with revenue impact, not marketing metrics. Every finding should translate to projected pipeline, CAC, or LTV change. Then prioritize to 5-7 actions with owners and deadlines, not 30 recommendations. Executives act on focused lists with clear accountability, not 60-slide decks.
The last thing to remember: an audit isn’t a verdict on the team’s work. It’s a snapshot of a moving system. Markets shift, channels saturate, and what worked 18 months ago stops working without anyone noticing. Running this framework on a cadence is how you catch that before the board meeting does.