Why Marketing Data Should Work Like Banking Records, Not Polaroids
Your CMO walks into your office: "Podcast ads were killing it last quarter. Same budget, same shows this quarter—conversions down 40%. What changed?"
You open your attribution dashboard. It shows current performance. A number. Just a number.
The context that would actually answer the question—which episodes worked, what messaging resonated, how listener behavior shifted—was overwritten weeks ago. You're about to spend $150,000 experimenting to rediscover what you already knew three months ago.
This is the $260,000 problem most mid-market companies don't realize they have.
What Banks Figured Out Centuries Ago
Your bank shows a balance of $4,237. But that number is meaningless alone. What matters is the complete history: the $3,000 paycheck, the $892 rent payment, the $65 dinner charge. Without that transaction log, you can't dispute fraud, prove payment, or understand spending patterns.
Banks learned this lesson the expensive way.
When Wells Fargo employees opened 1.5 million fraudulent accounts between 2011-2015, investigators could only impose $3 billion in penalties and identify every victim by analyzing complete transaction histories. They found suspicious patterns—account openings clustered around employee terminations, unauthorized fees, forged signatures—that were invisible in aggregate account balances.
Federal regulators now require 5-7 year retention of all transaction records with criminal penalties up to 20 years imprisonment for destruction. This isn't bureaucracy. It's because complete transaction history enables the fraud detection, dispute resolution, and pattern analysis that snapshots cannot provide.
Banks use double-entry ledgers where transactions never get deleted. Corrections happen through reversals that preserve complete audit trails—showing the original transaction, the mistake, and the fix. As Modern Treasury explains, "It's like removing the backspace key: mistakes must be corrected with compensating entries that preserve the complete story."
Marketing should work the same way: Store every touchpoint as an immutable event. Calculate attribution by querying that complete history, not by overwriting snapshots.
Most companies do exactly the opposite.
Why Your SaaS Tools Are Built This Way (Hint: It's Not Accident)
Only 18% of marketers are very confident in their attribution data. The problem isn't the people—it's that Salesforce and HubSpot were architecturally designed to store results, not the events that created them.
This isn't incompetence. It's business model.
Salesforce's default attribution assigns 100% to a single "primary campaign source"—problematic for any business with multi-touch journeys or offline touchpoints. The system can't track cross-device behavior, can't do multi-channel attribution natively, and critically: cannot retroactively apply different attribution models because it doesn't store the underlying events.
When Google Analytics documentation states "changes you make in GA4 will not retroactively adjust your past data," that's not a limitation—it's the architecture. You can't re-calculate what you didn't record.
HubSpot users face similar constraints: 90% of API-created contacts default to "offline sources" despite proper tracking, lists over 10,000 contacts aren't available for attribution, and attribution stops at contact creation—not through the full customer journey. One B2B marketer described being "crippled" by this limitation.
The vendor lock-in is the point. Attribution tools become more valuable as historical data accumulates. Switching costs increase over time. Export your data? Sure—but you'll lose the relationships, the journey sequences, the patterns that make it valuable.
When you change tools, you face stark choices: pay significant migration costs, accept 24 hours of downtime, or "all historical engagement data in data views and tracking reports will be lost."
Companies spending $1 million annually on marketing with typical attribution problems waste approximately $260,000—26% of budget—on channels they can't properly measure.
The $10M Data Blackout
At a previous enterprise B2B company, I managed a custom demand funnel tracking system on Salesforce - thousands of leads daily, thousands of campaigns quarterly, nine figures in annual marketing spend.
May 2024: IT made a routine operational change to a Salesforce filter. Nothing to do with attribution. Within hours, attribution went dark.
The damage: 1,200 leads, 1,700 contacts, 1,300 opportunities, 4,000 funnel records, $10M in early pipeline - all missing attribution. Board reporting stopped. Budget decisions stopped.
Recovery took three weeks of full-team firefighting because we stored snapshots, not events. We couldn't replay history - we had to manually reconstruct it.
The Real Cost: What You Never Build
The May crisis was dramatic. The daily reality was worse: constant vigilance over systems you don't control. Sales ops introduces a new opportunity type - your hardcoded logic breaks. IT modifies field dependencies - attribution fails. You maintain a list of "third rails" other teams can't touch.
Want to test a new attribution model? Need new fields. Fields need sprint planning, dev cycles, release coordination. Best case: two weeks. Typical case: months. You optimize for "what ships" over "what's right." Temporary fixes become permanent.
I never built account-based attribution. Not because it wasn't valuable - it was critical for enterprise sales. But the dev cycles and cross-team dependencies made it impossible to prioritize against keeping the current system alive.
At $100M+ spend, this isn't just technical debt. It's strategic paralysis.
Event-sourced architecture changes the question from "can we ship this in Q3?" to "can we write this query?" Operational systems evolve freely. Attribution reads immutable events - infrastructure changes can't break what they can't touch. New models query existing events. No fields. No sprints. No release risk.
Why Event-Sourcing Is Suddenly Practical
Event-sourced attribution was always the right architecture. Until recently, it was too expensive and complex for most companies.
Here's what changed: The infrastructure that used to require a dedicated engineering team now deploys in an afternoon. Cloud databases (Supabase, Postgres) auto-scale and self-maintain. Event tracking platforms (Segment, RudderStack) handle data quality automatically. Transformation tools (dbt) let marketers write queries without engineers.
The economics flipped. Custom attribution infrastructure that cost $5-9 million over three years now costs $2,000-$10,000 monthly with 2-3 month implementation. At $100,000+ annual marketing spend, owned infrastructure becomes cheaper than enterprise SaaS within 12-18 months - while delivering flexibility those tools architecturally cannot provide.
Modern platforms handle 80% of operational work automatically - data quality, monitoring, updates, scaling. Humans focus on strategy: model selection, metric definitions, interpreting insights. Le Figaro's marketing team, previously dependent on engineers for manual data work, found this approach meant "simplicity has been a life-changer" with marketers gaining complete data ownership.
But Won't This Conflict With Our Salesforce Investment?
No. Salesforce stays for operations - lead routing, territory management, sales workflows. Event infrastructure sits alongside it, reading from Salesforce but not depending on it.
When IT changes opportunity types in Salesforce (they will), your operational workflows update immediately. Your attribution keeps working because it queries immutable events, not live Salesforce objects. The systems are decoupled by design.
Think of it like your data warehouse - it doesn't replace Salesforce, it enables analysis Salesforce wasn't built for. Event infrastructure is the same pattern, optimized specifically for attribution and journey analysis.
The AI Unlock: Why This Matters More In 2025 Than 2020
For 10 years, event-sourcing was technically superior but operationally complex. That complexity justified paying SaaS premiums for simplicity.
AI agents just flipped that equation. They make event infrastructure both easier to use AND dramatically more valuable.
BCG's case study: A consumer goods company previously requiring 6 analysts per week now needs 1 employee with an AI agent delivering results in under 1 hour versus days or weeks. But here's the catch: AI agents need complete event streams, not pre-calculated summaries.
Questions like "What's the time interval between add-to-cart and purchase for different user segments?" require individual event timestamps that aggregation destroys. "Show me complete user journeys for customers who converted within 5 minutes versus 7+ days" needs event sequences, not averages.
With snapshot data, you ask: "Which reports do we have?" With event data and AI, you ask: "What patterns exist?" The AI queries raw events, finds correlations invisible in pre-aggregated dashboards, and surfaces insights humans would never think to look for.
Snapshot attribution: AI analyzes what you already calculated. Event attribution: AI discovers what you didn't know to calculate.
The marketing teams achieving 6-to-1 analyst efficiency gains aren't using AI to read dashboards faster. They're using AI to query complete behavioral data in ways dashboards cannot support.
Real Companies, Real Numbers
LinkedIn built custom attribution models and discovered approximately 50% of app installs reported through last-click were actually incremental—meaning half would have happened without ads. This finding enabled massive budget reallocation that snapshot attribution could never identify.
Gousto, a UK meal delivery service, built event-based attribution and can now measure campaign performance by platform, subscriber lifetime value by acquisition source, and customer retention patterns—enabling prediction of which customers will churn before they do.
The pattern: Companies building on owned event infrastructure consistently find 20-50% of their attribution was wrong, leading to five-figure monthly budget optimizations. They're not smarter. They're architecturally unconstrained. They can ask new questions of old data, test models in days instead of quarters, and evolve attribution logic without coordinating release schedules across six teams.
Is This Right For You?
Event-sourced attribution makes sense when:
You're spending $100,000+ annually on marketing across multiple channels
Attribution questions are blocking meaningful budget decisions
You need to test new models or answer questions your current tools can't support
You have (or want) technical resources who can write SQL queries
It's probably premature if:
Marketing spend is under $75,000 annually
You're still figuring out basic tracking and campaign tagging
Single-channel attribution (e.g., Google Ads only) gives you what you need
Your team has no technical capability and no budget to add it
The break-even point isn't just economic - it's strategic. When the cost of being wrong about attribution exceeds the cost of better infrastructure, you've crossed the threshold.
What This Means for Your Business
If you're spending $100,000+ annually on marketing and can't answer:
"What was our attribution three months ago?" "Why did this channel's performance change?" "Which specific touchpoints drive our best customers?"
You have an architecture problem worth solving.
The technology became dramatically simpler and more affordable in the last five years. Modern cloud databases, event tracking platforms, and transformation tools enable marketing teams to own their attribution infrastructure at costs competitive with—and often lower than—enterprise SaaS tools.
More importantly: owned infrastructure gives you the complete event history AI agents need to deliver the 6-analysts-to-1-person efficiency gains companies are already achieving.
The choice isn't "build complex infrastructure vs. use simple SaaS." The choice is: "Own flexible event streams AI can query vs. rent snapshot dashboards that can't answer new questions."
Next time your CMO asks what changed, you won't guess. You'll replay every touchpoint, every journey, every conversion path. You'll know exactly what happened. That's not magic—that's just treating your marketing data with the same respect your bank treats your $4,237.
Your bank has maintained perfect transaction records since before computers existed. Your marketing data deserves the same standard.
Sources
U.S. Department of Justice - "Wells Fargo Agrees to Pay $3 Billion to Resolve Criminal and Civil Investigations"
Modern Treasury - "Enforcing Immutability in your Double-Entry Ledger"
Revsure - "The State of Marketing Attribution in 2024"
Electrik - "Why You Need Google Analytics Raw Data?"
Lifesight - "Marketing Attribution Problems"
Supermetrics - "How to make your marketing data AI-ready"
Dreamdata - "B2B Revenue Attribution: Build vs Buy"
Measured - "Marketing Mix Modeling Software: Build vs. Buy"
HubSpot - "2025 AI Trends for Marketers"
LinkedIn Engineering - "Measuring marketing incremental impacts beyond last click attribution"
Snowplow - "How Gousto is growing its subscriber base using behavioral data"