Data Clean Rooms: What They Are and Why They Matter
January 22, 2026
16
minutes read
Data clean rooms help advertisers and partners collaborate on measurement and insights without exchanging raw customer data. In this article, we’ll explain what they are, why they exist, how they work, and when they actually make sense.
If you’ve been around programmatic and measurement long enough, you’ve seen the same pattern repeat: marketers want more certainty (better targeting, clearer attribution, tighter optimization), while platforms and regulators keep shrinking the set of “easy” signals.
Data clean rooms sit right in the middle of that tension. They’re not magic, and they’re not a replacement for good data strategy. What they are is a structured way for two (or more) parties to answer specific questions together—using sensitive data—while limiting what either side can see or take away.
This article breaks down what a data clean room is, why they exist, how they work, and the major clean room models you’ll run into.
💡 If you want broader context on how privacy and identity shifts are changing digital advertising, see our cookieless overview.
⚡Clean rooms are a response to constraints, not a new form of marketing freedom. If you treat them like a shortcut, you’ll spend months building something that answers no useful question.
What is a data clean room?
A data clean room is a controlled environment that allows companies to match and analyze data together under rules (“constraints”) that restrict how the data can be used and what can leave the environment.
The FTC’s plain-English framing is helpful: clean rooms enable data exchange and analysis restrained by rules that limit data use, typically when two companies want limited information about shared customers (for example, connecting ad exposure to purchases).
⚡ Think of the clean room as the rules, not the room. The value comes from the constraints: what’s allowed in, what’s allowed out, and what’s provably prevented.
What a clean room is doing (in practice)
Most advertising use cases boil down to four things:
Bring data to a governed environment: Each party contributes a dataset (often via a secure share or controlled view rather than raw table exports).
Match records using a join key: That might be hashed email, a platform-specific identifier, or another agreed identity mechanism. (We’ll talk more about identity matching mechanics later in the article.)
Run approved analysis: Queries or templates calculate overlaps, reach/frequency, conversion linkage, incrementality proxies, and other analytics.
Export only permitted outputs: Instead of exporting user-level rows, the clean room typically allows aggregated results (counts, rates, indexed metrics) and blocks anything that risks exposing individuals.
What it is not
A quick reset that prevents a lot of confusion later:
It’s not a CDP or data warehouse replacement.
It doesn’t “clean” data quality issues by default (duplicates, missing fields, bad taxonomy).
It’s not automatically privacy-safe just because it has “clean room” in the name.
That last point matters. The FTC explicitly notes that protections are not typically automatic—they depend on how constraints are configured, monitored, and enforced.
Clean rooms didn’t become popular because the industry suddenly fell in love with governance. They became popular because they’re one of the few ways to keep doing cross-party marketing work when the old mechanisms—third-party cookies, unrestricted device IDs, easy cross-site tracking—stop behaving like dependable plumbing.
The end of third-party cookies (and, more importantly, cookie instability)
The story here isn’t just “cookies are going away.” It’s that third-party cookie availability is no longer something advertisers can assume, and the future has been shaped by delays, regulatory oversight, and shifting implementation plans.
For example, Google’s Privacy Sandbox update in April 2024 stated it would not complete third-party cookie deprecation in the second half of Q4 2024, and it pointed to ongoing review with regulators and industry testing as reasons for timeline changes.
By April 2025, Reuters reported Google would retain third-party cookies in Chrome and not roll out a new standalone prompt, reflecting how contested and changeable this terrain has been.
Even if your strategy isn’t “cookie-based,” this uncertainty creates second-order effects:
Measurement becomes more fragmented (especially across browsers and environments).
Identity resolution gets harder at scale.
Data-sharing norms tighten because everyone is more cautious about what they disclose.
A useful adjacent signal: in the mobile world, consent frameworks have already normalized opt-in boundaries. AppsFlyer reported that U.S. opt-in to in-app tracking was 44% as of Q1 2024—a reminder that deterministic visibility is no longer a default assumption.
⚡ The biggest shift isn’t “cookies disappear.” It’s that marketers can’t plan around a single, stable tracking default anymore, so they need collaboration models that don’t depend on it.
Privacy regulation and consent boundaries
In the U.S., one of the most practical drivers of clean room growth is the operational reality of compliance: the rules are expanding, and teams need ways to collaborate without crossing consent and purpose limitations.
By late 2025, BSA noted that 20 states have enacted comprehensive consumer privacy laws, creating new rights for consumers and obligations for businesses handling personal data.
At the same time, NCSL documented the pace of legislative activity: in 2024, at least 40 states (plus D.C. and Puerto Rico) introduced or considered 350+ consumer privacy bills, with many enacted.
This matters to clean rooms because many common marketing workflows inherently involve “sharing”:
A brand wants to reconcile customer data with a retailer or publisher.
An agency wants to compare overlap across partners.
A measurement provider wants exposure + conversion linkage.
Clean rooms are attractive here because they can be designed to enforce “you can learn this, but you can’t take that.” The FTC is also clear that clean rooms don’t eliminate legal obligations—companies can’t treat them as a loophole around promises or the law.
⚡ Privacy compliance is a business constraint that shows up in workflows. Clean rooms help teams collaborate without constantly renegotiating what’s permissible.
The rise of walled gardens (and concentrated power)
The third driver is less technical and more structural: a growing share of advertising activity happens inside ecosystems where the platform owns the data, the measurement, and the rules.
EMARKETER’s forecast for 2024 put it bluntly: Amazon, Apple, Meta, Microsoft, and Google will attract almost two-thirds of U.S. digital ad dollars in 2024.
As that concentration increases, advertisers and agencies run into a hard truth:
You can’t “just” extract platform-level user data.
You often can’t connect platform exposure data to off-platform outcomes without platform-approved workflows.
Collaboration becomes permissioned, not assumed.
That’s one reason walled-garden clean rooms became mainstream early—and why independent and cloud-based clean rooms are now being adopted to regain flexibility outside any single ecosystem.
⚡ Walled gardens don’t just restrict data access; they shape what “truth” looks like. Clean rooms can restore some comparability, but only if you define consistent questions across partners.
💡 If you want a deeper read on the tradeoffs advertisers face inside platform-controlled environments, our walled gardens breakdown is a useful companion.
How data clean rooms work
At a high level, every clean room (regardless of vendor) is trying to solve the same problem:
Let multiple parties compute joint insights without exposing raw, person-level data to each other.
The differences are in how that is enforced, what identity methods are allowed, and how strict the outputs are.
Step 1: Decide the collaboration question first
Clean rooms work best when the question is explicit, like:
“How many of your subscribers did we reach?”
“What’s the overlap between our CRM list and your shoppers?”
“Among exposed audiences, what conversion rate do we see by segment?”
If the goal is vague (“we want to explore data together”), you can end up paying for infrastructure that produces ambiguous output.
Step 2: Prepare and scope the data
Each party typically contributes:
A match key (hashed email, platform ID, etc.)
A limited set of attributes allowed for analysis (timestamps, product categories, geography at an approved granularity, campaign IDs, etc.)
The FTC notes a key differentiator vs. “normal” data sharing is the constraints—rules limiting both the analysis and what can be exported.
Step 3: Matching happens under privacy controls
There are multiple matching approaches (hashed PII, privacy-safe identity protocols, publisher/advertiser reconciliation), but the principle stays consistent:
Matching should occur without either side receiving the other party’s raw identifiers.
The output should avoid creating a new transferable identity graph.
On the standards side, IAB Tech Lab has been pushing interoperability and agreed mechanics. Their data clean room work includes protocols like PAIR (publisher-advertiser identity reconciliation) and ADMaP (attribution data matching).
Step 4: Queries run under strict rules
This is where “clean room” becomes real:
Some clean rooms allow only pre-built templates or parameterized queries.
Others allow SQL but enforce strict output thresholds and logging.
Many block joins that could enable identity discovery.
For example, AWS Clean Rooms supports controls like minimum aggregation thresholds—a common pattern where an output row must represent at least X distinct users (or it gets suppressed).
Google’s BigQuery data clean rooms documentation describesanalysis rules that prevent raw access and enforce restrictions such as aggregation thresholds.
Google Cloud BigQuery data clean rooms diagram (clean room concept: join without moving data), (Source)
⚡ If the only reason you want a clean room is “more data,” pause. The best clean room projects start with a single measurement or planning decision that needs a defensible answer.
Step 5: Outputs are limited to what’s safe and agreed
Segment definitions (audience groups that can be activated inside an approved system)
Model-ready features (in more advanced setups, privacy-safe features for measurement or forecasting)
Types of Data Clean Rooms
A “data clean room” is really a pattern. The way it’s implemented varies based on who controls the environment, where the data lives, and what the business objective is.
In practice, most clean rooms fall into three categories.
Walled-garden clean rooms
These are clean rooms provided by large platforms where the collaboration and measurement happen inside the platform’s environment, using platform data and rules.
Typical strengths:
Native exposure data (impressions, clicks, video views) is already there.
Measurement often has platform-specific depth (reach/frequency, conversion APIs, modeled outcomes).
Activation is usually straightforward within that ecosystem.
Typical constraints:
The platform decides what identity methods are allowed.
Data export is heavily restricted (often aggregated-only).
Cross-platform comparisons can be awkward because each “room” speaks its own measurement language.
This is why many teams treat walled-garden clean rooms as necessary—but incomplete. They’re powerful for platform-specific truth, but not automatically a holistic measurement solution.
💡 If you’re tracking how platform shifts reshape the ad tech surface area, Microsoft’s Xandr sunset is a good example of the broader consolidation trend.
Independent and neutral clean rooms
Independent clean rooms are designed to be a neutral meeting place between parties: advertiser + publisher, brand + retailer, agency + multiple media partners.
Typical strengths:
More flexibility to collaborate across multiple partners.
Better fit for publisher networks, retail media partnerships, and multi-party measurement.
Can help reduce dependency on any single platform’s measurement worldview.
Typical constraints:
Identity strategy becomes your problem (and it’s rarely “one-size-fits-all”).
Integrations and governance take real work—especially with agencies coordinating multiple parties.
The clean room doesn’t magically fix poor data hygiene or inconsistent taxonomies.
This model tends to make the most sense when the business value is clearly tied to cross-partner collaboration (not just internal analytics).
Cloud frameworks are clean rooms built on (or embedded within) cloud data platforms. Instead of buying a “clean room product” as a black box, you use cloud-native features to create a governed collaboration environment.
What this enables:
Scale and control for enterprises already centralizing data in cloud warehouses.
Custom workflows for specific measurement needs, rather than forcing everything into templates.
Examples of how this looks in practice:
BigQuery data clean rooms: Google describes a model where contributors set analysis rules and export restrictions by default, supporting privacy-centric collaboration without copying or moving underlying data.
Snowflake Data Clean Rooms: Snowflake describes configurable, isolated environments where collaborators specify allowed queries and configure protections such as differential privacy and join/project controls.
Where cloud frameworks shine is also where they cost you: you typically need more data engineering, more governance design, and tighter stakeholder coordination. They’re a strong fit for organizations that want clean rooms to become infrastructure—not a one-off project.
Key use cases for advertisers and agencies
If you strip away the hype, most data clean rooms are used for one thing: decision support under privacy constraints. They’re strongest when two parties already have meaningful data, but can’t legally—or practically—swap row-level records.
⚡ Attribution improves when you control definitions. Clean rooms help you connect exposure and outcome, but you still need to decide what counts as success, and what doesn’t.
Below are the use cases that show up repeatedly in real-world deployments (and in the research). I’m keeping these grounded on what clean rooms reliably do today, not what vendors promise they’ll do “soon.”
Measurement and attribution
The most common reason teams pursue a data clean room is the same reason measurement conversations feel harder every year: signal loss reduces confidence, while leadership still expects proof.
IAB’s State of Data 2024captures that tension clearly. Among companies asked how cookie deprecation plus proliferating legislation would impact them, 73% expected their ability to attribute performance, measure ROI, track conversions, and optimize campaigns to be reduced, and 57% expected it to be harder to capture reach and frequency.
A clean room helps because it lets two parties connect exposure and outcome inside a governed environment. The output is usually aggregated reporting, not person-level logs. That changes what you can do, but it’s still valuable.
Here are the measurement tasks clean rooms tend to support well:
Exposure-to-outcome linkage (in aggregate): You can answer questions like “Did exposed households buy more than unexposed households?” using advertiser conversion data plus publisher/platform exposure data—without either side exporting user-level records.
Reach and frequency analysis across constrained environments: Instead of guessing with disconnected reports, clean rooms can estimate overlap and deduplicated reach using allowed join keys and strict thresholds.
Incrementality-oriented designs (where feasible): Clean rooms won’t magically run a perfect experiment, but they can support clean-room-based comparisons (e.g., holdout logic or cohort-based lift) when the media owner can segment exposure in a controlled way.
📍 Key takeaway: clean rooms are a measurement workspace, not a measurement methodology. If your methodology is weak, a clean room will produce precise-looking results that still don’t answer the business question.
⚡ A clean room can reduce data risk. It can’t reduce thinking risk.
💡 For a deeper dive into CTV measurement, see our dedicated guide: CTV measurement: The key metrics that will define successful campaigns
% expect to increase first-party datasets (Source)
Audience overlap and insights
Audience work is where clean rooms often surprise teams—in a good way. You stop obsessing over “who” (identity-level targeting) and start learning “what” and “how much” (planning-level insight).
This use case usually looks like:
Overlap sizing: “How many of our customers are reachable in this publisher’s authenticated audience?”
Profile comparison: “Do our high-value customers index more heavily against certain content categories or dayparts?”
Suppression and efficiency: “Are we paying to reach audiences we already own, or are we expanding net-new reach?”
What’s important is the boundary: most clean rooms don’t let you walk away with the other party’s audience list. You’re learning patterns and sizing, not building a portable identity graph.
This is also where clean rooms often get paired with other measurement approaches. For example, some organizations use clean rooms for overlap/duplication and then use MMM to understand channel contribution at a higher level.
📍 Key takeaway: audience insights in a clean room are usually about planning and strategy, not “new targeting superpowers.”
Clean rooms exist because collaboration is back, but it needs guardrails.
The publisher side is especially telling. A Digiday survey (sponsored research) found64% of publishers and media businesses were collaborating with advertisers via clean rooms to share audience data, plan campaigns, and measure performance.
This use case typically lands in three buckets:
Joint planning: Brands and publishers use overlap analysis to decide which inventory actually adds reach, rather than duplicating audiences already hit elsewhere.
Closed-loop reporting: Publishers can support advertiser outcome measurement without exporting raw subscriber data. Advertisers get more accountability; publishers protect their relationship with their audience.
Second-party data partnerships: Retailers, publishers, and data owners can collaborate with agencies and brands inside a permissioned environment—especially useful when both sides have authenticated users or strong first-party datasets.
📍 Key takeaway: clean rooms don’t “open up” publisher data. They create a way to collaborate without giving away the asset.
Limitations and common misconceptions
This section matters because data clean rooms are easy to oversell. Even the FTC has warned that the name can be misleading: they’re not rooms, they don’t “clean” data, and the protections depend on how the system is actually configured and used.
Here are the misconceptions that cause the most wasted time and disappointment:
“A clean room makes us privacy-compliant”
A clean room can support privacy-friendly collaboration, but it’s not a legal shield. The FTC’s position is straightforward: companies still need to honor the law, their privacy promises, and meaningful governance.
What to do instead: treat the clean room as one component in a compliance posture (consent strategy, retention policies, partner contracts, and access controls still matter).
“A clean room is a CDP replacement”
A CDP is designed to unify and activate your customer data across your channels. A clean room is designed to collaborate on analysis with someone else under restrictions.
They can complement each other, but they solve different problems.
Rule of thumb: if the work is internal (segmentation, lifecycle messaging, personalization in owned channels), start with your CDP/warehouse. If the work requires a partner’s data (publisher exposure, retailer purchase signals, platform measurement), that’s where clean rooms become relevant.
⚡A clean room can reduce risk, but it rarely reduces work. Identity, governance, and partner coordination are the real cost centers, not the software.
Some clean room setups rely on identity signals to match records, but they do not automatically give you a durable, reusable identity graph.
IAB Tech Lab’s work on clean room interoperability (including PAIR) is partly about standardizing how two parties reconcile identity for specific collaboration use cases, not about creating an industry-wide identity backdoor.
“Clean rooms are activation engines”
Some clean rooms support limited activation (often within the same ecosystem where the data lives), but many are designed primarily for analytics outputs. The more “portable” the activation becomes, the more privacy risk and governance complexity you inherit.
Practical expectation: clean rooms are usually best for measurement and insight. If activation is the primary goal, clarify up front where activation will happen and what leaves the environment.
Percentage of publishers using tracked targeting methods to address their digital ad inventory (Source)
“If we build it, the insights will come”
Clean rooms don’t fix:
poor event instrumentation
inconsistent taxonomy across partners
missing consent signals
unclear success metrics
weak experiment design
They make collaboration possible. They don’t make it meaningful by default.
⚡ Clean rooms don’t replace strategy. They expose whether you have one.
Data clean rooms in the CTV and programmatic ecosystem
CTV and programmatic are where clean rooms become especially relevant—because that’s where measurement fragmentation and platform constraints collide.
Why CTV puts pressure on collaboration
CTV is growing, and it’s increasingly treated as a core line item rather than an experiment.
Nielsen reported that 56% of marketers globally planned to increase OTT/CTV spend in 2025 (up from 53% in 2024), with growth concentrated in the Americas.
IAB’s September 2025 outlook update projectedCTV growth of +11.4% in 2025 (alongside double-digit growth in retail media and social).
More CTV spend means more executive attention on questions like: Did this drive incremental reach? Did it move sales? Are we hitting the same households everywhere?
Clean rooms can help because many CTV environments are “closed” by design. You often need the platform or publisher’s cooperation to connect exposure to outcomes in any controlled way.
💡 If you want a broader view of how CTV buying and measurement work (and where the common gaps show up), AI Digital’s CTV guide is a solid companion resource.
Where clean rooms sit in programmatic workflows
Programmatic runs on distributed systems: DSPs, SSPs, measurement vendors, data providers, and publishers all touch the transaction. That makes data collaboration both valuable and risky.
In practice, clean rooms show up in programmatic ecosystems in a few ways:
Supply-side collaboration and premium inventory analysis: Advertisers can work with premium publishers to understand overlap, reach contribution, and outcome signals—without demanding raw audience files.
Retail media and commerce signal partnerships: Retail media is one of the most natural clean room environments because retailers control deterministic purchase data. EMARKETER forecastU.S. retail media ad spend at $69.33B in 2026, up from $58.79B in 2025. As those budgets grow, so does the need to validate performance with privacy-safe measurement—especially when pairing retail media signals with offsite programmatic and CTV.
Cross-platform measurement stitching (within limits): Clean rooms don’t automatically unify measurement across every platform, but they can support consistent analysis patterns across multiple partners, especially when agencies manage a portfolio of publishers and retailers.
Agency operations and DSP flexibility: When buyers are spread across multiple DSPs and supply paths, maintaining consistency matters. AI Digital’s DSP-agnostic approach is one example of how teams try to keep media buying flexible in fragmented environments.
💡 For the programmatic basics (and how the ecosystem functions end-to-end), see AI Digital’s programmatic advertising guide.
A quick note on platform shifts
Platform changes can also reshuffle how data collaboration happens. Microsoft’s decision to sunset Xandr is a good reminder that supply and platform strategies evolve, and measurement workflows need to adapt with them.
When do data clean rooms make sense?
This is the point where a lot of teams want a simple answer: Do we need one or not? The real answer is a framework.
A data clean room makes sense when three things are true:
You have (or can access) meaningful first-party data
You have partners with meaningful data
There is a high-value question you cannot answer any other way
If any of those are missing, you can usually get farther with simpler approaches.
A decision framework you can actually use
Before you commit budget and engineering time, pressure-test these areas.
1) Is the question worth the operational work?
A clean room is rarely justified for “nice-to-know” analysis. It’s justified when the decision changes spend, strategy, or partner allocation.
Examples of “worth it” questions:
“Which publisher partnerships deliver net-new customers, not just repeat exposure?”
“Is our CTV spend incremental, or cannibalizing other channels?”
“Which retailer partnerships drive in-store outcomes we can validate?”
2) Do you have the data maturity to avoid garbage-in/garbage-out?
IAB’s State of Data 2024 shows why this matters: companies are already anticipating weaker attribution and harder reach/frequency measurement under signal loss.
If your conversion events are inconsistent, your customer records are fragmented, or your consent signals aren’t trustworthy, a clean room won’t fix that. It will simply produce cleaner-looking confusion.
Minimum viable readiness:
stable conversion definitions
consistent campaign naming/taxonomy
clear consent posture and data retention rules
a plan for how insights will change decisions
3) Do your partners actually support collaboration?
Clean rooms are a two-party (or multi-party) sport. If your key publishers, retailers, or platforms aren’t willing or able to participate, a clean room becomes shelfware.
This is why many organizations start with the partners where collaboration is already natural—retail media, major publishers with authenticated audiences, and platforms that offer defined clean room workflows.
4) Can you staff governance, not just implementation?
This is the unsexy part that determines success:
who can run queries?
what outputs are permitted?
what thresholds prevent re-identification risk?
how are results documented so they’re comparable over time?
The FTC’s warning is relevant here: clean rooms don’t automatically protect privacy. The rules and monitoring do.
When a data clean room is usually the wrong move
You can often say “no” with confidence if:
Your main goal is activation, and you don’t have a clear path for where activation will happen.
You don’t have strong first-party data, or your match rates will be too low to generate stable insights.
You need speed over precision, and a clean room workflow will slow decisions to a crawl.
📍 Key takeaway: clean rooms are powerful when they answer high-stakes questions under strict constraints. They’re frustrating when they’re treated as a general-purpose analytics toy.
⚡ A confident “no” is a good outcome. If you can’t name the decision the clean room will change, you’re not evaluating a tool, you’re collecting complexity.
Conclusion: The strategic role of data clean rooms going forward
Data clean rooms matter because advertising is being redesigned around privacy boundaries, platform control, and a shrinking set of “free” signals. They are one of the few tools that let marketers collaborate with partners without reverting to risky data sharing.
But here’s the honest conclusion:
Essential for advertisers who rely on partner data to measure outcomes (retail media, large publishers, major platforms, complex CTV plans).
Optional for organizations with simpler channel mixes or lower dependence on cross-party measurement.
Ineffective when they’re used to compensate for weak measurement discipline, unclear objectives, or poor data quality.
If you want help evaluating whether a clean room approach makes sense for your data, partners, and measurement goals, AI Digital’s team can help you sort the “worth it” from the “noise.” Why not get in touch?
Blind spot
Key issues
Business impact
AI Digital solution
Lack of transparency in AI models
• Platforms own AI models and train on proprietary data • Brands have little visibility into decision-making • "Walled gardens" restrict data access
• Inefficient ad spend • Limited strategic control • Eroded consumer trust • Potential budget mismanagement
Open Garden framework providing: • Complete transparency • DSP-agnostic execution • Cross-platform data & insights
Optimizing ads vs. optimizing impact
• AI excels at short-term metrics but may struggle with brand building • Consumers can detect AI-generated content • Efficiency might come at cost of authenticity
• Short-term gains at expense of brand health • Potential loss of authentic connection • Reduced effectiveness in storytelling
Smart Supply offering: • Human oversight of AI recommendations • Custom KPI alignment beyond clicks • Brand-safe inventory verification
The illusion of personalization
• Segment optimization rebranded as personalization • First-party data infrastructure challenges • Personalization vs. surveillance concerns
• Potential mismatch between promise and reality • Privacy concerns affecting consumer trust • Cost barriers for smaller businesses
Elevate platform features: • Real-time AI + human intelligence • First-party data activation • Ethical personalization strategies
AI-Driven efficiency vs. decision-making
• AI shifting from tool to decision-maker • Black box optimization like Google Performance Max • Human oversight limitations
• Strategic control loss • Difficulty questioning AI outputs • Inability to measure granular impact • Potential brand damage from mistakes
Managed Service with: • Human strategists overseeing AI • Custom KPI optimization • Complete campaign transparency
Fig. 1. Summary of AI blind spots in advertising
Dimension
Walled garden advantage
Walled garden limitation
Strategic impact
Audience access
Massive, engaged user bases
Limited visibility beyond platform
Reach without understanding
Data control
Sophisticated targeting tools
Data remains siloed within platform
Fragmented customer view
Measurement
Detailed in-platform metrics
Inconsistent cross-platform standards
Difficult performance comparison
Intelligence
Platform-specific insights
Limited data portability
Restricted strategic learning
Optimization
Powerful automated tools
Black-box algorithms
Reduced marketer control
Fig. 2. Strategic trade-offs in walled garden advertising.
Core issue
Platform priority
Walled garden limitation
Real-world example
Attribution opacity
Claiming maximum credit for conversions
Limited visibility into true conversion paths
Meta and TikTok's conflicting attribution models after iOS privacy updates
Data restrictions
Maintaining proprietary data control
Inability to combine platform data with other sources
Amazon DSP's limitations on detailed performance data exports
Cross-channel blindspots
Keeping advertisers within ecosystem
Fragmented view of customer journey
YouTube/DV360 campaigns lacking integration with non-Google platforms
Black box algorithms
Optimizing for platform revenue
Reduced control over campaign execution
Self-serve platforms using opaque ML models with little advertiser input
Performance reporting
Presenting platform in best light
Discrepancies between platform-reported and independently measured results
Consistently higher performance metrics in platform reports vs. third-party measurement
Fig. 1. The Walled garden misalignment: Platform interests vs. advertiser needs.
Key dimension
Challenge
Strategic imperative
ROAS volatility
Softer returns across digital channels
Shift from soft KPIs to measurable revenue impact
Media planning
Static plans no longer effective
Develop agile, modular approaches adaptable to changing conditions
Brand/performance
Traditional division dissolving
Create full-funnel strategies balancing long-term equity with short-term conversion
Capability
Key features
Benefits
Performance data
Elevate forecasting tool
• Vertical-specific insights • Historical data from past economic turbulence • "Cascade planning" functionality • Real-time adaptation
• Provides agility to adjust campaign strategy based on performance • Shows which media channels work best to drive efficient and effective performance • Confident budget reallocation • Reduces reaction time to market shifts
• Dataset from 10,000+ campaigns • Cuts response time from weeks to minutes
• Reaches people most likely to buy • Avoids wasted impressions and budgets on poor-performing placements • Context-aligned messaging
• 25+ billion bid requests analyzed daily • 18% improvement in working media efficiency • 26% increase in engagement during recessions
Full-funnel accountability
• Links awareness campaigns to lower funnel outcomes • Tests if ads actually drive new business • Measures brand perception changes • "Ask Elevate" AI Chat Assistant
• Upper-funnel to outcome connection • Sentiment shift tracking • Personalized messaging • Helps balance immediate sales vs. long-term brand building
• Natural language data queries • True business impact measurement
Open Garden approach
• Cross-platform and channel planning • Not locked into specific platforms • Unified cross-platform reach • Shows exactly where money is spent
• Reduces complexity across channels • Performance-based ad placement • Rapid budget reallocation • Eliminates platform-specific commitments and provides platform-based optimization and agility
• Coverage across all inventory sources • Provides full visibility into spending • Avoids the inability to pivot across platform as you’re not in a singular platform
Fig. 1. How AI Digital helps during economic uncertainty.
Trend
What it means for marketers
Supply & demand lines are blurring
Platforms from Google (P-Max) to Microsoft are merging optimization and inventory in one opaque box. Expect more bundled “best available” media where the algorithm, not the trader, decides channel and publisher mix.
Walled gardens get taller
Microsoft’s O&O set now spans Bing, Xbox, Outlook, Edge and LinkedIn, which just launched revenue-sharing video programs to lure creators and ad dollars. (Business Insider)
Retail & commerce media shape strategy
Microsoft’s Curate lets retailers and data owners package first-party segments, an echo of Amazon’s and Walmart’s approaches. Agencies must master seller-defined audiences as well as buyer-side tactics.
AI oversight becomes critical
Closed AI bidding means fewer levers for traders. Independent verification, incrementality testing and commercial guardrails rise in importance.
Fig. 1. Platform trends and their implications.
Metric
Connected TV (CTV)
Linear TV
Video Completion Rate
94.5%
70%
Purchase Rate After Ad
23%
12%
Ad Attention Rate
57% (prefer CTV ads)
54.5%
Viewer Reach (U.S.)
85% of households
228 million viewers
Retail Media Trends 2025
Access Complete consumer behaviour analyses and competitor benchmarks.
Identify and categorize audience groups based on behaviors, preferences, and characteristics
Michaels Stores: Implemented a genAI platform that increased email personalization from 20% to 95%, leading to a 41% boost in SMS click through rates and a 25% increase in engagement.
Estée Lauder: Partnered with Google Cloud to leverage genAI technologies for real-time consumer feedback monitoring and analyzing consumer sentiment across various channels.
High
Medium
Automated ad campaigns
Automate ad creation, placement, and optimization across various platforms
Showmax: Partnered with AI firms toautomate ad creation and testing, reducing production time by 70% while streamlining their quality assurance process.
Headway: Employed AI tools for ad creation and optimization, boosting performance by 40% and reaching 3.3 billion impressions while incorporating AI-generated content in 20% of their paid campaigns.
High
High
Brand sentiment tracking
Monitor and analyze public opinion about a brand across multiple channels in real time
L’Oréal: Analyzed millions of online comments, images, and videos to identify potential product innovation opportunities, effectively tracking brand sentiment and consumer trends.
Kellogg Company: Used AI to scan trending recipes featuring cereal, leveraging this data to launch targeted social campaigns that capitalize on positive brand sentiment and culinary trends.
High
Low
Campaign strategy optimization
Analyze data to predict optimal campaign approaches, channels, and timing
DoorDash: Leveraged Google’s AI-powered Demand Gen tool, which boosted its conversion rate by 15 times and improved cost per action efficiency by 50% compared with previous campaigns.
Kitsch: Employed Meta’s Advantage+ shopping campaigns with AI-powered tools to optimize campaigns, identifying and delivering top-performing ads to high-value consumers.
High
High
Content strategy
Generate content ideas, predict performance, and optimize distribution strategies
JPMorgan Chase: Collaborated with Persado to develop LLMs for marketing copy, achieving up to 450% higher clickthrough rates compared with human-written ads in pilot tests.
Hotel Chocolat: Employed genAI for concept development and production of its Velvetiser TV ad, which earned the highest-ever System1 score for adomestic appliance commercial.
High
High
Personalization strategy development
Create tailored messaging and experiences for consumers at scale
Stitch Fix: Uses genAI to help stylists interpret customer feedback and provide product recommendations, effectively personalizing shopping experiences.
Instacart: Uses genAI to offer customers personalized recipes, mealplanning ideas, and shopping lists based on individual preferences and habits.
Medium
Medium
Share article
Url copied to clipboard
No items found.
Subscribe to our Newsletter
THANK YOU FOR YOUR SUBSCRIPTION
Oops! Something went wrong while submitting the form.
Questions? We have answers
How do data clean rooms ensure user privacy?
Data clean rooms protect privacy by keeping collaboration inside a controlled environment where access and outputs are tightly restricted. Instead of sharing raw, person-level datasets, parties bring data into a governed space, match it using approved identifiers, and run analysis under rules that prevent anyone from extracting user-level records. Those rules typically include restrictions on what queries can be run, minimum aggregation thresholds so results can’t be used to isolate individuals, and auditing so activity is traceable. The key idea is that the insight leaves, not the underlying data, and the “safety” comes from how the clean room is configured and enforced, not from the label itself.
What problems do data clean rooms not solve?
Clean rooms do not fix weak measurement foundations. If conversion events are inconsistent, identity signals are unreliable, or taxonomy is messy, a clean room won’t turn that into trustworthy insight. They also don’t provide a universal view across every platform by default; each collaboration depends on available match keys, partner participation, and the constraints imposed by the environment. Finally, clean rooms are not a shortcut to portability: they are designed to prevent raw data extraction, so they won’t hand you a reusable identity graph or a fully exportable audience you can move anywhere without restrictions.
Do small or mid-sized advertisers need a data clean room?
Some do, but many don’t need one immediately. If you have limited first-party data scale, minimal partner data access, or your biggest performance wins are still available through better instrumentation and testing, a clean room may be premature. It starts to make sense for smaller or mid-sized advertisers when their strategy depends on partner datasets they can’t otherwise use safely—common examples are retail media measurement, premium publisher collaborations with authenticated audiences, or CTV plans where you need privacy-safe outcome validation. The deciding factor isn’t company size; it’s whether you have a high-value question that requires two-party data collaboration and will change spend or strategy.
What is the difference between a data clean room and a CDP?
A CDP is built to unify and activate your own first-party customer data across your marketing and owned channels, usually with an emphasis on segmentation, personalization, and orchestration. A data clean room is built for collaboration between parties who cannot share raw data, enabling joint analysis and limited activation under strict constraints. In practice, the two can complement each other: your CDP or warehouse often prepares the data you bring into a clean room, and the clean room produces aggregated insights you use for planning, measurement, and partner decisions rather than for building a fully portable customer profile.
Are data clean rooms required for privacy compliance?
No. Privacy compliance is driven by lawful collection, consent and purpose boundaries, security controls, data minimization, retention practices, and honoring consumer rights. A clean room can support a privacy-forward approach to collaboration by limiting what’s shared and what can be exported, but it does not replace compliance work and it doesn’t automatically make a questionable data practice acceptable. Think of clean rooms as a way to reduce exposure when collaboration is necessary, not as a requirement or a loophole.
Can data clean rooms be used for activation, or only measurement?
They can be used for activation in some cases, but measurement and insight are usually the most reliable core value. In walled-garden environments, activation often works best because the clean room lives inside the same ecosystem where media is bought, so segments can be created and applied without exporting user-level data. In independent or cloud-based frameworks, activation is more variable and often depends on integrations, governance rules, and the level of portability the parties are comfortable allowing. If activation is a primary goal, it’s important to clarify early where activation will happen, what exactly will leave the environment, and how privacy constraints will be enforced end-to-end.
What should we look for when choosing a clean room solution for data collaboration?
Start with whether the clean room solution fits your specific data collaboration needs: the partners you actually work with, the match keys you can legally use, and the measurement questions you need to answer. The best choice is usually the one that enforces clear constraints, supports repeatable workflows, and integrates cleanly with how you already run reporting and activation.
Have other questions?
If you have more questions, contact us so we can help.