"Taste Graph"—Visual and Intent-Led Discovery in 2026

Shane Miller

May 1, 2026

8

minutes read

Nobody has ever described the jacket they want by typing "asymmetric cotton-blend overshirt with dropped shoulder seam in desaturated sage." They point at it. They screenshot it. They save it to a board and forget about it until something similar appears three weeks later in a shop window. Commerce has always been visual; the infrastructure just wasn't. Until now.

Table of contents

Pinterest's "taste graph"—the company's term for the system that maps billions of saves, searches, and clicks into a rolling portrait of each user's aesthetic sensibility—has grown more than 70% in two years. Gen Z, who now make up the majority of the platform's users, are 68% more likely than older cohorts to start a shopping journey with an image or a video rather than a typed query. Google Lens alone now processes over 20 billion visual searches a month—a 43% jump from its 2024 average of 14 billion—and image-based queries account for more than a quarter of all Google searches. The numbers tell a story that most brand strategies have not yet caught up with: the keyword, that blunt little workhorse of digital commerce, is being quietly pensioned off.

Pic. The discovery-to-purchase journey, then and now.

Looking, not typing

What the taste graph does, in practice, is read rooms. Not literally—but close enough to make the distinction academic. It picks apart visual signals: textures, silhouettes, colour temperatures, spatial arrangements, the particular quality of light in a photograph of someone's kitchen. It cross-references those signals against a user's history of looking and saving, builds a running model of what that person finds appealing, and serves up products to match. Pinterest's proprietary multimodal model now outperforms leading off-the-shelf visual search tools by more than 30 percentage points on recommendation relevance.

The implications land differently depending on where you sit. 

  • For a consumer, it means photographing a friend's bookshelf and getting a shoppable list of similar furniture within seconds. 
  • For a brand, it means that product imagery—the thing most companies still treat as a box-ticking exercise in the e-commerce workflow—has become the front door. Or, more accurately, the only door. 

If your SKU photos are generic white-background affairs, shot without thought for aesthetic context, the machines that now mediate discovery will simply look past you. You will not be rejected. You might be invisible, which is worse.

This amounts to a new kind of brand asset, one you might call visual equity. In a world where AI indexes look as readily as it indexes text, every product image functions as a searchable signal. Get it right and you show up when someone's taste aligns with what you sell. Get it wrong—or, more commonly, do not think about it at all—and you have opted out of a discovery channel that is growing at roughly 17% a year.

The machines that shop

If the taste graph were only about inspiration, it would be interesting but manageable. What makes it genuinely disruptive is that inspiration and transaction are merging—collapsing into a single moment mediated not by a human browsing a checkout page but by an AI agent executing a purchase.

The infrastructure materialised with startling speed. In January 2026, Google's Sundar Pichai stood up at the National Retail Federation conference and announced the Universal Commerce Protocol (UCP), an open standard co-developed with Shopify, Walmart, Target, and more than 20 other partners. OpenAI and Stripe responded with their own competing protocol. By March, Shopify had rolled out "agentic storefronts" to millions of merchants, making their products automatically discoverable—and purchasable—inside ChatGPT.

Let that land for a moment. A consumer can now describe what they want to an AI assistant, or simply show it an image, and the agent will identify the aesthetic intent, query product catalogues, evaluate options against constraints like price and delivery speed, and complete the purchase—all without the consumer ever visiting a website. According to IBM's Institute for Business Value, 45% of consumers already use AI for at least part of their buying journey. 

Pic. How consumers use AI in the shopping journey.

McKinsey projects the global agentic commerce opportunity at $3 to $5 trillion by 2030. Morgan Stanley's research suggests that nearly half of online shoppers will delegate purchases to AI agents by the end of the decade.

Pic. Share of ecommerce spend in a bull case scenario (Source).

The checkout button, in other words, is becoming optional. What replaces it is something the industry has started calling "agent legibility"—the degree to which your product data is structured so that a machine can interpret, compare, and act on it in milliseconds. Clean metadata, consistent attributes, real-time inventory feeds, unambiguous fulfilment terms. This is not glamorous work. It is also, increasingly, the work that determines whether your product exists in the eyes of the systems doing the buying.

The search engine is not the search engine anymore

Here is where things get properly uncomfortable for anyone who has spent the last decade perfecting their Google rankings. Gartner predicts that 25% of organic search traffic will migrate to AI chatbots and virtual agents by 2026. Forbes reports that brands optimizing for answer engines—the emerging discipline known as Answer Engine Optimization, or AEO—are seeing conversion rates up to nine times higher than those relying on traditional search alone.

But the truly sobering number is this: the overlap between sources cited in AI-generated answers and Google's top 10 organic results is just 12%. For ChatGPT specifically, it drops to 8%. Roughly 80% of the sources that large language models choose to cite do not rank in Google's top 100 for the original query. Ranking on page one of Google, the thing brands have spent fortunes chasing for the better part of two decades, does not predict whether you will be visible in the environments where a growing share of purchase decisions now happen.

Pic. Citations overlap between AI & the top 10 search results (Source).

AEO has so far focused largely on text—structuring written content so AI systems can extract and cite it. The next frontier is visual. Product catalogues need to be legible to machines across voice, text, and image simultaneously. Alt text, structured metadata, visual schema, rich product attributes—these become the signals that determine whether the AI eye can parse your brand's aesthetic identity and surface it at the point of intent. 

It is a strange new world in which the first audience for your product photography is not a human being but a model trained on several billion images. And yet here we are.

Measuring what you cannot click

None of this sits easily with legacy measurement. Click-through rate, the metric that has underwritten digital marketing budgets for a generation, measures human interaction with a page. When discovery is visual, purchase is agentic, and the consumer never touches a website, CTR measures nothing at all.

What matters instead is something closer to "visual inclusion"—how frequently your products surface as recommendations within visual search results, AI-generated shopping responses, and agent-mediated purchase flows. Tracking that requires a different kind of intelligence layer, one built to measure business outcomes rather than media activity.

AI Digital's ELEVATE—a vendor-agnostic marketing intelligence platform processing 150 billion data points monthly across 12+ DSPs—is designed for exactly this shift:

  • Its custom KPI optimization aligns campaign measurement with business objectives rather than defaulting to impressions or CPM. 
  • Its impact scoring identifies which campaign dimensions actually move the needle. 
  • Marketing Mix Modeling provides a statistically grounded view of how upper-funnel and awareness channels—the kind of channels where visual discovery lives—influence overall performance. 
  • Path to Conversion mapping traces the thread from a taste graph recommendation through to a completed transaction. 
  • And the AI Digital Brand Study quantifies the longer-term effect: how visual discovery shapes brand preference and purchase intent over time, the "visual halo" that sits well beyond any single conversion event.

The deeper point is not about any one platform. It is that the relationship between discovery and measurement has broken and needs to be rebuilt. When a machine selects your product based on visual match and structured data—bypassing the click, the browse, the comparison page—traditional attribution has nothing useful to say. Outcome-first measurement, rooted in what the business actually gained, is the only thing left that tells the truth.

What this means, plainly

A brand's viability in 2026 hinges on a dual capability that would have sounded absurd five years ago: being visually attractive to AI systems that curate taste, and being structurally legible to AI agents that complete purchases. These are not two separate projects for two separate teams. They are the same strategic problem, viewed from different angles.

The brands that come through this will be the ones that combine a distinctive visual identity—the kind of coherent aesthetic sensibility that makes a taste graph sit up and pay attention—with the unsexy infrastructure work of clean data, structured feeds, and machine-readable product attributes. Art and plumbing, in other words. Which, come to think of it, has always been the story of good commerce.

To explore how visual and intent-led discovery can shape your media strategy, connect with the AI Digital team. We’d welcome a productive discussion!

Inefficiency

Description

Use case

Description of use case

Examples of companies using AI

Ease of implementation

Impact

Audience segmentation and insights

Identify and categorize audience groups based on behaviors, preferences, and characteristics

  • Michaels Stores: Implemented a genAI platform that increased email personalization from 20% to 95%, leading to a 41% boost in SMS click through rates and a 25% increase in engagement.
  • Estée Lauder: Partnered with Google Cloud to leverage genAI technologies for real-time consumer feedback monitoring and analyzing consumer sentiment across various channels.
High
Medium

Automated ad campaigns

Automate ad creation, placement, and optimization across various platforms

  • Showmax: Partnered with AI firms toautomate ad creation and testing, reducing production time by 70% while streamlining their quality assurance process.
  • Headway: Employed AI tools for ad creation and optimization, boosting performance by 40% and reaching 3.3 billion impressions while incorporating AI-generated content in 20% of their paid campaigns.
High
High

Brand sentiment tracking

Monitor and analyze public opinion about a brand across multiple channels in real time

  • L’Oréal: Analyzed millions of online comments, images, and videos to identify potential product innovation opportunities, effectively tracking brand sentiment and consumer trends.
  • Kellogg Company: Used AI to scan trending recipes featuring cereal, leveraging this data to launch targeted social campaigns that capitalize on positive brand sentiment and culinary trends.
High
Low

Campaign strategy optimization

Analyze data to predict optimal campaign approaches, channels, and timing

  • DoorDash: Leveraged Google’s AI-powered Demand Gen tool, which boosted its conversion rate by 15 times and improved cost per action efficiency by 50% compared with previous campaigns.
  • Kitsch: Employed Meta’s Advantage+ shopping campaigns with AI-powered tools to optimize campaigns, identifying and delivering top-performing ads to high-value consumers.
High
High

Content strategy

Generate content ideas, predict performance, and optimize distribution strategies

  • JPMorgan Chase: Collaborated with Persado to develop LLMs for marketing copy, achieving up to 450% higher clickthrough rates compared with human-written ads in pilot tests.
  • Hotel Chocolat: Employed genAI for concept development and production of its Velvetiser TV ad, which earned the highest-ever System1 score for adomestic appliance commercial.
High
High

Personalization strategy development

Create tailored messaging and experiences for consumers at scale

  • Stitch Fix: Uses genAI to help stylists interpret customer feedback and provide product recommendations, effectively personalizing shopping experiences.
  • Instacart: Uses genAI to offer customers personalized recipes, mealplanning ideas, and shopping lists based on individual preferences and habits.
Medium
Medium

Questions? We have answers

Have other questions?
If you have more questions,

contact us so we can help.