
There's a seismic shift happening in how your customers discover products online, and most D2C marketing teams aren't paying attention. I'm talking about the rise of AI crawlers—bots from OpenAI, Anthropic, Perplexity, and others that are fundamentally reshaping how content gets discovered, surfaced, and recommended to consumers.
And here's the uncomfortable truth: if your website isn't optimized for these new crawlers, you might as well not exist for a growing segment of your potential customers.
Let's start with some data that grabbed our attention.
According to N7’s server log data, GPTBot (OpenAI's crawler) saw its share of AI crawler traffic surge from roughly 5% to 28% between May 2024 and May 2025. Anthropic's ClaudeBot and Meta's crawler are growing rapidly too. Collectively, these AI crawlers now represent about 20% of Googlebot's total crawl volume.
That’s a fundamental shift in how the web is being indexed.
The Web Almanac's 2025 SEO chapter documented that websites are scrambling to respond. GPTBot now appears in 4.5% of all robots.txt files, with ClaudeBot at 3.6% - representing one of the fastest expansions of robots.txt directives for named user agents in recent memory.
Some sites are now seeing over 1% of their total traffic coming from platforms like ChatGPT, Perplexity, and Copilot.
E-commerce sites are seeing significant traffic impacts from AI Overviews and AI-driven search. Studies indicate that AI Overviews reduce clicks to websites by about 34.5% on average. For informational queries, the kind that drive top-of-funnel discovery for D2C brands, the click impact can be even more dramatic.
Think about how your customers shop. They're increasingly asking questions like "best sustainable sneakers for running" or "organic skincare routine for sensitive skin which costs not more than Rs 2000" to ChatGPT or Perplexity before they even open Google. If your brand isn't being surfaced in those AI-generated responses, you're losing mindshare before the customer journey even begins.
And here's the kicker: research shows that AI-referred users actually bounce 23% less, stay 41% longer on site, and view approximately 12% more pages compared to traditional organic users. These are high-intent visitors. You want them.
Vercel's analysis revealed a critical finding: none of the major AI crawlers currently render JavaScript. This includes OpenAI's GPTBot, Anthropic's ClaudeBot, Meta's crawler, and PerplexityBot. They don't execute JavaScript, they simply can't see content that loads dynamically.
Let that sink in. If your product pages, descriptions, or key content rely on client-side rendering, AI crawlers are essentially looking at a blank page.
Google's Gemini is the exception here because it leverages Googlebot's infrastructure, which does render JavaScript. But that's just one player in an increasingly crowded field.
For D2C brands that have invested heavily in modern JavaScript frameworks, and let's be honest, most of us have, this creates a significant visibility gap. Your beautiful React-powered product pages might be completely invisible to ChatGPT, Claude, and Perplexity.
The old playbook was simple: rank on Google, drive traffic, convert visitors. That playbook is rapidly becoming incomplete.
The new reality requires what some are calling Generative Engine Optimization (GEO), ensuring your brand and products are not just indexed, but cited by AI systems when they generate responses.
Here's what the data tells us works:
There's another dimension to this that D2C marketers need to understand: the asymmetry between what AI crawlers take and what they give back.
Research found that approximately 80% of AI crawling is for training purposes, while only about 18% is for search and just 2% for user-action queries. The ratio of pages crawled to visitors referred back varies wildly by platform and in many cases, AI systems are consuming far more content than they're driving traffic for.
This creates a strategic question: How do you balance being discoverable by AI systems (which requires letting them crawl your content) with the reality that they might not always send traffic back?
In the short term, the visibility benefits outweigh the risks for most D2C brands. Being cited in AI responses builds brand awareness and credibility, even when it doesn't result in immediate clicks. Think of it like being mentioned in a trusted publication. The brand value compounds over time.
Alright, let's get actionable. Here's what you need to prioritize:
The transition from traditional search to AI-mediated discovery isn't coming, it's here. The brands that adapt their technical infrastructure and content strategy now will have a significant advantage as this shift accelerates.
It’s tempting to wait and see. But the data is clear: AI crawlers are growing exponentially, AI-referred traffic is increasingly valuable, and the technical requirements for visibility are different from what we've been optimizing for.
The good news? The fundamentals still matter. Quality content, technical excellence, and brand authority - these haven't become less important. But the way we deliver them needs to evolve.
Your customers are already asking AI for recommendations. The only question is whether your brand will be part of the answer.