Generative AI is rapidly changing how people search for and discover products online. As an e-commerce veteran, I’ve noticed a curious gap about to explode: our sleek, modern websites might look great to human shoppers, but to AI-based visitors they could be practically invisible. We optimize for Google and mobile devices, yet we now have to consider a new audience—AI crawlers and chatbots. It’s time to take a hard look at how our content appears to these bots and explore the idea of “adaptive content for bots” as the next evolution in digital strategy.
Current State: AI Crawler Limitations and Evidence
AI-driven crawlers (like OpenAI’s GPTBot or Anthropic’s ClaudeBot) are already hitting websites in huge numbers – one report noted GPTBot and ClaudeBot made roughly 939 million requests in a month, about 28% of Googlebot’s volume. In other words, these AI bots are a significant (and growing) part of web traffic. The catch? They’re often crawling with blinders on due to technical limitations. Consider the evidence and current limitations:
- No JavaScript Rendering: Unlike Google, most AI crawlers do not execute JavaScript when they crawl a page. GPTBot, ChatGPT’s browser plugin, Claude’s bot – none of them run your scripts. They fetch the raw HTML and that’s it. If your site relies on client-side JS to load core content, the bot simply won’t see it. (OpenAI’s crawler will grab your .js files, but it won’t actually run them.)
- Invisible Dynamic Content: Any content injected after initial page load – via AJAX calls, SPAs, React components, etc. – might as well not exist from an AI’s perspective. Many have tried this out: feed ChatGPT a React-based product page and you often get a blank or boilerplate answer. In one test it responded that “the content may be JavaScript-based and couldn’t be retrieved”. The AI essentially shrugged because it saw almost nothing. I tried to get the phone number on a business recently in giant font in the footer and Claude couldn’t see it.
- Limited Interaction: These crawlers don’t click buttons, scroll, or wait for asynchronous data. They operate like a very primitive browser. That means features like infinite scroll product catalogs, content behind login or pop-ups, or user-triggered filters will not be navigated or understood by AI agents in their current form.
- Blocking and Access Issues: On top of technical limits, many websites are outright blocking AI crawlers via robots.txt due to data usage concerns. According to PPC Land, over 35% of the top 1000 websites now block OpenAI’s GPTBot (up from just 5% a year prior). Major companies like Amazon quickly implemented such blocks when GPTBot first appeared. This trend reflects mistrust of AI scraping, but it has a side effect: if you block these bots, your content won’t show up in their models or answers. As one analysis put it, sites face a dilemma between protecting content and maintaining visibility in AI-driven search results.
In short, today’s AI crawlers have significant tunnel vision. They’re powerful, but they primarily read static HTML and often only what’s immediately available. Anything not in that initial payload might as well be invisible. That’s a stark contrast to Google’s crawler, which will actually render JavaScript in a second-wave indexing process. We’ve spent years building rich, interactive sites under the assumption that “Google will handle it” – but now along comes a new class of crawlers that don’t handle it.
Adaptive Content for Bots – The “M.Brand.com” Parallel
Historical Precedent: The mobile web revolution saw companies create separate “m.brand.com” sites optimized for mobile devices. Today’s challenge requires a similar solution: AI-optimized content delivery that creates automatically generated, bot-readable versions of JavaScript-heavy sites.
So how do we bridge this gap? This is where adaptive content for bots comes in. Just as we once embraced responsive design for different screen sizes, we may need to adopt a kind of “responsive content” strategy for different visitor types (human vs. AI). The goal isn’t to cheat or cloak content, but to ensure the AI sees the same key information that a human would – delivered in a bot-friendly way.
Think of it as creating a simplified, pre-rendered storefront for AI visitors. When an AI crawler (or a chatbot’s browsing tool) comes knocking, the site could detect that user agent and serve up content that doesn’t require a human browser to make sense. This might involve techniques like server-side rendering (SSR) or generating a static snapshot of your page content on the fly. In fact, SEO experts are already recommending server-side rendering of important content so AI bots can understand and index it. It’s the same idea as making a website accessible – but here our “accessibility” target is an algorithm rather than a screen reader.
A few guiding principles are emerging for adaptive content targeting bots:
- Provide Full HTML Content: Ensure that all critical text (product names, descriptions, prices, reviews) is present in the initial HTML response. If that means rendering it on the server or at the edge, it’s worth it. The bot should not need to run any scripts to get the gist of your page.
- Preserve Consistency: The content you serve to bots should match what users see. This isn’t about showing something different (which could be seen as deceptive); it’s about delivering the same information in a more digestible format. For example, a dropdown menu of specs could be expanded into a simple list in the bot-facing HTML.
- Use Standard Signals: Leverage things like structured data (schema markup) and emerging standards (if any, like a hypothetical LLMs.txt file) to guide bots. While experimental, these can help ensure the AI understands the context of your content. At minimum, keep using schema for products, reviews, etc., since AI agents may pay attention to it much like Google does.
In practice, adaptive content for bots might mean your e-commerce platform, CDN, or middleware detects AI crawlers and serves them an unobfuscated version of the page. No fancy client-side tricks—just the goods, plain and simple. This way, when Perplexity or another AI reads your site, it actually finds what it needs to confidently include your products or information in its answers.
We have optimized for desktop, then mobile; we optimized for web accessibility; now we have to optimize for AI. It’s a natural next step in the evolution of content delivery. And given how fast AI-driven search is growing, it’s one we can’t afford to ignore.
Why It Matters for Online Retailers
You might be thinking: this sounds like a lot of technical fuss for bots—why should retailers prioritize this? The short answer is that AI-driven search and shopping are quickly becoming mainstream, and they have real revenue implications. Here’s why this trend should be on every online retailer’s radar:
- Shifting Consumer Behavior: A significant chunk of consumers are already using generative AI platforms as a starting point for search and shopping queries. About 10% of U.S. consumers now prefer AI chat platforms for search, a figure projected to swell to over 90 million people by 2027. If even a fraction of those users are asking AI for product recommendations or answers, you want your brand to be in the response. Ignoring this channel could mean missing out on a growing audience.
- Higher-Intent Traffic: Early data suggests that AI-referred traffic is especially valuable. Users who arrive via an AI assistant often spend more time and convert at higher rates than typical search visitors. One study by Zen Agency noted that AI-referred visitors stayed about 2.3 minutes longer and had ~1.5× higher conversion rates compared to regular search traffic. It makes sense – if someone asks an AI for “the best noise-cancelling headphones” and your product is recommended, that visitor is likely far down the purchase funnel and ready to act. Being invisible to these AI means losing out on some of the most qualified leads out there.
- New AI Shopping Experiences: Major AI players are actively integrating shopping features. OpenAI recently announced a shopping mode in ChatGPT that can surface product picks with buy buttons embedded. Essentially, ChatGPT might become a concierge that points consumers directly to purchase links on retailer sites. But here’s the rub: if ChatGPT can’t properly read your site or doesn’t know about your products (perhaps because your content was hidden or you blocked the crawler), your items won’t be among those recommendations. Instead, it will favor products it can read about – possibly your competitors. Retailers need to ensure they’re not inadvertently closed off from these emerging AI-driven storefronts.
- SEO Isn’t Enough Anymore: For years, we focused on climbing Google’s rankings. Now there’s a parallel challenge of getting noticed by AI algorithms (some call it Generative Engine Optimization or GEO). It’s not just about traditional SEO signals; it’s about feeding the AI the right data. If your beautifully designed site is essentially a blank page to an AI crawler, all the classic SEO optimizations won’t help in this new context. Being proactive about adaptive content ensures you don’t lose ground as search paradigms shift.
In essence, the rise of AI search means online retailers must double-check their digital storefront’s visibility. It’s no longer enough that a human with a web browser can navigate your site – now you have to consider the AI “visitor” who is blind to anything beyond raw text. The payoff for doing so is not just maintaining traffic, but tapping into a channel where recommendations carry extra weight and user engagement is high.
Looking Ahead
All of this paints a picture of a web in flux. We have a bit of a Wild West situation where AI models are reshaping how consumers find information, yet the infrastructure of the web hasn’t fully caught up. The good news is we’ve faced similar challenges before (remember the early mobile web, or the move to responsive design?) and we have the tools to adapt again.
Personally, I find this challenge exciting. It’s diagnostic and exploratory right now – we’re identifying the cracks in the system. As someone who’s spent a career at the intersection of commerce and technology, I know the solution is around the corner. I won’t delve into specific ideas here, but let’s just say there is a way to make adaptive content for bots dead simple for any e-commerce site to implement. My goal is to ensure retailers don’t need to overhaul their tech stack just to be seen by an AI.
For now, the key takeaway is awareness. Know that these limitations exist, audit your own site, and start thinking about how to serve AI crawlers the content they need. It might be as straightforward as flipping on server-side rendering for key pages, or as involved as using an edge worker to generate static snapshots. Whatever the approach, the first step is recognizing the gap.
The storefront of the future isn’t just what humans see – it’s what algorithms see as well. Adaptive content for bots could well become a standard practice in the years ahead. Or, this is irrelevant and the AI bots will read JavaScript by the time you read this old blog entry.