AI Shopping Bots are ghosts

Digital analytics platforms like Google Analytics 4 or Adobe Analytics, along with RUM tools, aren’t capturing the surge of new AI-driven shopping traffic. This isn’t because the bots are trying to sneak around these systems. It’s simply that the tools were built to track human journeys inside a browser, not bots that fetch web content in entirely different ways.

Most reputable AI agents actually identify themselves in HTTP headers with obvious user-agent strings—often ending in “-User” (think Claude-UserPerplexity-UserChatGPT-User). They’re doing their part to be transparent, but your analytics dashboards don’t show them.

Why not? Because analytics beacons and RUM libraries rely on JavaScript running in the browser. Bots like these typically just request the HTML and stop there. No scripts are executed, no beacons fire. Perplexity explicitly documents its Perplexity-User agent with published IP lists. Anthropic describes Claude-User as user-initiated access. OpenAI’s ChatGPT-User is known but not documented as executing site JavaScript. Google, by contrast, is very clear about how Googlebot works—and in some cases even renders pages to understand them fully.

And just to be clear, I’m not talking about malicious bots trying to hoard product drops or flood your site with junk traffic. This is about something much more everyday: when you ask Perplexity, “Show me the warmest full-length women’s parkas.” On your behalf, the system sends out a server-side fetcher. That’s just a program running on their backend that grabs your product page directly from your servers. It doesn’t spin up a browser. It doesn’t load your CSS, images, or scripts. Some companies are experimenting with headless browsers or full browser rendering, but today the overwhelming majority of consumer AI shopping activity is still simple server-side fetches.

This creates a huge blind spot. Only companies with the ability to dig into server logs or CDN data can piece together what’s happening. And even then, it takes effort. Meanwhile, studies show very few consumers actually click through the citations in AI answers. David Bell at Previsible reported in August 2025 that AI-driven traffic is up 527%. Impressive on the surface—but when you dig in, it’s counting only the subset of people who actually click a citation link. Since GA4 filters out known bots, that number excludes all the direct AI fetches that never appear in browser-based analytics. The real shopping activity triggered by AI remains invisible.

That’s the real challenge in this new agentic shopping world. All of our analytics tools were built around JavaScript libraries that run in the browser. That model worked perfectly when every shopper was a person loading a page. It’s how we’ve always known what percentage of our traffic was on Firefox or how many people shopped from their phones. But our new shopping bot “friends” don’t load JavaScript. They’re ghosts to every mainstream marketing and analytics platform.

To ground this in data, we looked at our own network at Yottaa. Across 700+ retail sites from August 7 to Sept 12th 2025—about 525 million sessions—we saw just 295 RUM sessions with a “-User” user-agent string. That’s essentially zero percent, and exactly what you’d expect. Bots that don’t run JavaScript don’t show up in RUM.

So where do we go from here? The answer isn’t client-side or server-side alone, but a hybrid lens. Edge and server data tell you who’s knocking. RUM shows you what human shoppers experience once inside. Brought together, you get the full picture: the fidelity of performance analytics for people, and visibility into the invisible AI agents that are already shaping how shopping journeys begin.