A controlled experiment from Heiko Hotz at O’Reilly showed something that should make ecommerce teams uncomfortable: an AI shopping agent chose a more expensive product from a merchant with clean structured data over a cheaper competitor whose product page was written in marketing language. The agent did not evaluate price and choose the worse option. It rejected the cheaper option because the marketing copy failed validation in the deterministic layer of the pipeline. The cheaper merchant was effectively invisible.
The architecture Hotz describes as dominant in agent shopping pipelines is a three-layer stack he calls the Sandwich Architecture: a translation LLM converts a vague user request into a structured query, a deterministic validation engine applies strict business logic, and a second LLM selects from the pre-validated options. Marketing language — “best-in-class”, “premium quality”, “ideal for any occasion” — is useful at the first layer for disambiguation, but actively harmful at the second. Deterministic validation engines are not reading vibes. They are matching structured fields: compatible_with, not_suitable_for, specifications, availability. A product description that answers those fields in natural language rather than structured data fails the match, regardless of how compelling the copy is.
The implication for merchants is that a decade of SEO and conversion rate optimisation is largely irrelevant to agentic buyers. What replaces it is a different kind of infrastructure investment: exposing raw product data from PIM and ERP systems in machine-readable formats, maintaining complete and accurate structured specifications, and practising what Hotz calls “negative optimisation” — explicitly flagging what a product is not for. That last point is counterintuitive. Merchants have historically avoided negative framing because it reduces conversion with human buyers. With agent buyers, failing to specify exclusions damages your trust score when an agent recommends your product for an incompatible use case and the purchase is returned.
The Universal Commerce Protocol is emerging as a standard for agent-to-merchant discovery and query handling. Think of it as robots.txt plus a schema for product capability, extended to cover conditional pricing, inventory signals, and compatibility constraints. Whether UCP specifically becomes the dominant standard is unclear, but the underlying pattern is not going away: agents need machine-readable interfaces to merchant data, and merchants who invest in that infrastructure first gain a structural advantage that is difficult to reverse-engineer from marketing alone.
The limitation here is that the agentic shopping stack is still early. Most consumer purchases still go through human-initiated browser sessions, and the data from agent transactions is thin. But the direction is clear enough that waiting for market proof before investing in structured data infrastructure is a losing bet. The merchants who will be visible to agent buyers in three years are the ones building clean data pipelines now, not the ones with the best copy.