Blog / The future of Shopify SEO/GEO: Our strategy for AI search
The future of Shopify SEO/GEO: Our strategy for AI search
HOW WE WORK
It might not look like it, but AI discovery is still in an experimental phase. In the tiny bubble of technical marketing we live in, it seems like we’ve been thinking and talking about this forever. With all of this going on, it’s hard not to let the mountain of "thought leadership" distract from the fact that things simply aren’t as developed as they seem.
That’s not to say we’re not excited about the future; we genuinely believe this has the potential to be the biggest change in the world of product discovery since the arrival of Facebook ads. But whenever a shift of this magnitude happens, it brings out all the usual "hustle culture" nonsense. It’s all pretty jarring, and in moments like this, we find it best to keep a specific statement from one of our brilliant partners, Timo Dechau, front of mind:
"Nothing is as simple and powerful as it gets described by vendors, agencies and consultants."
The reality is that we are all still in uncharted waters. Because we are dealing with systems at Shopify and within LLMs that we can't fully see yet, our approach has to be adaptable and flexible. But while the "how" of discovery is changing, the core objective remains the same: deeply understanding the user. As AI-driven personalisation kicks in, the brands that will win will be those that understand their customers best of all.
Now that we’ve got all of that out of the way, the purpose of this article is to explain how we approach AI discovery. There is plenty of big-picture thinking around AI that is interesting and relevant, but we need to stay focused on our specific thing: large catalogue Shopify brands. To us, anything else is just a distraction.
The current state of play
Things are moving fast, but we are starting to see the shape of things to come in a way that is actually executable. For the past few years, Shopify has been laying the plumbing for this new era of commerce.
This infrastructure is designed to move your store from a destination for human browsers to a source of truth for AI agents.
-
Universal Commerce Protocol (UCP). This is an open-standard data layer co-developed with Google that translates your entire catalogue into a machine-readable format.
-
Shopify's agentic storefront. This represents the shift toward a site that can interact natively with AI shopping agents, allowing them to compare products and complete checkouts. This is built on Shopify Catalog.
-
Shopify Knowledge Base. This is a structured repository that holds your team's expertise - such as FAQ pairs and technical specifications - and turns it into data that AI can use to reason through customer questions.
Together, they form infrastructure that is designed to move your store from a destination for human browsers to a source of truth for AI agents.
The playbook has been written
In February 2026, Shopify released its official playbook, The GEO Playbook: How (& Why) to Optimize for AI Discovery.
As well as settling the debate (for us at least) the best acronym to use to describe AI discovery, it also validated so much of what we have been talking about for years regarding product data and taxonomy - a subject we've written about at length here.
Is AI search/GEO different from SEO?
This brings us to the core of the current debate: is GEO actually anything different from standard SEO?
If you've been unlucky enough like me to have spent any significant time on LinkedIn lately, you’ve probably seen the exhausting debate over this. The arguments usually fall into two camps: either it’s exactly the same and everyone is getting excited about nothing, or it’s completely different and anyone saying otherwise is an idiot.
The reality, however, is that this isn't a binary choice - which, thankfully, means there’s a space for over-excited idiots like us.
Joking aside, there is some real nuance here. Many of the core principles of search remain the same, but the way you execute - and more importantly, where you optimise - is shifting fundamentally.
Traditional SEO is often a top-down exercise: looking at pages and sticking keywords on them. However, AI discovery requires a bottom-up approach. Instead of fixing the page at the end of the line, we are optimising the product data at the source.
This is a bit of an oversimplification - organising your site better, getting more granular, creating collections that match how people search, and making sure your products are optimised - has been table stakes for a long time. But to do that effectively, you need to make sure your product data is in a good position to allow it. This is what we’ve done for years.
Now, the difference with AI is that it needs that product data even more. It is reading it directly. It’s looking for slightly different things and it needs richer information, but this is a refinement of our existing work, not a total departure.
The era of technical merchandising
We call this approach technical merchandising. It is the vital philosophy that underpins everything we do: Shopify acts as the point of truth across all of your marketing channels.
Technical merchandising is the bridge between raw product data and how that data is surfaced to both humans and machines. It is effectively treating Shopify as a data warehouse for your products. When we optimise that warehouse, the benefits flow across every channel - whether that’s Google Ads, Meta, AI search, or on-site UX.
Within this realm, we focus on four distinct pillars:
-
Product data as technical DNA
This is the engine room of your store where we define the specific attributes that AI discovery engines use to build a case for your products. In the era of GEO, success is no longer about keywords; it is about winning the "inference handshake". This measures how easily an AI model can reason through your data to justify recommending your brand. By moving beyond standard presets and providing objective proof points - such as specific micron counts or durability ratings - you provide the machine with the facts it needs to choose you over a competitor. -
Taxonomy as a store map
Your taxonomy is the hierarchical map that shows where products fit into parent and child categories. Shopify is designed to be agent-ready by default, using the Universal Commerce Protocol (UCP) to translate your store into a format that AI can discover.
A strong taxonomy ensures that both humans and AI crawlers understand the context of every item. If your structure is a black box of loose tags and uncategorised products, you are suffering from a data leak that prevents AI agents from accurately grounding your brand authority.
-
Granularity for user intent
Granularity is how deep your store map goes to match specific search intents and personas. When your product data is rich, we can build a taxonomy that provides an inference advantage for highly specific prompts. Instead of just "Treadmills", we create specific shelves for "Folding treadmills for small apartments". This reduces machine uncertainty by giving the AI the exact context it needs to match a user's reasoning with your product.
-
Depth as an evidence base
Depth is about providing the authoritative content AI needs to verify its recommendations. AI discovery engines do not just read one description; they scan multiple sources - including FAQ content, collection descriptions, and customer reviews - to build a consensus. By providing specific evidence across these touchpoints, you give the AI the data it needs to reason that your brand is the right recommendation. This transforms your store from a simple storefront into a structured data warehouse.
Putting this into practice
Shopify is betting heavily on being the data layer that fuels the entire AI commerce ecosystem. With the introduction of UCP, which has been co-developed by Shopify and Google - the platform is effectively translating your store into a format that AI agents can discover, compare, and even use to complete checkouts natively.
Our guiding principle is to align with Shopify’s native functionality and structure wherever possible.
We have learned that the more you bend or break this native architecture with custom hacks, the more fragile your data becomes. Shopify is increasingly designing its metaobjects, attribute handling, and Knowledge Base to push data directly into this AI layer by default.
While we don't have a perfect window into every "black box" of the platform yet, we believe that staying close to the native "point of truth" is the best way to ensure your store remains machine-readable as the technology evolves.
Our role is to work within this native framework to improve your inference - making it as easy as possible for an AI model to "reason" its way to your products.
Theory into execution
This is where the theory begins to meet the practicalities of the platform. Because the landscape is shifting, we don't treat this as a "one and done" setup, but as an ongoing process of data enrichment and infrastructure refinement. To be successful in the era of AI-commerce, we focus on a few key areas:
- Building the technical DNA
Our goal is to make your product data as comprehensive as possible. We audit your existing attributes and look for opportunities to go deeper, adding technical metadata points where they add genuine value. This isn't just filling in boxes for the sake of it; it’s about building a rich taxonomy that helps AI, organic search, and Google Ads understand exactly what you sell. By aligning with Shopify's native standards, we make sure your data is in the best position to be interpreted by whatever new discovery tools emerge.
- Engineering the discoverable storefront
This often involves making that data more accessible to both users and machines. We work on features that allow your taxonomy to surface naturally: secondary navigations and breadcrumbs that help AI understand relationships, and modular theme components that handle complex data without cluttering the user experience. We also look at knowledge harvesting - taking the expertise from your team and structuring it as FAQ pairs within the Shopify Knowledge Base to help fuel the next generation of storefront agents.
- Scaling for the answering engine
We take that depth of information - like the FAQs - and look for ways to display it across products and collections. This allows the information to be found by traditional organic search while providing the "evidence" AI needs to reference you in its answers. Rather than manual, page-by-page updates, we focus on building systems that allow this data to flow across your catalogue as efficiently as the current Shopify infrastructure allows.
Off-site: the validation layer
We should probably treat this as a whole separate subject that we’ll be covering shortly, but the core principle is that AI models look off-site to verify what we say. This doesn't apply to every brand, but for those where it’s a fit, it’s about building entity and brand trust.
Because AI tools often look at authority across third-party sources to confirm a brand's claims, we focus on creating consistent "echoes" of your data across the web. This is an area where we are constantly testing, as the move toward AI-driven personalisation makes these external signals increasingly complex to measure. Our current focus includes:
- Consistency across touchpoints If your brand DNA - from product attributes to core service descriptions - is inconsistent across the web, AI engines may struggle to match your business to its claims with high confidence.
- Authority through corroboration LLMs often treat external mentions in credible formats as confidence indicators. We look for ways to ensure your expertise is referenced in the places where AI "learns," like industry roundups and authoritative niche publications.
- Community sentiment Models look to platforms like Reddit to understand unfiltered human experiences. We monitor these community signals to see how they align with your on-site claims, using them as a feedback loop to refine how we present your brand to both humans and machines.
Reporting is the wild west
If you are looking for a clean report that shows exactly how many sales ChatGPT sent your way last Tuesday, I have bad news - reporting is the wild west right now.
We are currently in the "infrastructure phase" of the AI revolution. The plumbing is still being laid, and the data pipeline simply isn't there yet. It is unrealistic to expect the same level of granular, "click-by-click" tracking that we have become accustomed to over the last decade.
Traditional tracking has a significant blind spot
Traditional analytics like GA4 were built to track human browser sessions, not machine learning. This creates two distinct challenges that happen at different stages of the customer journey:
- The blind spot in the "learning phase" - Before an AI can recommend you, it has to "learn" about your products by sending a crawler (like GPTBot) to scrape your site. Most AI crawlers do not execute JavaScript when they scrape. Because the GA4 code never runs, the visit is invisible. You are essentially blind to the moment the AI is "doing its research" and deciding whether you are a relevant recommendation.
- The referral mask in the "buying phase" - When a human user eventually clicks a link in an AI response, they do trigger GA4. However, because AI platforms often open links in internal "sandboxes" or strip out the source data, these visitors usually appear as "direct" traffic. You see the sale, but you lose the context of how they found you.
This gap is important because user behaviour has shifted. People are increasingly doing their comparison shopping inside the AI interface. By the time they actually click a link to your store, they have often already made their decision. They aren't "browsing" anymore; they are arriving to buy.
The "dark funnel" and the myth of 1:1 attribution
We have to consider the possibility that the attribution gap we’re seeing now isn't just a temporary technical glitch. If we look at the history of digital marketing, the trend has always been toward less transparency, not more.
Accurate 1:1 attribution has always been something of a myth, but we are moving into an era where even the "best-guess" data we used to rely on is being restricted. Google and Shopify have a strategic incentive to keep this a "black box" to steer brands toward paid AI placements. Just as the arrival of "not provided" keywords changed SEO forever in 2011, we most likely are entering a phase where we have to rely on indirect signals rather than perfect tracking.
Our "wait and see" approach is about letting the platforms finish the plumbing. Shopify and Google are already building the systems that will eventually close the loop. Until those native tools arrive, we focus on the only thing that actually lasts - data integrity.
- Visibility monitoring - We don't look for a specific "rank." Instead, we check for presence: is your brand being surfaced consistently across a variety of prompts?
- Correlative tracking - Watching for patterns where improvements in data depth (like technical specs) correlate with spikes in branded search and direct traffic.
- Point of truth - Ensuring your data in Shopify is so clean that you are ready for whatever reporting eventually becomes the standard.
Final thoughts
The world of AI search is moving incredibly fast. However, the foundational work of technical merchandising - organising your store and enriching your data - is what sets you in good stead. If your data is modelled correctly at the root, you are in a position to adapt as fast as the technology does.