If your Shopify store is getting traffic to some pages but not others, or if you have solid content but rankings are not moving, there is a good chance technical SEO is the problem. Technical issues do not always show up as dramatic drops in traffic. Sometimes they quietly limit your ceiling, page by page, until you audit the site and realize Google has not been indexing half your catalog.

This guide covers every major technical SEO issue we see on Shopify stores in 2026, and more importantly, how to fix each one. Whether you are running a jewelry brand, a fashion store, or a general ecommerce shop, these problems are remarkably consistent across the platform.

41%
of Shopify pages have at least one indexing issue
3.2x
more organic traffic after resolving crawl budget waste
6 wks
average time to see results after technical fixes

Why technical SEO matters more on Shopify than you think

Shopify is a well-engineered platform and it handles a lot of the basics for you. Your pages load fast, your checkout is secure, and your URLs are reasonably clean. But the platform also comes with a set of structural decisions that were made for developers, not for search engines. Those decisions have real consequences for your organic rankings.

The most important thing to understand about Shopify and technical SEO is that many of the problems are hidden. They do not break your store. Customers can still find products and make purchases. But Google might quietly exclude a third of your pages from its index without you ever seeing an error message in your admin dashboard.

What Shopify gets right

Before getting into the problems, it is worth acknowledging where Shopify genuinely helps with SEO. The platform generates a sitemap automatically, applies canonical tags to handle its own URL duplication, serves pages over HTTPS, and includes structured data on product pages out of the box. These are meaningful baseline advantages compared to a custom-built site where you would need to configure all of this manually.

Where Shopify creates structural SEO problems

The platform’s default behavior creates several recurring issues that you will need to address manually on almost every store.

  • Duplicate product URLs from the /products/ and /collections/collection-name/products/ URL patterns cause Googlebot to encounter the same page at two different addresses, splitting link equity and creating confusion about which version to index.
  • Auto-generated collection filter URLs from faceted navigation create thousands of near-duplicate pages that consume your crawl budget without adding any ranking value.
  • Rigid robots.txt on lower Shopify plans makes it difficult to control what gets crawled without workarounds, and some apps add their own disallow rules that cause unintended damage.
  • Pagination structures that use query parameters rather than clean URLs can confuse crawlers on large collection pages and prevent proper link equity consolidation.
  • Thin content on tag and vendor pages that Shopify generates automatically can dilute your site’s overall quality signals and cause Google to treat the entire domain as less authoritative.
Pro tip: Open Google Search Console right now and click on the Pages report under Indexing. If you see a large number of pages marked as “Discovered but not indexed” or “Crawled but not indexed,” that is a direct signal your store has technical SEO issues that are actively limiting your organic reach.

Indexing problems and how to fix them

Indexing is the foundation of SEO. If Google cannot or will not index your pages, nothing else matters. Not your keywords, not your backlinks, not your product descriptions. Shopify stores have more indexing problems than most owners realize, and the causes vary quite a bit depending on the specific situation.

Pages not being indexed by Google

When a page on your Shopify store is not being indexed, it usually falls into one of a few categories. Either Google has never crawled it, Google crawled it but chose not to index it, or something is actively blocking Google from indexing it. Each situation needs a different fix.

The first thing to do is pull the URL Inspection tool inside Google Search Console and enter the page URL. This will tell you whether the page has been crawled, when it was last crawled, and whether it is currently indexed. If it shows the page is not indexed, the tool will usually give you a reason in plain language.

  • Step 1: Check for noindex tags Open the page in your browser, right-click and view the page source, then search for “noindex.” If you find a meta robots tag with noindex, that is your problem. In Shopify, this can be accidentally enabled through theme settings, a third-party SEO app, or even a password-protect setting that was not fully removed.
  • Step 2: Inspect canonical tags Look for a canonical tag in the page source. It should point to the page itself. If it points to a different URL, Google will index that other URL instead of the one you want. Shopify’s canonical tags sometimes point to the wrong version on product pages that exist inside multiple collections.
  • Step 3: Review robots.txt Visit yourstore.com/robots.txt and look for any Disallow rules that could be blocking the page you are trying to index. Some Shopify apps write their own disallow rules to this file without clearly notifying the store owner.
  • Step 4: Fix the issue and request indexing Once you have resolved the technical blocker, go back to the URL Inspection tool in Search Console and click “Request Indexing.” This prompts Google to crawl and evaluate the page sooner rather than waiting for its next regular crawl cycle.
Common mistake: Many Shopify store owners assume that if a page is live and accessible, Google will eventually index it. That is not always true. Pages with thin content, no internal links pointing to them, or low perceived quality may be crawled and then actively excluded by Google. Simply requesting indexing will not help if the underlying content quality problem is still there.

The “Discovered but not indexed” problem

This is one of the most frustrating statuses you will see in Google Search Console. It means Google knows your page exists but has chosen not to prioritize crawling and indexing it yet. This status is extremely common on Shopify stores with large catalogs, especially stores that added many products quickly without building proper internal linking structures at the same time.

The most effective fix is to improve the internal linking structure of your store. Pages that have no links pointing to them from other indexed pages are treated as low priority by Google’s crawl scheduler. Make sure every product page is reachable from at least one collection page, and that your collection pages are linked from your navigation or homepage. The deeper a page sits in your architecture with no clear pathway to it, the less often Google will visit.

Accidental noindex tags applied by mistake

This happens more often than you would expect on actively managed Shopify stores. A theme update, an app installation, or a well-intentioned change in settings can accidentally apply a noindex directive to pages you need indexed. The problem is easy to miss because the pages still load normally for visitors. Everything looks fine on the surface.

The fix is to audit your meta robots tags across your most important pages. You can use a crawler like Screaming Frog to crawl your entire site and export a list of all pages with noindex tags. Review that list carefully and remove any noindex directives from pages that should appear in search results. Pay special attention to collection pages, product pages, and blog posts, which are the three page types most likely to be accidentally affected.


Crawlability issues that limit your visibility

Even when your pages are technically indexable, crawlability problems can prevent Google from ever finding them or from crawling them as often as you need. Shopify has several structural tendencies that make crawlability problems especially common on stores that have grown quickly or added many apps over time.

Crawl budget waste from faceted navigation

Faceted navigation is the filtering system that lets customers sort products by size, color, price, or any other attribute on a collection page. It is genuinely useful for shoppers. But it is a significant problem for crawl budget if it is not handled properly.

Every time a customer applies a filter combination, Shopify generates a new URL. A collection with 10 color options, 5 size options, and 3 price ranges can theoretically generate hundreds of unique URLs. Google will attempt to crawl all of them, burning through your allocated crawl budget on pages that have virtually identical content and serve no standalone SEO purpose.

Pro tip: Use canonical tags to tell Google that filtered collection URLs are variants of the base collection page. The canonical tag on every filtered URL should point back to the unfiltered collection URL. This consolidates link equity, prevents crawl budget waste, and does not break the filtering functionality for actual shoppers on your site.

Orphan pages that never get crawled

An orphan page is any page on your Shopify store that has no internal links pointing to it from other pages. Google discovers pages by following links, so orphan pages either never get crawled at all or get crawled very infrequently. This is a particularly common problem in stores that have removed products from collections without redirecting or relinking the standalone product pages.

To find orphan pages, crawl your site with Screaming Frog and then compare the list of URLs it discovers through crawling against the list of URLs in your sitemap. Pages that appear in the sitemap but were never found during the crawl are your orphans. Once you identify them, either add internal links pointing to those pages from relevant collection or blog pages, or set up redirects if the content is no longer active or relevant.

Pagination errors on large collection pages

Large collections on Shopify are often split across multiple pages. The way Shopify handles this pagination uses query parameters like ?page=2, which can cause problems. Google sometimes treats paginated pages as lower quality content or fails to properly associate them with the primary collection page, which means the full collection does not consolidate its ranking signals the way it should.

Make sure your paginated collection pages include a canonical tag pointing back to the first page of the collection. Also confirm that your internal navigation clearly links between all pagination pages so Google can follow the chain from beginning to end without hitting dead ends or gaps in the sequence.


Sitemap and robots.txt problems

Your sitemap and robots.txt file are the two documents that most directly shape how Google crawls your store. Getting them right is not complicated, but the default Shopify setup leaves room for problems that are worth understanding and actively managing.

Shopify sitemap.xml issues and fixes

Shopify automatically generates a sitemap at yourstore.com/sitemap.xml. The good news is that this sitemap exists and updates automatically as you add and remove products. The bad news is that you have limited control over what goes into it, and several common issues can make it less useful than it should be as a crawl guide for Google.

Sitemap Issue SEO Impact Fix Available Difficulty
Sitemap not submitted to Search Console Google crawls less frequently and misses new pages Easy
Noindexed pages included in sitemap Wastes crawl budget and sends conflicting signals Medium
Broken sitemap after domain migration Sitemap becomes unreachable by Google Easy
Missing blog post URLs Blog content is crawled infrequently and indexed slowly Medium
Sitemap not updating after product changes Deleted pages remain in sitemap, causing soft 404 signals Easy

The single most impactful thing you can do with your sitemap is submit it to Google Search Console if you have not already. Go to your Search Console property, click Sitemaps in the left sidebar, and enter sitemap.xml in the URL field. Google will begin using it as a direct guide for crawling your store. If you have never done this before, stop what you are doing and do it right now before reading any further.

Robots.txt blocking important pages

Your robots.txt file tells search engines which parts of your site they should and should not crawl. Shopify’s default robots.txt blocks several sections of the store that make sense to keep private, such as the checkout flow, account pages, and admin areas. The problem arises when rules in this file accidentally block pages you actually need crawled and indexed.

Some Shopify apps add their own Disallow rules to robots.txt without clearly communicating this to store owners, and those rules can block pages that should be driving organic traffic. LeanScaleMedia Shopify SEO Audit Team

Visit yourstore.com/robots.txt directly in your browser and read every Disallow rule carefully. Then check those paths against your most important pages to confirm there is no overlap. If you are on Shopify Plus, you have the ability to customize your robots.txt file directly through the theme editor, which gives you precise control over crawl permissions. On standard Shopify plans, your options are more limited, but you can still identify and request removal of app-generated rules that are causing unintended crawl blocks.


Content quality and structural SEO errors

Technical SEO is not only about crawlability and indexing signals. Google also evaluates the quality and structure of your content as part of its decision about which pages to rank and how prominently to feature them in search results. Several patterns that are common on Shopify stores create structural content problems that hurt rankings even on pages that are fully indexed and regularly crawled.

Soft 404 errors hurting your ranking signals

A soft 404 is a page that returns a 200 OK status code, meaning it technically loads without error, but the content is so thin or unhelpful that Google treats it essentially the same as a page that does not exist. This is a very common problem on Shopify stores that have been running for any length of time.

The most frequent causes of soft 404 errors on Shopify are sold-out product pages that display nothing but an out-of-stock notice with no other content, filtered collection pages that return zero matching products, and auto-generated tag pages that Shopify creates for product tags that only have one or two associated items.

  • Add meaningful content to sold-out product pages rather than showing a bare out-of-stock message. Describe the product, link to similar alternatives, or capture email addresses for restock notifications. The page should still be worth visiting.
  • Apply canonical tags on filtered collection URLs that return zero results, pointing them back to the parent collection. This consolidates the authority from any links those filtered URLs may have accumulated.
  • Evaluate your auto-generated tag and vendor pages on a case by case basis. Add a noindex tag to thin tag pages that cannot be meaningfully improved with additional content.

Canonical tag errors on product pages

Shopify’s canonical tag implementation is generally reliable, but it is not perfect in every scenario. The platform is designed to designate the /products/product-handle URL as the canonical version of each product, which is the correct approach. However, when products live inside multiple collections, the /collections/collection-name/products/product-handle URL is also accessible, and if something interferes with the canonical tag logic, Google may index the wrong version of the page or split authority between both.

You can check your canonical tags by viewing the source of your product pages and searching for the rel=”canonical” link element in the head section. It should always point to yourstore.com/products/product-handle and never to the collection-path version of the same product. If you find any mismatches, this is a priority fix.

Thin collection pages excluded from the index

Collection pages with very few products or no unique written content are often treated as thin pages by Google and may be excluded from the index or ranked very poorly. If you have collections with just one or two products and no introductory paragraph or descriptive text, these pages have very little reason to rank for anything, and Google may choose to ignore them entirely.

The fix depends on your overall catalog strategy. You can consolidate small collections into larger, more thematically robust ones. You can add meaningful written content to small collection pages to give them more substance and relevance signals. Or you can intentionally add a noindex tag to the thin collections, preventing them from diluting the quality signals of the rest of your site while you work on building them out properly.

Bottom line: A Shopify store that fixes its indexing blockers, crawlability problems, and content quality issues in a systematic order will see measurable ranking improvements within six to twelve weeks of completing the work.

A complete Shopify technical SEO audit checklist

Use this checklist to work through every area of technical SEO on your Shopify store methodically. Start at the top and work your way down, because indexing blockers are more urgent than content quality improvements and need to be resolved first before anything else can improve.

Indexing and visibility

  • Check Google Search Console’s Pages report for any excluded or unindexed pages and document every status reason you find.
  • Review all “Discovered but not indexed” and “Crawled but not indexed” URLs individually and investigate the specific cause for each one.
  • Audit meta robots tags across your most important product pages, collection pages, and blog posts.
  • Confirm canonical tags on product pages always point to the /products/ URL rather than any collection path version.
  • Request indexing for your highest-priority pages after resolving their specific issues.

Crawlability and internal linking

  • Identify orphan pages that have no internal links and either add links pointing to them from relevant pages or set up redirects if the pages are no longer active.
  • Audit faceted navigation URLs and apply canonical tags or robots.txt disallow rules to consolidate crawl budget on the pages that actually matter.
  • Verify that paginated collection pages are linked properly in sequence and include correct canonical tags pointing to the first page of the collection.
  • Confirm that your most important pages are reachable within three clicks from the homepage through natural navigation paths.

Sitemap and robots.txt

  • Confirm your sitemap is submitted in Google Search Console and showing no errors or warnings in the Sitemaps report.
  • Review your sitemap for any noindexed pages that should be removed to prevent sending conflicting signals to Google.
  • Read your entire robots.txt file and confirm that no important pages are being blocked from crawling.
  • Check for any app-generated disallow rules in robots.txt that may be causing unintended crawl blocks on pages you need indexed.

Content quality and structural issues

  • Identify soft 404 errors in Search Console and add content to thin pages, redirect empty filtered URLs, or set them to noindex where appropriate.
  • Audit collection pages with fewer than five products and decide whether to consolidate, improve, or noindex them based on your catalog strategy.
  • Confirm that blog posts are included in your sitemap and are properly indexed in Search Console.
  • Review auto-generated tag and vendor pages and apply noindex to any that do not have enough content to provide genuine value to searchers.
Pro tip: Run this audit quarterly rather than treating it as a one-time project. Shopify stores evolve constantly with new products, new apps, and new collections, and new technical SEO problems appear regularly as that growth happens. A quarterly audit habit is far more effective than a single deep fix with no follow-through.

Free Consultation

Want us to audit your Shopify store’s technical SEO?

We help Shopify brands find and fix the technical issues holding back their organic traffic, and build a clear roadmap to ranking in Google and AI search in 2026.

Book a free strategy call →

Frequently asked questions

Your Shopify pages may not appear in Google because of indexing blocks such as noindex tags, robots.txt restrictions, or canonical tag errors pointing to the wrong URL. Other common causes include orphan pages with no internal links, crawl budget being consumed by faceted navigation filter URLs, and pages stuck in the “Discovered but not indexed” state. Use the URL Inspection tool in Google Search Console to check any specific page and identify the exact reason it is not appearing in search results.
Yes, Shopify has several built-in technical SEO issues that affect most stores by default. These include duplicate content from the two different product URL structures that exist on every Shopify store, auto-generated canonical tags that can point to the wrong page in some scenarios, a robots.txt file that is difficult to customize on standard plans, and pagination structures that can create crawlability problems on large collections. The good news is that most of these can be fixed or worked around with the right approach, and Shopify Plus gives you more direct control than lower-tier plans.
To submit your Shopify sitemap, go to Google Search Console, select your property, and click Sitemaps in the left menu. Type sitemap.xml in the sitemap URL field and click Submit. Your Shopify sitemap is always located at yourstore.com/sitemap.xml and you do not need to create it manually. If Search Console shows an error after submission, confirm that your domain is fully verified in the property settings and that the sitemap URL loads correctly when you open it directly in a browser tab.
A soft 404 in Shopify happens when a page returns a 200 OK status to Google but the content on that page is essentially empty or unhelpful. Common examples include sold-out product pages that show nothing but an out-of-stock message, collection filter pages that return zero matching products, and auto-generated tag pages with only one or two items. To fix them, add real content to thin product pages, apply canonical tags on empty filtered collection URLs pointing to the parent collection, and noindex tag pages that cannot be meaningfully improved.
Most technical SEO fixes on Shopify start producing visible results within four to twelve weeks, depending on how frequently Google crawls your store. Pages that were previously blocked or excluded from the index can reappear in search results within days of the fix if you use the Request Indexing feature in Google Search Console after resolving the issue. Larger improvements, such as fixing crawl budget waste from faceted navigation, may take one to three months to fully reflect in your rankings and organic traffic data.