We value your thoughts! Share your feedback with us in Comment Box ✅ because your Voice Matters!

How to Add or Edit robots.txt in Shopify: A Comprehensive Guide

What is robots.txt?

The robots.txt file instructs web crawlers (like Googlebot) which pages or directories they can or cannot access on your site. Properly configuring it ensures search engines index your store correctly while blocking sensitive or irrelevant pages.

How to Edit robots.txt in Shopify

Shopify allows limited customization of the robots.txt file through its admin panel. Here’s how to do it:

  1. Log in to Your Shopify Admin: Navigate to your Shopify dashboard.
  2. Go to Online Store Settings: Click Online Store > Preferences from the left-hand menu.
  3. Scroll to the robots.txt Section: At the bottom of the page, locate the robots.txt text editor (under “Search engine listing preview”).
  4. Edit the File: Modify the content as needed.
  5. Save Changes: Click Save to apply your edits.

Default Shopify robots.txt Content

User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /account
Disallow: /collections/*+*
Disallow: /collections/*%2B*
Disallow: /collections/*%2b*
Disallow: /blogs/*+*
Disallow: /blogs/*%2B*
Disallow: /blogs/*%2b*
Allow: /collections/*.json
Allow: /blogs/*.json
Sitemap: https://your-store-url.com/sitemap.xml

Example: Customizing robots.txt

To block crawling of a test collection at /collections/test-products, add:

User-agent: *
Disallow: /collections/test-products

FAQs About robots.txt in Shopify

1. Can I fully edit the robots.txt file in Shopify?

Yes, but with limitations. Shopify’s default rules (e.g., blocking /admin) are pre-loaded and cannot be removed. You can only add new directives below them.

2. Where is the robots.txt file located?

Access it at your-store-url.com/robots.txt. To edit, use Online Store > Preferences in your Shopify admin.

3. How do I test my robots.txt changes?

Use Google Search Console’s robots.txt Tester. Note that changes may take hours to propagate.

4. Why aren’t my updates showing immediately?

Shopify uses a CDN; changes may take 5–10 minutes to reflect. Clear your browser cache or wait if issues persist.

5. How do I block specific pages from indexing?

Add Disallow: /page-url to your robots.txt. For stricter control, use <meta name="robots" content="noindex"> in the page’s HTML.

6. Can I add a sitemap to robots.txt?

Yes. Include Sitemap: https://your-store.com/sitemap.xml at the bottom. Shopify auto-generates this, but adding it explicitly is optional.

7. What if I make a mistake?

Revert your changes manually via the same editor. Shopify doesn’t offer a “reset” button, so keep a backup of your original file.

8. Are there editing restrictions?

Shopify limits the file size and syntax. Avoid removing default rules unless necessary, and ensure correct syntax (e.g., Disallow: /path).

9. How does Shopify’s default robots.txt affect SEO?

It improves security by blocking sensitive pages and ensures product/blog collections are indexed correctly. Customize it further to refine crawl behavior.

Best Practices

  • Test Thoroughly: Use Google Search Console to avoid blocking critical pages.
  • Combine with Meta Tags: Use noindex for pages you want excluded from search results.
  • Avoid Wildcard Overuse: Be specific to prevent unintended blocks.

Conclusion

Editing robots.txt in Shopify is straightforward via the admin panel. By understanding its default rules and carefully adding custom directives, you can optimize crawler access and enhance your store’s SEO.