Robots.txt file for shopify
What is a Robots.txt File
A robots.txt file tells search engines which parts of your WordPress, Woocommerce, Shopify, Wix or Blooger website they can crawl and index. This file should be placed in your root directory.
It helps you control what search engine bots can crawl and index on your website. For example, you can block access to sensitive folders like wp-admin or wp-includes.
This keeps your backend safe and prevents duplicate content from showing up in search engine results. A well-structured robots.txt can also improve crawl efficiency.
If search engines waste time crawling and indexing unnecessary files, they may miss your important pages. Make sure you don’t accidentally block content you want to rank.
Shopify automatically generates a robots.txt
file for your store, but you can customize it using the robots.txt.liquid
template. Here is an example of a typical robots.txt
file for Shopify:
User-agent: *
Disallow: /admin/
Disallow: /cart
Disallow: /checkout
Disallow: /orders
Disallow: /account
Disallow: /collections/*sort_by*
Disallow: /collections/*filter*
Allow: /sitemap.xml
Sitemap: https://yourshopifystore.com/sitemap.xml
Remember to replace your own website sitemap’s URL.
(make sure to rename the file to robots.txt before uploading it to your shopify website)
It’s a good practice to review your robots.txt regularly. You can easily do that visiting https://yourwebsite.com/robots.txt
Testimonials form Our Clients
Read how our clients boosted growth, improved security, and strengthened their brand with CBN Tek