In June 2021 Shopify announced they were deploying a new development change which would allow Shopify store owners the chance to edit and customise their own robots.txt file.
This was a significant change for Shopify websites as this was previously a commonly known drawback to this eCommerce platform amongst the SEO community.
What is Robots.txt?
A robots.txt is a file containing a set of rules which are specific for each website and are used to tell search engines robots what they can crawl and cannot crawl. This is done by using ‘allow’ and ‘disallow’ directives which can be created for particular search engine crawlers users agent such as Googlebot, Bingbot or Yandexbot.
Despite these rules being similar to instructions for search engine crawlers it doesn’t necessarily mean they will completely follow or adhere to them which is why it can sometimes be beneficial to complete log file analysis to gain a better insight into a search engine’s crawling habits.
It’s super easy to find your website’s robots.txt file, you just need to add ‘/robots.txt’ to your website’s root domain. For example https://www.keiradavidson.com/robots.txt.
To learn more about robots.txt file SEO best practice read our comprehensive guide.
How to Edit Your Robots.txt File on Shopify
If you are wanting to create or edit your robots.txt file for your Shopify store I recommend you follow these steps as it will allow you to access the default robots.txt file to then further customise:
- Open your Shopify Dashboard
- Go to Online Store > Themes
- In the Live theme section, click Actions > Edit code
- Under the templates section, click ‘Add a new template’
- Change ‘Create a new template for’ to Robots.txt
- Click ‘Create template’
It’s worth noting that should you wish to, you are able to revert any robots.txt customisation back by going through the steps mentioned above to create a new default robots.txt file to then allow you to remove the previous bespoke version.
Use Cases For Customising Your Robots.txt
Typically speaking when it comes to eCommerce websites there are a handful of go to disallow directives you’d want to implement to prevent search engines from crawling certain URLs. These usually consist of:
- Checkout pages
- Login/my account URLs
- Internal site search URLs
- URLs generated from faceted navigation
- URLs created from sorting navigation
We normally find that Shopify’s online stores default robots.txt file usually contains disallow rules for each of the use cases listed above.
However it’s always beneficial to double check this for your website to ensure that the rules outlined in your robots.txt file are capturing all URLs you don’t want search engines bots crawling.
When it comes to robots.txt files you need to be extremely careful as there is the chance to disallow your whole website which would significantly impact the amount of revenue generated through your eCommerce website. You’d be surprised by how often this occurs.
With this in mind if you are wanting to customise your Shopify’s stores robots.txt file I would recommend you work with an experienced SEO or SEO conscious developer to avoid any costly mistakes.
Finally if you have any questions on this or need any help, feel free to get in touch with us.