Customize robots.txt

The robots.txt file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:

  • The user agent, which notes which crawler the group of rules applies to. For example, adsbot-google.
  • The rules themselves, which note specific URLs that crawlers can, or can't, access.
  • An optional sitemap URL.

Shopify generates a default robots.txt file that works for most stores. However, you can add the robots.txt.liquid template to make customizations.

In this tutorial, you'll learn how you can customize the robots.txt.liquid template.