How to set up robots.txt in Magento 2?

To enable Submission to Robots.txt in Magento 2, you need to follow these steps:

1. Log in to your Admin Panel.

2. Go to Stores > Settings > Configuration and choose XML Sitemap in the Catalog tab.

3. Open the Search Engine Submission Settings tab and set Yes in the Enable Submission to Robots.txt dropdown.

4. Click on the Save Config button.

Robots.txt configuration in Magento 2

Step 1

Go to Content > Design> Configuration.

Step 2

In the opened grid, find the line Global, click the Edit link, and open the Search Engine Robots tab.

Step 3

Now choose what Default Robots you need from the following:

  • INDEX, FOLLOW: Search engines will regularly index your pages and check the changes.
  • NOINDEX, FOLLOW: Search engine bot won’t index your website but it’ll still monitor changes.
  • INDEX, NOFOLLOW: Search bot will index your store once but never came back to track the changes.
  • NOINDEX, NOFOLLOW: This setting will hide your store from search engine bots.

Step 4

In the Edit custom instruction of robots.txt File line, you can write the needed custom instructions. Read on to find out more about the option.

Step 5

If you want to reset the default settings, you need to click the Reset To Default button, which will remove all your custom instructions.

Step 6

Don’t forget to Save Configuration to apply the changes.

Custom instructions for Magento 2 robots.txt

As we already mentioned, you can add custom instructions to the robot.txt configurations. Below, we provide examples that you can use in Magento robots.txt for your needs.

Allow Full Access

User-agent:*
Disallow:

Disallow Access to All Folders

User-agent:*
Disallow: /

Default Instructions

Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=

Restrict User Account & Checkout Pages

Disallow: /checkout/
Disallow: /onestepcheckout/
Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/

Disallow Catalog Search Pages

Disallow: /catalogsearch/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow URL Filter Searches
Disallow: /*?dir*
Disallow: /*?dir=desc
Disallow: /*?dir=asc
Disallow: /*?limit=all
Disallow: /*?mode*

Restrict CMS Directories

Disallow: /app/
Disallow: /bin/
Disallow: /dev/
Disallow: /lib/
Disallow: /phpserver/
Disallow: /pub/

Disallow Duplicate Content

Disallow: /tag/
Disallow: /review/



To forget about glitches in Google's indexing, generate your XML website sitemaps automatically. As a result, you'll achieve faster content indexing, exclude irrelevant pages from indexing, define the frequency of updates for pages and get higher positions in SERPs. 

How can we help you?

Didn’t you find the answer to your question? We are always happy to help you out.