One of the essential parts of a site’s SEO optimization is configuring robots.txt in a web application. This is done in order to instruct and allow web bots & crawlers to index website pages. Similarly, when you allow search engines to index your website, in most cases, it is also necessary to instruct web crawlers to avoid indexing the mainly disallowed pages via the robot.txt file.
By default, Magento 2 configuration includes settings to generate and manage instructions for web crawlers and bots that index your store. These instructions are saved in the robot.txt file that resides at the root of the Magento 2 installation directory. At the same time, these directives are recognized by most search engines to avoid indexing particular pages of your Magento 2 store that are used internally by the system. You can use the default Magento 2 settings or define your custom instructions for all, or for specific search engines.
In this post, I will show you how to configure the robot.txt file in Magento 2. I am assuming that you already have Magento 2 installed on your hosting server. If not, Sign Up for Cloudways account and launch Magento 2 store with the convenience of 1-Click installation.
Configure Robot.txt File In Magento 2
1. First of all, login to your Magento 2 Admin Panel.
2. Click CONTENT and under Design, choose Configuration.
3. From the left side, Edit the Global Design Configuration.
4. Expand the Search Engine Robots section.
5. Now you need to do the following:
a) Set Default Robots option to one of the following:
INDEX, FOLLOW: Instructs web crawlers to index the store and check back later for changes.
NOINDEX, FOLLOW: Instructs web crawlers to avoid indexing the store but check back later for changes.
INDEX, NOFOLLOW: Instructs web crawlers to index the store once but do not check back for changes.
NOINDEX, NOFOLLOW: Instructs web crawlers to avoid indexing the store and do not check back for changes.
b) In Edit custom instruction of robots.txt File option, you can enter custom instructions if needed. For example, while developing your Magento 2 store, you can disallow access to all folders by entering custom instruction in this section.
c) Reset To Default button will remove your custom instructions and reset the robots.txt file to system’s default.
6. Once you’re done, hit Save Configuration button.
Custom Instructions For Magento 2 Robot.txt
Like any other web application or ecommerce platform, you can add custom instructions to the robots.txt file in Magento 2. Here are some examples of “Disallow” to consider for Magento 2 Robot.txt file.
Allow Full Access
Disallow Access to All Folders
User-agent:* Disallow: /
Disallow: /lib/ Disallow: /*.php$ Disallow: /pkginfo/ Disallow: /report/ Disallow: /var/ Disallow: /catalog/ Disallow: /customer/ Disallow: /sendfriend/ Disallow: /review/ Disallow: /*SID=
Restrict User Account & Checkout Pages
Disallow: /checkout/ Disallow: /onestepcheckout/ Disallow: /customer/ Disallow: /customer/account/ Disallow: /customer/account/login/
Disallow Catalog Search Pages
Disallow: /catalogsearch/ Disallow: /catalog/product_compare/ Disallow: /catalog/category/view/ Disallow: /catalog/product/view/
Disallow URL Filter Searches
Disallow: /*?dir* Disallow: /*?dir=desc Disallow: /*?dir=asc Disallow: /*?limit=all Disallow: /*?mode*
Restrict CMS Directories
Disallow: /app/ Disallow: /bin/ Disallow: /dev/ Disallow: /lib/ Disallow: /phpserver/ Disallow: /pub/
Disallow Duplicate Content
Disallow: /tag/ Disallow: /review/
I hope that you found this quick guide on how to configure Robot.txt file in Magento 2 useful and you can now successfully add custom instructions in the robot.txt file. If you have any question, feel free to share your thoughts in the comments section below.