How To Configure Robot.txt File In Magento 2

by Fayyaz Khattak  February 19, 2018

One of the essential parts of a site’s SEO optimization is configuring robots.txt in a web application. This is done in order to instruct and allow web bots & crawlers to index website pages. Similarly, when you allow search engines to index your website, in most cases, it is also necessary to instruct web crawlers to avoid indexing the mainly disallowed pages via the robot.txt file.

Magento 2 Robots txt

By default, Magento 2 configuration includes settings to generate and manage instructions for web crawlers and bots that index your store. These instructions are saved in the robot.txt file that resides at the root of the Magento 2 installation directory. At the same time, these directives are recognized by most search engines to avoid indexing particular pages of your Magento 2 store that are used internally by the system. You can use the default Magento 2 settings or define your custom instructions for all, or for specific search engines.

In this post, I will show you how to configure the robot.txt file in Magento 2. I am assuming that you already have Magento 2 installed on your hosting server. If not, Sign Up for Cloudways account and launch Magento 2 store with the convenience of 1-Click installation.

Configure Robot.txt File In Magento 2

1. First of all, login to your Magento 2 Admin Panel.

Cloudways Magento Hosting For Developers

2. Click CONTENT and under Design, choose Configuration.

Magento Configuration

3. From the left side, Edit the Global Design Configuration.

Magento Global Design

4. Expand the Search Engine Robots section.

Magento Search Engine Robots

5. Now you need to do the following:

a) Set Default Robots option to one of the following:

INDEX, FOLLOW: Instructs web crawlers to index the store and check back later for changes.

NOINDEX, FOLLOW: Instructs web crawlers to avoid indexing the store but check back later for changes.

INDEX, NOFOLLOW: Instructs web crawlers to index the store once but do not check back for changes.

NOINDEX, NOFOLLOW: Instructs web crawlers to avoid indexing the store and do not check back for changes.

b) In Edit custom instruction of robots.txt File option, you can enter custom instructions if needed. For example, while developing your Magento 2 store, you can disallow access to all folders by entering custom instruction in this section.

c) Reset To Default button will remove your custom instructions and reset the robots.txt file to system’s default.

6. Once you’re done, hit Save Configuration button.

Custom Instructions For Magento 2 Robot.txt

Like any other web application or ecommerce platform, you can add custom instructions to the robots.txt file in Magento 2. Here are some examples of “Disallow” to consider for Magento 2 Robot.txt file.

Allow Full Access

Disallow Access to All Folders

Default Instructions

Restrict User Account & Checkout Pages

Disallow Catalog Search Pages

Disallow URL Filter Searches

Restrict CMS Directories

Disallow Duplicate Content


I hope that you found this quick guide on how to configure Robot.txt file in Magento 2 useful and you can now successfully add custom instructions in the robot.txt file. If you have any question, feel free to share your thoughts in the comments section below.

Start Creating Web Apps on Managed Cloud Servers Now!

Easy Web App Deployment for Agencies, Developers and E-Commerce Industry

About Fayyaz Khattak

Fayyaz is a Magento Community Manager at Cloudways – A Managed Magento Hosting Platform. His objective is to learn & share about PHP & Magento Development in Community. Fayyaz is a food lover and enjoys driving. You can email him at

Stay Connected:

You Might Also Like...