This website uses cookies

Our website, platform and/or any sub domains use cookies to understand how you use our services, and to improve both your experience and our marketing relevance.

📣 Try the fastest hosting platform with pay-as-you-go pricing & 24/7 expert support! MIGRATE NOW →

How To Configure Robots.txt File In Magento 2

Updated on March 16, 2023

4 Min Read

One of the most essential parts of a site’s SEO optimization is configuring robots.txt in a web application. When you allow search engines to index your website, it is also necessary to instruct web crawlers to avoid indexing the mainly disallowed pages via the robots.txt file.

Magento 2 Robots txt

By default, Magento 2 configuration includes settings to generate and manage instructions for web crawlers and bots that index your store. These instructions are saved in the Magento 2 robots.txt file that reside at the root of the Magento 2 installation directory. These directives are recognized by most search engines to avoid indexing particular pages of your Magento 2 ecommerce store that are used internally by the system. You can use the default Magento 2 settings or define your custom instructions for search engine bots.

In this post, I will show you how to configure the robots.txt file in Magento 2. I am assuming that you already have Magento 2 installed on your web hosting server. If not, sign up for Cloudways account right now and launch Magento 2 store with the convenience of one-click application installation feature.

Scale Your Magento 2 Store With Ease

One-Click Magento installation with your own managed hosting solution.

Configure Robots.txt File In Magento 2

1. First of all, login to your Magento 2 Admin Panel.

2. Click Content and under Design, choose Configuration.

Magento Configuration

3. From the left side, Edit the Global Design Configuration.

Magento Global Design

4. Expand the Search Engine Robots section.

Magento Search Engine Robots

5. Now you need to do the following:

a) Set Default Robots option to one of the following:

INDEX, FOLLOW: Instructs search engine crawlers to index the store and check back later for changes.

NOINDEX, FOLLOW: Instructs search engine crawlers to avoid indexing the store but check back later for changes.

INDEX, NOFOLLOW: Instructs search engine crawlers to index the store once but do not check back for changes.

NOINDEX, NOFOLLOW: Instructs search engine crawlers to avoid indexing the store and do not check back for changes.

b) In Edit Custom Instruction Of Robots.txt File option, you can enter the custom instructions if needed. For example, while developing your Magento 2 ecommerce store, you can disallow access to all folders by entering custom instructions.

c) Reset To Default button will remove your custom instructions and reset the Magento 2 robots.txt file to system’s default.

6. Once you’re done with the configuration, hit Save Configuration button to apply the changes.

Custom Instructions For Magento 2 Robots.txt

Like any other web application or ecommerce platform, you can add custom instructions to the robots.txt file in Magento 2. Here are some examples of “Disallow” to consider for Magento 2 Robots.txt file.

Allow Full Access

User-agent:*
Disallow:

Disallow Access to All Folders

User-agent:*
Disallow: /

Default Instructions

Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=

Restrict User Account & Checkout Pages

Disallow: /checkout/
Disallow: /onestepcheckout/
Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/

Disallow Catalog Search Pages

Disallow: /catalogsearch/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/

Disallow URL Filter Searches

Disallow: /*?dir*
Disallow: /*?dir=desc
Disallow: /*?dir=asc
Disallow: /*?limit=all
Disallow: /*?mode*

Restrict CMS Directories

Disallow: /app/
Disallow: /bin/
Disallow: /dev/
Disallow: /lib/
Disallow: /phpserver/
Disallow: /pub/

Disallow Duplicate Content

Disallow: /tag/
Disallow: /review/

Conclusion

The robots exclusion standard, also known as the robots.txt file, is essential for your Magento 2 ecommerce store when communicating with web crawlers. This standard defines how to inform the search engine bots about the pages of your Magento 2 store that should be excluded or opened for crawling. That’s why the robots.txt file is significant for the correct Magento 2 store indexation and its overall search visibility.

As Magento 2 provides a mechanism for creating a robots.txt file, there is no need to create one manually. All you need to do is add some configuration in Magento 2 itself, and a robots.txt will be generated periodically.

I hope that you found this quick guide on how to configure Robots.txt file in Magento 2 useful. Now, successfully add custom instructions in the robots.txt file to enhance your rankings in the SERPs.

If you have any question, feel free to share them in the comments below.

FAQs

What is robots txt in Magento 2?

In Magento 2, robots.txt file tells search engine that which URLs crawlers can access in your Magento store.

Where is robots txt file in Magento?

The robots.txt file are at the root of Magento 2 installation directory. By default, Magento 2 configuration includes settings to manage crawlers and bots for store indexing.

Share your opinion in the comment section. COMMENT NOW

Share This Article

Abdur Rahman

Abdur Rahman is the Magento whizz at Cloudways. He is growth ambitious, and aims to learn & share information about Ecommerce & Magento Development through practice and experimentation. He loves to travel and explore new ideas whenever he finds time. Get in touch with him at [email protected]

×

Get Our Newsletter
Be the first to get the latest updates and tutorials.

Thankyou for Subscribing Us!

×

Webinar: How to Get 100% Scores on Core Web Vitals

Join Joe Williams & Aleksandar Savkovic on 29th of March, 2021.

Do you like what you read?

Get the Latest Updates

Share Your Feedback

Please insert Content

Thank you for your feedback!

Do you like what you read?

Get the Latest Updates

Share Your Feedback

Please insert Content

Thank you for your feedback!

Want to Experience the Cloudways Platform in Its Full Glory?

Take a FREE guided tour of Cloudways and see for yourself how easily you can manage your server & apps on the leading cloud-hosting platform.

Start my tour

CYBER WEEK SAVINGS

  • 0

    Days

  • 0

    Hours

  • 0

    Mints

  • 0

    Sec

GET OFFER

For 4 Months &
40 Free Migrations

For 4 Months &
40 Free Migrations

Upgrade Now