Search Engine Robots

The Magento 2 configuration includes settings to generate and manage instructions for web crawlers and bots that index your site. The instructions are saved in a file called “robots.txt” that resides at the root of your Magento installation.

With the ScandiPWA theme, feel free to use the default Magento functionality to upload and configure the robots.txt.


  1. On the Admin sidebar, go to Content → Design → Configuration

  2. Open Global configuration in edit mode

  3. Expand Search Engine Robots.

  4. Insert the instructions in Edit Custom instruction of robots.txt file and set the Default Robots.

  5. When complete, tap Save Configuration.

Robots.tx template

Even in the regular Magento projects, there are cases when we upload robots.txt directly to the server root directory and ignore the field in BE. The default robots.txt is usually taken and adjusted based on the project. As a template you are welcome to use the following generic robots.txt which can be used for your store that you can insert in the Edit Custom instruction of robots.txt file field in the Search Engine Robots configurations:

This file contains:

  • Standard M2 directories

  • URLs that appear in FE but should be excluded from the index, e.g., My Account

  • SID

  • /catalog/view/

Last updated