Search Engine Robots
Last updated
Last updated
The Magento 2 configuration includes settings to generate and manage instructions for web crawlers and bots that index your site. The instructions are saved in a file called “robots.txt” that resides at the root of your Magento installation.
With the ScandiPWA theme, feel free to use the default Magento functionality to upload and configure the robots.txt.
On the Admin sidebar, go to Content → Design → Configuration
Open Global configuration in edit mode
Expand Search Engine Robots.
Insert the instructions in Edit Custom instruction of robots.txt file and set the Default Robots.
When complete, tap Save Configuration.
Even in the regular Magento projects, there are cases when we upload robots.txt directly to the server root directory and ignore the field in BE. The default robots.txt is usually taken and adjusted based on the project. As a template you are welcome to use the following generic robots.txt which can be used for your store that you can insert in the Edit Custom instruction of robots.txt file field in the Search Engine Robots configurations:
This file contains:
Standard M2 directories
URLs that appear in FE but should be excluded from the index, e.g., My Account
SID
/catalog/view/