Search Engine Robots
Last updated
Last updated
The Magento configuration includes settings to generate and manage instructions for web crawlers and bots that index your site. The instructions are saved in a file called “robots.txt” that resides in the root of your Magento installation.
With the ScandiPWA theme, feel free to use the default Magento functionality to upload and configure the robots.txt.
WHERE TO CONFIGURE?
On the Admin sidebar, go to Content → Design → Configuration → Global configuration → Search Engine Robots. Insert the instructions in Edit Custom instruction of robots.txt file and set the Default Robots. When complete, tap Save Configuration.
Even in the regular Magento projects, there are cases when we upload robots.txt directly to the server root directory and ignore the field in BE. The default robots.txt are usually taken and adjusted based on the project. As a template you are welcome to use the following generic robots.txt which can be used for your store that you can insert in the Edit Custom instruction of robots.txt file field in the Search Engine Robots configurations:
It contains:
Standard M2 directories
URLs which appear in FE but should be excluded from the index, e.g., My Account
SID
/catalog/view/