@FOTISPAN

How to overdide and keep safe the robots.txt file of a Drupal installation

You may want to add or remove lines from the default Robots.txt file of a drupal installation. The "problem" is that when Drupal is installed with composer, the next time you will run a Composer update command, the robots.txt file will be overwritten by Drupal's default file.

In order to avoid it you can add the following lines to your composer.json file

Find the "extras" section and add the below code under drupal-scaffold / locations.

**Make sure to check if there are any updates to robots file (core version notes) to apply them.

        "drupal-scaffold": {
            "locations": {
                "web-root": "web/"
            },
            "file-mapping": {
                "[web-root]/robots.txt": false
            }
        }

 

OR in case you want to append some lines you can append them 

  "drupal-scaffold": {
            "locations": {
                "web-root": "web/"
            },
            "file-mapping": {
                "[web-root]/robots.txt": {
                    "append": "robots.txt.append"
                }
            }
        }