Robots.txt is blocking my labels

In my adsense account in “revenue optimization” i have crawl errors then when i click “fix crawl errors” then this..

Blocked Urls Error
http://www.rechargeoverload.in/search/label Robot Denied

My robots.txt:
SEMrush

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.rechargeoverload.in/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: http://www.rechargeoverload.in/atom.xml?redirect=false&start-index=501&max-results=500

 

selenium – why we should follow robots.txt file

Can anyone tell me exactly what type of circumstances could our company face if we do not follow the robots.txt file, because we are crawling following social media’s?

  • Facebook
  • Linkedin
  • Instagram
  • Twitter
  • Reddit
  • Tumblr
  • Youtube

Our Scenario: currently, we are not following any social media robots.txt file and fetching data from all major social media platforms using selenium, scrappy, and other technologies and dumping into our Database then do some analysis on it and show it on our dashboard for our clients.

Note: Our company is registered in the Netherlands

8 – Robots.txt change are not reflected in AWS server

we have a Deployment process, We initiated deployment initially from branch to environment through Jenkins. After its process starts with the help of docker environment which gets all the files from my branch and takes an image back up and it’s deployed to the concerned environment. All other files are reflecting correctly but robots.txt changes are not reflecting.

Changes are reflecting in the S3 bucket.

Please give suggestions to changes the file on the server or any idea to resolve those issues?

8 – How can I prevent my robots.txt file from being overwritten?

I customized the robots.txt file on my Drupal 8 installation

Every time I use Composer, it overwrites my changes :

Scaffolding files for drupal/core:
  - Copy [web-root]/robots.txt from assets/scaffold/files/robots.txt

Here is my composer.json

I read the documentation, but did not quite understand what to add.

How can I prevent my robots.txt file from being overwritten ?

...
    "minimum-stability": "dev",
    "prefer-stable": true,
    "config": {
        "sort-packages": true
    },
    "extra": {
        "drupal-scaffold": {
            "locations": {
                "web-root": "web/"
            }
        },
        "installer-paths": {
            "web/core": [
                "type:drupal-core"
            ],
            "web/libraries/{$name}": [
                "type:drupal-library"
            ],
...