How to update robots.txt file

As site owner you can simply update the robots.txt file from Site Settings area.

You can use the following content for the robots.txt file:

To allow all robots complete access

User-agent: *
Disallow:

(or just create an empty “/robots.txt” file, or don’t use one at all)

 

To exclude all robots from the entire server

User-agent: *
Disallow: /

 

To allow a single robot

User-agent: Google
Disallow:

 

Alternatively you can explicitly disallow all disallowed pages:

User-agent: *
Disallow: /~folder/junk.html
Disallow: /~folder/foo.html
Disallow: /~folder/bar.html

 

You can also add a crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server:

User-agent: *
Crawl-delay: 5

 

Note: robots.txt  file supports a directive for slowing down for certain bots like MSN or Yandex, but Google Bot does not support it.

 

For advance usage of robots.txt file, please check the following page: http://www.robotstxt.org/robotstxt.html

As an alternative you can check the robots.txt file using the following tool: http://tools.seobook.com/robots-txt/analyzer/