Copied content!

Generator Robots.txt

Standard - All robots are:

SELECT TO FILTER

Crawl-Delay:

SELECT TO FILTER

Sitemap: (leave blank if you don't have one):

Search Robots:

Google

SELECT TO FILTER

Google Image

SELECT TO FILTER

Google Mobile

SELECT TO FILTER

MSN Search

SELECT TO FILTER

Yahoo

SELECT TO FILTER

Yahoo MM

SELECT TO FILTER

Yahoo Blogs

SELECT TO FILTER

Ask/Teoma

SELECT TO FILTER

GigaBlast

SELECT TO FILTER

DMOZ Checker

SELECT TO FILTER

Nutch

SELECT TO FILTER

Alexa/Wayback

SELECT TO FILTER

Baidu

SELECT TO FILTER

Naver

SELECT TO FILTER

MSN PicSearch

SELECT TO FILTER

Restrictor directories:

The path is relative to the root and must contain a trailing slash "/":

CODE

Now create the 'robots.txt' file in the root directory. Copy the text below and paste it into the file.

# robots.txt generator by devtoolspro.com.br

User-agent: *
Disallow:

Disallow: /cgi-bin/

About Robot Generator.txt

The robots.txt file is a key item for managing the behavior of crawlers on the site.

You can choose which pages can be crawled, how much time the robot should spend on each page, define a sitemap and even perform settings for each robot, the main ones being:

  • Googlebot
  • googlebot-image
  • googlebot-mobile
  • MSNBot
  • Slurp
  • Teoma
  • Gigabot
  • Robozilla
  • Nutch
  • ia_archiver
  • baiduspider
  • naverbot
  • yahoo-mmcrawler
  • psbot
  • yahoo-blogs/v3.9