Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.



About Robots.txt Generator

This online robots.txt genrator generates robots.txt file for free. With the robots.txt files added to your website, you can then be sure search engine crawlers are crawling and indexing your site. Save yourself the stress and use our free tool to generate just the correct robots.txt files for your website.

What Is a Robots.txt File?

A robots.txt file is a really simple, plain text format file. Its core function is to prevent certain search engine crawlers like Google from crawling and indexing content on a website for SEO. If you’re not certain whether your website or your client’s website has a robots.txt file, it’s easy to check:

Simply type yourdomain.com/robots.txt. You’ll either find an error page or a simple format page.  If you are using WordPress and you have Yoast installed, then Yoast can also build the text file for you as well.

Robots.txt is also known as robots exclusion protocol, and this standard is used by sites to tell the bots which part of their website needs indexing. And a complete Robots.txt file contains “User-agent,” and below it, you can write other directives like “Allow,” “Disallow,” “Crawl-Delay” etc. Each rule blocks or allows access for a given crawler to a specified file path in that website. Unless you specify otherwise in your robots.txt file, all files are implicitly allowed for crawling.

Other Tools