This isn’t the first time the major search engines have come together for an announcement regarding how they support publishers. In late 2006, all three joined together to support XML Sitemaps and launched sitemaps.org, followed in April 2007 with support for Sitemaps autodiscovery in robots.txt and in February 2008 with more support for more flexible storage locations of Sitemap files. In early 2005, the engines declared support for the nofollow attribute> on links (in an effort to combat comment spam).
They are simply making a joint stand in messaging that robots.txt is the standard way of blocking search engine robot access to web sites. They have identified a core set of robots.txt and robots meta tag directives that all three engines support:
For robots.txt, they all support:
- use of wildcards
- Sitemap location
For robots meta tags, they all support:
For More deails How Can Create robots.txt and how you can use robots.txt you can use google WebMaster tools and http://www.robotstxt.org/.