Breaking down the components of SEO, the robots.txt file is the gateway by which web robots, in this case search engine bots, access a website. It's absolutely vital that your robots.txt file be error free. To make sure this is the case, try using this Robots.txt Checker.
http://www.frobee.com/robots-txt-check
http://www.frobee.com/robots-txt-check
Nice posts like the way you differentiate the use of allow and disallow a great post to learn something about robots.txt
ReplyDeleteI vote up for this post.
Thanks
HireProfessionalSEOExpert
thanks for the information..Robots.txt Checker is very good for looking what web pages are forbidden for Google crawling, you may find some nice hidden pages that is not visible for crawlers and search engine results.
ReplyDeleterobots txt checker