Robot.txt


#1

Recently I have seen someone mention a file robot.txt, related to the “google bot”. The web crawler,…apparently it needs to find the file “robot.txt” to function correctly ? Can anyone explain more on this, where should that file be, ? What should be in it ?
Thanks, from Garry


#2

In shorter order… Robots.txt is a protocol for telling friendly search engines basically what to index or what not to index on a site… Robots.txt is a simple text file

In addition to a lot of information available on the topic you might also try looking for checkers or tools that are available online to help you correctly format the file.


#3

Ok, thanks