
Robots -
Also known as Spider / Crawler / Web Crawler / Robots
"Robots is an automated program used to feed the database of their search engines with new information"
By Crawling link to link in search of new information.
By updating the database already index site.
Robots.txt -
This is a file that gives the instructions for all search engine robots. Webmaster use this file for avoid spamming.
To allow all Robots...