Robot.txt files tell your instructions to a search engine robot. it depends on what you what it to do. if you dont have the robot.txt file then when a robot visits your site and if it doesnt find the file, it will start to visit all the pages and contents of your site. these files are like walls which helps in preventing some files like java scripts, some images or any other files from indexing. it is also important to make a note that it is not a method that prevents search engines from crawling your site. so it is important to create a robot.txt file. if you need this file to be created you can get it from XnYnZ.com
A robots.txt is a simple text file that can be created with Notepad. each robot.txt file should contain two parts
1. User agent: * means all search bots should use the instruction to crawl the website. unless your site is a complex one you need not seperate instructions for seperate spiders.
2.disallow : contains the files which should be avoided from crawling.