What is Robots.txt?
File that stops search engines from crawling a specified page or directory.
Disallow all search engines to crawl the specified directories - /images and /cgi-bin:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
See KFM