robots.txt
Internet protocol / From Wikipedia, the free encyclopedia
Dear Wikiwand AI, let's keep it short by simply answering these key questions:
Can you list the top facts and stats about Robots.txt?
Summarize this article for a 10 year old
robots.txt is the filename used for implementing the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit.
The standard, developed in 1994, relies on voluntary compliance. Malicious bots can use the file as a directory of which pages to visit, though standards bodies discourage countering this with security through obscurity. Some archival sites ignore robots.txt. The standard was used in the 1990s to mitigate server overload; in the 2020s many websites began denying bots that collect information for generative artificial intelligence.
The "robots.txt" file can be used in conjunction with sitemaps, another robot inclusion standard for websites.