Google is releasing robots.txt to the open-source community in the hopes that the system will, one day, becoming a stable internet standard.
On Monday, the tech giant outlined the move to make the Robots Exclusion Protocol (REP) -- better known as robots.txt -- open-source, alongside its matching C++ library.
REP is a way for webmasters to establish the behavior of code attempting to visit a website. The original creator, Martijn Koster, found that his website was being overwhelmed by crawlers and so in a bid to reduce server strain, developed the initial standard in 1994.
The link for this article located at ZDNet is no longer available.