Web Crawler as Public Enemy
Alexey Andreyev, Webplanet.ru
At the end of the summer, a representative from Peterhost proposed that access to the content of sites on its hosting be banned for the web crawler of the
Webalta search engine. Instead of giving the information to real visitors, the servers had to deal more and more with a web crawler. In response the representatives of the search engine accused the webmasters that they issue the content on the wrong format. It turns out that the people producing the content must not only release it for public access, but feed it to the crawler in bite-size pieces. But do we need the search engines so much that we have to learn the language of these robots and play by their rules? The Internet without a global search engine is the same natural thing as life with a television. This is the kind of Internet, with its mailing lists, newsgroups, forums, blogs and social networks, where people now resolve far more serious matters than that enabled by the universal web crawler.
Good. Now you've got to kill RSS parasites as well!
http://www.seomoz.org/blogdetail.php?ID=1348