Spider trap

Spider traps may be created to "catch" spambots or other crawlers that waste a website's bandwidth.

They may also be created unintentionally by calendars that use dynamic pages with links that continually point to the next day or year.

Common techniques used are: There is no algorithm to detect all spider traps.

A spider trap causes a web crawler to enter something like an infinite loop,[3] which wastes the spider's resources,[4] lowers its productivity, and, in the case of a poorly written crawler, can crash the program.

Polite spiders alternate requests between different hosts, and do not request documents from the same server more than once every several seconds,[5] meaning that a "polite" web crawler is affected to a much lesser degree than an "impolite" crawler.