A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. This process is called Web crawling or spidering.

A Web crawler is one type of bot, or software agent which starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are recursively visited according to a set of policies.
Other terms for Web crawlers are ants, automatic indexers, bots, or Web spiders, Web robots, or Web scutters.
0 comments:
Post a Comment