Tech journalism time capsule: the wonderful world of 1996 computing

If you maintain a site on the World Wide Web, have you ever been "hit" repeatedly by a user who downloads tens, perhaps hundreds, of documents in rapid-fire succession? If so, your site may have been visited by a program that savvy surfers refer to as a robot, a spider, a wanderer, a Web walker, or a Web agent.

Robots are automated programs, and spiders are a type of robot that continually crawls the web, jumping from one page to another to gather statistics about the Web itself or build centralized databases indexing the Web's content. Alta Vista, Lycos, OpenText, WebCrawler, and other popular Internet search sites use spiders to build Web indexes. These let users easily locate every Web page containing information about, say, the MD5 Message Digest Algorithm or tourism in the Canary Islands.

ncG1vNJzZmivp6x7tbTEr5yrn5VjsLC5jmtnam1fZn1wgI5ya21rYm6GcMDEnJ9mop%2Bqv6%2Bty6KqpmWknrqmecKap6ytnJp6sa%2BMppigmaqeu6Z5kHJwbw%3D%3D