Spidering or Web crawling:
Spider or Web crawler is a computer program that browses the web pages of WWW in a systematic, automated manner. Search Engines use spider for getting up-to-date data on web sites. They are used to create a copy of the pages visited by them for later processing to create Index. These programs are also useful in validating HTML code to a particular standard like XHTML or checking or validating the hyperlinks.