Spiders & webcrawlers
A web crawler (also known as a web spider or web robot) is a program or automated script which browses the in a methodical, automated manner. Other less frequently used names for web crawlers are ants, automatic indexers, bots, and worms (Kobayashi and Takeda, 2000).
With a web crawler you will be able to:
- Extract data from a site that is constantly updating its data;
- Download a huge amount of files from web pages like MP3 files, code ( zip files ) etc;
- Build a monitoring bot that will alert you when specific conditions are met. For instance, let me know when Nasdaq is below a specific value;
- Test web pages and links for valid syntax and structure;
- monitor sites to see when their structure or contents change;
- search for copyright infringements;
- Or build a special-purpose index.
InDis has many years of experience of building webspiders. Please contact us if you want us to built a specific spider for you.