Computer robots are simply programs that automate repetitive tasks at speeds impossible for humans to reproduce. The term bot on the internet is usually used to describe anything that interfaces with the user or that collects data.
Search engines use "spiders" which search (or spider) the web for information. They are software programs which request pages much like regular browsers do. In addition to reading the contents of pages for indexing spiders also record links.
- Link citations can be used as a proxy for editorial trust.
- Link anchor text may help describe what a page is about.
- Link co citation data may be used to help determine what topical communities a page or website exist in.
- Additionally links are stored to help search engines discover new documents to later crawl.
Another bot example could be Chatterbots, which are resource heavy on a specific topic. These bots attempt to act like a human and communicate with humans on said topic.
Comments
Post a Comment