Glossary
Spider (Crawler/Bot)
A spider is a search engine component that indexes the internet by crawling web pages and following links.
SEOteknikindexering
A spider (also called a crawler or bot) is a search engine component that indexes the internet by "spinning" through web pages. Spiders follow links to various pages on your site and to others. This is why backlinks are important for SEO.
How do spiders work?
The spider makes copies of websites it visits and saves them in the search engine's index. Only pages visited and indexed by spiders can appear in search results. You can influence how spiders index your site through robots.txt and sitemaps.
