Before, we start a whole SEO . Lets take a look on "How search engine works" and in perspective of crawling and indexing. So , i'll give u a quick walk through of what search engines crawler or spider do. A question that arise in your mind is What is spider or crawler?? Basically, it is a software program that used by search engine to find out whats out there on the ever changing web.
Search engines may run thousand of spider program on multiple spider. The process is like that when crawler visit your site it loads the page content into the database. When the page loads on database the text of your page is loaded into the search engine index, which is a big database of words, and where they occur on different web pages. The whole process is completed in three steps >>
1. CRAWLING(Fetching the pages)
2.INDEXING (Break them down into words for the index).
3. Where the links(Web page address/ URLS) that are found get fed back to the crawling program to be retrieved.
Robots.txt & Site Maps.org:
The first thing visit by spider on your site is "robots.txt". This file contains instruction for the spider on which part of the website index and which part of the website is ignore. All spider follow some certain rules and these rules are also follow by all search engines.
How Spider find us:
The most common way that a search engine find us from hyper link from other sites. Some search engines also have a option of add url . So , u can add your website and your websites is added to their list of links to crawl. There are some also paid or free services that submit your website automatically and some also software that submit your site but actually its not a good deal. Using these types of software it will generate a flood junk of email than anything else.
**But if you take my personal advice. I am recommended you let the search engines finds you rather that submitting your site.
0 comments:
Post a Comment