How Easy Can a "Bot" Crawl Your Site?
Many webmasters believe that great content is the key to Search Engine Optimization but they have forgotten an even more important factor, how easy can the bots crawl their sites? Making a 100% crawlable site should be the top priority for all the webmasters. There is no point having unique and fresh content but cannot be searched from the search engine index.
In order to ensure that bots can crawl a site successfully, one must make sure that all pages can be found by going from page to page via the inter linking mechanism. It is recommended to use text links as a form of linking between all the internal pages. Webmasters can also provide a sitemap that lists all the pages’ links in a page to facilitate this indexing.
One point to take note is that if the site has more than 100 links, it is advisable to split the sitemap into several pages with each page containing no more than 100 links. A page with more than 100 links may be classified as ‘Links Farm’ by the search engine. This is stated officially in the Google Webmaster Guidelines.
Google Sitemap is another great tool to index a new site. Creating a Google sitemap is encouraged because it tells the search engine what are the pages in a site and how often the content is updated. This is particularly useful when some pages are not linked to within the site.
Many sites have faced crawling problems because of the way they linked up the internal pages. Search Engine bots have difficulties indexing JavaScript and Flash navigation menus. If it is not possible to remove this kind of navigation in a site, it is advised to implement a text-link navigation system in the footer of a site. This will help the bots to index the site easily.
According to Google Webmaster Guidelines, it is advised to have a static and short HTML link destination rather than a dynamic URL. Dynamic URL especially those with tagged session identifiers will not be indexed because bots will ignore these pages. It is also encouraged to avoid using the ‘&id=’ parameter when passing variables between pages as Google does not include them into the index. If parameters are needed to pass between pages, a meaningful parameter like ‘&count=’ can be used instead.
When the bots conduct a visit, they will crawl the page like what a general user will see in their browser. Therefore, the bots will never index a password-protected page. If the objective is to get the site indexed by the search engine, it is advisable to remove any password-protected pages to allow access for the bots.
A site should always be search engine friendly so that users can locate the content easily. To conclude, search engines like to index simple and content rich pages. A simple and crawlable site with great content is the essential criteria to Search Engine Optimization.
Fun Site:
Yahoo is a great online resource you might want to use to start your marketing campaign. Put your business on the map. Click Here to Sign up for Yahoo! Sponsored Search and get a $20 bonus.
Until next time
Jasmine
TAN KC is a SEO consultant with several years of related experiences. His advice has helped several Webmasters to increase their SERP. KC is also the founder of www.USESEO.COM , a site that offers free SEO techniques. http://foxiedesign.blogspot.com