Developing the design, layout, content and ordering of web pages within a website is a very important step in search engine optimization.
A website needs to be user friendly at the same time search spider friendly.
Though each search engine spider uses its own method of weighing your web pages, there are a few commonly accepted protocols that can get better visibility to your web pages and contents to search spiders.
Following techniques are adopted by webmasters to make their website search spider friendly:
Cascading Style Sheets
Cascading Style Sheets (CSS) are used to manage site layout, colors and designs. It gives the ability to abstract design out of a webpage, and move it to a different file, that doesn’t come under the purview of search spiders.
CSS files makes the life of a web developer easier since the entire design of the website can be managed from a single file.
Search Engine Friendly URL
Search Engine Friendly URLs ensures complete spidering of all the pages within your website that you expect search engines to spider.
Due to high volume of products being handled on commercial website, they have switched to dynamic web pages where the content for each product is loaded on to a single common template based on user requests. This calls for passing of product specific names for identification code via the URLs.
Technically it helps a webmaster to manage the web pages easily, however for search engine spiders this has proved to be a tedious task, since they are unsure as to how to navigate using dynamics, hence skip most of such pages.
Ideally, the entire URL needs to be readable and meaningful to human eyes. That by itself would ensure readability to search engines, and they know what the URL has in stock for the user.
I would suggest keeping URLs similar to what you see here:
Any page that stays closer to the domain name or the root folder would enjoy better weightage than pages that tend to be far away from it. It’s because search engine spiders assume that pages that are closer to the root folder are more relevant to the website than the rest of them.
Lots of content on your web pages would attract search spiders, and new content being updated to the website would get the spiders to frequent your web pages often and re-rank you in their index.
However, content on your web pages, needs to be unique. The search spiders are clever enough to detect duplicate content, be it on different pages on the same website, or same content showing up on multiple websites.
So you, being a dealer for “Honda cars” gets description and features of a new model from the manufacturer. The same information will be shared by them to all the various dealers across the country. If all of the dealers put up the same content on their websites as part of marketing the car, will that be seen as duplicate content?
Definitely YES as far as search engines are concerned!
The only workarounds here are:
– Avoid using such content on your website.
– Use that content but disallow them to search engine robots.
– Get a content writer to redo the article in his own words, so you have fresh content ready for SEO.
Internal Site Navigation
Proper navigation to each and every page on your website from within its pages is important to ensure proper spidering of all pages, and for users to gain access to every page in the website. Without you telling users and search spiders about the pages that you hold in your website, there is a high chance of them not seeing all of your web pages.
Maintaining balance in optimizing your website for both human users and search spiders is an art which can be mastered by a search engine optimizer.