Welcome to part three in this ten part SEO series. The ten parts of the SEO process we will be covering are:
-
As for Cascading Style Sheets (CSS) it gives us the ability to abstract the design out of a webpage, or site into a secondary document. This gives us a lot of advantages, and very few disadvantages. By removing redundant design code from your website you place the content closer to the start of the document, while reducing your code to markup ratio. It also makes it easier, and more cost effective to maintain your website as you can implement simple design changes by only editing on file.
When converting a website from table based design, to pure CSS based design there is generally around a 40% decrease in code. The reason for this is when most people use tables they end up placing tables, within tables, within tables all with their own attributes (height, width, border, etc). Now multiple all that redundant, and unneeded markup by the numbers of pages of you site and you'll quickly see how Google (or any other search engine) will be able to index you website more efficiently.
In my research, and experience I have concluded using these two technologies in conjunction with each other is a part of guaranteeing your websites success, especially with its compatibility with Google. You will also find if you do any research on this topic a recurring mantra of CSS fanatics tables are for tabular data not design.
You'll find that most of the highly organically ranked SEO companies implement CSS based design on their own websites. For examples of CSS based design check out
A better approach is to URL Rewrite your URLs. For the Linux side Apache has Mod Rewrite, and for Windows you can use ISAPI Rewrite. When you implement a URL Rewriting system you are essentially creating a hash URL lookup table for your site, than when a server query comes in it checks the hash table to see if it finds a match then feeds it the corresponding entry.
To put it into simple terms what we strive to accomplish with URL Rewrites is to mask our dynamic content by having it appear as a static URL. A URL like Article?Id=52&Page=5 could be rewritten to /Article/ID/52/Page/5/, which to a search engine appears to be a directory with an index.htm (or whatever default / index page your particular web server uses). To see an implementation of Mod Rewrites check out
Another goal is also to avoid having any additional URLs on you site such as Links for changing currency with a redirect script, links to "Email to a friend" pages, or anything related to this. Always use Forms to POST date like this so that the same page, or a static page to reduce page count. This issue seems to plague a lot of custom developed ecommerce / CMSes. I've actually see CMSes that will present up to 5 URL / Links for each page, in the long run the spiders got so confused in indexing the catalog that some of the main content pages were not cached.
Internal Site NavigationIf built properly most websites will never have a need for an XML Sitemap, other than to get their new pages indexed that much quicker (Ecommerce & Enterprise being exceptions). I will however recommend that every website have a user accessible Sitemap linked from every page to aide your users, and for internal linking.
Most sites with indexing problems have issues with their internal page linking structure. The biggest of all these issues are websites that implement pure javascript navigation based system, these systems depend on Javascript to insert HTML into pages as there rendered. Now Google can parse javascript menus to find URLs, however all of these pages will only be linked from the JS, and not the pages there located on (expect no internal pagerank passing). The best Javascript menus are menus that manipulate your code on your page to change which sections are being displayed via CSS. An example of a hybrid CSS / Javascript menu that I like is QuickMenu by OpenCube (these guys have a great support department).
Keep I mind the more internal links you have to a page, the more internal strength this page will be given. So when in doubt link it up.
Testing Your Site StructureWhen it comes to reliable website deploying all I can say is "Test It, Test It, and then Test It Some More". When testing structure I rely on 3 different programs / firefox extensions. The first is Xenu Link Slueth, this is a great tool to run on your website to figure out how many pages can be spidered, and to find dead links. The second is the Web Developer Extension for Firefox, make sure you always validate your code when you make changes. And the last is consult Google and Yahoo to see how many pages are in your index compared to how many pages Xenu found, on Yahoo or Google type site:www.yourdomain.com (Don't use Live's site: function it is useless).
After you've finished testing your code if you need to debug it I strongly recommend the Firebug Firefox Extension, and the IE7 Developer Toolbar.
ConclusionWhen trying to maximize your organic rankings your internal structure is paramount, consider your site structure to be equivalent to the foundation of your house. If your foundation is not built adequately your house may be livable, but may have long term issues. With websites your long term issues will be a failure to maximize your ROI of your website, so practice safe and smart structure.





No comments yet. Be the first to comment!