Statistics indicate that 80% of website traffic is generated by search engines. Having a high search engine ranking for your most relevant keywords and phrases can bring floods of highly-targeted traffic to your site. Likewise, ranking poorly in search engines will can cost you a lot of traffic.
Each search engine has its own algorithm to determine how websites are ranked in response to searches. There are a number of factors that are used during the ranking process; however, for your website to be accurately read by the search engine and to achieve high rankings, the search engine’s spiders must first be able to easily maneuver through your website.
Spiders are what the search engines use to “crawl” through web pages to provide the information that determines rankings. If there is content on a page that a spider is unable to access, that content will not be able to help your search engine ranking.
Generally, websites are designed for appearance with the user in mind. Of course this is essential to the success of a website, but proper coding must not be sacrificed for the sake of appearance. Spiders view the code behind the page, which determines how the page appears in the browser. To see what the code looks like, in Internet Explorer click on “View” in the toolbar and select “Source.” In Firefox click on “View” and select “Page Source”
Spiders easily read text on a page, but do not read Java Script. Websites that use large amounts of Java Script to accomplish the look and features may suffer from poor search engine rankings.
Also, spiders cannot read writing that is contained with an image. Every image should have an alt tag (which can be read by spiders) describing the image and inserting keywords when possible.
Valid XHTML and CSS (Cascading Style Sheets) should be used in order to make the spiders’ job as easy as possible. Valid coding will also help to ensure that your pages are rendered the same in different browsers (Internet Explorer, Mozilla Firefox, Netscape, Safari, etc.). Using word processors or other programs that are not designed primarily for website design can add a great deal of invalid code. Hand coding or using a product such as Macromedia’s Dreamweaver will produce the best results.
The W3C, the governing body of the Internet, provides an online feature that allows you to check your XHTML and CSS for valid coding. To validate XHTML visit http://validator.w3.org. To validate CSS visit http://jigsaw.w3.org/css-validator.
Using valid XHTML and CSS will:
- Make your website easier for search engine spiders to crawl
- Allow you to achieve an attractive appearance
- Help you to keep that appearance on multiple browsers
- Keep loading times to a minimum
- Allow for easier updates and maintenance