Understanding Search Engine Optimization

Search engines are a primary way that many users will find your site, so ensuring that your pages will appear as high as possible in the results is a key factor in increasing the number of site visitors.

Early Search Techniques

In the mid-1990s, as the Web began to rise in popularity, the first search engines appeared to help users find information in the exponentially increasing number of pages on the Web.

These early search engines relied on Web masters to selfdescribe their pages by using the meta element in HTML to add keywords and descriptions of the page content. Unfortunately, it became too easy to manipulate the system by inserting keywords that contained popular search terms yet had no relation to the page in question.

PageRank and Google

While Larry Page and Sergey Brin were graduate students at Stanford University, they developed a new algorithm that relied primarily on incoming links to a page. Their basic theory is that while a Web master might use misleading keywords on a page, other Web masters would fail to perpetrate the false connections because they would not provide links to it. In other words, while a site might say that it was about a currently popular celebrity in its keywords, other sites are not likely to link the name of that celebrity to a site if it is not about her.

This system, dubbed PageRank, became the basis for the company Google, which Page and Brin founded in 1998. While Google’s system was much more difficult to manipulate than older systems, some Web sites developed systems to artificially increase their ranking. Because of this, modern search companies, including Google, Yahoo!, and Microsoft, maintain strict levels of secrecy around the precise details of their search algorithms. An unfortunate reality of being a modern Web designer is that those few unscrupulous Web masters who have found ways to manipulate their search rankings have made it much more difficult for the honest majority to ensure that they get good rankings for their pages.

Content Is King

By far the most important key to getting good search engine rankings is to have good, meaningful content and to be sure to use the proper XHTML elements to code that content.

Search engines read the code in your page, and give more weight to text enclosed in heading tags than that in, say, paragraphs, the logic being that the text in headings is what the page is “about.” Another consideration is that pages with good content are more likely to keep their viewers once they have been found. Far too many designers become so focused on getting a good search ranking that they sacrifice the usability of their page. What is the use of getting a good search ranking if your users immediately leave your site because they cannot find the information they want once they get there?

Consider Accessibility for Search Optimization

Because search engines read the code of the page, they approach your page in many of the same ways screen readers for blind users do, so in general, pages that are accessible to disabled users will get higher search engine rankings than those that are not.

Avoid All-Image Pages

Pages that are made up of nothing but images will rank much lower due to their lack of meaningful text for the search engines to read. A common approach to design is to begin in a program like Adobe Fireworks or Photoshop to create a mock-up of the finished page. Once it is done, this mock-up can be a useful tool in design. However, many newer users want to take the mock-up, which will by definition be an image, and simply place that image on the Web as the finished page. Even through the use of appropriate alternate text, this page will rate very low on search engine results. It will also load very slowly and be difficult to edit later. A better approach is to take the mock-up as a guide for the layout. You can take the portions of the mock-up that should be images from the original and place them in the final product, either through CSS backgrounds or the XHTML img element, but the bulk of the page should be text entered directly into the XHTML.

Do Not Rely on Search Engine Optimization Alone

Not every site is necessarily going to benefit from high search engine rankings. Sites that are part of a larger marketing strategy may not rely heavily on search engine results at all. Placing the site’s address on billboards or television advertisements may be as, if not more, effective than worrying about one’s Google PageRank. Word of mouth can be an effective strategy, as can the use of social networking sites such as MySpace or Facebook, particularly for blogs or small local sites. Search engine optimization should fit within the site’s overall marketing strategy.

Manual Page Submission

While most search engines attempt to automatically catalog pages, they also allow Web masters to submit their pages manually. Yahoo!, for example, has a manual submission whereby they will guarantee that your page will appear under certain keywords, although they will not guarantee the exact placement on the results page, and they do charge for this service. Other search engines allow for manual submission but may not guarantee placement, and none make any warrantees as to how long the placement may take, as most manual submissions are verified by hand by staff at the search company.

Follow Search Engine Guidelines

Every one of the major search engines makes a set of guidelines available to assist Web masters in building pages that will get higher rankings. These are not rules that are set in stone, and, in fact, search engines change them frequently, but you should observe them as much as possible. The techniques used by unscrupulous designers who are attempting to increase their page ranking by fooling the search engines are referred to as black hat techniques.

These techniques include placing repeated keywords in text with the same color as the background, or in blocks that are hidden through CSS, either through the display or visibility properties or by positioning them off the page. Search engines have both automated and manual processes for attempting to find sites that engage in these practices, and will either lower the site’s ranking or possibly remove the site from their databases altogether if they are discovered. As such, the risks of using these techniques outweigh the rewards.

Maria Gonza

Great article, this will help me a lot.

1

Post Comment on This Article

Your e-mail address won't be published. If you simply add some value to the original post and stay on the topic, your comment will be approved.

You can use Textile parameters on your comments. For example: _italic_ *bold* bq. quated text "link text":URL — Get your own picture next to your comment with a Gravatar account.