SEO 4 Ways to Optimize Your Website
This section describes the key processes undertaken to obtain a higher organic ranking with the major search engines to optimize your website. How search engines work is part of their proprietary knowledge. The exact workings of their algorithms are closely guarded commercial secrets. However, guidance to how these algorithms (or algos) work can be found or deduced from various sources.
Some general guidance is available free, directly from the search engines’ own web sites. Some guidance can be found from examining the various Google and related patents. Some general guidance can be found from authoritative articles on SEO forum sites. However, real world applications of this knowledge can only be found by experimentation and trial and error.
There are some general rules which everyone need to/ follow to optimize your website for better ranking in search results. Applying them will provide a route to improved search engine visibility. The guidance in this section could be broadly applied to the three main engines – Google, Yahoo and MSN. However, given its dominance, much of the advice is derived from my interpretation of the Google “Hilltop” patent of 2001. The patent is believed by SEOs to have been the basis of the so-called Google “Florida” update of November 2003.
Optimize your website – Four Phases of an SEO Project
In addition to definitive information about the workings of search engines, there is much speculation, myth and rumour. There are many spurious ideas in circulation and applying them may do more harm than good. In this section, I will try to stick to tried and trusted conventions to optimize your website for higher ranking.
- SEO 1 – The Pre-Site Phase
- SEO 2 – The On-Site Phase
- SEO 3 – The Off-Site Phase
- SEO 4 – The Post-Site Phase
How Search Engines Gather Information
Search engines gather information by crawling web sites. They crawl from page to page visiting sites already known and by following the links that they find. Whilst crawling, the robots, or spiders, gather information from the source code of each site and then send back that information for indexing.
The Spiders were designed to read HTML code or code related to it such as XHTML or PHP. The Spiders find it difficult to read pages written in Flash and some other popular web programmes. Spiders cannot directly read Java Script or images. They can however read the alt tags which may be provided with GIF, JPEG or PNG images. These are all one of major areas to optimize your website in google and other search engines with better results.