Search Engine Guide
Search The Internet: 

Top Site Listings provides two additional ways to stay current on search engine optimization techniques, trends, and other vital information:

Subscribe to have these articles delivered directly to you each week via email.

In addition to receiving the weekly articles, be sure to sign up for the Top Site Listings monthly newsletter.

Search Engine Optimization

Article provided by:
Top Site Listings
© 2002 Orbidex.

Back To Article Index

Tell Your Friends About This Site

Analyzing SERPs and Sites To Your SEO Advantage
By Andrew Gerhart - January 22, 2002

Before beginning any optimization campaign, and most times before taking on an optimization project, it is good practice to do some research and analysis of the website's market and competitor's. There is a lot to be learned from this research and analysis, and will assist in the direction of the search engine optimization campaign.

Analyzing SERPs
The SERPs, or search engine results pages, are a good place to start this evaluation of the market and competition. Using Google as an example, go to Google and perform a search for the main keyword for the proposed optimization campaign. (There is always one main keyword for a campaign) There are many things that you should notice about these listings, without even clicking on the link to enter the site.

First, what is the size (K), of the pages? Collect the page sizes of the ten listings that Google returns and try to determine if there is a relationship between them. Are all of the pages under 10K? Are they all under 20K? The next thing to notice about the competing sites is the domain names that are being used. Are the results flooded with domain names that are stuffed with keywords? How many characters are in the URLs of the top ranking sites? Most search engines have a limit, or preference, to the number of characters that it will allow in a domain name. Do the URLs of the top ranking sites contain query strings? Google will index and list pages that have query strings in the URLs, but these pages will have a tough time competing in a highly competitive market. If one or more of the top ranking pages for a keyword search have query strings in the URL, then there is a good chance that the market is not heavily flooded. Another thing to notice about the domain name is its extension. For example, is the extension of the domain name a .com, .edu, .org, .gov, etc. If the listings are populated with sites that have a .edu or .gov extension than the competition may be tough.

The next thing to examine is how many pages are returned for that particular search term. Was the number of pages below 100,000? If so, the market is not fairly unsaturated and there is not a great deal of competing sites. Is the number of pages 1,000,000 or more? If so, the market may be fairly to highly competitive depending on how high the number of returned pages is. Another important aspect to evaluate is the relevancy of the results that were returned for the search. Are the search results full of highly relevant pages, or do the listings include a lot of irrelevant pages? If it turns out that the latter is true, then having a themed site with relevant content may be a strong point for your campaign.

As link popularity can play a large role in the success of a site, you should determine what the competition's link popularity is. On certain search engines, you can find out the link popularity of a site by searching for link: There are also websites that offer tools that will query the search engines automatically and return the results. Search Engine World offers a link popularity tool, as well as Market Leap's link popularity tool.

Analyzing The Source Code
Once you are on one of the pages, view the source code of the page. This will allow you to determine a number of things. First, are they using any scripting, like JavaScript? Are they using CSS, style sheets? If they are, are they using hidden layers? You are running a risk when using hidden layers, although it is mainly when human reviewed that users of hidden layers are caught. Using hidden layers through CSS allows them to include links and keyword rich content that is not viewable to the visitor, and is food for the search engine robots.

At the top of the source code of the page should be their title tag. Are their target keywords in the title, and if so, how many times is the keyword repeated in the title? Is the title stuffed, or repeated over and over, with the main keyword? Collecting and analyzing the titles of numerous high-ranking pages will allow you to construct your title in a way that will be click-enticing and at the same time have the correct amount of keywords and right keyword placement to help your optimization campaign. The title tag, and the number and placement of keywords within it, is important in a good number of search engines and directories. Also at the top of the source code will be the META tags for the page. The META tags are not of dire importance anymore, but they should be evaluated for some search engines that still use one or more of the META tags. Are there META tags in place? If so, is the META Description relevant to the actual content? Take note of how many times the target keyword was used and where it was used within the description tag. The META Keywords tag is of little importance for almost all search engines, but nonetheless, you can still gather information about your competitors from their tags. What other keywords are they targeting? Are these keywords relevant to each other and to the site? We will discuss more about keywords later in the article.

There are other tags throughout the HTML that are used for search engine optimization purposes, and you should note some things about these tags. Some of the tags that are commonly used are Alt tags, Heading tags, NoFrames Tags, Link Title tags, Comment Tags, and others. Are these tags being used on the page? Are the target keywords being used within these tags, and if so, how many times?

Analyzing The Site
When analyzing a page and website that you will be competing with there is a great deal of information to extract. Start by analyzing the page that is returned for the particular search term, and then move on to the rest of the site as there is off page criteria that are important to evaluate.

Starting with the individual page, what does the page consist of? Is the page all images, all text, or is it a mixture of the two? Are the keywords mixed into the content? (We will discuss keyword density later in the article) Where is the keyword rich text placed on the page? You should make note of whether it is near the top of the page, what size the font is, and what tags the text is encapsulated in. Does the page have old spam tricks like invisible text or keyword stuffing? It is rare to see this now, but occasionally pages slip past the search engines and are indexed without penalties or being thrown out. In Google, there are two things to evaluate that play a substantial role in ranking: PageRank (PR), and link popularity. The two are related, as a page's PR is a number and rank that Google gives to a site and to a page. This number is the output of a calculation that analyzes the incoming links to a page and site, the authority and weight of the page and site that is linking to it, and the number of these authoritative links. This is an extremely basic explanation of the PageRank algorithm and calculation. To check the PageRank of a page or site, you will need to download and install the Google Toolbar.

Examining the entire site is next, and there are a number of components that are involved in this as well. The first thing that should be noted is what the site is composed of. Does the site utilize only HTML, or does it use ASP, PHP, etc.? While Google will index URLs that contain query strings, not all of the search engines do. With a site that is dynamically driven, it can only be optimized by section, so there may not be as much room for keyword targeting and variation. You should also examine and evaluate the internal linking of the site. Is each page linked to from the homepage? Does the site employ a strict vertical linking structure, or is there horizontal linking between sections? The internal linking of the site will have a correlation on the strength of the pages in some search engines. Another aspect to take note of is the size of the site. Does the site have over 100 pages, over 1,000 pages, or below 10 pages? While surfing through the site and evaluating, did you notice that all of the pages seem to be related to each other in some way? Do they all fall into a general category or theme? Themes are an optimization strategy that organizes related content on a website, follows a linking strategy, and follows a keyword strategy. As this is a very basic explanation, follow this link to read more about website themes.

Analyzing Keywords and Keyword Density
We have all heard it a million times, but it is true that targeting the wrong keywords is the first step in the path to failure. Analyzing the keywords that are used by competitors and throughout the market can give you some new avenues of traffic to pursue, or additional keyword variations. The first place to check for keywords is in the META Keywords, Title, and Description tag. You should also look within certain HTML tags on the page. Some of these tags are Alt, Heading, Link Title, or Comment tag. You should also check the actual text of the page as keyword rich text plays a large role in a page's ranking. Start a list of keywords that you have thought of and add additional keywords and keyword variations that you can target on your site. To see which keywords and keyword variations are generating traffic you can use Word Tracker, which is a great tool for keyword research. You can then add the number of searches performed for each keyword next to the keyword on the list. This should allow you to organize your keyword targeting. Remember, it is not always best to target the keyword that is searched for the most!

Another important part of an optimization campaign and ranking is the keyword density of a page. Perform a search for the target keyword and collect the URLs of the top ranking sites. There are a number of tools that will automatically analyze the keyword density of a page for you. Search Engine World has a good keyword density tool. Enter the URL into the tool, and it will output the keyword density for that keyword, as well as others, on the page. Each search engine has a preference for keyword density, so it is best to analyze the results before optimizing.

The positioning of keywords throughout the page is a factor in ranking as well. Are the target keywords positioned at the top of the page or at the bottom? Are there keywords near the links? Other places to look for keywords are in some of the HTML tags that were previously mentioned, like ALT, Heading, Title, and Image tags.

As you can see, there is quite a bit of information that can be obtained from analyzing the SERPs, your competitors' pages, and other high-ranking pages in the search engines. All of this information will assist you in expanding your knowledge base and your optimization campaign. Everything counts, and not one aspect should be ignored.