by JP Sherman - MarketSmart Interactive
In my white paper (competitive search intelligence) I talked about using keywords to define the left and right limits of your online strategy and how to use those keywords to identify your competitors and opportunities.
In this article, I'm taking that one step further and applying it to your competition so you can find out the following:
Reverse engineering your competitions' keyword strategy requires nothing more than having a basic idea of what your defining keywords are. As a part of the discovery process, you might find out that adjustments to your keyword strategy are in order.
For this experiment, I will be using the following keywords along with their search frequency from WordTracker:
Running these keywords through some of the proprietary software I use at MSI, we come up with the following top ten players in this market. Where SE Presence is the number of times the competitor shows up in the top 15 in Google and the top 10 in MSN and Yahoo. The SE Saturation is the search engine market share for the given keywords. In this case, the total number of results adds up to 451.
The following analysis is actually fairly work intensive, usually to develop a reverse engineered keyword strategy it would take the competitive intelligence team between 4 and 5 hours of data mining, statistical analysis and interpretation. However, it's worth the effort to determine the strategy of your competitors. The idea is to scrape the competition's title tags, while keeping note of the total number of pages the site has. As most SEM practitioners know, unique is valuable and repetition reduces that value (from a keyword ranking perspective). Using that principle, the next step is to find out how many unique title tags each competitor has compared to the number of text/ html pages it has. The result is a percentage that gives you an idea of how they are putting SEM's best practices into their site.
In this case, you get the following information. The percentage of unique tags is found by comparing the number of unique title tags by the total number of text/ html pages on the site. In this case, I used a sampling of their pages of no less than 1000 total pages. Then I filtered out images, applications, dead pages and pages with duplicate title tags. Next, I used the keyword "college textbooks" (which includes all of the stems of that keyword) and searched the number of unique title tags comparing that number to the total number of unique pages (first percentage) and the total number of text/ html pages (second percentage).
While this process is intensely labor intensive, with more and more data sets, we can identify if a trend emerges. Then, taking a look at your site, how well does your site's keyword strategy compare to the strategy that sites that dominate the search engines. In this case, there was a surprise. Ecampus does very well in the search engines for college textbook related terms, however, out of the over 240,000 pages looked at, only one had the term "college textbooks", thus giving it a 0.041% for that particular keyword. This statistical outlier would be very interesting to the competitive intelligence department, and would label it for research in links, copy and other factors for ranking because it's obvious that they are doing something well to dominate the marketspace and there can be valuable lessons and strategies to be identified when additional factors are analyzed.
The trick here is to have a good set of keywords that define your marketspace, and have a good selection of who your true competitors are. Using some simple, yet work intensive, tactics, you can gain some valuable information as to what keywords your competition are most concerned about, which keywords they are targeting that you are not, which keywords you're focusing on that your competition is ignoring and a very specific sense of your competitions' keyword strategy.
Once the initial work is completed, it's possible to pivot the data to show several different aspects and strategies of your competitions. In further articles, I will explore the different way the raw data can be displayed to show different aspects of keywords strategies as they're compared to a baseline.
Discuss this article in the Small Business Ideas forum.
JP Sherman is the head of the competitive intelligence section at MarketSmart Interactive. Using data driven and analytical methodologies, he uses the predictive power of data by converting it into actionable intelligence. Read his white paper on competitive search intelligence or contact him at email@example.com.
Copyright © 1998 - 2013 K. Clough, Inc. All Rights Reserved. Privacy