A few years ago, in the days before the introduction of Google’s
Florida Update, SEO was a quasi-Masonic vocation practiced by an expanding
order of techno-monks who acted openly but held secret the minute details
of their trade. As search engines and SEO techniques evolved that old
order was already dying, long before the discovery of Florida in November
I was reminded of the olden days yesterday when fielding a question over at one of the SEO Forums I spend time in. One of the new members wrote in asking about the correct range of Keyword Densities for the various search engines. That got me thinking about many of the lesser known tricks of the trade that have been used by search engine optimizers over the years and how some of these “tricks” have incorporated themselves into our SEO practice while others have been roundly rejected by SEO practitioners.
To clearly examine SEO techniques over the years, we need to separate search into two unique time periods, pre-Florida and post-Florida. (We are clearly moving into a third unique period in the history of search, which we will cover in the StepForth Weekly Newsletter.)
On November 16, 2003, Google introduced the most comprehensive algorithm shift in its history. Before this date, Google’s algorithm acted primarily on site content and link popularity. After that date however, Google’s formula was much smarter, almost intuitive. For the past two years, Google has used an algorithm capable of understanding content and contextual relationships. Florida made many old-school tricks useless or radically less powerful. It wouldn’t be long before the other major search engines followed suit.
The first obvious SEO trick was the practice of keyword stuffing. Used in the earliest days of the industry, SEOs would simply place as many incidents of keywords (both visible and invisible) on a page in the hopes a spider would find them. When older SEOs get that far-off glassy look remembering how simple it was ten years ago, this technique is likely what they are thinking of. To this day, we come across meta tags stuffed with nonsensical usage of keywords and phrases. We also continue to see invisible, keyword stuffed text used to promote a site or page. While we remember the history of these techniques, StepForth optimizers have not used them for at least seven years.
The practice of keyword stuffing led to a technique that measured Keyword Densities. Keyword densities were one of our favourite pre-Florida secrets. At one time, before the rise of the great Google, search engines operated strictly based on words found on the page. In order to prevent keyword stuffing, search engines measured the ratio of keywords against all words found in the text of a document or page. If that ratio fell into the correct range, (generally 3 – 7% of words found on the document), that document would likely fare better than competing documents would.
While writing text to apply to client pages, we are still affected by our use of keyword densities though we rarely actually measure them. It comes out in the writing where force of habit drives us to create copy that would almost certainly fall within those formerly magic keyword densities if actually measured.
Measuring keyword densities is a fairly benign technique however, since each search engine had a slightly different keyword density to target, another much less benign technique evolved, the use of doorway pages. Please note, this technique has been banned by Google and has led to the de-listing of several sites that have used it, including those of a few SEO shops.
Operating under several different names such as gateway pages, traffic-pages and also (a term shared with pages AdWords are directed towards) landing pages, the doorway page technique is still in use today.
The basic premise is to design a series of pages uniquely suited to the various ranking algorithms used by different search engines. MSN and Yahoo have slightly different ranking criteria but both drive a great deal of traffic. Wouldn’t it make sense to create pages specifically designed to rank well at each engine? It might, however search engine spiders will find each doorway page, regardless of which engine the pages are designed for, thus plugging search results with duplicate content and degrading user experiences.
Doorway pages often look like carbon copies of each other with minor variances included to please the different engines. As Google emerged on the scene in 2000, the use of doorway pages led to yet another “seo trick”, the interlinking of these doorway pages.
When Google was first understood by SEOs, the most obvious exploit was found in how it measured incoming links to determine the relevance of a document in its index. Links between a series of doorway pages were designed to fool Google into thinking a large link-density had been established for a site. If the SEO could artificially inflate the number of incoming links (often by linking doorway pages together), Google would generally rank the target document higher. Artificial link networks became a mainstay of several SEO shops, some of which used the technique up until the summer of 2004 when Google slammed and banned a couple of well known SEO shops (along with their client lists) for using it.
As Google became better able to determine the context of a document and to detect multiple incidents of duplicate content, the doorway page technique changed towards the creation of several smaller, search engine specific sites, each designed to rank well on unique search engines but ultimately designed to work best at Google. The logic and process behind mini-sites was the same as it was for doorway pages but at a more robust level due to the increased sophistication of Google and the other search engines. The technique is still in use today however Google has penalized sites for its use and is likely to do so again in the future.
After the Florida Update, each of the techniques described above were basically disempowered. They stopped working for the good of a site and, with the exception of Keyword density analysis, can actually work against strong rankings for a site or document.
Florida was a rapid departure from the PageRank dependent search algorithm that many still associate with Google. PageRank was the original name of the Google algorithm however that algorithm has changed so many times over the years it is now remarkably different than it once was. The term PageRank carries its own meaning in the SEO community as well. PageRank is a score affixed to a webpage or document by Google. It is assumed to be a number between 1 and 10 Google assigns to a document to note the relevance or authority of that document. In previous years, SEOs took the concept of PageRank very seriously as links from documents with high PageRanks tended to perform better in Google’s index.
Links have always been an important factor at Google but, as the legions of link-farms that continue to exist have proven, links have always been remarkably easy to manipulate. The algrorithm Google used in its pre-Florida days was fairly link agnostic. It didn’t care as much where the link came from than it did that the link existed. Knowing Google would reward sites with many incoming links, SEOs started signing their clients up with link-schemes of varying sophistication. Google’s post-Florida algorithm was remarkable for its ability to find the contextual connection between linked documents and measure the relevance of those relationships and if such relationships existed at all. When it discovered that no true relationship existed, it either dumped or devalued the link. It also stopped displaying an accurate metric of the PageRank it assigned to documents in its index, now displaying a measure known to be for “entertainment value only”. Needless to say, link-farming is a dying profession.
There have been several changes made to Google’s algorithms since November 2003, many of which have forced SEOs to make subtle changes to their tactics and techniques. The biggest accomplishment of Florida, from an SEO standpoint, was that it stands as the first major step Google took to distance its rankings from SEO spam. Florida set a new standard, a cornerstone for the optimization sector to build new structures for the user-centric services practiced by ethical SEOs.
Discuss this article in the Small Business Ideas forum.
Jim Hedger is the Executive Editor of the new daily webmasters information site, SiteProNews.com. He is also a consultant to Metamend Search Engine Marketing and Enquisite Search Metrics. He spends most of his time in Victoria BC, recovering from traveling to the Internet marketing events and conventions where he spends the rest of his time.
Copyright © 1998 - 2020 Search Engine Guide All Rights Reserved. Privacy