It's been almost one month since Google started rolling out the Florida update
and millions of listings were dropped from the results. In that time, hundreds of search engine marketers and thousands of website owners have dealt with the loss with all the classic signs of bereavement: at first, denial, then anger, gradually changing to acceptance and finally, healing. We're moving on, understanding that we're just part of the never ending circle of Google.
I too moved through the process, starting by wondering what the heck Google was doing, then trying to guess, often shaking my head in bewilderment and then trying to look at it in Google's perspective. It seems the rest of the SEM industry is doing the same. With a few exceptions, there was no outright attacks on Google at any of the sessions during the recently ended Search Engine Strategies show in Chicago. With the benefit of hindsight, I have seen why Google handled it the way they did. The jury is still out about whether it was the right way.
First, an update...
As we mentioned in the last column
, we expected this latest update to be a work in progress. Google confirmed this at the SES show. In fact, during one of the sessions, a site owner who had been dropped was very pleasantly surprised to see his site back in when the results were put up on the big screen. I pointed out a few instances where the relevancy of results were suffering badly from the broad exclusion of commercial sites. I expected Google to tweak the algorithm to allow for some of these sites to come back in. That seems to be exactly what is happening. For many of these searches, we're seeing previously excluded commercial sites beginning to come back, making the results more relevant to the searcher. Changes seem to be happening almost daily. On the more competitive searches, the new rankings do seem to be catching a lot of the previous spam.
Yet another Hypothesis about Why...
A few weeks ago, I said this was a filter aimed at aggressively optimized and affiliate sites. After several hours of team research and speaking to others in the industry, I'm beginning to think this is part 1 of a major change in how Google will rank sites. Danny Sullivan put forward his theory
that Google is now using two algorithms, a new more sophisticated one on the more competitive searches and the old one on the less competitive searches.
There has been some discussion about the possible role Google's recent purchase of Applied Semantics may be playing here. At Enquiro, we had a few people point to this as a place to start looking. Rob Sullivan dug a little deeper and came up with an interesting theory. I've taken it and run a little further. We have no proof that this could be happening, but certain things do start to make sense when you look at it from this perspective. Besides, this industry thrives on speculation, so why not throw a little more into the mix?
Applied Semantics Concept Server used language patterns, including semantics and ontology to try to both determine the real meaning of the words on a website page and also to anticipate what people are looking for. It tries to interpret concepts based on the use of words, their proximity and the patterns they occur in. What if Florida was Google's first attempt to start introducing this concept to their search engine?
The other unique aspect about Concept Server is that it can refine results on an ongoing basis as it becomes "smarter". It starts by feeding concepts or results that it feels matches the searchers intentions. If the response isn't positive, it will try to do a better job next time.
Search engines already monitor the relevancy of their search results by looking at the click rate on each results page. If the search engine is doing its job well, there should be a heavy click through rate on the first page, and the clicks should be fairly consistently spread around the results shown. This indicates that all the results were relevant and the searcher didn't have to go any further.
What if Google is combining the artificial intelligence of Applied Semantics Concept Server and this monitoring of click throughs. In this case, Google's algorithm isn't applied universally to every set of search results. Rather, the various factors that make up the algorithm can be adjusted on the fly, delivering results that improve with each search. The more popular the search term, the more searches are conducted and the faster the results will improve. As Google monitors more searches, the Concept Server will start to notice patterns between similar concepts and the type of results chosen by the searcher. With every search, Google will be better able to anticipate what the searcher is looking for, even if their query isn't right on target.
Presenting Our Case...
With that in mind, let's look back at what's happened in the past 4 weeks.
Danny Sullivan theorized that the new algorithm wasn't being applied in every case because of the processor horsepower required. This makes sense with our theory, as these "smarter" queries would put a significantly higher workload on a server than the old searches
It would also explain why the most popular searches looked much more relevant right at the beginning, with the less popular searches taking a week or two to improve in relevance. It also would explain why it looks like search results are changing almost daily.
It also makes sense that the new algorithm wouldn't be applied to most single words, as it would be harder for Applied Semantics process to work on single word searches.
Finally, there's the question of why it was commercial sites that seemed to be hardest hit. I believe Google knew it had to move quickly to clean up spam so they started with a crack down on the most likely culprits, knowing they would be throwing out some of the good with the bad. They also knew that the algorithms would gradually adjust the thresholds, letting the borderline sites back in as the monitoring showed that relevancy had to be improved.
If We're Right, What Does It Mean
Well, for one thing, it means the Google Dance is a thing of the past. Changes in results will happen fluidly and consistently, based on ongoing relevancy monitoring based on click throughs. It's almost as if Google has taken a page from Direct Hit's book and gave it a Google twist. Direct Hit was the one time search engine wunderkind that used searcher click throughs to determine relevance. Apparently its back end technology still plays a part in determining results on Ask Jeeves and Teoma.
Secondly, it would mean that individual rankings will move much more frequently and reliance on specific keyphrases will become less important.
Thirdly, a change like this will take a while to fully roll out, so Google will continue to take us on a roller coaster ride for the foreseeable future.
Lastly, this would be the first major step forward in search engine technology in quite some time, and that's probably the biggest reason why we think we might be on to something.
If Google Did It, Why?
Consider Google's position. They're still moving towards an IPO. They knew there was a significant problem with the relevance and integrity of their search results. And they know that the 800 pound gorilla, Microsoft, is rattling the cage, waiting to go head to head with them. It's a battle they had no hope of winning as long as their results were filled with spam. But if they could unveil a major technological improvement that put Google back far ahead of the crowd in terms of the quality of their search results, they might have a fighting chance.
Now to the question of timing. Why now? It's pretty obvious. Google is still in an untouchable position when it comes to search engines. They enjoy a 80% plus market share. They could afford a little short term turmoil if they knew it would settle down in a couple of weeks. And as the battle with Microsoft looms larger, every week becomes vital. For Google advertisers, the timing might have been disastrous, but the damage to Google would have become dramatically more significant by waiting to the new year.
But Why The Big Secret?
If there was anger towards Google at the Search Engine Strategies conference, it was mainly due to the lack of communication about the Florida update. Many felt that Google owed it to their advertisers, many of which saw their organic results wiped out overnight, to communicate exactly what it was they were doing.
I have to admit, I was squarely in this camp, until I started thinking about this theoretical new roll out of a substantially different ranking mechanism.
If Google had warned us prior to the update, would it have accomplished anything? More likely, it would have just caused a furor of changes on websites, as optimizers, affiliates and site owners tried to avoid being dropped out of the results. In the end, it's likely the same sites would have been dropped; only to see some of them come back in a few weeks as the algorithms adjusted. Perhaps Google was doing the SEM industry a favor, saving us from hours of futile work.
"But what about after the update?" respond the nay sayers, "Why did Google not just come out and tell us what they were doing, rather than force us to guess?" This point is a little more valid. Officially, Google's line was that is was just another algorithm change. If our theory is correct, this statement is true in substance, but grossly understated in scope. The Google Guy gave a few more hints on Webmaster World, but remained pretty tight lipped and virtually disappeared from the forums during the worst of the turmoil.
From a strict customer relations perspective, this could have been handled much better on Google's part. But we can't forget the Microsoft factor here. You have to know the gang from Redmond has Google under a high power microscope right now. With the battle looming, every day and week Google can keep Microsoft in the dark about their intentions could make the crucial difference between surviving and suffering the same fate as Netscape.
Crunch Time for Google
I have no idea if we're right. All I know is that the pieces seemed to fall into place neatly around this theory. And I do know that small moves on Google's part are not nearly enough to give them a hope of surviving Microsoft's onslaught. They have to make big moves, and make them soon. This could be the first of them.
PS..other cool things from Google
Although not nearly of the same potential scope as an artificially intelligent, self adapting algorithm, Google did unveil some new features in Chicago that are pretty high on the "neat" scale. First of all, type a UPS waybill number into Google and you'll be taken to the UPS tracking site. You can also do the same to track down patent numbers and other highly referenced items. By the way, if you want to see what other ideas Google is playing with, try Google Labs
Of much more importance from a marketing perspective is Google's recent featuring of Froogle results, tied into the Google results. If you have a US based e commerce website that allows consumers to purchase online, you can have your site included in Froogle free! This is a tremendous opportunity and we'll be working on this for our clients.
December 15, 2003