Earlier this year, I wrote an article that talked about the problems that search engine marketers had created by spending years focusing on rankings as a sign of successful optimization work. While the article made some important points about the need for the industry to reeducate our clients based on the more realistic and accurate measurement of ROI, I've realized in the last few weeks that we've got our work cut out for us in another area...algorithms.

During the New York Search Engine Strategies event last week, I had the pleasure of moderating several different panels, including a few targeted toward beginners. Time and time again, when we got to the Q&A segment I'd notice a pattern in the types of questions that the audience would ask.

People would ask what keyword density they needed to focus on, or how many words they should use in their Title tag. Well-meaning attendees would ask for the magic number of links to get per week or how to know exactly when to request an incoming link instead of a reciprocal link. In other words, everyone wanted to know the magic formula that would guarantee them great results.

The problem with this line of questioning is that there is no magic formula. Each and every time these questions were asked, the panelists would carefully try to explain that while there may have been magic numbers in the past, those days are behind us now. That's not really the answer that attendees want to hear, but the reality is that search marketers need to find a new way to explain the concepts of algorithms to their customers, one that takes the focus away from math.

In the seminars that I put on with Matt Bailey, I spend a lot of time trying to explain to people that the best way to understand a search engine is to think back to the story of Pinocchio.

You see, deep down, search engines want nothing more than to be real boys (or girls). That's right, it's that simple. As search engine engineers gain more and more ability to tailor the algorithms, their ultimate goal is to help the search engines make choices the way that people do.

Let's think about it this way...

Originally, search engines focused on mathematical formulas. What percentage of a page was a keyword or phrase? How many links does a site have? Where on the page does a keyword appear? What shows up in the Meta Tags? The problem with mathematical formulas is that they can be deciphered and they can be cheated. The proper keyword density could be reached even if every other word on the page was garbage. Links could be purchased or traded for with sites that had nothing to do with each other. In other words, you could meet all the criteria without having a very "good" site.

Search engines realized this and they've been working ever since to find better ways to make these judgement calls.

It started with linking. Search engine engineers discovered that if they looked at not just how many links pointed to a site, but also at where those links came from, what the content of those sites was and even how many links pointed to those sites, they could start to make a better judgement about the quality of those links (or votes). At the same time, search engines realized that they could tell if a link was a one way link or a reciprocal link and made the obvious judgement that a one way link probably made up a bigger vote of confidence than a traded link did. In other words, search engines started to "learn" how to make judgements more like a human does, which made for better search results.

Moving forward, the same is and will hold true for things like keyword placement and copy writing. With more focus on the long tail effect, the shift toward search engine copy writing with a more natural language style is really coming into play. Add in latent semantic indexing and search engines will continue to improve in their ability to read and judge copy like a human. Ultimately, search engines want to be able to read the content on your page in a way that lets them understand that an "automobile" a "car" and a "Volvo" are all related.

This is why it's becoming important for search engine marketers to understand concepts like usability, accessibility and even user intent when they are developing pages. While many within the industry have long talked about the need to create pages that are 'search engine friendly' instead of 'search engine optimized' the day really is coming where web site owners will find themselves learning to shift their focus from algorithmic math to creating a strong user experience on a search engine friendly web site.

Based on the feedback and questions I hear when I'm speaking, it's clear that that day is still a ways off, but if we can start teaching web site owners that search engines are working to be able to understand content the way a human does, the day will come a bit sooner.

Discuss this article in the Small Business Ideas forum.

March 7, 2006

Jennifer Laycock is the Editor of Search Engine Guide, the Social Media Faculty Chair for MarketMotive and offers small business social media strategy & consulting. Jennifer enjoys the challenge of finding unique and creative ways to connect with consumers without spending a fortune in marketing dollars. Though she now prefers to work with small businesses, Jennifer’s clients have included companies like Verizon, American Greetings and Highlights for Children.

Search Engine Guide > Jennifer Laycock > Search Engine Algorithms - Understanding the Pinocchio Effect