It's been more than two and a half years since I first wrote about the "Pinocchio Effect" as a way to explain the ever changing nature of search engine algorithms. Earlier this month, while preparing for the new small business panel at SES Chicago, I read through that old article and realized just how many areas of the Pinocchio story can be applied to online marketing.

In this six part series, I'll be exploring six valuable lessons you can learn from the classic story of Pinocchio and offering up some input on how to apply it to your own marketing plans.

pinocchioboy.gifToday's post sets the stage with an updated take on my original article: "Search Engine Algorithms: Understanding the Pinocchio Effect." Here's how I described it back then:

You see, deep down, search engines want nothing more than to be real boys (or girls). That's right, it's that simple. As search engine engineers gain more and more ability to tailor the algorithms, their ultimate goal is to help the search engines make choices the way that people do.

Succinctly put: Search engines seek to replicate human judgement with their algorithm. Every change they make aims to help them judge a site the way a human would judge a site.

The original post was inspired by questions that kept popping up during the Q&A of some sessions at a past SES New York show. At the time I wrote:

People would ask what keyword density they needed to focus on, or how many words they should use in their Title tag. Well-meaning attendees would ask for the magic number of links to get per week or how to know exactly when to request an incoming link instead of a reciprocal link. In other words, everyone wanted to know the magic formula that would guarantee them great results.

The problem with this line of questioning is that there is no magic formula. Each and every time these questions were asked, the panelists would carefully try to explain that while there may have been magic numbers in the past, those days are behind us now. That's not really the answer that attendees want to hear, but the reality is that search marketers need to find a new way to explain the concepts of algorithms to their customers, one that takes the focus away from math.

For the most part, I'm seeing these types of questions fade away at search conferences. That said, they're now being replaced with new questions that follow the same patterns. People want to know how many links they need to get a good ranking. They want to know which social media sites carry extra "weight" to help them rank better. They want to know if blogs are the magic tool to rocket their rankings. They're still ultimately looking for a magic formula, even if they know the ingredients in the formula have changed.

Why Chasing Down the Algorithm Simply Leaves You Exhausted

Of course that leads to another problem...the problem with SEO formulas. While it's true search engine algorithms are essentially complex "formulas" it's not generally feasible to focus on reverse-engineering them so you can determine exactly what changes to make to your site. Engines like Google rely on literally hundreds of factors in determining ranking and no one but their engineering team truly knows how each of those factors are weighted. While it's true there are some individuals and companies out there who are fairly well known for their ability to test and determine new algorithm factors, this method of optimization simply isn't practical for 95% of the companies looking to increase their rankings.

For the rest of us, this type of optimization is known as "al go-chasing." You'll often see it on discussion forums as one person announces the results of a "test" they've run and legions of other rush off to make edits to their web sites to reflect this new information. Unfortunately, algo-chasing generally results in a lot of work with very little pay off. Stoney deGeyter wrote a great advice post on this several years ago called "Common Sense Algorithm Chasing."

Applying Common Sense

Let's go back to our simple definition of the Pinocchio Effect and see it in action. If search engines are looking to replicate human judgement, it means we can match up the changes in the algorithm with a better understanding of how humans value a web site. This is probably most clear through the progression of how engines like Google have valued links.

linkprogression.gifBack when Google first blasted on the scene with some of the best search results any engine had delivered, it was their reliance on links that made them special. Google had figured out that linking was the online equivalent of a vote of confidence. With that in mind, the algorithmic adjustments went a little something like this...

1.) Link Quantity - Originally, search engines were most concerned with the number of links pointing to a site. They viewed each link as a vote of confidence and made the natural assumption sites with more links were of higher quality. (Unfortunately, it didn't take long for site owners to figure this out and to start finding ways to build new links on their own.)

2.) Link Text - As site owners began actively seeking links, search engines realized they needed to improve this area of the algorithm to give them a better idea of just how valuable a link was. A natural progression was to read and consider the anchor text (the blue, underlined text a user clicks on to link to a new page) and to factor that text into the algorithms. It made sense that if a site had a million incoming links using the word "pizza" the site those links were pointing to was probably about pizza. (Once again, it didn't take long for site owners to figure this out and to begin seeking specific link text.)

3.) Link Quality - As site owners once again began to catch up with the algorithm, the search engines moved on to the next stage. This time around they not only looked at the number of links and the text describing those links, they looked at the quality of the site the link was coming from. By using their first two link judgements, they could easily tell if the site giving the link was popular (lots of links) and related (topical words and anchor text). It was natural to assign more weight to the links coming from respected, related sites. (Any surprises here? Site owners catch up and start seeking these types of links.)

4.) Link Age - As site owners began creating better link building campaigns, the engines needed to create better ways of judging those links. The next step for the engines was to put value on the age of a link. After all, a site that has had quality links pointing to it for years is a sure sign of an established and trusted site. At the same time, a very recent link could be a great way to tap a site as having good coverage on breaking news or a hot new topic. As such, the engines began adding the age of a link to their equations. (and once again, site owners took notice and started working on this strategy, often by buying established domains with incoming links to build new businesses on.)

5.) Link Buys - Eventually, seeking out quality links from quality sites in a world where everyone else is doing the same became fairly difficult. While still doable, many businesses turned to purchasing links as a faster way to control and build the links coming into their sites. The engines, always seeking to replicate human judgement, decided a purchased link was not worth as much as an "earned" or freely given link. As such, they've spent the last year or two working on ways to combat paid links and threatening to harm the rankings of sites who either buy or sell links.

Applying the Pinocchio Effect

What's next in this progression? Any number of possibilities exist. Overall though, the path becomes clear. Each and every adjustment made to the algorithms is designed to better judge a site the way a human being does. Ultimately, the sites that are built in a search engine friendly manner and designed to benefit users tend to come out ahead. It's essential to understand search engine friendly design techniques and to learn how to find out which keywords to target. Once you get the basics down, it's really about focusing on your customer and giving them the best experience possible.

In the next article in this series, I'll take a look at how Pinocchio's rapidly growing nose teaches us a valuable lesson about online reputation management.

December 29, 2008

Jennifer Laycock is the Editor of Search Engine Guide, the Social Media Faculty Chair for MarketMotive and offers small business social media strategy & consulting. Jennifer enjoys the challenge of finding unique and creative ways to connect with consumers without spending a fortune in marketing dollars. Though she now prefers to work with small businesses, Jennifer’s clients have included companies like Verizon, American Greetings and Highlights for Children.


Great post! This really breaks down how SEs have evolved in time. The formula to have good rankins remains to be the same, though: offer your readers / customers / clients good contents / services - that's what really matters, after all.

Great post!

that's good information to know...

there is so much information that is going into seo...

That's why I think it's important to have a good base and understand what's good and what's not!

Fresh unique content and links to your site!

That's what I'm after!

David King,

I think it is interesting that a lot of the social sites that are now nofollow may benefit us as searchers if they were dofollow. I know it begs for spam. But not everyone owns a site where they can link to other sites. They vote on social networks. That is their platform and when their vote is not followed, their vote is not counted.

Great stuff Jennifer! Now imagine you have been link building for 14 years, and have had to hear these exact same questions thousands of times, year after year, and no matter how often you plead, preach, or scream from the mountaintop that the formula approach is not strategically viable, nobody wants to hear it. That's my life :)

When I examine why one site ranks above another site, I find a remarkable variety of factors in play. Depending on subject matter, I've seen sites rank #1 with minimal links, while sites in that same subject rank on page 20. An inbound link profile is a beautiful thing to study. A painting emerges, and it is usually obvious which painting is a Renoir and which is a paint by numbers.


Interesting post - thank you. Have you had a chance to look at the website I work for -
It's a very cool search engine aggregator bringing Google, Yahoo, MSN, plus all the major online market places to one location.

Thanks for the great article, Jen.

We certainly cannot say these things enough times. No magic formula for SEO, kids, that's the truth!

Maybe 2009 will be the year people actually get it? :D

I recently came across your blog and have been reading along. I thought I would leave my first comment. I don't know what to say except that I have enjoyed reading. Nice blog. I will keep visiting this blog very often.

This in-depth analysis really makes you understand how links work.
Thanks a lot :)

power level guide

Great post that helped refresh my mind on key tactics from Google's algorithm. Looking forward to the other segments of the series!

Hi Jennifer,
I like your topic here: The Big G algorithm, links! Have you meet the case like me? I often find the domains can be listed at the 1st page at SERP with only under 10 backlinks. How can they do that? The recipe is their domains contain long tail KW. So, this KW become domain name, usually with ext .com or .org.

Run the right some researches and you will get them. But, you will not be able to do that in case you have to buy

    expired domain
for some particular business.


Comments closed after 30 days to combat spam.

Search Engine Guide > Jennifer Laycock > Six Lessons from a Wooden Boy: Part One: Search Engines Want to be Real Boys