There is a big difference between optimizing your site for keywords and making it search engine friendly. While one of the first steps in the optimization process should be focused on ensuring the site is search engine friendly, sometimes we tend to want to jump right into the keyword optimization because that's how we "see results" in the search engines, leaving the site incompatible with the engines while we wonder when we'll get the results we wanted.

Even though an SEO may do all they can to make a site search engine friendly early in the optimization process, some issues will only be uncovered over time as the engines begin spidering and indexing the website. By keeping an eye on performance we can often find indicators that something may be wrong. With that hypothesis, the SEO must delve into research mode to uncover what, if anything, is creating problems for the site.

There are a number of individual issues that you can keep an eye on. Some issues can easily be fixed at the beginning of the SEO process, others are fixed as pages and keywords get optimized, and still others can only be uncovered later as time passes. Each, however are important to ensure your site remains as search engine friendly as possible.

Here are four issues that can impede on having a search engine friendly site:

Duplicate Titles and Descriptions

As pages are optimized for specific keywords, going in and changing title tags and description meta tags is all part of the process. Each page that gets optimized becomes unique as unique keywords are targeted for each page. But pages that are left unoptimized, or still waiting to be optimized, can often be found to use the same title and description tags repeatedly.

This kind of title and description tag duplication makes your pages, and your site overall, far less valuable and less likely to be fully indexed. Each page that is indexed may be given less weight and value because the site as a whole is dragging down the value of those pages. If you have a large site, correcting duplicate title and descriptions can be a challenge, but even more important than if your site is small because the duplication is often magnified significantly.

Duplicate Content and Pages

There are a few types of duplicate content: 1) content on your site that can also be found on other sites, 2) content on your site that is duplicated on several pages, such as product descriptions or different pages that say mostly the same thing, and 3) single "pages" that can be accessed via multiple URLs.

The second issue above is the least significant as most site's, especially those with products, do have some form of duplication as similar product descriptions may be displayed in different product categories. An easy solution to this is to make sure each category page contains at least a paragraph of unique text at the top. This will allow the search engines to find something on each page that can't be found elsewhere on the site.

The first issue above can be a problem, especially if that content is stolen from you. You want to do as much as you can to prevent your content from being swiped. But this can also be an issue if you write valuable content that you then republish on other websites. If you do this too much your content will become devalued and, worse, the duplicated content may outrank your own.

The third duplication issue is probably the most common, especially for e-commerce sites. While there is no one way to fix these kinds of problems, the basic solution that any "page" should only be able to be accessed via a single URL. If someone navigates via brand name versus product type, once the reach the product, the URL should be exactly the same.

Internal Linking

Internal linking is accomplished in two distinct ways, via the site's primary and subsidiary navigation and through adding links into content from page to relevant page.

The most important for search engines is your site's navigation. The navigation must be inclusive but not overbearing. Link to your main sections and even sub-sections, but only to a point. Once your navigation becomes overly burdened with links then you're likely not giving the search engines the information they need to determine which pages are the most important.

Linking with content can be accomplished easily enough by linking relevant portions of text to additional relevant pages (as I've done above.) This is great for visitor usability as it allows visitors to go to pages they think they may be interested in for more information without having to dig for it in the navigation. If they have to do that then it's most likely they won't even know they are interested in viewing that page. Similar content linking can be accomplished with product pages by adding features that provide the visitor with other related products or items they may also be interested in.

Bloated Code & Proper HTML Markup

Designers, overall, are getting better and better at reducing the amount of code bloat on sites they develop. There are a lot of web development programs out there that create garbage code that weighs down the page and reduces download times, but developers are getting wise to this and creating great designs with highly streamlined code. This is good for everyone. The less code bloat on an individual page the faster each page downloads and the more frequently the search engines are likely to return.

Where may developers still may be lacking us using proper markup to create the HTML. While validating the HTML has little relevance to SEO, invalid HTML can often be problematic if the search spiders cannot decipher it properly. Such errors can cause the spiders to misread content or attribute elements incorrectly, or even prevent them from navigating the site all together.

What is important, however is that proper markup be used for text and headings. Using Hx tags improperly can assign improper weight to areas of the page giving the engines a false understanding of the page. This can affect rankings for your keywords, giving exaggerated importance to unimportant content.

By keeping an eye on the issues above you can hope to stay ahead of the search engines, or at the very least, not too far behind, allowing important issues to be addressed as quickly as they are uncovered. There is, however, a flip side to all of this. While sites can be keyword optimized without being search engine friendly, the opposite holds true as well. Just because a site is search engine friendly does not necessarily mean it is search engine optimized. There are two sides to every coin and it's important to understand both sides.


January 8, 2009





Stoney deGeyter is the President of Pole Position Marketing, a leading search engine optimization and marketing firm helping businesses grow since 1998. Stoney is a frequent speaker at website marketing conferences and has published hundreds of helpful SEO, SEM and small business articles.

If you'd like Stoney deGeyter to speak at your conference, seminar, workshop or provide in-house training to your team, contact him via his site or by phone at 866-685-3374.

Stoney pioneered the concept of Destination Search Engine Marketing which is the driving philosophy of how Pole Position Marketing helps clients expand their online presence and grow their businesses. Stoney is Associate Editor at Search Engine Guide and has written several SEO and SEM e-books including E-Marketing Performance; The Best Damn Web Marketing Checklist, Period!; Keyword Research and Selection, Destination Search Engine Marketing, and more.

Stoney has five wonderful children and spends his free time reviewing restaurants and other things to do in Canton, Ohio.





Comments(28)

This is good information. I think the days of benefit by keyword stuffing for seo are gone. Content is king imho for today's search engine friendly sites. Well placed keywords in title and meta tags, combined with original content is a great combination.

Refinance Rates

in oth3er words; subscribe to google webmaster tools and check that data !!

Wow! This article is really great. Those points are great. i will use it as a guide for me to check all my siteS. Thanks Stoney.

Simon
Malaysia Web Hosting

Event if your content was stollen (and it's a Duplicate Content) and the page your content is on has more links pointing to it or the domain is stronger, this page will be higher in rankings despite the fact that it's a Duplicate Content...I think so from my experince.

thanks for great article Stoney
vigrx

@ yuri - you're right. the problem is, if the other site gains more links then you can be out of luck.

your article validated my hard work with SEO in each page on my site
thehuntingdog

I've been really negligent with duplicated heading and description tags for a long time, then someone with pretty good SEO experience pointed out the mistake in my code just a couple of days ago. You article validates what he told me, so thanks for posting it and I'm glad I read it.

It seems like links are more important than contents nowadays. Though it still debatable, I did see a 1-page site ranked No.1, with the competition 500, 000+. Google always try to fool we webmaster so that all of us will go for adword? :)

Vigrx Plus

Aapus Ismic, it is pretty amazing that a one-page site can be ranked No 1 in Google. And you are saying that it was the number of sites linking-in that enable the No 1 position?

For seo relevant content is king and will always be king.

it's all the game of keywods and link building....a million of wbsites are running in this webworld ... but we have to survived in this then we have to do some different thing......

"It seems like links are more important than contents nowadays. Though it still debatable, I did see a 1-page site ranked No.1, with the competition 500, 000+. Google always try to fool we webmaster so that all of us will go for adword? :)"

Can you show us the site's url? may be it's sort of blackhat? like one of the poker sites I seen before, though it is not 1-page site...

Duplicate content is a myth.

Penalties are applied to duplicate content within the same site. Duplicate content shared between different sites are fine.


Website Marketing Strategy

I loaded up one of my blogs with old content, lots of pages, and only a few pages of new. I spent very little time setting it up, just an afternoon, and that site brings in quite a lot of traffic now.

Website Marketing Strategy

Interesting. I picked up a few new ideas. Thanks

TL

Informative article, i guess original content holds the key and off course on page optimization plus also keep building quality links to get high ranking more traffic and more money

This is a really terrific useful article. Most bloggers / web writers do realize that they need to be doing something to get the attention of search engines but it's really unclear to most of us what the tricks are that work and which ones don't. This article helps to clear up some of that confusion.

One of the biggest parts of SEO is website usability. Making a website something that a user actually wants to be on is vital. Google is getting better at detecting human behavior and how humans interact with websites. It does not matter how well your site ranks if when a user goes there they hate it and leave.

Thank you for the info, I found it helpful. I have just started a new business and need all the help I can get on promoting the site.

I learned the hard way about duplicate titles and descriptions. I took the easy way out at first, by just replicating my meta descriptions and then I learned that is a big NO NO and spent the time and rewrote them all. Another tip is to remember to write those descriptions as a mini-ad.

Van
Van from Voice T1 in St. Louis

yeah, indeed very good advice, I'm going to send this link to a bunch of my clients :)

Nice article. Helpful information about SEO.

Great Comments

So often we get a very "pretty" web site without consideration to the finer points required to get good results not just from search engines but from humans to.

Lets get the public interested in SEO early in the development stage then we will see high performing sites straight out of the box

Hi Stoney..

The duplicate content, and Meta Title/description is a big problem with large ecommerce sites and B2b portals. For eg. a page may be dynamically assigned several urls while the content being the same. A company listed in a B2b site will have a url like
http://site/state/company101
http://site/carhire/company101

the 2 urls (or more) will point to the same company which is shown under different categories. Is there any solution for this? I often see this under supplemental results

And r u sleeping on a snooker table :-)

@ Guna - yes there is a solution, but it has to be done on the programming end. How the programmers do this is up to them so long as each product has only one URL.

yeah i agree with some other posters here that mention google messing with ou rankings so we just go with adwords! certainly seems that way.

vigrx plus

I try to keep the important content and H1 right at the top of my code. I think this helps.

I think the new google chrome will make it harder for seo built site that have done a ton of spam linking, google is making it harder to optimize, but content should go a long way, especially if it is unique and well organized

Comments closed after 30 days to combat spam.


Search Engine Guide > Stoney deGeyter > Your Site is Keyword Optimized, But is it Search Engine Friendly? You Might Be Suprised!