In a previous article I wrote that a search engine friendly website is not the same as a search engine optimized website. I'm using that as a jump off point for this article on the topic of building a search engine friendly website. The purpose is to simply highlight a few of the aspects that comprise a SE friendly (not necessarily optimized) website.

The Domain Name

For this series I figured we'd start at the top: the domain name. Many argue about the value of using keywords in the domain name and whether that will make any difference at all in the ranking algorithms. My opinion is if it does make a difference, it's not much. But I also believe that the full process of optimization is largely about doing a whole lot of "not much".

The smartest thing to do is make sure your business name uses your keywords. And no, I don't mean you should name your business No Doc Home Mortgage Loans Company. But if you sell mortgages it makes sense to use mortgages in your name, being sure you can secure the domain name as well.

Since there is really not a lot you can do to "optimize" your domain or even make it search engine friendly so I'll leave you with links to series of articles I've written previously on Securing a Marketing Rich Domain Name. That'll give you some food for thought.

Title and Meta Tags

To reiterate again, I'm making a distinction here between "search engine optimized" and "search engine friendly." They are two very different things. Making your website search engine friendly is largely a one-time task, while optimization of that website is an ongoing process. But in order to be effective with the optimization, your site must first be search engine friendly.

This installment I'll focus on the Title and meta tags, most specifically the description meta, but this can include the keyword meta, though that one is largely irrelevant.

I can pretty much sum up search friendliness of the Title tag and meta description as being two things

  1. They are present
  2. They are unique

You might be surprised to find how how often number one is not done, even by experienced web developers. About a year ago we signed a client for SEO only to find that their programmers did not create a way to program a unique title for each page. But it gets worse. They didn't even program a way to add any text whatsoever into the title tag. Their code looked like this:

<title></title>

One of the first changes we requested was to add a title to their pages to which they told us that such functionality is not available and it will take a few months before the programmers can add that functionality. Can we say, "Your Fired?"

They chose not to fire their programmers so we fired them!

Many ecommerce systems we see use a global title tag across all pages. Well, step one is complete, the title is present. But now each of those titles needs to be unique for each page in order to accurately represent the content of the individual page each is on.

When working with a database system, the smartest (read: most search engine friendly) thing to do is not to just make the title and meta tags editable for each page, but to allow for unique default verbiage to be automatically generated for the pages until keyword optimized text is created.

When looking a potential client's website the other day we were concerned when we saw that the title tags of each page looked to be typical default text. We contacted the developers to find out if these were editable and sure enough, they gave me the answer I was looking for. They are editable for each page but default text is in place until those fields are edited by the client. Perfect!

If you don't use a e-commerce system for your website then you simply want to go add unique title tags, description tags and possible keyword tags to each page. Don't worry so much about using keywords, that'll be your SEO's job, but for now, just make these elements search engine friendly by getting them in place.

Code Bloat

Code bloat is one of my minor amusements when I'm evaluating websites. I enjoy looking at the source code of a web page and then scrolling down to see how long the code for the page is and mentally compare it to how long it should be. What I really enjoy most is when I see a really well designed page with very little code. That makes me happy, but then I'm pretty easily amused anyway!

I'm not a coder myself, so if you ask me to develop a web page I'm going to use a program such as Dreamweaver to build it. But just because I can't develop code on my own, doesn't mean I don't know how to strip down the unnecessary junk from HTML in order to produce a cleaner coded page.

The problem with programs such as Dreamweaver is that they don't always create the best or most streamlined coding structure. Unfortunately, many professional designers don't even know enough about code to go in and fix what these development programs create. To be fair to Dreamweaver, it isn't even close to the biggest offender of code bloat. That title, from my experience, goes to any product from Microsoft. Especially those programs with a "turn this into a web page" feature. (If you ever want to see some of the worst code imaginable, create a "web page" using Microsoft Word.)

What does code bloat have to do with search engine friendliness?

Reducing code bloat no only cuts down on page download time, but it also makes it easier on the search engines as well. When spidering a page the search engines pull the entire code (or the first 100kb) of the page. Only later is that information parsed. By reducing download time the spiders, which are already fast, can burn through many more pages more quickly, quite possibly indexing more pages than they would otherwise.

Once the pages have been spidered, the reduced code then makes it easier for the engine to parse the data. While engines have gotten much better about getting through the junk code, reducing the amount of code they have to sort through will only streamline their processes and potentially giving you an additional, albeit insignificant, advantage. Of course, we can argue about this all day, but the powers behind the search engines have stated numerous times over the years that anything site owners can do to make the spider's job easier, the better. Take that however you want.

There are a number of things you can do to reduce the code bloat of your website.

CSS

CSS has many benefits for programmers. For this article the most relevant one is that CSS greatly reduces the amount of code on your page. This is especially true if you use external CSS files but we'll get to that in a bit. First and foremost, however is that CSS can be used to eliminate duplicate on-page styles from the code. I'll provide links to some how-to CSS references but for now, just know that the amount of code that can be replaced using CSS instead of <font> tags is pretty significant.

One of the other great things about CSS is that none of it actually has to be in the page code itself, but can be called from an external CSS file. The web browser simply has to download the file once and then that CSS document will apply to every page on the website (assuming only one CSS document is used.)

JavaScript

JavaScript itself is not a code saver but what you can do, like your CSS file, is move the JavaScript into an external file. Again, just like CSS, that single JavaScript file will then be used for every page it is required to function on without having any additional download. Moving the JavaScript off the page reduces code length and page download time significantly.

There are a lot of other ways to reduce code bloat and increase page download time. Some of these include compressing images, using text instead of images, removing nested tables, etc. I don't need to go into all of these just so long as you understand that bloated code simply isn't necessary and the benefits of reducing and eliminating garbage code is worthwhile.

Headings

Some SEOs will argue whether using keywords in Hx tags actually helps your search engine rankings. I'm going to bypass that argument because there is an altogether different reason for using proper heading tags. Simply put, it helps the search engines understand the relative importance of different textual areas of the page.

Many sites are content with just using a bold or maybe a bit larger font for their paragraph headings. But that tells the search engine very little other than that particular line is of slightly more relevance. After all, any text can be bolded, colored different, or be made bigger in order to create various inflections of tone, page scanability, or to call out certain points that are helpful. But ultimately, these things weigh small in the overall scheme of things.

Remember back to high school or college when you had to create an outline of a paper before you even began to write it? Your paper outlines would look something like this:

Title

introduction

I. Point #1
... A. Sub Point #1
... B. Sub Point #2
...... 1. Sub-Sub point #1
...... 2. Sub-Sub point #2
... C. Sub Point #3

II. Point #2
... A. Sub Point #1
... B. Sub Point #2
...... 1. Sub-Sub point #1
...... 2. Sub-Sub point #2
... C. Sub Point #3

III. Point #3
... A. Sub Point #1
... B. Sub Point #2
...... 1. Sub-Sub point #1
...... 2. Sub-Sub point #2
... C. Sub Point #3

Conclusion

Now a web page is different from a paper, but not by a whole lot. Using Hx tags properly on each page can tell the engine quite a bit about the page before it even breaks it down. That's where your headings come in.

When you do no more than bold your headings you tell the engine something about that line of text only. When you use Hx tags you tell the engines about the overall topic of the text directly beneath it. Using keywords in your headings is an added bonus as the search engines should then confirm that the headline does accurately represent the following text and together both should be given a bit of extra weight.

Here is how I outline a lot of the sites we work on:

Page Title

<h1>Main Headline</h1>
<h2>Sub-headline</h2>

introduction

<h3>First main point </h3>
... <h4>First Sub point </h4>
... <h4>Second Sub point </h4>
......... <h5> First sub-sub point </h5>
......... <h5> Second sub-sub point </h5>
... <h4>Third Sub point </h4>

<h3>Second main point </h3>
... <h4>First Sub point </h4>
... <h4>Second Sub point </h4>
......... <h5> First sub-sub point </h5>
......... <h5> Second sub-sub point </h5>
... <h4>Third Sub point </h4>

<h3>Third main point </h3>
... <h4>First Sub point </h4>
... <h4>Second Sub point </h4>
......... <h5> First sub-sub point </h5>
......... <h5> Second sub-sub point </h5>
... <h4>Third Sub point </h4>

conclusion

Or alternately:

Document Title

<h1>Main Headline</h1>

introduction

<h2>First main point </h2>
... <h3>First Sub point </h3>
... <h3>Second Sub point </h3>
......... <h4> First sub-sub point </h4>
......... <h4> Second sub-sub point </h4>
... <h3>Third Sub point </h3>

<h2>Second main point </h2>
... <h3>First Sub point </h3>
... <h3>Second Sub point </h3>
......... <h4> First sub-sub point </h4>
......... <h4> Second sub-sub point </h4>
... <h3>Third Sub point </h3>

<h2>Third main point </h2>
... <h3>First Sub point </h3>
... <h3>Second Sub point </h3>
......... <h4> First sub-sub point </h4>
......... <h4> Second sub-sub point </h4>
... <h3>Third Sub point </h3>

conclusion

Should your document have that many headings and sub-headings? Probably not, but that's all determined by length. If its short than an <h1> will likely do the trick. As the document gets longer you need to break it up accordingly. I have to admit that I fail to do this routinely on this blog, but definitely not for pages we optimize for our clients.

In fact, this should be done before a page even gets optimized. When developing the site, the outline format should already be prepped for page content. This is easily done using CSS. Simply create your style for each tag and then they are ready to be employed onto the site as the content developers and/or SEOs get to it.

But going back to the engines, by using this structure the search engines can read the text hierarchy. SEOs talk about this when it comes to site architecture, but often fail when it comes to the page architecture. By providing this outline structure, we let the search engines know the layers of importance on each page, which help make the page much more friendly to the search engines and the SEOs who have to do their job.

Navigation & Links

We recently had been working with a client to get them to remove their block of footer links. All the links were relevant to drive people to their brand pages but it was simply ugly. We recommended a few options, one of which was a "shop by brand" drop down menu. They were able to implement a drop down that linked to all of their brand pages but via their implementation of it none of the links were able to be followed by search engines spiders.

Sorry. Try again.

Search engine friendly navigation and linking structures are the cornerstone of a search engine friendly website. This is the one area where, if everything else is done right but you create links the search engines cannot follow, you simply won't get anywhere... or very much beyond the home page, at least.

Web developers often design navigation links using flash or JavaScript. This can create a little extra flare for the website, which can be nice, but unfortunately, it is probative to search engines.

The bottom line, if search engines cannot follow your links then they won't be able to crawl through your site in order to rank your pages. While this won't completely hinder your pages from ranking well you'll be severely handicapped and completely at the mercy of external links pointing to individual pages. Definitely not a place you want to be.

There are a few ways to check to see if your links are spiderable. One of the easiest, if you use FireFox, is the YellowPipe Lynx Viewer Tool extension.

If you like the extra bells and whistles in your navigation, you will either have to find ways to get what you want while creating search friendly links, which might be expensive to find/develop workarounds, or be willing to compromise a bit. Cool nav features are cool, but nothing stunts business growth like limiting search engine spiderability to your site.

On Page Content

About a year or two ago I had a potential client contact me to inquire about our services. They had heard a lot about us and were really interested in using us for their optimization. But we ended up hitting a snap before we could get a contract signed. In fact, I basically told them that we were not the ones to be able to help them.

What was the snag? Thy were unwilling to make any changes to the visible look of their website. They said they had poured thousands of dollars into the design and they liked it exactly they way it was.

That's fair and understandable. When you poor so much money into developing the perfect website visibly, it's hard to come to terms with the fact that you may have to make changes to it in order to make it search engine friendly. That's the danger of designing without your SEO's in place providing feedback along the way.

The site in question contained about fifty words of text on the home page all embedded within an image. While most people would be OK with converting that to standard text, in this case the text "layout" was perfect. Perfect font, perfect line spacing, perfect character spacing, etc. It would have been impossible to duplicate it perfectly outside of an image. Unfortunately, that was a deal breaker for them.

And for us.

Making sure your text is indexable by the search engines is paramount. Search engines cannot read text embedded in images. Sure you can load your alt attribute with a paragraph of text but that just won't give you the same effect. If the search engines cannot "read" your page, they have no way of knowing what's on the page. This makes it (near) impossible to rank for any given keyword phrases. In most cases the Title tag, meta description and alt tags won't be enough to compete for first page placement.

A great looking design is paramount for usability issues, but don't be so married to a "perfect" design that you have to throw out search engine marketing as a viable option. Work with both your designer and SEO to create a search engine friendly website that is appealing to both search engines and visitors.

Discuss this article in the Small Business Ideas forum.


July 2, 2007





Stoney deGeyter is the President of Pole Position Marketing, a leading search engine optimization and marketing firm helping businesses grow since 1998. Stoney is a frequent speaker at website marketing conferences and has published hundreds of helpful SEO, SEM and small business articles.

If you'd like Stoney deGeyter to speak at your conference, seminar, workshop or provide in-house training to your team, contact him via his site or by phone at 866-685-3374.

Stoney pioneered the concept of Destination Search Engine Marketing which is the driving philosophy of how Pole Position Marketing helps clients expand their online presence and grow their businesses. Stoney is Associate Editor at Search Engine Guide and has written several SEO and SEM e-books including E-Marketing Performance; The Best Damn Web Marketing Checklist, Period!; Keyword Research and Selection, Destination Search Engine Marketing, and more.

Stoney has five wonderful children and spends his free time reviewing restaurants and other things to do in Canton, Ohio.







Search Engine Guide > Stoney deGeyter > A Day in the Life of a Search Engine Friendly Web Page