This is a continuation of a series of website marketing checklists. Check out all Web Marketing Checklists in this series.

What this is about: This list covers several elements regarding the architectural aspects of a website that focus on building a more search engine friendly site overall.

Why this is important: Website architecture can make or break the performance of a website in the search engines. Poor architectural implementation can create numerous stumbling blocks, if not outright roadblocks, to the search engines as they attempt to crawl your website. On the other hand, a well-implemented foundation can assist both visitors and search engines as they navigate through your website, therefore increasing your site's overall performance.

bdwmc-blog-bottom-new.jpg

What to look for:

  • Correct robots.txt file: Make sure robots.txt file is free from errors that can otherwise block search engines from indexing important pages.
  • Declare doctype in HTML: Implement proper doctype declaration across all site pages and code accordingly.
  • Validate HTML:You don't have to have 100% compliant code, but eliminate as many errors as possible throughout site.
  • Don't use frames: Find alternate ways of displaying framed content.
  • Alt tag usage on images: Every visual image should use alternate text.
  • Custom 404 error page: Make sure broken links lead to a custom 404 page to keep visitors on the site.
  • Printer friendly: Print a few pages to ensure that the result is readable. Create alternate CSS if necessary.
  • Underlined links: Hyperlinks in body copy should always be underlined.
  • Differing link text color: Linked text in body copy should be a different color than standard body text.
  • Breadcrumb usage: Be sure breadcrumbs are used and are effective at letting visitor know where they are in the site.
  • Nofollow cart links: Any links pointing to shopping cart, or adding products should not be followable by the search engines. Add nofollow attribute if necessary.
  • Robots.txt non-user pages: Any pages that are not intended to be listed in search results should be dissallowed in robots.txt.
  • Nofollow non-important links Don't send link juice to pages that you don't want to appear in the search results.
  • Review noindex usage: Consider necessary usage of the robots meta tag for pages that should not be indexed.
  • Validate CSS: Use proper markup for CSS to ensure proper rendering.
  • Check broken links: Perform a broken link check and fix all broken links.
  • No graphics for ON/YES, etc.: When using yes/no, on/off comparisons, don't rely solely on images to make the point.
  • Page size less than 50K: Keep pages small for fast loading.
  • Flat directory structure: Keep page/url directory structure as flat as possible while still intellectually organized.
  • Proper site hierarchy: Ensure navigation and directory structure adhere to a sensible hierarchy structure.
  • Unique titles on all pages: Each page should have its own distinct title in the tags.
  • Title reflects page info and heading: Title tag should reflect page content and uppermost page heading.
  • Unique descriptions on pages: Each page should have its own distinct meta description.
  • No long-tail page descriptions: Pages capturing long-tail keywords may not need a description at all.
  • Proper bulleted list formats: Be sure bulleted lists use proper markup (i.e. <ol>/<ul> and <li>)
  • Branded titles: Use branded title tags when it makes sense to do so.
  • No code bloat: Check for excessive code bloat and make pages as lean as possible.
  • Minimal use of tables: Keep table usage to a minimum. Remove whenever possible.
  • Nav uses absolute links: All global navigation should use absolute links at all times.
  • Good anchor text: Use keyword rich text in hyperlinks, both in navigation and in body copy.
  • Text can be resized: Make sure content can be resized by the visitor as necessary.
  • Key concepts are emphasized: Make sure each page places appropriate emphasis on it's key information.
  • CSS less browsing: View pages with CSS turned off and make sure site can still be properly browsed.
  • Image-less browsing: Turn off images and browse site, making sure it can be properly navigated and understood.
  • Summarize all tables: When using tables be sure to summarize it's contents.

bdwmc-blog-bottom-new.jpg






Stoney deGeyter is the President of Pole Position Marketing, a leading search engine optimization and marketing firm helping businesses grow since 1998. Stoney is a frequent speaker at website marketing conferences and has published hundreds of helpful SEO, SEM and small business articles.

If you'd like Stoney deGeyter to speak at your conference, seminar, workshop or provide in-house training to your team, contact him via his site or by phone at 866-685-3374.

Stoney pioneered the concept of Destination Search Engine Marketing which is the driving philosophy of how Pole Position Marketing helps clients expand their online presence and grow their businesses. Stoney is Associate Editor at Search Engine Guide and has written several SEO and SEM e-books including E-Marketing Performance; The Best Damn Web Marketing Checklist, Period!; Keyword Research and Selection, Destination Search Engine Marketing, and more.

Stoney has five wonderful children and spends his free time reviewing restaurants and other things to do in Canton, Ohio.





Comments(7)

This question is regarding link text color. Is there anything that states the standard blue link text is better than other colors?

@SEMaven It's becoming less and less important that link color be blue. Blue links are a carryover from the days when, well, all links were blue. As site designs improved links gradually started changing color to match with the design. It's almost rare now for links to be blue.

On the other hand, search engines and other popular sites such as wikipedia still use blue text for links and there are still "old school" web users out there. For absolute usability, keep your links blue. But this is one convention that is less important.

Underlining your links, however should be non-debatable. :)

Just to add that neater solution is to underline links using border-bottom attribute instead of text-decoration: undeline.

This will make the underline perfectly go beneath the text without cutting descender, the part of the text below the baseline.

Sasa, thanks for that. Thats something I had never thought of.

Great post!thank you

Great post and series!

I have a question about: "All global navigation should use absolute links at all times."

I've heard conflicting opinions about this..I think it is logical that links should be absolute as well, but I have been hesitant to make the designers go through the trouble of changing them without any more solid evidence. Can you elaborate or point me to other resources that suggest absolute links?

Also, why add nofollow to the shopping cart? What if the shopping cart has static, optimized product pages that you want crawled?

Thanks again for the great info in this series and I look forward to your feedback :)

Pam

@ Pam - Relative links in nav work just as well, but we find that by using absolute links you fix a number of potential problems. 1) ensures that visitors go to the proper "version" of each page in regard to having the www. or not. This can be fixed in other ways as well, but no sense forcing the link value to be "redirected" instead of actually pointing to the proper location in the first place. 2) it also helps when using include files for your navigation when pages are more than one level deep. Again, relative will work, but it's not as, well, absolute. :)

Comments closed after 30 days to combat spam.


Search Engine Guide > Stoney deGeyter > The Best Damn Web Marketing Checklist for Website Architectural Issues