Years ago dynamic websites posed significant problems for search engines. While the engines have come a long way since the early days of dynamic website development, there are still some key problems that arise. Google has gone public stating that you don't need to fix your dynamic site problems, but in reality that's poor advice for the website owner.

Google says this in their own self interests so they can use your problems to fix theirs. But when engaging in the battle of online visibility, you don't want to sit around while Google figures out how to plug the holes with their indexing spiders. You need to be proactive and fix the issues so you can be competitive today, not tomorrow.

Poor categorization that creates duplicate pages

One of the key problems with websites that serve dynamic content is producing "pages" with the same content using different URLs. This happens when there are multiple ways to navigate to specific products. For example, blogs often allow you to find articles by category, date and author. Navigating by any of these methods can get you to the same page of content but on a different URL (,, or

Be best solution for this is to give each product a static URL or create a "master" category that is the one in which the URL is formed. So any product you have that can fit into multiple categories, only allow one possible URL. All other links to that page or product, regardless of which category you navigate through, will link to the same URL.

Secure / Non-Secure Page Duplication

When dealing with ecommerce websites visitors often find themselves moving into the secure portion of the website once they enter into "checkout" mode. This security is essential to keep your information private, but what often happens is that site's will link out from their secure areas back to their products or other main pages. If the links were created wrong, the visitor will be stay in the secure portion of the site even though they are viewing pages that are also available in the unsecure area.

There is nothing inherently wrong with navigating a site completely in secure mode. But what happens is that both secure and non-secure version of the same pages can exist in the search engine indexes. This creates duplicate content issues that throw a wrench into your SEO efforts. Ensuring that when navigating out of the intended secure area visitors do not enter an unintended secure area ensures that these duplicate pages cannot exist.

Session IDs

Session IDs are often used for tracking visitors as they navigate from page to page and ensuring that the products they add to the shopping cart remain "attached" to that specific user. Problems arise because these session IDs are attached to the end of each URL. So instead of you'll see And then the next user comes in and they get a new session ID which means the URL they are given is em> And the next woman gets em> Three users, three urls, same page.

Every visitor creates a duplicate page with a completely unique URL. The search engines index these URLs and suddenly you have thousands of pages in the index, all for the same basic page of content. The best solution is to use other methods of tracking rather than session IDs. Yes, session IDs can be suppressed so the search engines don't get them, but this still leaves holes when other pages link using URLs with the session IDs in them. While the engines may be able to figure it out, you don't always want to leave things to chance.

Broken links

Dynamic sites make it easy to add and remove products or pages with little or no effort. For each new page added a new URL is created and links to the new content are automatically integrated. But what if you add links to other pages manually in the content you write? This happens frequently with blogs and ecommerce sites. You add a link to a product or article that you think the visitor viewing that page will be interested in. But down the road that page or product gets removed from the database and now you have a hard-coded link that goes nowhere.

The solution here is quite simple. Check for broken links regularly. This will allow you to go in and change any links going to content or products that no longer exist, ensuring that you're not sending your visitors (or the search engines) off to nowhere.

Complicated URLs

Many dynamic systems create very complicated URLs. While the search engines can spider them, the longer and more complicated they are, the more it slows the search engine spiders down. Instead of continuing to index your pages they may leave sooner rather than later.

You also have the usability factor. URLs are often displayed in the search results and can provide key clues to the value of the page to the searchers intent. If the URL is convoluted you lose a prime opportunity. On the other hand, if you ensure your dynamic system produces clean, user-friendly URLs, those URLs in displayed in the search results can be quite telling to the searcher as to what's on that page. If the words in the URL match or are a close fit to the search query, the searcher can be fairly confident that if they click on your page they'll be delivered to the content they want.

Hidden data behind forms, dropdowns and passwords

Many dynamic systems will use drop downs menu systems to help users navigate to the areas of the site they want. Most of these drop down menus are created using form fields rather than CSS. While users have no problems using this type of navigation, search engines do. It's the same with content that requires a password in order to get to.

Content and pages hidden behind this type of navigation becomes virtually unfindable by the search engines, unless there is an alternate means of navigation. The same drop down effect can often be created using CSS, but the key is to make sure that the links are generated using proper HTML and not forms or JavaScript.

Bloated Code

Dynamic websites run off templates. The developers create one or more templates, based on your needs and then various pieces of content are automatically inserted into the proper place on each template "page". If the template contains bloated code then your site as a whole can be too cumbersome for the search engines. It's not that they won't spider your pages, it's just they'll tire of it more quickly and move on to other sites.

There are many ways that code bloat is created. The primary culprits are poor HTML markup, excessive tables, and on-page JavaScript and CSS. The good news is if you fix your bloated templates to be lean and clean, changing one template will change dozens, if not hundreds or even thousands of pages on your site. A simply solution that takes effect site wide.

Dynamic websites are becoming more and more popular because they allows significant ease when updating product information, changing templates or adding new content. Fortunately, the problems typically associated with dynamic websites can be eradicated. While other business owners may sit and wait for the search engines to get around to finding a solution on their end, you can get be proactive and fix they problems that may be hindering your performance today.

February 24, 2009

Stoney deGeyter is the President of Pole Position Marketing, a leading search engine optimization and marketing firm helping businesses grow since 1998. Stoney is a frequent speaker at website marketing conferences and has published hundreds of helpful SEO, SEM and small business articles.

If you'd like Stoney deGeyter to speak at your conference, seminar, workshop or provide in-house training to your team, contact him via his site or by phone at 866-685-3374.

Stoney pioneered the concept of Destination Search Engine Marketing which is the driving philosophy of how Pole Position Marketing helps clients expand their online presence and grow their businesses. Stoney is Associate Editor at Search Engine Guide and has written several SEO and SEM e-books including E-Marketing Performance; The Best Damn Web Marketing Checklist, Period!; Keyword Research and Selection, Destination Search Engine Marketing, and more.

Stoney has five wonderful children and spends his free time reviewing restaurants and other things to do in Canton, Ohio.


Boy! What a lot of great info. Thanks for a good overview of how content affects out sites and SEO standings.

wonderful information. I have wondered how to get rid of a bucnh of pagerank stealing useless urls that maps and amazon causes on our sites.

Thanks for the info, you really had great insights, by the way do you think that the best solution for broken links was deleting it? it is what I usually do.

most search engines don't pay attention with dynamic URLs meaning they do not index it, simply because search engines see a huge sequence of web pages while they are crawling on the dynamic URL, the contents of which keep changing makes it looks like a different URL and this was called spider trap, The more sophisticated search engines, such as Google, index dynamic URLs as long as the information is market specific and content rich. But the best way to ensure that a Web site is indexed on the largest possible number of search engines is to include at least one significant page that is represented by a static URL, in which the contents do not change unless the HTML code is rewritten.

@Miami - You can delete the broken link, or link it to a different relevant page. But either way, the url to the non-existent page should be gone.

Yeah I live at a drug rehab and im trying to learn this stuff right blogs like this are really usefull...thanx

Comments closed after 30 days to combat spam.

Search Engine Guide > Stoney deGeyter > Dy-No-Mite Solutions to Dynamic Content Problems