Dynamic sites require highly specialized search engine marketing strategies that differ from those used for static sites. It's still hard to get dynamic sites indexed unless they're properly optimized. While search engines say they now index dynamic sites, and they do; many times it doesn't happen without a little help. And certainly the positioning of pages is another issue.

There are a number of strategies that can be used to convert your dynamic URLs into search-engine friendly URLs. Before we get into that, let's look at how the dynamic databases used by e-commerce sites and other large sites are created and why they are hard to index.

What Keeps Dynamic Sites Hidden?

Dynamic pages are created on the fly with technology such as ASP, Cold Fusion, Perl and the like. These pages function well for users visiting the site, but they don't work well for search engine crawlers.

That's because dynamically generated pages don't actually exist until a user selects the variable(s) that generate them. A search engine spider can't select such variables, so the pages don't get generated or indexed.

The big problem is that crawlers such as Google can't read the entire dynamic database of URLs, which either contain a query string (?) or other database characters (#&*!%) known to be spider traps. Because search crawlers have problems reading deep into a dynamic database, these crawlers have been programmed to detect and ignore many dynamic URLs.

We recently increased a client's search engine potential from six (6) to six-hundred fifty-nine (659) pages. Considering that Google saw only a half-dozen pages originally, we think hundreds of optimized pages will significantly increase our client's search engine visibility.

Making Dynamic Sites Visible

There are a few dynamic-page optimization techniques that can be used to facilitate the indexing of dynamic sites. The first that comes to mind is to make use of static pages. There are also many fixes to convert dynamic URLs to search-engine friendly URLs. Another good way to get visibility is to use paid inclusion and trusted feed programs that guarantee the indexing of dynamic sites or a number of click-throughs.

Static Pages. Place links to your dynamic pages on your static pages, submitting your static pages to the search engines manually according to each search engine's guidelines. This is easily done with a Table of Contents displaying links to dynamic pages. While the crawlers can't index the entire dynamic page, they will index most of the content.

Active Server Pages (ASP). XQASP from Exception Digital Enterprise Solutions is an excellent tool for converting dynamic ASP pages into search-engine compatible formats.

For example, the following URL contains both "?" and "&," making it non-indexable:

   http://www.planet-source-code.com/vb/scripts/ShowCode.asp?lngWId=3&txtCodeId=769

Below, it has been made search engine-friendly (all "?" and "&" and "=" characters replaced with alternate characters):

   http://www.planet-source-code.com/xq/ASP/txtCodeId.769/lngWId.3/qx/vb/scripts/ShowCode.htm

Once you've converted the URL, don't forget to use search engine optimization techniques to modify the HTML tags and content within the tags before submitting all pages in accordance with each search engine's submission guidelines.

ColdFusion. This might be an easy fix. Reconfigure ColdFusion on your server so that the "?" in a query string is replaced with a "/," passing the value to the URL. You will still have to deal with optimization of your pages and making your site respond quickly when a crawler does come by for a visit.

CGI/ Perl. Path_Info and Script_Name are environment variables in a dynamic application containing the complete URL address, including query string information. Your solution is to write a script that removes all the information before the query string, making the remaining information equal to a variable and using that variable in your URL address. Again, optimization is required to show up in the top editorial listings.

Apache Software. The Apache server has a rewrite module (mod_rewrite) available for Apache 1.2 and beyond that converts requested URLs on the fly. You can rewrite URLs containing query strings into URLs that can be indexed by search engines. The module doesn't come with Apache by default, so find out from your Web hosting firm whether it's available for your server.

Indexing Dynamic Sites With Paid Inclusion Programs

Most major search engines now offer paid inclusion and trusted feed programs based on refresh indexing or cost-per-click. Engines offering such programs include AltaVista, AskJeeves, FAST AllTheWeb, Inktomi, LookSmart, Lycos, and Teoma.

These programs alone are not good enough for search engine positioning. When indexing dynamic sites through XML feed, you must first ensure the site is properly optimized using professional search engine optimization (SEO) techniques.

Good SEO contractors have access to Web-based automated feeds with creation and management application to generate XML optimized feeds for multiple search engine inclusion programs. Such contractors can map any large size e-commerce site's entire catalog, generating an automated XML optimized feed.

The key to this XML procedure is keyword matching between the dynamic site content and various search engine databases. Using special filters and parameters, this process then generates thousands of keywords with page-specific meta information. The result is a distinctive representation of each product page on the target search engine, and a more accurate representation of your dynamic site's services, products, etc.

Internet users search with mind-boggling combinations of your strategic keywords. Users at various search levels must find you before they find your competition. That's why keyword analysis and research is so important to the success of your SEO campaign. Professional optimization techniques covering site architecture, copywriting, editorial linking, etc. are also important for top positioning on major search portals.
November 12, 2002





Paul J. Bruemmer has provided search engine marketing expertise and consulting services to prominent American businesses since 1995. As Director of Search Marketing at Red Door Interactive, he is responsible for strategizing and implementing search engine marketing activities within Red Door's Internet Presence Management (IPM) services.



Comments closed after 30 days to combat spam.


Search Engine Guide > Paul Bruemmer > Optimization of Dynamic Sites Requires Precision