As a web developer and programmer I have been building and optimizing dynamic sites for the past 4 years. I knew going in that there were some problems with optimization of database-driven sites. The first step in programming a solution is identifying the problems you'll encounter implementing a solution. A major problem for dynamic sites is search engine visibility.

Search engine visibility, or the degree and depth a spider crawls a site, is crucial to all organic SEO campaigns. On static websites this can be controlled through document structure/design, link hierarchy/structure and site architecture. Dynamic sites provide challenges to link structure because some search engines' crawlers do not follow dynamic urls (www.site.com/followme.asp?dynamic=maybe).

The querystring (?dynamic=maybe) is the cause of the indexing problem. The engines affected are primarily AltaVista (Scooter) and Inktomi (Slurp). The others have less problems, but can still be troublesome.

Other querystring problems:

  1. Google Guy, a representative of Google Inc, has stated that there is a problem with session ID's in querystrings. Avoid using ID for the name in the parameter and use alphanumeric vales if at all possible, particularly if the platform is NT. Strictly numeric values can have characteristics associated with Session ID's. NT servers use a numeric value for Session ID's (usually 6+ digits). Combining ID and a numeric value is riskier than ?NUM=1234. PHP is alphanumeric so it is possible a numeric value may be safer. At the very least, avoid using "ID" as a parameter name. Since sessions are often used to provide dynamic content other search engines besides Google may also be wary of urls containing qstrings.
  2. Avoid using more than 3 parameters
  3. Querystrings are often used to maintain state, so for a shopping cart the best solution includes a way to maintain state without the querystring.

Maintaining State Without Querystrings:
At first this seemed like a daunting task. The ultimate answer was staring me in the face for some time before it came to me. Cookies are a part of the solution, but because they are dependent on user settings it is advised to include an alternate tracking method. The method I developed does use parameters, however search engines or users have to execute a form in order to invoke parameters in the url. If cookies are enabled (search engines aren't cookie-enabled) then the session is tracked in NT sessions (phantom cookies) so no parameters are used to maintain state.

Parameters in Querystring:
I do use parameters, however I also use them to steer engines since I use the static .asp pages for sub categories. Since this is "redundant" rather than "duplicate" content, engines which don't like parameters won't follow them but since there is a static page with the same information formatted differently it isn't crucial to visibility.

Removing the Querystring:
The best answer to the querystring challenge is to remove it by programming the site using either a server or programmed solution to provide the parameters (?name=value) in the querystring to the program and ultimately the database. The usability gains accrued to users by removing querystrings is a benefit that should be included in your solution's value. Removing querystrings provides an easier url for a user to remember.

One workaround I read about uses a sitemap to link to static pages built from the dynamic pages. Though this is an easy thing to do for a small site I immediately dismissed this as a workaround rather than a solution. There is no point in making portions of a dynamic site static. It seemed to me this just defeated the purpose for it being data-driven in the first place. In my opinion, it also has huge scalability issues. After all, total visibility is the goal, so a sitemap (or group of sitemaps) with 3500 links isn't going to be very effective in inducing full indexing. Moreover, I'm not a big believer in the sitemap strategy even for static sites. It just seems like another workaround rather than a structured solution.

I didn't go for a server-based solution because I was concerned about the associated server load. Pathinfo (www.site.com/followme.asp/dynamic/tough) was very interesting, more so if it was a Perl solution, and VbScript is pretty verbose and its pathinfo methods looked kludgy to me, but then again, most MS methods look kludgy to me ;). Another notable server solution is mod rewrite on UNIX and either a component or comparable NT service for mod re-write.

The first solution I wrote was in Perl using SSI to embed the database parameters. The pages had static names (page.shtml), which was the ultimate goal for this solution. This worked well, but was eventually re-programmed as an .asp solution using embedded database calls in the page. To spiders the site appears static. I advised a client to implement the embedded SQL queries on a ColdFusion site with instant success. The site went from 100 pages in Google to over 5,000 in less then 6 weeks! The success is also partially due to the site having a PR5 for the entry page.

Whenever I'm providing a solution, I determine benefits to cost. Sometimes it is more cost effective to look at a solution of a non-programmatic nature. It is often effective to look beyond just what you can "personally" provide.

PositionTech, provides a one-stop shopping site for inclusion to all major programs. This may be a more cost effective workaround than implementing any other solution. The only real drawback to inclusion is that internal links aren't followed, so it only provides a partial solution to the indexing problem.

Implementing the inclusion program should include careful targeting of keyword phrases. Optimize as many pages as you can afford to. XML feeds can also be used to leverage the database for greater visibility. XML feeds can leverage the programming almost like putting it on steroids with the big caveat being that it can be expensive since payment is based on clicks. However, for anyone thinking of cloaking, this is far less risky and likely about the same cost.

See this post I made to I-search for an example of embedded SQL code and optimization of a page using includes and database programming. Bare in mind that although I have used .asp in the example code this embedded SQL solution can also be easily implemented in JSP (Java Server Pages), PHP and ColdFusion.

............... see ya at the top!
Da' Tmeister


September 8, 2003





In 2000 Terry founded the SeoPros Organization as a means to improving consumer perceptions of the Search Engine Optimization Industry. From 2001 to present he has been the lead developer of its reviewed directory of SEO Consultants and RFPgenerator. SeoPros has become an NFP as a vehicle for development of the Industry and trade. His new venture Canadian Search Engine Marketing or CaSEM presently administers and continues to develop SeoPros' directory and RFPgenerator. CaSEM was formed as the provider of SeoPros Private Label Recommendation and Campaign Management Services.





Search Engine Guide > Terry Van Horne > A Dynamic Solution for Dynamic URLs