Search Engine Guide
Home
Search
Engines
Knowledge
Base
Vendor
Directory
Newsletters
About
Search The Internet: 


Orbidex provides two additional ways to stay current on search engine optimization techniques, trends, and other vital information:


Subscribe to have these articles delivered directly to you each week via email.



In addition to receiving the weekly articles, be sure to sign up for Orbidex's monthly newsletter.

Search Engine Optimization

Article provided by:
Orbidex
© 2001 Orbidex.


Back To Article Index

Tell Your Friends About This Site




Surviving the Google Update
By Andrew Gerhart - November 29, 2001

Google is still the king of the search engines, hands-down. If you want to have a successful website through the search engines, you need to keep this in mind. Over the last week or so Google has been updating its database and its search engine results pages (SERP). Search engines, including Google, have a tendency to change their SERPs around, add sites, and drop sites. To keep your head above the Google waters there are certain things that you need to do and certain things that you should not do. We will clarify what these things are so that your website will survive the next update.

GOOGLE DON'TS
Redirects
Webmasters often employ redirects, or refreshes, on websites for any number of reasons. This should be avoided. The redirect has been the cause of penalization within Google as well as other search engines. A redirect can be done with JavaScript or HTML, and simply takes the user from one URL to another URL without any notice.

Frames
The issue of frames and whether or not they will cause a website to be penalized or banned is quite controversial. Googlebot, Google's indexing robot, will not enter or read any content that is inside frames. Google instructs users that using the NoFrames tag on your framed pages will allow Google to index the pages, but this is not always the case. Using frames on your website is a gamble.

Dynamic Pages
This is another issue that is a gamble. Google will index file types including html, pdf, asp, jsp, hdml, shtml, xml, cfml, doc, xls, ppt, rtf, wks, lwp, wri. The problem with this is certain file types and extensions are more likely to rank better and are more optimizable. Google has officially stated that it will only index a certain amount of dynamically generated pages, which includes URLs with a query string, as a result of server issues. If a search term brings back 3,000 results, there is a good chance that your dynamically generated page will show up in the SERPs. On the other hand if another search term brings back 1,000,000 results, the chances are that your dynamically generated page will not show up. One reason for this is that if there are so many competing sites you can bet that there are a great deal of optimized websites, and dynamic pages do not fall under this category.

Orphan Pages
The way that Google finds a website and the web pages within a website is by sending out Googlebot, Google's indexing robot. Googlebot will crawl each page and then follow the links on the page to the next page. If you have orphan pages, which are pages that have no links pointing to them, on your website, Googlebot will not be able to find the pages. If the page has no links pointing to it, than improving the page's link popularity and Page Rank, which we discuss below, are virtually impossible.

Orphan pages are also bad from a usability standpoint. How is the user going to find the page if they don't know the URL or did not find the page in a search engine?

"Doorway Pages"
These pages, which are created to target a specific keyword, have been criticized and deemed as spam by many webmasters and search engines. Almost every search engine states in their TOS, or on their submission pages, not to submit doorway pages. The reason that doorway pages have a bad reputation is because they were abused and also because they are in essence orphan pages. Certain people in the SEO industry and webmasters would create a page that had a very small amount of actual content, that was high in keyword density, and had links into the actual website. These pages would then be submitted to the search engines to rank well for a specific keyword. These pages were fairly easy to create, and therefore could be made on a high-scale basis, creating irrelevant pages being served during search results, and at the same time creating more spam within the search engines.

Targeting a keyword is fine, and is what we do as optimizers. There are some things to think about when creating a page to target a keyword so that the page is not classified a "doorway page." Does the page have any real content on it? Is the page actually connected to the rest of the site? What is the keyword density of the page? Most importantly, would a human reviewer let this page pass?

In essence, every page on your site should be a doorway page as they should all target a specific keyword, but the trick is to make these pages an integral part of the site. Don't isolate the page, have no valuable content, or make the page a mere entrance point into the site.

Get Caught Cloaking
Notice that I didn't say, "Don't Cloak." The reason that I didn't say this is because some people are very successful using the cloaking method. Cloaking is a method of search engine optimization where through scripting the search engine is fed a highly optimized version of a page and the user is served a different version of that page. There are many forms and uses of cloaking, and some of the search engines even use it.

Google has officially stated in their "Do's and Don'ts" that they will not allow cloaked pages in their database, and other search engines have the same policy. One way of Google finding websites that use cloaking is their Cache feature. So, if you are going to cloak pages that you would like to rank in Google, you must be careful and pay attention to them. Again, this form is sometimes a gamble.

Hidden Text
This is an old trick that used to work, but has since been thrown out the window. If one was to use this method, it is safe to say you are going to get classified as a spammer.

Abusive Submission (automated - manual)
While all search engines warn against abusive submission, the methods of submission and getting your website listed in the Google SERPs are a bit different. Google states that submission is not necessary for inclusion into their database. Googlebot spiders its way throughout the websites that are already in the Google database, and the best way to get Googlebot to visit your website is to get a link on one of these sites. This will also tie into your website's link popularity and Page Rank. To increase the chance that your website will be found by Googlebot, find a well ranking site for one of your target keywords and request that they post a link to your site. Another way to be found without submitting is to get a listing in the Open Directory Project and Yahoo!.

Those are the issues with manual submission. Several months ago Google was found to be penalizing websites that are from certain IP's. What they were doing was penalizing websites that were using programs that would report the website's ranking. Google noticed a huge amount of requests on their server from people, webmasters, and optimizers abusing the programs that would allow them to check their ranking. Google will detect the IP that the requests are coming from and either penalize or block the IP.

Keyword Density Too Low
Keyword density is the ratio of keywords to text on a web page. Your keyword density is a factor in a web page's ranking for each target keyword. If the keyword density is too low, than this may be part of the reason your site is not doing well in the SERPs. Search Engine World has a good tool to check your keyword density.

Keyword Density Too High
Your keyword density shouldn't be too low, but it also shouldn't be too high. If your keyword density is too high than Google may see the page as spam or a doorway page. The keyword density of your page and what number you should target is all relative to your market. If the web page's in the top 10 of the SERPs all have an average of 50% keyword density, than you will want to target a number that is roughly the same to be able to compete.

Bad Navigation
The navigational structure relates to the orphan pages and to Googlebot being able to find its way through your entire site. Your website should have a clean navigational structure that makes it easy for the user to get through the site, as well as Googlebot. Themes are a good way to lie out your navigational and website structure.

The best form of navigation is text, as it will help with keyword density and word count, as well as defining the link for the robot. Flash was previously a form of navigation to stay away from, but recently Google began indexing Flash links.

GOOGLE DO'S
Site Accessibility
First and foremost, you need to make sure that your website is always accessible. This is something that happens often, and if it does you have to wait until the next month's update to get back in. If your server goes down temporarily for whatever reason and Googlebot comes to your site at this same time, than Google will leave you out of the update. InternetSeer is a web-monitoring tool that will check your site throughout the day and alert you if the site goes down.

Good Internal Linking
The internal linking of your site will play a role in the indexing and ranking of your website and web pages. If you decide to theme the website than the website should be linked vertically, then horizontally within categories, but not horizontally across categories. The linking within the site should not be too heavy as to not dilute the strength of each page.

Another important aspect of the internal linking of your site is the sitemap. A sitemap is a page that contains links to every page on the site. There should be a link to the sitemap on every page, or at least on the homepage, and the link should be easily reached by Googlebot. By having the link to the sitemap, you can be sure that Googlebot will be able to reach each one of your web pages.

Good External Linking (link popularity)
External linking, also called link popularity, is the number of links on other websites that point to your web pages. This is a big factor in the ranking algorithm for Google, and as mentioned before, can speed up the process of getting your site into the database. Previously, to climb the SERPs all that was needed was a large number of links. Google is now examining the quality of the links as well as the quantity of links. For example, it is sometimes better to have 20 links to your site from sites within the top 40 for a target keyword search, than to have 100 links that are irrelevant to the website's content. This ties into Page Rank, which we will discuss below.

Have Good Content
Forget the tricks and forget the smoke and mirrors. The way to rank well in Google is to have good, unique content that is optimized. A search engine is a business, with the main goal being to provide the most relevant results to the user based on their search. If your website has unique, good content that is relevant to a search, the search engines will want to serve up your site to the user. (That is where optimizing comes in)

The content should be in the form of HTML. Google uses its indexing robot, Googlebot, to crawl through your website and go though your content. While the robot can read the image tags, Flash tags, etc, it cannot read the actual files. It is a robot, not a human. For example, Googlebot cannot see that your image has 20 lines of text that is keyword rich, but instead sees <img src>.

This is why static HTML content is vital to your Google listing.

Update The Content
A good way to stay on top of your competitors within Google, and to feed Googlebot is to have updated content. Content becomes outdated and stale and in turn becomes undesirable to the user. A website should be updated at least once a month, if not more.

Google's normal updating cycle is once every four weeks. Recently, Google has been seen to be updating certain sites on cycles that have been as little as 24 hours. The sites that are been updated on this time cycle are the ones that have updated content and a high user base.

Listing In Yahoo and ODP
If your site is not in the Google database or if your site is in jeopardy of being dropped from the database, a good way for Googlebot to find your site is to have a good listing in Yahoo! and the Open Directory Project.

Have a Robots.txt
A robots.txt is a file that resides with your web pages, and is the first file that Googlebot, and most other robots and spiders, will attempt to access. This file will tell Googlebot which file it can access and which files to stay away from. Using the robots.txt can be beneficial in some cases. For example, if you have a page that is in the current index but you need to take the page down, you can deny Googlebot access to this web page. The same goes for a link that no longer exists.
More about the robots.txt

Good Page Rank
Google has added a variable to their ranking algorithm called Page Rank (PR). Page Rank is basically the calculation of how much authority your website has within a market. PR is a complex mathematical calculation that is performed with the end result being a number 0-10. A very basic explanation of Page Rank is that a link from Site A to Site B is a vote for Site B from Site A. The twist is that not every vote has the same weight authority or weight. When Google calculates the weight of these votes it will look at the PR and the content of the site that is casting the vote. If the PR of the voting site is higher, than the link will count for more, and if the PR is low and the content is not relevant than the link will have less weight.

Earlier we discussed link popularity and how 20 links are sometimes better than 100 links. This is how that example applies. Page Rank is the foundation for Google's ranking algorithm, as its complexity makes it hard to tamper or alter with ease.

Although there is no quick solution to a low Page Rank, there are ways to improve your PR. Perform a search within Google for your target keywords, and make a list of sites that have a high Page Rank. The websites that you choose have to be within your market, or have relevant content, as Google examines this. A high PR can be deemed as anything over 7, but websites with a PR of 6 should not be included in the list. Contact the webmasters of these sites and ask them to post a link to your site. Some of them will decline, and this is where having that unique and good content comes into play!

The title of this article/tutorial is "Surviving the Google Update", but if you adhere to these things and exceed them, your site will be on the list of sites that are updated every 48 hours. Good luck!