It is not a particularly well known concept amongst certain webmasters, but it is essential to have a clean and crawlable website structure. Without this in place, from an SEO point of view, you are heading for dangerous territory.
The reason is simple. If your site cannot crawled and indexed, then you won’t be able to consider other aspects of SEO, such as content optimization and link building techniques.
During the course of most search engine optimisation initial audits, it often becomes clear that canonicalisation becomes a hot topic of discussion.
Canonicalisation is the process that prevents the site from displaying the same information at more than one URL.
Basically it is in your interest to minimise, or preferably eliminate, the number of URLs that have the same content – or face the consequences of duplicate content issues. There are a number of different ways that a site can present canonical issues, and as an SEO practitioner, you will need a plan to tackle these.
For example, do you see your site resolving with both the non-www and the www, or perhaps with the http and https headings? Maybe some folders are resolving with or without the trailing slashes?
The solution for handling these types of canonical issues generally includes 301 permanent redirecting requests, which redirect potential duplicate content to the ‘canonical’ URL. As mentioned above, the non-www version of the page would then be redirected to the www version.
RewriteCond Host: ^yourdomain\.com
RewriteRule (.*) http\://www\.yourdomain\.com$1 [I,RP]
It is important to have a decent redirect system in place for aspects other than just canonical issues. Recently I was involved in the redesign of site for a client, which involved moving core pieces of content including product landing pages, which resulted in new URLs. It was vital that each page was mapped, so that a permanent 301 redirect was put into place to transfer visitors via the old URL to the new one.
RewriteRule /old-page.htm http://www.example.com/new-page.htm [I,O,RP,L]
When I say visitors, I am not only referring to the human kind. Search spiders will also follow these header requests, and if incorrectly written (or not written at all) will deliver an error message – a poor user experience, and one that can be costly in Organic search.
ISAPI Rewrite: .htaccess for the Windows environment
How do you address the canonical issues that have been mentioned? Checking Google for these problems overwhelmingly returns references to .htaccess and mod_rewrite. These are great solutions, but rely on the site being hosted on an Apache server. What if you are using a Windows environment to host the site?
There is now a solution available, which is relatively easy to install and implement – it is called ISAPI Rewrite. Version 2 uses an httpd.ini file, and allows you to add your rewriting rules, it much the same way that you would if using an .htaccess file. Version 3 has been produced to mimic .htaccess for Apache.
The developers realized how popular .htaccess for Apache had become and developed a solution that allows you to write exactly the same code for both platforms. This was a smart move, on their part, as it enabled developer s to port the same code from one platform to another with minimum fuss.
ISAPI Rewrite is available at a reasonable cost – a single user license can be purchased for $99, with a better cost available on a shared hosting plan.
You may syndicate this article content only if you do not modify the article in any way, leave the links intact, and utilize the canonical link relationship or Robots meta tag directives in the head section of your blog:
- <link rel="Canonical" href="http://www.searcheditors.com/isapi-rewrite-canonical-windows/">
- <meta name="Robots" content="noindex,noarchive,nosnippet,follow">