An year ago(well, more than a year ago now), I blogged about doing something in SE Friendly URLs in ASP on IIS without any COM or ISAPI filter, but just in pure ASP. I tried a few things, got it working & then other things came up and it was left then & there. Even the project which I had in my mind for which I was doing this, was abandoned, lack of time & other priorities being the reason.
But I received a request from a guy called Chris Martz whom I don’t know & he seemed interested in this & wanted me to post some code. 😀 So, here is what I came up with 1 year back. But mind you, don’t criticise the code, it was just a test code & I generally don’t write test code beautifully!!! My test code is generally quite amateurish & is there to just get the job done, to get the concept right, then I work on refining & optimising it. I’ve tried to clean it as much as I can but even when I’m posting it, I don’t have much time to refine it, so its given to you on an understanding that you’ll use it on your own risk. 😉
Ok, so lets get going. There are quite a few articles around on having Search Engine Friendly URLs in ASP without any COM or ISAPI. Basically almost all of them(as much as I’ve read) are about using custom 404 error pages & putting your code in them. Now some of them use the redirection to the dynamic URL after putting in the correct parameters & others use the Server.Transfer feature of IIS5+. There are pros & cons of each method which I’ll put forth here.
Redirect Method
The only benefit of this is that you can pass any parameters you want to the resultant page. So if your resultant page is http://www.example.com/dynamic.asp?id=20&pid=abc, you can simply redirect to it as it is. The disadvantages of this method are:-
- the new dynamic URL with querystring appears in the browser window, your visitor might get confused
- the purpose you are doing all this for is defeated, the search-engine’s bot will also be redirected to that dynamic URL, & might not index it, which is why you are using SE Friendly URLs in the first place
- the browser is redirected & thus makes a new request to the server, so for serving 1 document, the server has to receive 2 HTTP requests
Server.Transfer Method
This method has just 1 disadvantage & all other are advantages. If you are not aware of the Server.Transfer() which was introduced in IIS5, then you should consider looking up your IIS documentation or your reference book(s), I’m not going to explain it here. So, the interesting thing is that the advantages/dis-advantages of the Redirect Method apply to this method but in reverse order. So, this method’s advantages are:-
- the SE Friendly URL remains in the browser window, your visitor doesn’t notice a thing
- the purpose you are doing all this for is kept, the search-engine’s bot will also see the same URL, it also won’t notice a thing
- the browser is not redirected & thus makes a new request to the server is not made, so for serving 1 document, the server has to receive just 1 HTTP request
But there is one dis-advantage, its the same as the advantage of the Redirect Method. You can’t pass parameters to the target page via querystring. So if your resultant page is http://www.example.com/dynamic.asp?id=20&pid=abc, you’ll have to redirect the server(or I should say Transfer the server control) to So if your resultant page is http://www.example.com/dynamic.asp. But that will be practically of no use, as this URL is SE Friendly, it hasn’t got any querystring in it, so why the hell would you resort to such a way?? But we’ll use this way only, as its better than the Redirect Method & has 1 dis-advantage which we can eliminate. 😉
The way to get around this one dis-advantage is quite quite easy, you just have to think!!!! 😀
We maintain state, use Sessions to pass the variables to the target page which will output the content to the browser. Now you may ask why Sessions, why not Cookies?? The answer is simple, you shouldn’t send any content to the browser from the 404 page, better to send anything only from the target page. Also, you shouldn’t take the chance of Cookies being disabled on the client browser. So using Sessions is safe & sound. 😉
So I expect you are hoping for some code as well, right? 😀 Ok, here it goes.
[asp]
Private Function stripEndSlashes(strURL, Optional strEnd = “L”)
strEnd = UCase(Trim(strEnd))
Select Case strEnd
Case “L”:
If (Left(strURL, 1)=”/”) OR (Left(strURL, 1)=””) Then
stripEndSlashes = Mid(strURL, 2, Len(strURL))
End If
Case “R”:
If (Right(strURL, 1)=”/”) OR (Right(strURL, 1)=””) Then
stripEndSlashes = Mid(strURL, 1, (Len(strURL)-1))
End If
Case Else:
stripEndSlashes = strURL
End Select
End Function
strSiteURL = “http://example.com/” ‘Full URL of the Website
strSiteURLR = “/” ‘Relative Path for use in Server.Transfer
str404 = Trim(Request.ServerVariables(“QUERY_STRING”))
‘check if its a 404 error or a direct request
If Left(str404, “3”)=”404″ Then
‘its a 404 error, so parse it baby!!
strURL = stripEndSlashes(Mid(str404, 5, Len(str404)), “R”)
strFileUrl = stripEndSlashes(Mid(strURL, Len(strSiteURL), Len(strURL)), “L”)
strFile = Mid(strFileUrl, 1, (InStr(strFileUrl, “/”)-1))
strParam = stripEndSlashes(Mid(strFileUrl, (Len(strFile)+1), Len(strFileUrl)), “L”)
strRedirect = strSiteURLR & strFile & “.asp”
Session(“pageParam”) = strParam
Server.Transfer(strRedirect)
Else
Response.Write(“404 – Page Not Found”)
Response.End
End If
[/asp]
Now in the pages that will be handling requests, the top should be something like this
[asp]
If Not(IsNull(Session(“pageParam”))) And Not(Trim(Session(“pageParam”))=””) Then
pageParam = Trim(Session(“pageParam”))
Session(“pageParam”) = Null
ElseIf Not(IsNull(Request.Querystring(“pageParam”))) Then
pageParam = Request.Querystring(“pageParam”)
End If
‘go about using pageParam for outputting your page
[/asp]
This basically sees if the session for parameter is empty or not. If its not then it gets the value from it else it checks to see if the QueryString is set & gets the value from it if its not empty. You can use that variable then to retrieve whatever product you want to, etc.
That’s all there is to this Chris. 😉 It sure was a bit tough to get on with this as I cleaned up some of the code(removing useless commented lines) & combined two functions to create the stripEndSlashes function above. Its been more than an year since I coded any ASP, looked a bit strange to me when I started coding & cleaning it up, but it all came back nicely, no problem. 😉
This is just the test code I worked with & is provided on an as is basis. I don’t guarantee that this will work on your setup or not. You need to create an ASP page on your website & set it to handle 404 errors. Put this code in that ASP page.
This code is just for learning purposes, its by no means a final solution that can be plugged in to a website for use straightaway!! 😀
Thanks a bunch!
Thanks, man! Nice and useful blog post. Keep doing a great job, best wishes!
Hi, I did something similar for a website and it works great, but now they want to run Webtrends to make traffic reports but IIS logs the 404 error pages, so I can’t have real stats. Do you know any work around for this issue?
hmm, well no, I don’t know of any workarounds, haven’t come across it. The only option for you will be to use a custom software for traffic reports, the one that includes some image or JavaScript in every page to analyse traffic.
that’s an impressive workaround =).
i won’t be using it though. i’ve just been doing some research on what i thought would be fairly easy (after spending most of my time developing using LAMP).
it’s unbelievable that there is no simple way of doing this under IIS!
There is a much simpler way if you use one of those ISAPI filters available for this, you can even use the same rewrite rules which work on Apache, hence making easy to port the website from a Apache server to IIS.
Or if you use ASP.NET, then also you have in built url rewriting in IIS6, but not for ASP(unless you use ASP.NET for url rewriting & then call the appropriate ASP scripts).
Is using a session variable to pass page params safe for SE spiders? I read somewhere that this isn’t a good method b/c spiders can’t use the session variable.
@PAD
Yeah its perfectly ok since what is being passed in the session is none of anyone else’s business. What will help with the search engines will be in URL & basically that info will be passed in session variable only, just to get around the lack of this feature & make server load the desired page as per the way we desire.