I think the key question here is "Why do you want to compress URLs?"
Trying to shorten long urls for the address bar?
You're better storing the original URL somewhere (database, text file ...) alongside a hashcode of the non-domain part (MD5 is fine). You can then have a simple page (or some HTTPModule if you're feeling flashy) to read the MD5 and lookup the real URL. This is how TinyURL and others work.
For example:
http://mydomain.com/folder1/folder2/page1.aspx
Could be shorted to:
http://mydomain.com/2d4f1c8a
Using a compression library for this will not work. The string will be compressed into a shorter binary representation, but converting this back to a string which needs to be valid as part of a URL (e.g. Base64) will negate any benefit you gained from the compression.
Storing lots of URLs in memory or on disk?
Use the built in compressing library within System.IO.Compression or the ZLib library which is simple and incredibly good. Since you will be storing binary data the compressed output will be fine as-is. You'll need to uncompress it to use it as a URL.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…