We've got a classic ASP application that is putting out some very large reports, where the resulting HTML is several MBs. We've made a lot of progress in trimming this down by reducing extraneous HTML, but I'd like to know if there's any way to enable GZIP compression on these dynamic .asp pages. I'm sure compressing them would be an enormous benefit to the file size.
All of the GZIP compression information I've seen only talks about supporting files or .aspx pages.
Thanks.
Sure, that's just a matter of turning on compression in IIS. See this MSDN page for example.
I recommend using HttpZip from Port 80 Software. It basically just enables compression in IIS but from a GUI instead of getting into the metabase. I used it in a web-farm for a big enterprise ASP application.
Related
On our web server we want to provide urls that can be used in HTML elements, for displaying user profile pictures.
I could do this in two ways:
put the images on the web server as static resources: mysite.com/user1.jpg
implement an IHttpHandler: mysite.com/images?userid=1
are there any benefits of one method over the other?
Not handling images via ASP.NET but letting IIS deal with them as static content and circumventing the ASP.NET pipeline altogether would be fastest.
Implementing an HttpHandler, apart from requiring code, would require the ASP.NET pipeline to be involved - this takes more resources from the server.
Basically, if the content is static, let it be static and let IIS handle it.
Whilst allowing IIS to handle the images would be easier, it depends on whether or not you want to compress the files as they are sent?
Certain versions of IIS support GZIP compression, thus meaning you can compress the files as IIS serves them, however, older versions of IIS may not, meaning if you need to compress the files, you would have to use an HTTPHandler or similar to compress and serve them.
Let IIS deal with them. Serving images from a handler can take up a lot of resources.
Don't do that unless absolutely necessary.
If your site has a lot of images, this means that many extra requests for Asp.net to handle.
Asp.net needs quite a lot more resources per request than what IIS would need to push out a static file.
I want to be able to track the amount of data that is being transfered from my web site to each user that accesses the site. I can do this for file downloads and such but what about the pure html content itself.
How can I track the output size of a page (or the data that's trasnfered via an AJAX call) to the client and log it against a particular users session?
Also how would this differ when GZip is used in IIS 6.0?
You could develop a HttpModule to do this as in this question. If you used a HttpModule to do this I would imagine compression is applied after this in IIS 6.0.
One more way without writing any special code
You just get this parameter
Response.Filter.Length
Ref:
Bandwidth Monitoring in asp.net
I've tried to setup compression (both dynamic and static) in IIS7 for my local system, but when start my ASP.NET site using the debugger, YSlow tells me that all of the files (aspx, js, css, etc.) are not compressed. Any ideas? I really want to test this before I make changes to the production server.
Are you using Cassini as your server? If so it does not support compression that I know of.
You might try using Fiddler to see what the accept headers are that are being sent back and forth:
Fiddler
Enable compression on dynamic content in IIS7
Problem with GZIP transfer using webrequest
The only thing I was able to find on the subject was a posting from 1997
(http://insecure.org/sploits/microsoft.asp.iis.html), so I was hoping someone on here might have more recent knowledge on this topic:
Does anyone know if there are any known vulnerabilities in IIS6 that would allow a user to view an unprocessed ASP or ASPX page, outside of gaining control of the server?
IIS will serve raw asp or aspx only if those extensions are removed from application mappings for the site, or if you done some other dumb thing to configure it that way.
Why would you want unprocessed asp pages? You could just have a link that will escape the page and put it into a webpage for the user.
To me it would be a potential security risk, as, if you forgot and left a security vulnerability it would be seen.
If you didn't have your script mappings set up properly, this could be an issue, but that's more of a deploy-time concern, not a run-time concern.
I think any other vulnerabilities in this area would be app-related (picking a file to download server side...), not so much platform related.
Are you concerned about people being able to see your source code? If it is, I wouldn't worry too much about it, especially with .net and using code behind files, and a properly architectured n-tiered site.
Really, the only time this is a concern is if you have an error on your page and you spit out debugging code, even with classic asp.
I am working on a web application with many images, using ASP.NET MVC. I want to be able to cache the images in memory to improve the performance, but I would like to hear what is the best way to do this.
1) The images are accessible from URL, like http://www.site.com/album/1.jpg. How are the images stored in memory? Are they going to be in a form of memory stream?
2) How to access the image from memory and send to the web pagee? Now the web pages will use the image URL to directly embed the image in a tag.
Thanks!
Wont the webserver and downstream caches be handling this for static resources anyway? Not sure there's many performance gains to be had, but knowing nothing of the app or setup I could be wrong.
To implement I'd setup a page that took an image filename and served it either from disk or from the asp.net in memory cache.
If the images are just static files on disk, then Beepcake is right that IIS will already be caching frequently used images and serving them from memory. Using separate caching servers shouldn't be any quicker than IIS serving an image from memory - it's got more to do with scalability. Once you have a large server farm, it means you have a group of servers just dealing with your code and a group of servers just dealing with static images. Also, if you have too much content for one server to cache it all then you you can route requests so that each of your ten servers caches a different 10% of your content. This should be much better than just having each server cache the same most-used 10% of the content.
Thanks for the response. I think I was thinking the wrong direction. I just found out Flickr is using Squid to cache images.
If you want really good performance, I'd suggest Amazon CloudFront. Edge caching will give you better performance than memory caching, and CloudFront runs nginx, which is significantly better than IIS at static files (among other things).
Setting up edge caching is very easy - you log in, and get an domain to use instead of your own for image URLs.