Enable dynamic compression in app within GBPS LAN? - asp.net

I have a LAN of 1000 clients with speeds of 1 GBPS.
One application hosted in IIS 7.5.
Fact: A megabyte response is transferred between the server and the client in no more than 30 miliseconds. The connection is very fast.
Fact: Some clients have older PCs (windows xp, ie7, pentium4).
I think that dynamic compression is not needed in this case, becuase the problem is not the bandwidth but the clients computer performance.
Do you recommend to disable compression?
My pages have too much javascript. In every post I refresh the page with javascript, ajax and json. In some cases when the HTML is too big, the browser gets a little bit unresponsible. I think that compression is causing this problem.
any comments?

A useful scenario for compression is when you have to pay for the bandwith and would like to speed up the download of large pages, but this creates a bit of work for the client having to uncompress the data before serving it.
Turn it off.
You don't need it for serving pages over a high-speed LAN.

Definitely don't think you need the compression. But you are shooting in the dark here -- get yourself a http debugger such as the one included in google chrome and see what parts of the pages are slow.

Related

Is CDN helping server in terms of performance and RAM?

I'm planning to move my website files into a CDN system, i'm running 4 drupal websites and 1 wordpress. I was thinking to use Amazon Cloud Front.
I have a some questions:
Is the CDN system helping my server in terms of performance and RAM?
I'm using http://www.webpagetest.org to see the performance of the website, and the 83% of the requests comes from the images. The rest is between html, css, js and other. this is the other result
F = First Byte Time
A = Keep-alive Enabled
F = Compress Text
C = Compress Images
A = Cache static content
X = CDN detected
Is it possible, using amazon CloudFront, to put on cloud a website inside a sub-folder?
Basically I want to test it in a non-production site.
My server is a r310 quad core xeon 2.66 with 4gb of ram.
Thanks in advance
The answer should be much long to describe this I think, but in simple terms, a well-manaed CDN can help you to make your site faster.
4GB of RAM is not bad for a normal web site.
There are 3 main reasons to use a CDN that i can think about.
1. To deliver static content faster using nearby servers.
2. To avoid the browser from sending cookies to each GET request.
3. To take off some of the Apache load.
1 - I haven't used cloudfront but some akamai servers and they do make a difference. They simple gives content in a different and nearby server so file loading is relatively fast. But don't forget that this adds additional ip lookups if the user is loading site for the first time after a dns cache clean up.
2 - I think you know the cookie-less domain problem. If you host your site in example.com and images are in example.com/image.png like structure, browser should send the cookie info on each page request. They are usually ~100 bytes of data but when it comes to many assets, this is something worth considering. If you take off them to example-data.com domain, browser will not send the cookies to assets in this location. Faster pages.
3 - Your web server load is the other benefit. Your server will get less requests (mainly html requests) and images and other assets will be served from another server.

Increase in number of requests form server cause website slow?

In My office website,webpage has 3css files ,2 javascript files ,11images and 1page request total 17 requests from server, If 10000 people visit my office site ...
This may slow the website due to more requests??
And any issues to the server due to huge traffic ??
I remember My tiny office server has
Intel i3 Processor
Nvidia 2Gb Graphic card
Microsoft 2008 server
8 GB DDR3 Ram and
500GB Hard disk..
Website developed on Asp.Net
Net speed was 10mbps download and 2mbps upload.using static ip address.
There are many reasons a website may be slow.
A huge spike in Additional Traffic.
Extremely Large or non-optimized graphics.
Large amount of external calls.
Server issue.
All websites should have optimized images, flash files, and video's. Large types media slow down the overall loading of each page. Optimize each image.PNG images have an improved weighted optimization that can offer better looking images with smaller file size.You could also run a Traceroute to your site.
Hope this helps.
This question is impossible to answer because there are so many variables. It sounds like you're hypothesising that you will have 10000 simultaneous users, do you really expect there to be that many?
The only way to find out if your server and site hold up under that kind of load is to profile it.
There is a tool called Apache Bench http://httpd.apache.org/docs/2.0/programs/ab.html which you can run from the command line and simulate a number of requests to your server to benchmark it. The tool comes with an install of apache, then you can simulate 10000 requests to your server and see how the request time holds up. At the same time you can run performance monitor in windows to diagnose if there are any bottlenecks.
Example usage taken from wikipedia
ab -n 100 -c 10 http://www.yahoo.com/
This will execute 100 HTTP GET requests, processing up to 10 requests
concurrently, to the specified URL, in this example,
"http://www.yahoo.com".
I don't think that downloads your page dependencies (js, css, images), but there probably are other tools you can use to simulate that.
I'd recommend that you ensure that you enable compression on your site and set up caching as this will significanly reduce the load and number of requests for very little effort.
Rather than hardware, you should think about your server's upload capacity. If your upload bandwidth is low, of course it would be a problem.
The most possible reason is because one session is lock all the rest requests.
If you not use session, turn it off and check again.
relative:
Replacing ASP.Net's session entirely
jQuery Ajax calls to web service seem to be synchronous

Http requests / concurrency?

Say a website on my localhost takes about 3 seconds to do each request. This is fine, and as expected (as it is doing some fancy networking behind the scenes).
However, if i open the same url in tabs (in firefox), then reload them all at the same time, it appears to load each page sequentially rather than all at the same time. What is this all about?
Have tried it on windows server 2008 iis and windows 7 iis
It really depends on the web browser you are using and how tab support in it has been programmed.
It is probably using a single thread to load each tab in turn, which would explain your observation.
Edit:
As others have mentioned, it is also a very real possibility the the webserver running on your localhost is single threaded.
If I remember correctly HTTP standard limits the number of concurrent conections to the same host to 2. This is the reason highload websites use CDNs (content delivery networks).
network.http.max-connections 60
network.http.max-connections-per-server 30
The above two values determine how many connections Firefox makes to a server. If threshold is breached, it will pipeline the requests.
Each browser implements it in its own way. The requests are made in such a way to maximize the performance. Moreover, it also depends on the server (localhost which is slower).
Your local web server configuration might have only one thread, so every next request will wait for the previous to finish

Will HTTP Compression (GZip or deflate) on a low traffic site actually be beneficial?

I have a web application where the client will be running off a local server (i.e. - requests will not be going out over the net). The site will be quite low traffic and so I am trying to figure out if the actual de-compression is expensive in this type of a system. Performance is an issue so I will have caching set up, but I was considering compression as well. I will not have bandwidth issues as the site is very low traffic. So, I am just trying to figure out if compression will do more harm than good in this type of system.
Here's a good article on the subject.
On pretty much any modern system with a solid web stack, compression will not be expensive, but it seems to me that you won't be gaining any positive effects from it whatsoever, no matter how minor the overhead. I wouldn't bother.
When you measured the performance, how did the numbers compare? Was it faster when you had compression enabled, or not?
I have used compression but users were running over a wireless 3G network at various remote locations. Compression made a significant different to the bandwidth usage in this case.
For users running locally, and with bandwidth not an issue, I don't think it is worth it.
For cachable resources (.js, .html, .css) files, I think it doesn't make sense after the browser caches these resources.
But for non-cachable resources (e.g. json response) I think it makes sense.

Issues with HTTP Compression?

We are investigating the use of HTTP Compression on an application being served up by JBoss. After making the setting change in the Tomcat SAR, we are seeing a compression of about 80% - this is obviously great, however I want to be cautious... before implementing this system wide, has anyone out there encountered issues using HTTP Compression?
A couple points to note for my situation.
We have full control over browser - so the whole company uses IE6/7
The app is internal only
During load testing, our app server was under relatively small load - the DB was our bottleneck
We have control over client machines and they all get a spec check (decent processor/2GB RAM)
Any experiences with this would be much appreciated!
Compression is not considered exotic or bleeding edge and (fwiw) I haven't heard of or run into any issues with it.
Compression on the fly can increase CPU load on the server. If at all possible pre-compressing static resources and caching compressed dynamic responses can combat that.
It just a really good idea all the way around. It will add slight CPU load to your server, but that's usually not your bottleneck. It will make your pages load faster, and you'll use less bandwidth.
As long as you respect the client's Accept-Encoding header properly (i.e. don't serve compressed files to clients that can't decompress them), you shouldn't have a problem.
Oh, and remember that deflate is faster than gzip.

Resources