.Net Web Application Performance - asp.net

I have a .net web application utilizing bootstraps. At our office we have very high end machines with great wireless network cards. However people on slower pc's and cheaper network cards are increadably slower, even when they are on a great internet connection. For instance I am writing this on my home machine which is an HP i7-3630QM CPU 2.4GHz, 8gb ram, RalinkRT5390R 802.11 WiFi Card on a 30mbps internet connection. Now I realize HP is crap and the network card is crap. But the problem is this is the type of system most people buy. Long story short the system is at least twice as slow on these machines, most of the time 4 times slower elements just never load at all and my users have to refresh the page a few times to get things to work.
We also have New Relic to measure server speed and it never maxed out on the Microsoft Azure platform. In fact our slowest page is only about a 3.5 second loading time.
We are working on the system to try to implement more ajax, dom cacheing and what not. But nothing so far seems to make a big impact.
So I guess my questions are why would things be so dramatically slower and not work at all some of the time on these machines with crap network cards? Is there any software out there that could help us track down speed issues on the client side? Also is there any services out there anyone could recommend that specializes in .net code optimization for speed improvement?
Thank you in advance.

If you have confirmed it's fine on the server end then you should examine the client side.
You can use the inbuilt developer tools (Ctrl+Shift+I in Chrome) to view what is taking the most time to load. From there you can view the timeline of the page loading to determine if it's network or rendering slowness.

Related

Increase in number of requests form server cause website slow?

In My office website,webpage has 3css files ,2 javascript files ,11images and 1page request total 17 requests from server, If 10000 people visit my office site ...
This may slow the website due to more requests??
And any issues to the server due to huge traffic ??
I remember My tiny office server has
Intel i3 Processor
Nvidia 2Gb Graphic card
Microsoft 2008 server
8 GB DDR3 Ram and
500GB Hard disk..
Website developed on Asp.Net
Net speed was 10mbps download and 2mbps upload.using static ip address.
There are many reasons a website may be slow.
A huge spike in Additional Traffic.
Extremely Large or non-optimized graphics.
Large amount of external calls.
Server issue.
All websites should have optimized images, flash files, and video's. Large types media slow down the overall loading of each page. Optimize each image.PNG images have an improved weighted optimization that can offer better looking images with smaller file size.You could also run a Traceroute to your site.
Hope this helps.
This question is impossible to answer because there are so many variables. It sounds like you're hypothesising that you will have 10000 simultaneous users, do you really expect there to be that many?
The only way to find out if your server and site hold up under that kind of load is to profile it.
There is a tool called Apache Bench http://httpd.apache.org/docs/2.0/programs/ab.html which you can run from the command line and simulate a number of requests to your server to benchmark it. The tool comes with an install of apache, then you can simulate 10000 requests to your server and see how the request time holds up. At the same time you can run performance monitor in windows to diagnose if there are any bottlenecks.
Example usage taken from wikipedia
ab -n 100 -c 10 http://www.yahoo.com/
This will execute 100 HTTP GET requests, processing up to 10 requests
concurrently, to the specified URL, in this example,
"http://www.yahoo.com".
I don't think that downloads your page dependencies (js, css, images), but there probably are other tools you can use to simulate that.
I'd recommend that you ensure that you enable compression on your site and set up caching as this will significanly reduce the load and number of requests for very little effort.
Rather than hardware, you should think about your server's upload capacity. If your upload bandwidth is low, of course it would be a problem.
The most possible reason is because one session is lock all the rest requests.
If you not use session, turn it off and check again.
relative:
Replacing ASP.Net's session entirely
jQuery Ajax calls to web service seem to be synchronous

How to avoid crashing my user's router?

It appears that cheap consumer routers are fairly easy to crash: hanging around in various backup/sync software forums, I see this mentioned from time to time. Developers seem to be putting a fair amount of effort into making sure they don't crash the routers.
What are the "do"s and "don't"s for my network-heavy application to ensure that it doesn't cause issues with badly designed routers? Especially one that intends to connect to a number of peers?
IMO trying to workaround bad hardware is the road to nowhere, because every router fails in its own remarkable way :).
What you can do in the network-heavy application is assume that network is not stable media (routers can crash, etc) and design application network operations accordingly.
For instance, provide reconnect logic, connection timeouts, some sort of state caching to allow users work with app even if network connectivity is gone.
Concerning faulty routers - they usually crash because of great number of simultaneous connections (e.g. downloading via bittorrent or other p2p protocol). So, maintaining minimum number of connections can help.

IIS 7 - Does the number of HTTP connections matters?

I'm optimizing a very popular website and since the user base is constantly growing I'm interested in what matters when it comes to scaling.
Currently I am scaling by adding more CPU power / RAM memory to the server. This works nicely - even though the site is quite popular, currently CPU usage is at 10%.
So, if possible, I'd keep doing that. What I am worried about is whether I could get to the point where CPU usage is low but users have problems connecting because of the number of HTTP connections. Is it better to scale horizontally, by adding more servers to the cluster?
Thanks!
Eventually just adding more memory won't be enough. There are concurrent connection limits for TCP rather than IIS (though both factors do come into account, IIS can handle about 3000 connections without a strain).
You probably won't encounter what you suggest where the CPU usage is low but number of HTTP connections is high unless it is a largely static site, but the more connections open, the higher the CPU usage.
But regardless of this, what you need for a popular site is redundancy, which is essential for a site which has a large user base. There is nothing more annoying to the user than the site being down as your sole server goes offline for some reason. If you have 2 servers behind a load balancer, you can grow the site, even take a server offline with less fear of your site going offline.

What tools does your company use to manage application performance of asp.net applications?

I am not talking about application profilers or debuggers but more specific to managing the applications in production environment. So essentially monitor, identify bottlenecks, deploy fixes.
For monitoring the application is up and running we use Nagios.
We also use good old performance monitor for monitoring database connections, memory consumption and CPU usage.
We use IPMonitor to verify uptime, and it has a lot of options for pinging the site for keyword validation, HTTP response validation, and response time. You can also use SNMP to figure out responsiveness of the processor and RAM, and remaining size on hard disks, among many other options. It supports multiple servers and types of servers, not just website or database.
Additionally, we test basic uptime and response speed with AlertSite.
A 3rd party, Keynote, tests our sites to verify that they are navigable like a human would browse. They have scripts to mimic clicks and interactions.
We use Spotlight for SQL server management, and also good old perfmon for the granular problem fixing.
We recently purchased WildMetrix to monitor and troubleshoot performance issues for our ASP.NET applications. It's nice because you can easily aggregate IIS, ASP.NET, and SQL Server information into a single graph or dashboard that allows you to pinpoint possible trouble spots. We currently use it for as our primary performance reporting and track tool, along with ELMAH for Exception Tracking.

ASP.NET AJAX application extremely slow over VPN

I have a ASP.NET AJAX intranet application that has been running for a few months. It runs reasonably fast on the LAN.
However when going over a VPN it slows down dramatically. Even taking line speed into account its takes like 60 seconds to change a page. I eventually got a vmware up and running to test the VPN speed, so the connection is super fast, but it still takes the same amount of time. I can even remote desktop over the VPN to the VMware and its perfect.
This makes me think that its has nothing to do with line speed. FYI I am using Hamachi.
I have tried other VPN software and it gives the same results.
I am really stuck... any help would be much appreciated!
I found the problem was due to trying to do a dns lookup in the code. Changed to just the IP and now working!!!
Try to add system counters for "Byte per second" and other similar parameters of you network. This helps you to figure out the bottleneck of you system.

Resources