Building ASP.Net web page to test user's connection (bandwidth) - asp.net

We need to add a feature to our website that allows the user to test his connection speed (upload/download). After some research I found that the way this being done is downloading/uploading a file and divide the file size by the time required for that task, like here http://www.codeproject.com/KB/IP/speedtest.aspx
However, this is not possible in my case, this should be a web application so I can't download/upload files to/from user's machine for obvious security reasons. From what I have read this download/upload tasks should be on the server, but how?
This website has exactly what I have in mind but in php http://www.brandonchecketts.com/open-source-speedtest. Unfortunately, I don't have any background about php and this task is really urgent.
I really don't need to use stuff like speedtest
thanks

The formula for getting the downloadspeed in bit/s:
DataSize in byte * 8 / (time in seconds to download - latency to the server).
Download:
Host a blob of data (html document, image or whatever) of which you know the downloadsize on your server, make a AJAX GET request to fetch it and measure the amount of time between the start of the download and when it's finished.
This can be done with only Javascript code, but I would probably recommend you to include a timestamp when the payload was generated in order to calculate the latency.
Upload:
Make a AJAX POST request towards the server which you fill with junk data of a specific size, calculate the time it took to perform this POST.
List of things to keep in mind:
You will not be able to measure higher bandwith than your servers
Concurrent users will drain the bandwith, build a "limit" if this will be a problem
It is hard to build a service to measure bandwidth with precision, the one you linked to is presenting 5Mbit/s to me when my actual speed is around 30Mbit/s according to Bredbandskollen.
Other links on the subject
What's a good way to profile the connection speed of Web users?
Simple bandwidth / latency test to estimate a users experience
Note: I have not built any service like this, but it should do the trick.

Related

Peaks in Load of Firebase RT Database cause my applications to slow down

I have a dashboard in which I use Firebase Real Time Database. I have also a backend that publishes output to the front-end, which runs in batches. Every time when there is a batch, I notice that the Load peaks to almost 100% (most of it is writing but there is also some considerable loading). This is causing my front-end dashboard to slow down.
My question is what I could do to avoid this issue? Is there a way to scale the Load up, such that its less likely to approach the 100%? What is the Firebase recommended way to handle this?
This type of spiky load is indeed commonly caused by backend processes that run in batches.
The Firebase backend storage layer runs pretty much as a single threaded process for each database, handling each (read or write) request from clients in turn. So while it is processing one request, any other requests are awaiting their turn.
This means that if you have a particularly large read or write request, it keeps other clients from getting their requests served. For this reason you'll want to take care to divide the interactions with the database (especially from the backend process) into small operations as to not interfere with the other clients.
If the backend process also needs to read a considerable part of the database for its work, consider if you can make it read from a backup of the database instead of the from the live database. The backups are made out-of-band, so don't interfere with clients, so if you can use the backup as the source for reading the data, it will significantly reduce the read load that the backend process causes.

Handle ActionResults as cachable, "static content" in ASP.NET MVC (4)

I have a couple of ActionMethods that returns content from the database that is not changing very often (eg.: a polygon list of available ZIP-Areas, returned as json; changes twice per year).
I know, there is the [OutputCache(...)] Attribute, but this has some disadvantages (a long time client-side caching is not good; if the server/iis/process gets restartet the server-side cache also stopps)
What i want is, that MVC stores the result in the file system, calculates the hash, and if the hash hasn't changed - it returns a HTTP Status Code 304 --> like it is done with images by default.
Does anybody know a solution for that?
I think it's a bad idea to try to cache data on the file system because:
It is not going to be much faster to read your data from file system than getting it from database, even if you have it already in the json format.
You are going to add a lot of logic to calculate and compare the hash. Also to read data from a file. It means new bugs, more complexity.
If I were you I would keep it as simple as possible. Store you data in the Application container. Yes, you will have to reload it every time the application starts but it should not be a problem at all as application is not supposed to be restarted often. Also consider using some distributed cache like App Fabric if you have a web farm in order not to come up with different data in the Application containers on different servers.
And one more important note. Caching means really fast access and you can't achieve it with file system or database storage this is a memory storage you should consider.

ASP Website runs slow when number of users Increases

I need some information from you.I have used session.TimeOut=540 in application.Is that effects on my Application performance after some time.When number of users increases its getting very slow. response time nearly more that 2 minutes for a button click also.This is hosted in server in Application pool .I don't know about Application pool much.If Session Timeout is the problem i will remove it.Please suggest me the way to for more users.
Job Numbers,CustomerID,Tasks will come from one database.when the user click start Button then the data saved in another Database.I need this need to be faster for more Users
I think that you have some page(s) that make some work that takes time, or for some reason or a bug is keep open for more time than the usual.
This page is keep lock the session and hold the rest page from response because the session holds all the pages.
Now, together with the increase of the timeout this page is lock everything and here is you response time near to 2 minutes.
The solution is to locate the page that have the long running problem and fix it or make it faster by optimize the process, or if this page must keep the long time running, then disable the session for that one.
relative:
Web app blocked while processing another web app on sharing same session
What perfmon counters are useful for identifying ASP.NET bottlenecks?
Replacing ASP.Net's session entirely
Trying to make Web Method Asynchronous
Does ASP.NET Web Forms prevent a double click submission?
About server
Now from the other hand, if your server suffer from hardware, or bad setup then here is one other answer with points that you need to check to make it faster.
Find out where the time is spent
add the StopWatch in the method which you said "more that 2 minutes for a button click". you can find which statment spent the most time.
If it is a query on DB that cost time. Check your sql statement.
are you using "SELECT Count(*)" instead of "SELECT Count(Id)"? the * is always slower. also, don't try "SELECT * FROM...."
Use cache.
there are many ways to do cache. both in ASPX pages and your biz layer.
the OutputCache is the most easy way.
and also, cache the page (for example a blog post) on the first time when a user visit it.
Did you use memory paging?
be careful when doing paging on gridview or other list. If you just call DataSource=xxx and DataBind(), even with PagedDataSource, this is likely a memory paging. It cost a lot of performance. Please use stored procedures to do paging.
Check your server environment
where did you deploy the website? many ISP will limit brandwide and IIS connection count and also CPU time to your account.
if you have RD access to your server. you can watch CPU and memory usage to see if they are high when many user comes to your site. If the site is slow and neither CPU nor memory useage is high, it may be a network brandwide problem.
Here are some simple steps to narrow down the issue -
1) Get HTTPWatch (theres a free Basic version) available and check whats really taking time from an end user perspective. Look at number of requests, number of resources downloaded, and the payload. If there is nothing to worry move on to next
2) If its not client, then its usually the processing time on the server. Jump on to DB first - since this is quite easier to eliminate quickly. Look at how many DB calls are made (run profiler in staging or dev) and see if there are any long running queries, missing indexes or statistics, and note the IO. If all is well, move on
3) Check your app code. You could get on with VS.NET in build profiler or professional tools such as Ants. If code is fine then its your network or external calls that you make, check your network bandwidth. If you still cannot narrow down, check your environment/hardware
The best way to get to it is to apply load - You could use simple tools such as ab.exe (that comes as part of Apache Web server) to have concurrent hits on your server and run the App, DB profilers in the background to get to the issue.
Hope this helps!

Guidelines for providing large downloads in IIS + ASP.NET (MVC)

We want to allow users to download large files from our ASP.NET MVC2 system.
We are providing the files through the Controller.File method, which streams from FileStream to Response.OutputStream.
The reason we use Controller.File instead of providing a direct link is that we need to verify security rules on the logged in (Forms authentication) user.
What would be the largest areas of concern when doing this?
Security: we'll probably need to increase executionTimeout. Does this expose security issues?
Memory: I assume that, since Controller.File is streaming the contents directly from disk, there are little memory implications.
CPU: I read on various blogs that providing large downloads is heavy on the cpu, but these were unconfirmed statements, so I did not find any recommendations from MS.
Network: how many concurrent downloads are possible? Can we throttle, so that other traffic is not hindered by this?
Other?
What would be your recommendations?
What would be other options than going through the ASP.NET pipeline, but still provides us with the data we need to validate the logged in user. ISAPI is said to reduce CPU and Memory, maybe some other advantages here?
Are there any (official) guidelines or best practices available concerning this?
I would look to do it asynchronously. Make sure buffering is switched off, that way your data is sent to the client rather than asp waiting for you to finish. if you're already streaming, then that's a good thing. I'm assuming you mean you read x bytes from the filestream and write those bytes to the output stream, repeat until EOF.
I'm not aware of any guidelines I can point you at. The above come from my own experience.

Multiple requests to server question

I have a DB with user accounts information.
I've scheduled a CRON job which updates the DB with every new user data it fetches from their accounts.
I was thinking that this may cause a problem since all requests are coming from the same IP address and the server may block requests from that IP address.
Is this the case?
If so, how do I avoid being banned? should I be using a proxy?
Thanks
You get banned for suspicious (or malicious) activity.
If you are running a normal business application inside a normal company intranet you are unlikely to get banned.
Since you have access to user accounts information, you already have a lot of access to the system. The best thing to do is to ask your systems administrator, since he/she defines what constitutes suspicious/malicious activity. The systems administrator might also want to help you ensure that your database is at least as secure as the original information.
should I be using a proxy?
A proxy might disguise what you are doing - but you are still doing it. So this isn't the most ethical way of solving the problem.
Is the cron job that fetches data from this "database" on the same server? Are you fetching data for a user from a remote server using screen scraping or something?
If this is the case, you may want to set up a few different cron jobs and do it in batches. That way you reduce the amount of load on the remote server and lower the chance of wherever you are getting this data from, blocking your access.
Edit
Okay, so if you have not got permission to do scraping, obviously you are going to want to do it responsibly (no matter the site). Try gather as much data as you can from as little requests as possible, and spread them out over the course of the whole day, or even during times that a likely to be low load. I wouldn't try and use a proxy, that wouldn't really help the remote server, but it would be a pain in the ass to you.
I'm no iPhone programmer, and this might not be possible, but you could try have the individual iPhones grab the data so all the source traffic isn't from the same IP. Just an idea, otherwise just try to be a bit discrete.
Here are some tips from Jeff regarding the scraping of Stack Overflow, but I'd imagine that the rules are similar for any site.
Use GZIP requests. This is important! For example, one scraper used 120 megabytes of bandwidth in only 3,310 hits which is substantial. With basic gzip support (baked into HTTP since the 90s, and universally supported) it would have been 20 megabytes or less.
Identify yourself. Add something useful to the user-agent (ideally, a link to an URL, or something informational) so we can see your bot as something other than "generic unknown anonymous scraper."
Use the right formats. Don't scrape HTML when there is a JSON or RSS feed you could use instead. Heck, why scrape at all when you can download our cc-wiki data dump??
Be considerate. Pulling data more than every 15 minutes is questionable. If you need something more timely than that ... why not ask permission first, and make your case as to why this is a benefit to the SO community and should be allowed? Our email is linked at the bottom of every single page on every SO family site. We don't bite... hard.
Yes, you want an API. We get it. Don't rage against the machine by doing naughty things until we build it. It's in the queue.

Resources