Single Page Application with signalR: performance testing - signalr

I have an issue to evaluate the amount of concurrent users that our website can handle. Website is a Single Page Application built on .net framework with Durandal.js on the frontend. We use signalR (hubs) for real time communication between server and client.
The only option I see is ‘browser testing’, so each test should run browser instance (or use phantomJs etc) to keep real time connection with the server (as in real usage). Are there any other options to do this except use tests that will use browser instance to emulate user’s behaviour? What is the best way to emulate load of e.g. 1000 concurrent users?
I’ve found several cloud services that support such load testing, e.g. loadimpact, blazemeter. Would be great if someone can share their experience of using such tools.

SignalR provides tool called Crank, which can be used to test how many connections can be handled by given machine.
More info: http://www.asp.net/signalr/overview/performance/signalr-connection-density-testing-with-crank

Make your own script to create virtual users! that is the most effective way to recreate real world load/stress! use Akka Actor model(for creating virtual users) with java signalr client! (if you want you can use Gatling tool as framework and attach your script written in java or scala to virtual users of Gatling!)
make script dynamic by storing user info(Authentication token or user credentials) in xml document.
Please comment questions I can guide you end to end as I completed building+deploying such tool...

Related

Processing Requests in the Background

I'm writing a REST API using ASP.Net Core and Microsoft SQL Server. One of my requirements is that clients will POST certain data to this API and the API will have to transform/process the data in some way before it is used or read. Turns out this processing is costly. So I'm thinking of doing it asynchronously in the background without blocking the POST request. I'm considering doing the processing:
In a scheduled SQL job
Using a separate Windows Service running in the background that reads from the DB, does the processing and writes back to it. It'll be slower than the SQL job I presume, but the code will be more readable.
Using Hangfire. Never used it. Not sure how well it works.
What are the best options for this? Are there there any best practices around this kind of thing?
Boilerplate
Store that data somewhere (RDBMS, nonSQL, etc)
Respond to user that his data has been scheduled for processing
Run some worker or pool of workers for job processing
Store result somewhere
Notify client that background job is complete (could be just a GET /jobs/id endpoint which client can check
Show that result
You can use your own daemon, process, script. If it's not enough and you need more features use that Hangfire which is looking solid.
I am using hangfire in production for almost 3 years, and yes this is a great way, retry policy from out of the box, UI dashboard, but extra options can be like this:
Serverless (Azure function, AWS Lambda)
AWS SQS or Azure Queue + Hosted services docs
Another option I've found is to implement IHostedService, a built-in interface in ASP.Net Core. See this page for details.

How to share information between .NET programs

I´m working on .NET web application that also runs different jobs at the background. These background jobs update the database, create report files, delete old files etc.
The jobs are triggered on a schedule and the scheduler itself is triggered every hour via Windows´ task scheduler.
So far I´ve been monitoring the status of the jobs by logging messages to windows events viewer however this method is very cumbersome.
I'd like to implement a new program with a GUI that will communicate with the jobs and monitor their activities. For start the gui should display the following information:
Status (run vs idle)
Progress bar ("24% done", "327 of 23423 items are finished")
The gui.exe should run independently from the jobs.exe (Starting or stopping the gui should not influence the jobs and vice versa).
I'm looking for advices, what is the best way to share information between jobs.exe and gui.exe
General answer - all forms of IPC involve using some shared resource accessible by both processes. Examples:
network sockets (with "network" as shared resource)
named pipes
shared memory
plain old disk files (not the best way to do IPC but still a valid one)
a shared database
Specific .NET options:
Remoting
WCF
Web API
named pipes
to name a few.
Research some of them and see if they fit your usecase
Depends whether the gui.exe will be running on the same system as your job services. If so then better not use any web api / network sockets / WCF for communication, but something rather more simple as IPC method. I personally like message queues.

Types of services for long running asp.net processes

Our website has a long running calculation process which keeps the client waiting for a few minutes until it's finished. We've decided we need a design change, and to farm out the processing to a windows or a WCF service, while the client is presented with another page, while we're doing all the calculations.
What's the best way of implementing the service though?
We've looked at background worker processes, but it looks like these are problematic because if IIS can periodically shut down threads
It seems the best thing to use is either a Windows service or a WCF service. Does anyone have a view on which is better for this purpose?
If we host the service on another machine, would it have to be a WCF service?
It looks like it's difficult to have the service (whatever type it is) to communicate back to the website - maybe instead the service can update its results to a database, and the website polls that for the required results later on.
It's an open ended question I know, but does anyone have any ideas?
thanks
I don't think that the true gain in terms of performance will come from the design change.
If I were to chose between windows service and WCF I would go with the Windows service because I would be able to fix an affinity and prioritize as I want. However I will have to implement the logic for serving multiple clients in the same time (which in a WCF service approach will be handled by IIS).
So in terms of performance if you use .NET framework for both the WCF service and Windows service the performance difference will not be major. Windows service would be more "controllable", WCF would be more straight-forward and with no big performance penalties.
For these types of tasks I would focus on highly optimizing the single thread calculation. If you have a complex calculation, can it be written in native code (C or C++)? You could make a .DLL file that is highly optimized and is used by either the Windows service or the WCF service. Using this approach will allow you to select best compiler option and make best use of your machine resources. Also nothing stops you from creating multiple threads in the .DLL function.
The link between the website and the service can be ensured in both cases: through sockets for Windows service (extra code for creating the protocol) or directly through SOAP for the WCF. If you push the results in a database the difficulty would be letting the website (and knowing to wich particular user session) know that the data is there.
So that's what I would do.
Hope it helps.
Cheers!
One way to do this is:
The Client submits the calculation request using a Call to a WCF Service (can be hosted in IIS)
The calculation request is stored in a database With a unique ID
The ID is returned to the Client
A Windows Service (or serveral on several different machines) poll the database for New requests
The Windows service performs the calculation and stores the result to a result table With the ID
The Client polls the result table (using a WCF service) With the ID
When the calculation is finished the result is returned to the client

How to fire off and poll a windows service from asp.net page

client wants an asp.net page that has a button to fire off a database update from an external source with hundreds of records. This process takes a long time. He also wants status update as the process runs, like "processing 10 out of 1000 records". In reading various articles, I'm thinking of putting the database update code in a windows service. I've never worked with windows services before and I can't find many tutorials on how to fire off a windows service and poll it from an asp.net page. My questions are is this the best way to handle this process? And, does anyone have any examples on how they've accomplished this?
There are a few ways to approach this.
You're right in that executing a long-running task within the Web's worker process doesn't usually end well: it ties up resources, the app pool can get recycled, etc. In most of my projects of any complexity, I usually end up with 4 pieces: the database, a DLL with my model, a "Worker" that is a Windows service, and an ASP.NET Web site.
The "Worker" is a Windows service that is always running and uses Quartz.net to execute scheduled tasks using the same model that the Web site uses. These can be all sorts of periodic tasks that seem to crop up when maintaining a Web site of any complexity: VacuumExpiredPickTicketsJob, BackupAndFtpDatabaseJob, SendBackorderReminderEmailsJob, etc.
Writing a Windows service is not difficult in C# (there is a built-in template in Visual Studio, but you pretty much inherit from ServiceBase and you're off to the races), and libraries like TopShelf make it even easier to deploy them.
What is left is triggering the update from the Web site and communicating the results back to the user. This can be as simple or as complicated as you want it to be. If this is something that has to scale up to lots of users, you might use something like MSMQ to queue up update commands to the Windows service, and the Windows service would respond to that queue. I get the impression that that is probably overkill here.
For a handful of users, you could override your service's OnCustomCommand(int command) method to be the trigger. Your Web site would then use ExecuteCommand() of the ServiceController class to get the process started. Your Web site and service would agree on the parameter value that means "do that update thing," let's say 142 (since it has to be a number between 128 and 255 for reasons of history).
As for communicating progress back to the client, it's probably easiest to just have the Web page use a timer and an AJAX call to poll for updated progress data. You can get fancy with new stuff like WebSockets (bleeding edge stuff as I write this) and long polling, but regular polling will simply work for something that doesn't need to scale.
Hope this helps!
In addition to Nicholas' thorough answer, another option is to deploy your back end processes as command line scripts, and schedule them to run through Window's built in task scheduler, which has improved quite a bit in Windows Server 2008+. Or you can use any other host of task scheduler applications.
I find the command line approach to be easier for MIS staff to understand and configure, and to migrate to new servers, versus standard Windows services.

A Way to Run a Long Process From ASP.NET page

What are your most successful ways of running a long process, like 2 hours, in asp.net and return information to the client on the progress.
I've heard creating a windows service, httphandler and remoting can be successful.
Just a suggestion...
If you have logic that you are tyring to utilize already in asp.net... You could make an external app (windows service, console app, etc.) that calls a web service on your asp.net page.
For example, I had a similiar problem where the code I needed was asp.net and I needed to update about 3000 clients using this code. It started timing out, so I exposed the code through a web service. Then, instead of trying to run the whole 3000 clients at through asp.net all at once, I used a console app that is run by a nightly sql server job that ran the web service once for each client. This way all the time consuming processing was handled by the console app that doesn't have the time out issue, but the code we had already wrote in asp.net did not have to be recreated. In the end slighty modifying the design of my existing architecture allowed me easily get around this problem.
It really depends on the environment and constraints you have to deal with...Hope this helps.
There are two ways that I have handled this. First, you can simply run the process and let the client time out. This has two drawbacks: the UI isn't in synch and you are tying up an IIS thread for non-html purposes (I did this for a process that used to return quickly enough but that grew beyond time-out limits).
The better way to handle this is to write a "Service" application that handles the request as passed through a database table (put the details of the request there). Then you can create a window that gives the user a "window" into ongoing progress on the task (e.g. how many records have been processed or emails sent). This status window can either have a link to permit the user to refresh or you can automate the refresh using Ajax callbacks on a timer.
This isn't directly applicable but I wrote code that will let you run processes similar to "scheduled tasks" inside of ASP.NET without needing to use windows services or any type of cron jobs.
Scheduled Tasks in ASP.NET!
I very much prefer WCF service to scheduled tasks. You might (off the top of my head) pass an addr to the WCF service as a sort of 'callback' that the service can call with progress reports as it works.
I'd shy away from scheduled tasks... too course grained.

Resources