I've got this situation.
I need to test a web application that was created with asp.net and c#.
This application has one special process that does the heaviest and the most important process in all the application. This process is part of the webservice that I need to test.
Now I've been asked to come up with any kind of program to simulate a specific number of simultaneous requests to the process, to see how it reacts to a certain number of user trying to call and use the same process.
I've never done anything like this before, but some ideas have come to mind, like maybe creating a little program in visual studio using the BackgroundWorker class and in this backgroundworker call the webservice , then call this backgroundworker the number of times specified by the user.
As I mentioned, I've never done anything similar, so I'm open to suggestions.
What would you do if you had do something similar?
Don't write this yourself, go straight to a web load testing tool.
See also
Performing a Stress Test on Web Application?
Related
We are referencing a 3rd party proprietary CLI DLL in our .net project. This DLL is only an interface to their proprietary C++ library. Our project is an asp.net (MVC4/Web API) web application.
The C++ unmanaged library is rather unstable. Sometimes it crashes with e.g. dangling pointers. We have no way of solving it, and using this library is a first-class customer requirement.
When the application crashes, the application pool in IIS doesn't respond anymore. We have to restart it, and doing so takes a couple minutes (yes, that long!).
We would like to keep this unstable DLL from crashing our application. What's the best way of doing it? Can we keep the CLI DLL in a separate AppDomain? How?
Thanks in advance.
I think every answer to this question will be some kind of work around.
My workaround would be to not interact directly with the DLL from your web application.
Instead write your requests from the web application to either a Message Queue or a SQL table. You can then have another application such as a Windows Service which reads the requests, interacts with the DLL and then writes the results back for your web application to read.
I'm not saying that SQL / Message Queues are the right way, I'm more thinking of the general process flow.
I had this exact problem with a third party library that accessed protected memory for purposes of interacting with a hardware copy protection dongle. It worked fine in a console or winforms app, but crashed like crazy when called from an IIS application.
We tried several different things, some of which are mentioned in other answers on this page. But ultimately, the best solution for us was to us a very old technology - .Net Remoting. I know - it's somewhat frowned on these days. But it fit this particular need quite well.
The unstable code was placed in a Windows Service application. The web application made remoting calls to this service, which relayed the commands to the third-party library.
Now I'm sure you could do the same thing with WCF, sockets, etc. But remoting was quick and easy to setup, and since we only talk to the same server it works without opening any ports. It just talks on a named pipe.
It does mean a second service to install besides the web application, but that was acceptable in my particular use case.
If you did something similar, and the third-party code actually crashed the service, you could probably write some code in your main application to bring it back up.
So perhaps a process boundary is more useful than an App Domain when you have unstable code to wrangle.
I would first increase the IIS process recyling rate, maybe the the DLL code fails after a certain number of calls, or after the process reaches a certain amount of memory usage.
You can find information on the configuration of IIS 7.0 recycling options here: http://technet.microsoft.com/en-us/library/cc753179(v=ws.10).aspx
In your case I would recycle the process at a specific time, when you know there is less load on the application. And after a certain number of requests (lower than the default) to try and have "fresh" process most of the time.
The recycling process is graceful in the sense that the the old process is not terminated until the one that will replace it is ready, so there should be no noticeable downtime.
More information about the recycling mechanism here: http://technet.microsoft.com/en-us/library/cc745955.aspx
If the above does not solve the problem I would wrap the calls in my own code that manages the unstable DLL execution.
This code should recover from the failures for example by repeating the failing calls until a result is obtained and failing with a graceful error if it is not possible after a number of attempts.
Internally the calls to the unstable DLL could be made in a spawned thread or even the code could be in an new external executable that you could launch with Process.Start.
This last option has more overhead but it might be your only option. See this SO question for more information on this: How do you handle a thread that has a hung call?
I suggest following solution.
Wrap this dll with another web application. Can be one of the following ones. Since you already use web api, it is most suitable for you.
Simple ASMX Web Service
WCF Service
Asp.Net MVC - WEB Api Service
Control your p-invoke code so that you do not have any bug? See following articles.
The Black Art of P/Invoke and Marshaling in .NET
P/Invoke Revisited
Publish this application to IIS with different application pool.
Use standard techniques suggested before like. I suggest configure recycling IIS for both memory and scheduled times.
IIS process recycling rate
How to limit the memory used by an application in IIS?
I have downloaded TheWorldsWorstStackOverflowClone. One of the project is called TheWorldWorsts.ApiWrapper, which basically is the core of accessing the API. There is a class called ApiProxy.cs, which has all the methods for the API call. This is good.
Now what I want to do is I am trying to collect data from this API interface and store it in a database. I know the limit to the API call is 10k per day. I.e: I want to be able to call the method in the ApiProxy class 10k times per day, done automatically. How can I do this?
The non-automatic way would be to create a dummy site where when every time I access the site it does all that process, but this not efficient. It seems that I have to write some kind of a scheduler by deploying a web service, but that is too complicated... as explained here. Any other simpler methods?
A Windows Service or Desktop App might be a better solution than a web application. You are not deploying a web service, you are consuming one using a proxy class, and this does not require you to have a web server or a web site.
You could use a web application to control and monitor progress as your service downloads data, but the actual work is long running and needs to be offloaded to another process or thread so you can tell the user whats going on.
Check out this one
http://stacky.codeplex.com/
This looks what you need, though I am facing some debugging issues, but hope you can figure it out.
We need the ability to send out automatic emails when certain dates occur or when some business conditions are met. We are setting up this system to work with an existing ASP.NET website. I've had a chat with one of the other devs here and had a discussion of some of the issues.
Things to note:
All the information we need is already modelled in the ASP.NET website
There is some business-logic that is required for the email generation which is also in the website already
We decided that the ideal solution was to have a separate executable that is scheduled to run overnight and do the processing and emailing. This solution has 2 main problems:
If the website was updated (business logic or model) but the executable was accidentally missed then the executable could stop sending emails, or worse, be sending them based on outdated logic.
We are hoping to use something like this to use UserControls to template the emails, which I don't believe is possible outside of an ASP.NET website
The first problem could have been avoided with build and deployment scripts (which we're looking into at the moment anyway), but I don't think we can get around the second problem.
So the solution we decided on is to have an ASP.NET page that is called regularly by SSIS and to have that do a set amount of processing (say 30 seconds) and then return. I know an ASP.NET page is not the ideal place to be doing this kind of processing but this seems to best meet our requirements. We considered spawning a new thread (not from the worker pool) to do the processing but decided that if we did that we couldn't use the page returned to signify a success or failure. By processing within the page's life-cycle we can use the page content to give an indication of how the processing went.
So the question is:
Are there any technical problems we might have with this set-up?
Obviously if you have tried something like this any reports of success/failure will be appreciated. As will suggestions of alternative set-ups.
Cheers,
Don't use the asp.net thread to do this. If the site is generating some information that you need in order to create or trigger the email-send then have the site write some information to a file or database.
Create a Windows service or scheduled process that collects the information it needs from that file or db and run the email sending process on a completely seperate process/thread.
What you want to avoid is crashing your site or crashing your emailer due to limitations within the process handler. Based on your use of the word "bulk" in the question title, the two need to be independent of each other.
I think you should be fine. We use the similar approach in our company for several years and don’t get a lot of problems. Sometimes it takes over an hour to finish the process. Recently we moved the second thread (as you said) to a separate server.
Having the emailer and the website coupled together can work, but it isn't really a good design and will be more maintenance for you in the long run. You can get around the problems you state by doing a few things.
Move the common business logic to a web service or common library. Both your website and your executable/WCF service can consume it, and it centralizes the logic. If you're copying and pasting code, you know there's something wrong ;)
If you need a template mailer, it is possible to invoke ASP.Net classes to create pages for you dynamically (see the BuildManager class, and blog posts like this one. If the mailer doesn't rely on Page events (which it doesn't seem to), there shouldn't be any problem for your executable to load a Page class from your website assembly, build it dynamically, and fill in the content.
This obviously represents a significant amount of work, but would lead to a more scalable solution for you.
Sounds like you should be creating a worker thread to do that job.
Maybe you should look at something like https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
You can and should build your message body (templated message body) within domain logic (it means your asp.net application) when some business conditions are met and send it to external service which should only send your messages. All messages will have proper informations.
For "when certain dates occur" scenario you can use simple solution for background tasks (look at Craig answer) and do the same as above: parse template, build message and fast send to specified service.
Of course you should do this safe then app pool restarts does not breaks your tasks.
I have a web service running and I consume it from my desk application that is written on Compact Framework.
It takes 13 seconds to retrieve 8 results which is kinda slow. I also expect to be retrieving more results in the future. The database query runs fast.
Two questions: how do I detect where the speed slow down occurs? Do I put timers in the Web services code?
I would like to detect whether it is the network or the application code.
This is my first exposure to web services in a real environment so please bear with me.
i used asp.net 2.0 and c# to write a simple web service.
Another good profiler is the EQATEC Profiler. I did a write up on it here: http://elegantcode.com/2009/07/02/eqatec-profiler-and-net-cf-profiling-and-regular-net/
And it works find for .net CF projects. But this will allow you to see if there performance issues in unexpected places.
Your already on the right track of adding event logging, and include timers in them. Note, doing so will add to the over all time it takes, so you'll want to remove them after you track down the culprit. Also look into running the same webservice call multiple-times without re-initiating the connection, that may be cause as well.
-Jay
A starting point is to profile your web service to see where the delay is comming from
Did you know the CLR Profiler? There are some tools you can use to see what is happening
http://msdn.microsoft.com/en-us/library/ms998579.aspx
The database connectivity from your service to the DB could be a possible cause for slowdown. Adding timers should do the trick. If the code isnt too huge, you can look at the coding constructs to come up with an informed decision of where exactly things can be slow. Then add the timers. You would get a fair idea of where things are slowing down.
Two biggest pain points are going to be instantiating the web service reference and transferring all the data over the network. Pending anything turning up where some obvious blunder was made, I would look at ways of reducing the size of your xml and ways of better handling your web service reference.
All I know about the compact framework is that it is a pain to work in. I've worked on a number of web projects though and profiling your server, putting in logging to record the time taken will be helpful. If all the time is being taking post server response, however, it won't do much more than prove your server is working quickly.
SoapUI is a fantastic java application for consuming web services. It has a lot of functionality, including time metrics. I would start with that and see how long it takes to consume the same thing your client would be. Failing issues there, start with what I recommended above.
I am writing a web application in ASP.NET 3.5 that takes care of some basic data entry scenarios. There is also a component to the application that needs to continuously poll some data and perform actions based on business logic.
What is the best way to implement the "polling" component? It needs to run and check the data every couple of minutes or so.
I have seen a couple of different options in the past:
The web application starts a background thread that will always run while the web application does. (The implementation I saw started the thread in the Application_Start event.)
Create a windows service that is always running
What are the benefits to either of these options? Are there additional options?
I am leaning toward a windows service because it is separated and can run on a different server (more scalable) as well as there is more control over when it is started/stopped, etc. However, I feel like the compactness of having the "background" logic running in the process of the web application might make the entire solution more understandable.
I'd go for the separate Windows service primarily for the reasons you give:
You can run it on a different server if necessary.
You can start and stop it independently of the web site.
I'd also add that it could well have some impact on the performance of the web site itself - something you want to avoid.
The buzz-word here is "separation of concerns". The web site is concerned with presenting the data to the user, the service with checking the integrity of the data.
You can also update the web site and service independently of each other should you need to.
I was going to suggest that you look at a scheduled task and let Windows control when the process runs, but I re-read your question and noted that you wanted the checks to run every couple of minutes. The overhead of starting the process might be too great in this case - though some experimentation would probably prove this one way or the other.
If you use a scheduled task there's also the possibility that you could start the next check before the current one has finished - something you can code for if you're in complete control.
Why not just use a console app that has no ui? Can do all that the windows service can and is much easier to debug and maintain. I would not do a windows service unless you absolutely have to.
You might find that the SQL Server job scheduler sufficient for what you want.
Console application does not do well in this case. I wrote a TAPI application which has to stay in the background and intercept incoming calls. But it did it only once because the tapi manager got GCed and was never available for the second incoming call.