Small performance test on a web service - asp.net

I'm trying to develop a small application that tests how many requests per second my service can support but I think I'm doing something wrong. The service is in an early development stage, but I'd like to have this test handy in order to check from time to time I'm not doing something that decrease the performance. The problem is that I cannot get the web server or the database server go to the 100% of CPU.
I'm using three different computers, in one is the web server (WinSrv Standard 2008 x64 IIS7), in other the database (Win 2K, SQL Server 2005) and the last is my computer (Win7 x64 Ultimate), where I'll run the test. The computers are connected through a 100 ethernet switch. The request POST is 9 bytes and the response will be 842 bytes.
The test launches several threads, and each thread has a while loop, in each loop it creates a WebRequest object, performs a call, increment a common counter and waits between 1 and 5 milliseconds, then it do it again:
static int counter = 0;
static void Main(string[] args)
{
ServicePointManager.DefaultConnectionLimit = 250;
Console.WriteLine("Ready. Press any key...");
Console.ReadKey();
Console.WriteLine("Running...");
string localhost = "localhost";
string linuxmono = "192.168.1.74";
string server = "192.168.1.5:8080";
DateTime start = DateTime.Now;
Random r = new Random(DateTime.Now.Millisecond);
for (int i = 0; i < 50; i++)
{
new Thread(new ParameterizedThreadStart(Test)).Start(server);
Thread.Sleep(r.Next(1, 3));
}
Thread.Sleep(2000);
while (true)
{
Console.WriteLine("Request per second :"
+ counter / DateTime.Now.Subtract(start).TotalSeconds);
Thread.Sleep(3000);
}
}
public static void Test(object ip)
{
Guid guid = Guid.NewGuid();
Random r = new Random(DateTime.Now.Millisecond);
while (true)
{
String test = "<lalala/>";
WebRequest req = WebRequest.Create("http://"
+ (string) ip + "/WebApp/" + guid.ToString()
+ "/Data/Tables=whatever");
req.Method = "POST";
req.ContentType = "application/xml";
req.Credentials = new NetworkCredential("aaa", "aaa","domain");
byte[] array = Encoding.UTF8.GetBytes(test);
req.ContentLength = array.Length;
using (Stream reqStream = req.GetRequestStream())
{
reqStream.Write(array, 0, array.Length);
reqStream.Close();
}
using (Stream responseStream = req.GetResponse().GetResponseStream())
{
String response = new StreamReader(responseStream).ReadToEnd();
if (response.Length != 842) Console.Write(" EEEE ");
}
Interlocked.Increment(ref counter);
Thread.Sleep(r.Next(1,5));
}
}
If I run the test neither of the computers do an excessive CPU usage. Let's say I get a X requests per second, if I run the console application two times at the same moment, I get X/2 request per second in each one... but still the web server is on 30% of CPU, the database server on 25%...
I've tried to remove the Thread.Sleep in the loop, but it doesn't make a big difference.
I'd like to put the machines to the maximum, to check how may requests per second they can provide. I guessed that I could do it in this way... but apparently I'm missing something here... What is the problem?
Kind regards.

IMO, you're better off using SoapUI for the test. You can easily adjust the test case for the number of threads, number of iterations, etc.. And it'll graph the results. When you hit the plateau where you overwhelm the server, you'll see it on the graph. If one PC isn't enough, just run more of them on other PCs. You can do all of this with the free version.

There are a lot of limiting factors besides the CPU on a web server. There are a lot of IIS settings which throttle the number of connections can be served.
I would read this:
http://www.eggheadcafe.com/articles/20050613.asp
I know it is for IIS 6, but there are things that will still apply.
If you have access to MSDN and have VS 2010 ultimate, I would check out their load testing tools. Purchasing the load testing program can be expensive, but if you need to test something specific, you can use the trial version to accomplish what you need. You can use it to monitor and response time, server utilization, etc. Well worth looking into.

I agree with Chris, and would go a step further to recommend JMeter, as it can also test the database and webapp, all within the same script.

Related

IIS Worker process using 6gb RAM on web server with ASP.NET MVC web site

I have a web site running in its own Application Pool (IIS 8). Settings for the pool are default i.e. recycle every 29 hours.
Our web server only has 8gb RAM and I have noticed that the worker process for this web site regularly climbs to 6gb RAM and slows the server to a crawl. This is the only site currently on the web server.
I also have SQL Express 2016 installed as well. The site is using EF version 6.1.3.
The MVC site is very straightforward. It has a GETPDF controller which finds a row in a table, gets PDF info stored in a field then serves it back to the browser as follows :-
using (eBillingEntities db = new eBillingEntities())
{
try
{
string id = model.id;
string emailaddress = Server.HtmlEncode(model.EmailAddress).ToLower().Trim();
eBillData ebill = db.eBillDatas.ToList<eBillData>().Where(e => e.PURL == id && e.EmailAddress.ToLower().Trim() == emailaddress).FirstOrDefault<eBillData>();
if (ebill != null)
{
// update the 'Lastdownloaded' field.
ebill.LastDownloaded = DateTime.Now;
db.eBillDatas.Attach(ebill);
var entry = db.Entry(ebill);
entry.Property(en => en.LastDownloaded).IsModified = true;
db.SaveChanges();
// Find out from the config record whether the bill is stored in the table or in the local pdf folder.
//
Config cfg = db.Configs.ToList<Config>().Where(c => c.Account == ebill.Account).FirstOrDefault<Config>();
bool storePDFDataInEBillTable = true;
if (cfg != null)
{
storePDFDataInEBillTable = cfg.StorePDFDataInEBillDataTable;
}
// End of Modification
byte[] file;
if (storePDFDataInEBillTable)
{
file = ebill.PDFData;
}
else
{
string pathToFile = "";
if (string.IsNullOrEmpty(cfg.LocalPDFDataFolder))
pathToFile = cfg.LocalBackupFolder;
else
pathToFile = cfg.LocalPDFDataFolder;
if (!pathToFile.EndsWith(#"\"))
pathToFile += #"\";
pathToFile += ebill.PDFFileName;
file = System.IO.File.ReadAllBytes(pathToFile);
}
MemoryStream output = new MemoryStream();
output.Write(file, 0, file.Length);
output.Position = 0;
HttpContext.Response.AddHeader("content-disposition", "attachment; filename=ebill.pdf");
return new FileStreamResult(output, "application/pdf");
}
else
return View("PDFNotFound");
}
catch
{
return View("PDFNotFound");
}
Are there any memory leaks here?
Will the file byte array and the memory stream get freed up?
Also, is there anything else I need to do concerning clearing up the entity framework references?
If the code looks OK, where would be a good place to start looking?
Regards
Are there any memory leaks here?
No.
Will the file byte array and the memory stream get freed up?
Eventually, yes. But that may be the cause of your excessive memory use.
Also, is there anything else I need to do concerning clearing up the entity framework references?
No.
If the code looks OK, where would be a good place to start looking?
If this code is the cause of your high memory use, it's because you are loading files into memory. And you're loading two copies of each file in memory, once in a byte[] and copying to a MemoryStream.
There's no need to do that.
To eliminate the second copy of the file use the MemoryStream(byte[]) constructor instead of copying the bytes from the byte[] to an empty MemoryStream.
To eliminate the first copy in memory, you can stream the data into a temporary file that will be the target of your FileStreamResult, or initialize the FileStreamResult using a ADO.NET stream.
See https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/sqlclient-streaming-support
If you go to ADO.NET streaming your DbContext, will need to be scoped to your Controller, instead of a local variable, which is a good practice in any case.
In addition to David's advice. I noticed that I was doing the following
**db.eBillDatas.ToList<eBillData>()**
therefore I was getting all the data from the database then fetching it again with the where clause.
I didn't notice the problem until the database started to fill up.
I removed that part and now the IIS worker processing is about 100mb.

How to recognize client's computer?

I have a problem recognizing client's computer data. I'm developing an application that requires some king of recongnision of machine client is currently using. I tried
System.Environment.MachineName;
and also
string hostName = Dns.GetHostName();
var add = Dns.GetHostAddresses(hostName);
, but it always returns data of server computer IIS is working on. Also i tried to get processors and basebords id's but with same result as previous examples.
ManagementObjectSearcher mos = new ManagementObjectSearcher("SELECT SerialNumber FROM Win32_BaseBoard");
ManagementObjectCollection moc = mos.Get();
foreach (ManagementObject mo in moc)
{
serial = mo["SerialNumber"].ToString();
}
ManagementObjectSearcher mbs = new ManagementObjectSearcher("Select * From Win32_processor");
ManagementObjectCollection mbsList = mbs.Get();
string id = "";
foreach (ManagementObject mo in mbsList)
{
id = mo["ProcessorID"].ToString();
}t
System will be used inside a private network(not to be used publicly).
Is there any way to somehow recognize any unique client's computer data in ASP.NET web application development, or .NET simply deny any recognision of client's computer data due to safety?
Thanks
Since the code runs on the Server, "System" is the Server. In ASP.NET you can get information about who is requesting (the client) with the Request object. Try this:
Request.UserHostAddress
Or this:
Request.UserHostName

asp.net membership, notify administrators when account about to expire

I have a requirement that a certain email distribution list should be notified every so often (still to be determined) about user accounts that are nearing expiration.
I'm wondering the best way to achieve this, I know its generally a bad idea to spawn another thread within asp.net to handle this type of thing, so I'm thinking maybe a simple service is the way to go but for something so small this seems like it might be slightly overkill.
Ideally I'd like something that doesnt require much babysitting (eg. checking service is running).
I have also suggested having a page in the site with this type of information but it is likely that it could be a few days before this is checked. We also cannot let users extend their own expiration date.
Are there any other viable options.
The best suitable method to work on it according to is
create a application which will select list of all users whose account expiry date is nearby (eg. 10 days from today) as per your requirement.
This application will be scheduled as an daily execution (you will create an exe with log file to display errors raised and total number of emails sent in one execution.)
This application will fetch all the records based on criteria and send the emails to all yours using the basic HTML template. and once the email is sent, you will update a column (notificationFlag) in your database as 1 if you have sent is once in last 10 days. else by default it will be 0
you can schedule the exe by the end of the day at 12:10 am (just incase your database server and webserver is not matching in time) every day. .
This is something I've done which is similar to Prescott's comment on your answer.
I have a website with an administrative page that reports on a bunch of expiration dates.
This page also accepts a QueryString parameter SEND_EMAILS, so anytime an administrative user of the site passes the QueryString parameter SEND_EMAILS=true a bunch of emails go out to all the users that are expiring.
Then I just added a windows scheduled task to run daily and load the page with the SEND_EMAILS=true parameter.
This was the simple code I used to issue the webrequest from the console in the scheduled task:
namespace CmdLoadWebsite
{
class Program
{
static void Main(string[] args)
{
string url = "http://default/site/";
if (args.Length > 0)
{
url = args[0];
}
Console.WriteLine(GetWebResult(url));
}
public static string GetWebResult(string url)
{
byte[] buff = new byte[8192];
StringBuilder sb = new StringBuilder();
HttpWebRequest request = (HttpWebRequest) WebRequest.Create(url);
HttpWebResponse response = (HttpWebResponse) request.GetResponse();
Stream webStream = response.GetResponseStream();
int count = 0;
string webString;
do
{
count = webStream.Read(buff, 0, buff.Length);
if (count != 0)
{
webString = Encoding.ASCII.GetString(buff, 0, count);
sb.Append(webString);
}
}
while (count > 0);
return(sb.ToString());
}
}
}

Asp .net Session is timeout and all data wasn't saved , any idea?

I created an web application where a lot of words was needed in the application which take a lot of time and thinking when u need to write it.
Let's suppose the session timeout after 30 minutes,i started writing a lot of words and while thinking and writing the session timeout and redirect to the login page and all written data is lost.
Any idea for this problem except extending session timeout period ???
Currently your session is created and managed as In-Process mode and in this mode you cannot recover session state once it reaches timeout stage. You may set SQL Server Mode and configure your application for SQL Server Mode so your data will be persisted into Sql Server database.
Profile Properties is an alternate to save the state.
You can use some ajax function that regularly "calls home" (executes some dummy code on the server). This will keep the session alive as long as that user has this page open.
You might need to explicitly use the Session in that callback, such as
Session["LastAccess"] = DateTime.Now;
just to keep it alive.
If you execute this call every 15 minutes, the session will not time out and the load on the server is minimal.
Use asynchronous programming model which allowing some portion of code to be executed on a separate threads.
There are three style of Programming with APM
Wait Until Done Model
Polling Model
Callback Model
Based on your requirement and result you can choose the model which is more appropriate.
For instance, let us say you can to read the file and wait until done and sample code is
byte[] buffer = new byte[100];
string filename =
string.Concat(Environment.SystemDirectory, "\\mfc71.pdb");
FileStream strm = new FileStream(filename,
FileMode.Open, FileAccess.Read, FileShare.Read, 1024,
FileOptions.Asynchronous);
// Make the asynchronous call
strm.Read(buffer, 0, buffer.Length);
IAsyncResult result = strm.BeginRead(buffer, 0, buffer.Length, null, null);
// Do some work here while you wait
// Calling EndRead will block until the Async work is complete
int numBytes = strm.EndRead(result);
// Don't forget to close the stream
strm.Close();
Console.WriteLine("Read {0} Bytes", numBytes);
Console.WriteLine(BitConverter.ToString(buffer));
But creating your threads are not necessary or suggesting, .NET supports a built in thread pool that can be used in many situation where you thinking to create your own threads. Sample code
static void WorkWithParameter(object o)
{
string info = (string) o;
for (int x = 0; x < 10; ++x)
{
Console.WriteLine("{0}: {1}", info,
Thread.CurrentThread.ManagedThreadId);
// Slow down thread and let other threads work
Thread.Sleep(10);
}
}
Instead of creating a new thread and controlling it, we use the ThreadPool to this work by using its QueueWorkItem method
WaitCallback workItem = new WaitCallback(WorkWithParameter));
if (!ThreadPool.QueueUserWorkItem(workItem, "ThreadPooled"))
{
Console.WriteLine("Could not queue item");
}

ASP.NET: HttpModule performance

I've implemented an HttpModule that intercepts the Response stream of every request and runs a half dozen to a dozen Regex.Replace()s on each text/html-typed response. I'm concerned about how much of a performance hit I'm incurring here. What's a good way to find out? I want to compare speed with and without this HttpModule running.
I've a few of these that hook into the Response.Filter stream pipeline to provide resource file integration, JS/CSS packing and rewriting of static files to absolute paths.
As long as you test your regexes in RegexBuddy for speed over a few million iterations, ensure you use RegexOptions.Compiled, and remember that often the quickest and most efficient technique is to use a regex to broadly identify matches and then use C# to hone that to exactly what you need.
Make sure you're also caching and configuration that you rely upon.
We've had a lot of success with this.
Http module is just common piece of code, so you can measure time of execution of this particular regex replace stuff. It is enough. Have a set of typical response streams as input of your stress test and measure executing of the replace using Stopwatch class. Consider also RegexOptions.Compiled switch.
Here are a few ideas:
Add some Windows performance counters, and use them to measure and report average timing data. You might also increment a counter only if the time measurement exceeds a certain threshold. and
Use tracing combined with Failed Request Tracing to collect and report timing data. You can also trigger FRT reports only if page execution time exceeds a threshold.
Write a unit test that uses the Windows OS clock to measure how long your code takes to execute.
Add a flag to your code that you can turn on or off with a test page to enable or disable your regex code, to allow easy A/B testing.
Use a load test tool like WCAT to see how many page requests per second you can process with and without the code enabled.
I recently had to do some pef tests on an HTTPModule that I wrote and decided to perform a couple of load tests to simulate web traffic and capture the performance times with and without the module configured. It was the only way I could figure to really know the affect of having the module installed.
I would usually do something with Apache Bench (see the following for how to intsall, How to install apache bench on windows 7?), but I had to also use windows authentication. As ab only has basic authentication I it wasn't a fit for me. ab is slick and allows for different request scenarios, so that would be the first place to look. One other thought is you can get a lot of visibility by using glimpse as well.
Being that I couldn't use ab I wrote something custom that will allow for concurrent requests and test different url times.
Below is what I came up with to test the module, hope it helps!
// https://www.nuget.org/packages/RestSharp
using RestSharp;
using RestSharp.Authenticators;
using RestSharp.Authenticators.OAuth;
using RestSharp.Contrib;
using RestSharp.Deserializers;
using RestSharp.Extensions;
using RestSharp.Serializers;
using RestSharp.Validation;
string baseUrl = "http://localhost/";
void Main()
{
for(var i = 0; i < 10; i++)
{
RunTests();
}
}
private void RunTests()
{
var sites = new string[] {
"/resource/location",
};
RunFor(sites);
}
private void RunFor(string[] sites)
{
RunTest(sites, 1);
RunTest(sites, 5);
RunTest(sites, 25);
RunTest(sites, 50);
RunTest(sites, 100);
RunTest(sites, 500);
RunTest(sites, 1000);
}
private void RunTest(string[] sites, int iterations, string description = "")
{
var action = GetAction();
var watch = new Stopwatch();
// Construct started tasks
Task<bool>[] tasks = new Task<bool>[sites.Count()];
watch.Start();
for(int j = 0; j < iterations; j++)
{
for (int i = 0; i < sites.Count(); i++)
{
tasks[i] = Task<bool>.Factory.StartNew(action, sites[i]);
}
}
try
{
Task.WaitAll(tasks);
}
catch (AggregateException e)
{
Console.WriteLine("\nThe following exceptions have been thrown by WaitAll()");
for (int j = 0; j < e.InnerExceptions.Count; j++)
{
Console.WriteLine("\n-------------------------------------------------\n{0}", e.InnerExceptions[j].ToString());
}
}
finally
{
watch.Stop();
Console.WriteLine("\"{0}|{1}|{2}\", ",sites.Count(), iterations, watch.Elapsed.TotalSeconds);
}
}
private Func<object, bool> GetAction()
{
baseUrl = baseUrl.Trim('/');
return (object obj) =>
{
var str = (string)obj;
var client = new RestClient(baseUrl);
client.Authenticator = new NtlmAuthenticator();
var request = new RestRequest(str, Method.GET);
request.AddHeader("Accept", "text/html");
var response = client.Execute(request);
return (response != null);
};
}

Resources