ASP.NET: HttpModule performance - asp.net

I've implemented an HttpModule that intercepts the Response stream of every request and runs a half dozen to a dozen Regex.Replace()s on each text/html-typed response. I'm concerned about how much of a performance hit I'm incurring here. What's a good way to find out? I want to compare speed with and without this HttpModule running.

I've a few of these that hook into the Response.Filter stream pipeline to provide resource file integration, JS/CSS packing and rewriting of static files to absolute paths.
As long as you test your regexes in RegexBuddy for speed over a few million iterations, ensure you use RegexOptions.Compiled, and remember that often the quickest and most efficient technique is to use a regex to broadly identify matches and then use C# to hone that to exactly what you need.
Make sure you're also caching and configuration that you rely upon.
We've had a lot of success with this.

Http module is just common piece of code, so you can measure time of execution of this particular regex replace stuff. It is enough. Have a set of typical response streams as input of your stress test and measure executing of the replace using Stopwatch class. Consider also RegexOptions.Compiled switch.

Here are a few ideas:
Add some Windows performance counters, and use them to measure and report average timing data. You might also increment a counter only if the time measurement exceeds a certain threshold. and
Use tracing combined with Failed Request Tracing to collect and report timing data. You can also trigger FRT reports only if page execution time exceeds a threshold.
Write a unit test that uses the Windows OS clock to measure how long your code takes to execute.
Add a flag to your code that you can turn on or off with a test page to enable or disable your regex code, to allow easy A/B testing.
Use a load test tool like WCAT to see how many page requests per second you can process with and without the code enabled.

I recently had to do some pef tests on an HTTPModule that I wrote and decided to perform a couple of load tests to simulate web traffic and capture the performance times with and without the module configured. It was the only way I could figure to really know the affect of having the module installed.
I would usually do something with Apache Bench (see the following for how to intsall, How to install apache bench on windows 7?), but I had to also use windows authentication. As ab only has basic authentication I it wasn't a fit for me. ab is slick and allows for different request scenarios, so that would be the first place to look. One other thought is you can get a lot of visibility by using glimpse as well.
Being that I couldn't use ab I wrote something custom that will allow for concurrent requests and test different url times.
Below is what I came up with to test the module, hope it helps!
// https://www.nuget.org/packages/RestSharp
using RestSharp;
using RestSharp.Authenticators;
using RestSharp.Authenticators.OAuth;
using RestSharp.Contrib;
using RestSharp.Deserializers;
using RestSharp.Extensions;
using RestSharp.Serializers;
using RestSharp.Validation;
string baseUrl = "http://localhost/";
void Main()
{
for(var i = 0; i < 10; i++)
{
RunTests();
}
}
private void RunTests()
{
var sites = new string[] {
"/resource/location",
};
RunFor(sites);
}
private void RunFor(string[] sites)
{
RunTest(sites, 1);
RunTest(sites, 5);
RunTest(sites, 25);
RunTest(sites, 50);
RunTest(sites, 100);
RunTest(sites, 500);
RunTest(sites, 1000);
}
private void RunTest(string[] sites, int iterations, string description = "")
{
var action = GetAction();
var watch = new Stopwatch();
// Construct started tasks
Task<bool>[] tasks = new Task<bool>[sites.Count()];
watch.Start();
for(int j = 0; j < iterations; j++)
{
for (int i = 0; i < sites.Count(); i++)
{
tasks[i] = Task<bool>.Factory.StartNew(action, sites[i]);
}
}
try
{
Task.WaitAll(tasks);
}
catch (AggregateException e)
{
Console.WriteLine("\nThe following exceptions have been thrown by WaitAll()");
for (int j = 0; j < e.InnerExceptions.Count; j++)
{
Console.WriteLine("\n-------------------------------------------------\n{0}", e.InnerExceptions[j].ToString());
}
}
finally
{
watch.Stop();
Console.WriteLine("\"{0}|{1}|{2}\", ",sites.Count(), iterations, watch.Elapsed.TotalSeconds);
}
}
private Func<object, bool> GetAction()
{
baseUrl = baseUrl.Trim('/');
return (object obj) =>
{
var str = (string)obj;
var client = new RestClient(baseUrl);
client.Authenticator = new NtlmAuthenticator();
var request = new RestRequest(str, Method.GET);
request.AddHeader("Accept", "text/html");
var response = client.Execute(request);
return (response != null);
};
}

Related

Confluent Batch Consumer. Consumer not working if Time out is specified

I am trying to consume a max of 1000 messages from kafka at a time. (I am doing this because i need to batch insert into MSSQL.) I was under the impression that kafka keeps an internal queue which fetches messages from the brokers and when i use the consumer.consume() method it just checks if there are any messages in the internal queue and returns if it finds something. otherwise it just blocks until the internal queue is updated or until timeout.
I tried to use the solution suggested here: https://github.com/confluentinc/confluent-kafka-dotnet/issues/1164#issuecomment-610308425
but when i specify TimeSpan.Zero (or any other timespan up to 1000ms) the consumer never consumes any messages. but if i remove the timeout it does consume messages but then i am unable to exit the loop if there are no more messages left to be read.
I also saw an other question on stackoverflow which suggested to read the offset of the last message sent to kafka and then read messages until i reach that offset and then break from the loop. but currently i only have one consumer and 6 partitions for a topic. I haven't tried it yet but i think managing offsets for each of the partition might make the code messy.
Can someone please tell me what to do?
static List<RealTime> getBatch()
{
var config = new ConsumerConfig
{
BootstrapServers = ConfigurationManager.AppSettings["BootstrapServers"],
GroupId = ConfigurationManager.AppSettings["ConsumerGroupID"],
AutoOffsetReset = AutoOffsetReset.Earliest,
};
List<RealTime> results = new List<RealTime>();
List<string> malformedJson = new List<string>();
using (var consumer = new ConsumerBuilder<Ignore, string>(config).Build())
{
consumer.Subscribe("RealTimeTopic");
int count = 0;
while (count < batchSize)
{
var consumerResult = consumer.Consume(1000);
if (consumerResult?.Message is null)
{
break;
}
Console.WriteLine("read");
try
{
RealTime item = JsonSerializer.Deserialize<RealTime>(consumerResult.Message.Value);
results.Add(item);
count += 1;
}
catch(Exception e)
{
Console.WriteLine("malformed");
malformedJson.Add(consumerResult.Message.Value);
}
}
consumer.Close();
};
Console.WriteLine(malformedJson.Count);
return results;
}
I found a workaround.
For some reason the consumer first needs to be called without a timeout. That means it will wait for a message until it gets at least one. after that using consume with timeout zero fetches all the rest of the messages one by one from the internal queue. this seems to work out for the best.
I had a similar problem, updating the Confluent.Kafka and lidrdkafka libraries from version 1.8.2 to 2.0.2 helped

Using a background worker in ASP.NET with AJAX

I have the need to perform a background task that has a progress bar that shows percentage done and a cancel button. Task specifics aside, for now, I just want to get an example working, so I just have the three main event handlers (DoWork, ProgressChanged, and RunWorkerCompleted) and a loop that just increments a counter and sleeps for 50ms in DoWork. However, it doesn't update except for once at the end.
In Windows Forms I use a Background worker and it functions correctly without any issues. I'd like to just use this same code. However, I have been seeing stuff that says ASP.NET must use AJAX to get the same functionality. So my questions are:
1) Do I really need AJAX to use the background worker?
2) If yes, I do need AJAX, what is the easiest, most simplest way a person that doesn't know a darn thing about AJAX could do to get the Background worker up and running on an ASP.NET webpage?
3) If no, I don't need AJAX, can anyone point me to a working sample that doesn't use it? I am interested even if it uses some other threading method than background workers.
Sorry for the multi-part question! If you can answer one or the other, it would be much appreciated. I don't really mind which method I end up using as long as it works.
Code for reference from the .cs page:
protected void bwProcess_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
lblProgress.Text = "Task Complete: " + e.Result;
}
protected void bwProcess_ProgressChanged(object sender, ProgressChangedEventArgs e)
{
lblProgress.Text = e.ProgressPercentage.ToString();
}
protected void bwProcess_DoWork(object sender, DoWorkEventArgs e)
{
for (int i = 0; i <= 100; i++)
{
if (bwProcess.CancellationPending)
{
lblProgress.Text = "Task Cancelled.";
e.Cancel = true;
return;
}
bwProcess.ReportProgress(i);
Thread.Sleep(50);
}
e.Result = "100%";
}
protected void BWClick(object sender, EventArgs e)
{
lblProgress.Text = "Firing Process...";
bwProcess = new BackgroundWorker();
bwProcess.WorkerReportsProgress = true;
bwProcess.WorkerSupportsCancellation = true;
bwProcess.DoWork += new DoWorkEventHandler(bwProcess_DoWork);
bwProcess.ProgressChanged += new ProgressChangedEventHandler(bwProcess_ProgressChanged);
bwProcess.RunWorkerCompleted += new RunWorkerCompletedEventHandler(bwProcess_RunWorkerCompleted);
if (bwProcess != null)
{
bwProcess.RunWorkerAsync("StartAsynchronousProcess");
}
}
Other notes: I have entered Async="True" and EnableSessionState="ReadOnly" into the #Page.
Thanks in advance!
Web programming offer's many challenges which are easy to take for granted when desktop programming. Moving from one to the other can require a lot of changes. Spawning long-running threads is one of those things that require more care to avoid pitfalls. The application pool does not know about threads that you spawn so when it recycles it will kill those threads causing unexpected behavior in your application. See this post for more about that, Can I use threads to carry out long-running jobs on IIS?
This means you need to use a more persistent means to keep track of progress. A database would probably be best, but even a file would persist after the pool is recycled.
AJAX would be perfect for this because it will allow you to pull the progress from the database asynchronously in the background and update the webpage. Here is a breif example of how you might achieve this. All the percentage calculations are done on server side.
function getProgress() {
$.ajax({
url: '/progress', // send HTTP-GET to /progress page
dataType: 'json', // return format will be JSON
success: function(result) { // function to run once the result is pulled
$("#p1").html('Percentage: %' + result);
if (result == "100")
clearInterval(window.progressID);
}
});
}
function startGetProgress() {
window.progressID = setInterval(getProgress, 10000); // update progress every 10 seconds
}
The above example uses JQuery ajax method, http://api.jquery.com/jQuery.ajax/ so you will need to reference the JQuery library.
Whether you are using webforms or mvc you can add a web api controller to handle the ajax request. Let me know if you need me to clarify anything.

Loading any persistent workflow containing delay activity when it is a runnable instance in the store

We are trying to load and resume workflows which have a delay. I have seen the Microsoft sample of Absolute Delay for this using store.WaitForEvents and LoadRunnableInstance to load the workflow. However here the workflow is already known.
In our case we want to have an event waiting for the store.WaitForEvents after every say 5 seconds to check if there is a runnable instance and if so only load and run that /those particular instances. Is there a way I could know which workflow instance is ready.
We are maintaing the workflow id and the xaml associated to it in our database, so if we could know the workflow instance id we could get the xaml mapped to it, create the workflow and then do a LOadRunnableInstance on it.
Any help would be greatly appreciated.
Microsoft sample (Absolute Delay)
public void Run(){
wfHostTypeName = XName.Get("Version" + Guid.NewGuid().ToString(),
typeof(WorkflowWithDelay).FullName);
this.instanceStore = SetupSqlpersistenceStore();
this.instanceHandle =
CreateInstanceStoreOwnerHandle(instanceStore, wfHostTypeName);
WorkflowApplication wfApp = CreateWorkflowApp();
wfApp.Run();
while (true)
{
this.waitHandler.WaitOne();
if (completed)
{
break;
}
WaitForRunnableInstance(this.instanceHandle);
wfApp = CreateWorkflowApp();
try
{
wfApp.LoadRunnableInstance();
waitHandler.Reset();
wfApp.Run();
}
catch (InstanceNotReadyException)
{
Console.WriteLine("Handled expected InstanceNotReadyException, retrying...");
}
}
Console.WriteLine("workflow completed.");
}
public void WaitForRunnableInstance(InstanceHandle handle)
{
var events=instanceStore.WaitForEvents(handle, TimeSpan.MaxValue);
bool foundRunnable = false;
foreach (var persistenceEvent in events)
{
if (persistenceEvent.Equals(HasRunnableWorkflowEvent.Value))
{
foundRunnable = true;
break;
}
}
if (!foundRunnable) {
Console.WriteLine("no runnable instance");
}
}
Thanks
Anamika
I had a similar problem with durable delay activities and WorkflowApplicationHost. Ended up creating my own 'Delay' activity that worked essentially the same way as the one out of the box, (takes an arg that describes when to resume the workflow, and then bookmarks itself). Instead of saving delay info in the SqlInstanceStore though, my Delay Activity created a record in a seperate db. (similar to the one you are using to track the Workflow Ids and Xaml). I then wrote a simple service that polled that DB for expired delays and initiated a resume of the necessary workflow.
Oh, and the Delay activity deleted it's record from that DB on bookmark resume.
HTH
I'd suggest having a separate SqlPersistenceStore for each workflow definition you're hosting.

ASP.NET Web application prevent denial of service attacks

What tools or techniques can I use to protect my ASP.NET web application from Denial Of Service attacks
For sure a hardware solution is the best option to prevent DOS attacks, but considering a situation in which you have no access to hardware config or IIS settings, this is definitely why a developer must have something handy to block or at least decrease dos attack effect.
The core concept of logic relies on a FIFO (First In First Out) collection such as Queue, but as it has some limitations I decided to create my own collection.
Without discussing more details this is the complete code I use:
public class AntiDosAttack
{
readonly static List<IpObject> items = new List<IpObject>();
public static void Monitor(int Capacity, int Seconds2Keep, int AllowedCount)
{
string ip = HttpContext.Current.Request.UserHostAddress;
if (ip == "")
return;
// This part to exclude some useful requesters
if(HttpContext.Current.Request.UserAgent != null && HttpContext.Current.Request.UserAgent == "Some good bots")
return;
// to remove old requests from collection
int index = -1;
for (int i = 0; i < items.Count; i++)
{
if ((DateTime.Now - items[i].Date).TotalSeconds > Seconds2Keep)
{
index = i;
break;
}
}
if (index > -1)
{
items.RemoveRange(index, items.Count - index);
}
// Add new IP
items.Insert(0, new IpObject(ip));
// Trim collection capacity to original size, I could not find a better reliable way
if (items.Count > Capacity)
{
items.RemoveAt(items.Count - 1);
}
// Count of currect IP in collection
int count = items.Count(t => t.IP == ip);
// Decide on block or bypass
if (count > AllowedCount)
{
// alert webmaster by email (optional)
ErrorReport.Report.ToWebmaster(new Exception("Blocked probable ongoing ddos attack"), "EvrinHost 24 / 7 Support - DDOS Block", "");
// create a response code 429 or whatever needed and end response
HttpContext.Current.Response.StatusCode = 429;
HttpContext.Current.Response.StatusDescription = "Too Many Requests, Slow down Cowboy!";
HttpContext.Current.Response.Write("Too Many Requests");
HttpContext.Current.Response.Flush(); // Sends all currently buffered output to the client.
HttpContext.Current.Response.SuppressContent = true; // Gets or sets a value indicating whether to send HTTP content to the client.
HttpContext.Current.ApplicationInstance.CompleteRequest(); // Causes ASP.NET to bypass all events and filtering in the HTTP pipeline chain of execution and directly execute the EndRequest event.
}
}
internal class IpObject
{
public IpObject(string ip)
{
IP = ip;
Date = DateTime.Now;
}
public string IP { get; set; }
public DateTime Date { get; set; }
}
}
The internal class is designed to keep the date of request.
Naturally DOS Attack requests create new sessions on each request while human requests on a website contain multiple requests packed in one session, so the method can be called in Session_Start.
usage:
protected void Session_Start(object sender, EventArgs e)
{
// numbers can be tuned for different purposes, this one is for a website with low requests
// this means: prevent a request if exceeds 10 out of total 30 in 2 seconds
AntiDosAttack.Monitor(30, 2, 10);
}
for a heavy request website you may change seconds to milliseconds but consider the extra load caused by this code.
I am not aware if there is a better solution to block intentional attacks on website, so I appreciate any comment and suggestion to improve the code. By then I consider this as a best practice to prevent DOS attacks on ASP.NET websites programmatically.
Try the Dynamic IP Restriction extension http://www.iis.net/download/dynamiciprestrictions
Not a perfect solution, but helps raise the bar =)
It's a broad area, so if you can be more specific about your application, or the level of threat you're trying to protect against, I'm sure more people can help you.
However, off the bat, you can go for a combination of a caching solution such as Squid: http://www.blyon.com/using-squid-proxy-to-fight-ddos/, Dynamic IP Restriction (as explained by Jim) and if you have the infrastructure, an active-passive failover setup, where your passive machine serves placeholder content which doesnt hit your database / any other machines. This is last-defence, so that you minimise the time a DDOS might bring your entire site offline for.

Small performance test on a web service

I'm trying to develop a small application that tests how many requests per second my service can support but I think I'm doing something wrong. The service is in an early development stage, but I'd like to have this test handy in order to check from time to time I'm not doing something that decrease the performance. The problem is that I cannot get the web server or the database server go to the 100% of CPU.
I'm using three different computers, in one is the web server (WinSrv Standard 2008 x64 IIS7), in other the database (Win 2K, SQL Server 2005) and the last is my computer (Win7 x64 Ultimate), where I'll run the test. The computers are connected through a 100 ethernet switch. The request POST is 9 bytes and the response will be 842 bytes.
The test launches several threads, and each thread has a while loop, in each loop it creates a WebRequest object, performs a call, increment a common counter and waits between 1 and 5 milliseconds, then it do it again:
static int counter = 0;
static void Main(string[] args)
{
ServicePointManager.DefaultConnectionLimit = 250;
Console.WriteLine("Ready. Press any key...");
Console.ReadKey();
Console.WriteLine("Running...");
string localhost = "localhost";
string linuxmono = "192.168.1.74";
string server = "192.168.1.5:8080";
DateTime start = DateTime.Now;
Random r = new Random(DateTime.Now.Millisecond);
for (int i = 0; i < 50; i++)
{
new Thread(new ParameterizedThreadStart(Test)).Start(server);
Thread.Sleep(r.Next(1, 3));
}
Thread.Sleep(2000);
while (true)
{
Console.WriteLine("Request per second :"
+ counter / DateTime.Now.Subtract(start).TotalSeconds);
Thread.Sleep(3000);
}
}
public static void Test(object ip)
{
Guid guid = Guid.NewGuid();
Random r = new Random(DateTime.Now.Millisecond);
while (true)
{
String test = "<lalala/>";
WebRequest req = WebRequest.Create("http://"
+ (string) ip + "/WebApp/" + guid.ToString()
+ "/Data/Tables=whatever");
req.Method = "POST";
req.ContentType = "application/xml";
req.Credentials = new NetworkCredential("aaa", "aaa","domain");
byte[] array = Encoding.UTF8.GetBytes(test);
req.ContentLength = array.Length;
using (Stream reqStream = req.GetRequestStream())
{
reqStream.Write(array, 0, array.Length);
reqStream.Close();
}
using (Stream responseStream = req.GetResponse().GetResponseStream())
{
String response = new StreamReader(responseStream).ReadToEnd();
if (response.Length != 842) Console.Write(" EEEE ");
}
Interlocked.Increment(ref counter);
Thread.Sleep(r.Next(1,5));
}
}
If I run the test neither of the computers do an excessive CPU usage. Let's say I get a X requests per second, if I run the console application two times at the same moment, I get X/2 request per second in each one... but still the web server is on 30% of CPU, the database server on 25%...
I've tried to remove the Thread.Sleep in the loop, but it doesn't make a big difference.
I'd like to put the machines to the maximum, to check how may requests per second they can provide. I guessed that I could do it in this way... but apparently I'm missing something here... What is the problem?
Kind regards.
IMO, you're better off using SoapUI for the test. You can easily adjust the test case for the number of threads, number of iterations, etc.. And it'll graph the results. When you hit the plateau where you overwhelm the server, you'll see it on the graph. If one PC isn't enough, just run more of them on other PCs. You can do all of this with the free version.
There are a lot of limiting factors besides the CPU on a web server. There are a lot of IIS settings which throttle the number of connections can be served.
I would read this:
http://www.eggheadcafe.com/articles/20050613.asp
I know it is for IIS 6, but there are things that will still apply.
If you have access to MSDN and have VS 2010 ultimate, I would check out their load testing tools. Purchasing the load testing program can be expensive, but if you need to test something specific, you can use the trial version to accomplish what you need. You can use it to monitor and response time, server utilization, etc. Well worth looking into.
I agree with Chris, and would go a step further to recommend JMeter, as it can also test the database and webapp, all within the same script.

Resources