I am writing an HttpModule in VS2010/ASP.NET 4.0 for use with IIS 7. The module is going to enforce query string security by encrypting query strings.
I would like this module to be completely independent of as well as transparent to the website, so that the website has no knowledge of the fact that query string encryption is being employed. This will ensure firstly that pages/controls don't have to care about this issue. Secondly, it would enable query string encryption for Production environments and disable it for non-Production ones (by removing the HTTP module from the Web.config).
I have designed the HttpModule to get plugged into IIS via the Web.config by:
<configuration>
<system.web>
<httpModules>
<add name="QueryStringSecurityModule" type="MyHttpModules.QueryStringSecurityModule"/>
</httpModules>
</system.web>
</configuration>
The module itself looks like this:
public class QueryStringSecurityModule : IHttpModule
{
public virtual void Init(HttpApplication application)
{
application.BeginRequest += HandleBeginRequest;
application.EndRequest += HandleEndRequest;
application.ReleaseRequestState += HandleReleaseRequestState;
}
public virtual void Dispose()
{
}
private void HandleBeginRequest(object sender, EventArgs e)
{
// TODO : Decrypt the query string here and pass it on to the application
}
private void HandleEndRequest(object sender, EventArgs e)
{
// TODO : Twiddle thumbs
}
private void HandleReleaseRequestState(object sender, EventArgs e)
{
var response = HttpContext.Current.Response;
if (response.ContentType == "text/html")
{
response.Filter = new QueryStringSecurityStream(response.Filter);
}
}
}
There is a class QueryStringSecurityStream which is used to fiddle with the HTML output in the Response, and secure all tags by replacing the query strings therein with encrypted ones.
public QueryStringSecurityStream : Stream
{
public QueryStringSecurityStream(Stream stream)
: base()
{
}
public override void Write(byte[] buffer, int offset, int count)
{
var html = Encoding.Default.GetString(buffer, offset, count).ReplaceHRefsWithSecureHRefs();
var bytes = Encoding.Default.GetBytes(html);
this.stream.Write(bytes, 0, bytes.Length);
}
}
The magic happens, or is supposed to happen, in the ReplaceHRefsWithSecureHRefs() extension method.
This method expects the entire HTML. It will go through it with a fine-toothed comb (i.e., using a Regex), find all the anchor tags, take out their href attributes, replace any query strings in the href value with encrypted versions and return the HTML. This HTML will then be written out to the Response stream.
So far so good. All of this falls over because I suspect that ReleaseRequestState is raised multiple times for individual requests. That is to say, that there are multiple calls to ReleaseRequestState as a result of a single call to BeginRequest.
What I am looking for is:
Confirmation that my hunch is correct. I have asked Mr. Google and Mr. MSDN but haven't found anything definitive. I seem to remember hitting something similar while debugging WSDL from an ASMX web service running in IIS 6. In that case I solved the issue by caching the incoming byte stream till I have valid XML and then writing it all out after modifying it.
The right way to handle this sort of scenario. You can take this to either mean specifically the single BeginRequest/multiple ReleaseRequestState calls issue or query string encryption generally.
Ladies and Gentlemen. Start your engines. Let the answers roll in.
Update:
I read this article on MSDN on request life-cycle
I have solved this issue for myself by creating a buffer to store response content across multiple calls to ReleaseRequestState. On every call, I check for the existence of a </html> tag and writes out the content buffered up to that point after modification (in my case encrypting the query strings in the <a> tags).
So:
Declare a StringBuilder as a private field member in the QueryStringSecurityModule class (in my case a StringBuilder to serve as a buffer for response content).
Initialize the field at BeginRequest (in my case, allocate a StringBuilder).
Finalize the field at EndRequest (in my case set it to null, though I have read that EndRequest doesn't always fire)
Buffer bytes being sent to Write in the custom filter till we find a closing html tag, at which point we modify buffer contents and write them out to the output stream.
Would anybody like to comment on this approach?
Related
I'm writing a route that will allow the user to set a cookie with the version of some JSON object that the application will use to set client-side configurations. It is a fairly large JSON object that we don't want to store in a cookie alone. We want to store ONLY the version to be looked up and set from some map up in the cloud on every request since multiple versions of the client are running around and we want those to be separated on a per request basis.
Currently, I know the problem is due to my lack of understanding of the single request lifecycle of ASP.NET MVC as I'm sure the following code proves. I do know that the Application_BeginRequest Action is probably happening BEFORE the route is handled (correct me if I'm wrong here), but I am not sure where it SHOULD be happening so that the cookie is populated BEFORE it is retrieved. I also don't believe that Application_EndRequest would be better due to the same, but opposite issue.
Any and all suggestions that lead to my understanding of the lifecycle and an appropriate Action to handle that kind of cookie value getting will be welcomed!
// Working controller (cookie does get set, this is confirmed)
using System;
using System.Web;
using System.Web.Mvc;
using SMM.Web.Infrastructure.Filters;
namespace SMM.Web.Controllers
{
[NoCache]
public class SetCookieController : ApplicationController
{
private HttpCookie CreateVersionCookie(int versionId)
{
HttpCookie versionCookie = new HttpCookie("version_id");
versionCookie.Value = versionId.ToString();
return versionCookie;
}
public ActionResult SetCookie(int versionId)
{
Response.Cookies.Add(CreateVersionCookie(versionId));
return Redirect("/");
}
}
}
// In Global.asax.cs (this does not work to get the cookie)
private void LoadSomeJsonFromACookie()
{
HttpCookie someJsonThingCookie = HttpContext.Current.Request.Cookies["version_id"];
string jsonVersion = (string)staticVersionCookie.Value;
string json = FunctionToGetSomeJsonThingByVersion(jsonVersion); // This returns a stringified JSON object based on the jsonVersion supplied
dynamic someJsonThing = JsonConvert.DeserializeObject<dynamic>(json);
HttpContext.Current.Items["someJsonThing"] = someJsonThing;
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
RedirectToHttps();
// some other redirects happen here
LoadSomeJsonFromACookie();
}
Application_BeginRequest is the right place. Since in the code, you can see I'm firing a redirect back to root /, it will set the cookie before it ever needs the cookie.
I have an ASP.Net website where I am downloading a large zip file to the server from a remote site. This file is not transferred to the client, but will remain on the server. I would like to provide progress updates to the user using SignalR. When I use the code below:
public class InstallController : Hub
{
public void Send( string message )
{
Clients.All.AddMessage( message );
}
public void FileDownload()
{
WebClient client = new WebClient();
client.DownloadProgressChanged += new DownloadProgressChangedEventHandler( client_DownloadProgressChanged );
client.DownloadFileCompleted += new AsyncCompletedEventHandler( client_DownloadFileCompleted );
client.DownloadFileAsync( new Uri( "http://someserver.com/install/file.zip" ), #"\file.zip" );
}
/* callbacks for download */
void client_DownloadProgressChanged( object sender, DownloadProgressChangedEventArgs e )
{
double bytesIn = double.Parse( e.BytesReceived.ToString() );
double totalBytes = double.Parse( e.TotalBytesToReceive.ToString() );
double percentage = bytesIn / totalBytes * 100;
this.Send( String.Format( "Download progress: {0}%", percentage.ToString() ) );
}
void client_DownloadFileCompleted( object sender, AsyncCompletedEventArgs e )
{
this.Send( "Finished downloading file..." );
}
}
I get the exception:
An exception of type 'System.InvalidOperationException' occurred in
System.Web.dll but was not handled in user code
Additional information: An asynchronous operation cannot be started at
this time. Asynchronous operations may only be started within an
asynchronous handler or module or during certain events in the Page
lifecycle. If this exception occurred while executing a Page, ensure
that the Page is marked <%# Page Async="true" %>. This exception may
also indicate an attempt to call an "async void" method, which is
generally unsupported within ASP.NET request processing. Instead, the
asynchronous method should return a Task, and the caller should await
it.
I've seen several mentions to use the HttpClient instead of the WebClient, but I don't see how to get the progress from that.
"It's All About the SynchronizationContext"
http://msdn.microsoft.com/en-us/magazine/gg598924.aspx
This phrase is becoming quite common since the addition of new technology and features in .NET.
Briefly.. There are several components, such as BackgroundWorker and WebClient, thats hiding the SynchronizationContext to the capture and usage, it means that you need to respect the life cycle of requests, the life cycle of ASP.NET components.
Speaking specifically, the HTTP methods (GET and POST) always keep working in the same way, the client submits a HTTP request to the server, then the server returns a response to the client, and the application will try to ensure that this occurs, the SynchronizationContext of ASP.NET was designed for this.
More information:
http://codewala.net/2014/03/28/writing-asynchronous-web-pages-with-asp-net-part-3/
Which ASP.NET lifecycle events can be async?
http://evolpin.wordpress.com/2011/05/02/c-5-await-and-async-in-asp-net/
Even the requests using SignalR contains the same ASP.NET SynchronizationContext, because of it you need to work "outside" the current SynchronizationContext or use it in the right way.
SignalR was designed to use asynchronous programming, using TPL as default, you can take benefits of it, check in http://www.asp.net/signalr/overview/signalr-20/hubs-api/hubs-api-guide-server#asyncmethods and http://www.asp.net/signalr/overview/signalr-20/hubs-api/hubs-api-guide-server#asyncclient
You can solve your problem in many ways.
If you want to use SignalR to show the progress.. I would do something like the code below (I'm still using .NET 4.0, bu it is more easy with .NET 4.5 with TaskAsync methods).
public Task<string> FileDownload()
{
var client = new WebClient();
client.DownloadProgressChanged += (sender, args) => client_DownloadProgressChanged(sender, args, this.Context.ConnectionId);
client.DownloadFileAsync(new Uri("https://epub-samples.googlecode.com/files/cc-shared-culture-20120130.epub"), #"C:\temp\file.zip");
var result = new TaskCompletionSource<string>();
AsyncCompletedEventHandler clientOnDownloadFileCompleted = (sender, args) =>
{
client.Dispose();
if (args.Error != null)
{
result.SetException(args.Error); //or result.SetResult(args.Error.Message);
return;
}
result.SetResult("Downloaded");
};
client.DownloadFileCompleted += clientOnDownloadFileCompleted;
return result.Task;
}
private static void client_DownloadProgressChanged(object sender, DownloadProgressChangedEventArgs e,
string connectionId)
{
GlobalHost.ConnectionManager.GetHubContext<SomeHub>()
.Clients.Client(connectionId)
.NotifyProgress(e.ProgressPercentage);
}
Keep in mind that this is just an example, you could improve the way they treat the disconnection, and cancellation, among other things that can occur (depends on your application logic).
Also it is possible to use a "workaround" (not recommended):
Fire and forget async method in asp.net mvc
How to execute async 'fire and forget' operation in ASP.NET Web API
The code would be very similar to the above.
in my project i need to upload files so i decided to use the uploader provided by asp.net ajax controls AsyncFileUPloader control.
there are four blocks. every block contains two such uploaders
so i decided to utilize the power of asp.net web user controls.
i wrapped the required form fields in my user control called DesignUploader.ascx
now i have to put the four instances of this control on my aspx page
please refer the snap below
my problem starts here i have to insert the fileurl to the database and each of the block generates unique id and id value changes after uploading the file to the server. i noticed that viewstate does not work for me in case of asyncfileuploader it clears the viewstate because it does the secret postback to the server behind the scenes. now only option left for me is to use session but when user uploads the files in two blocks one after another then filepath from second/third consecutive blocks overwrite my session. i don't know how many blocks user can use to upload the designs he may use 1 only or he may use all four.
There would be a final submit button in the bottom of the page on click of which i have to insert data to database.
so when i tried to save the data to database the session stores the value of the recently uploaded file path for all the records my problem lies here
i don't know if i was able to describe my problem in correct manner or not please excuse me if it is not clear and post comment if required.
Note: I can not change the UI because client insists for this only :(
any quick work around would be appreciated much
Thanks
Devjosh
I believe you saving file path to session in a wrong way and it's impossible to recognize where is an error without code.
All the way, in my opinion better don't persist file path in session but use client side for that purpose instead. You can add two hidden fields to DesignUploader.ascx control and set their values in UploadedComplete event handler.
public partial class DesignUploader : System.Web.UI.UserControl
{
private static readonly string AppDataPath = HttpContext.Current.Server.MapPath("~/App_Data/");
public string FirstFilePath
{
get
{
return Server.UrlDecode( FirstFilePathHiddenField.Value);
}
}
public string SecondFilePath
{
get
{
return Server.UrlDecode(SecondFilePathHiddenField.Value);
}
}
protected override void OnInit(EventArgs e)
{
base.OnInit(e);
FirstFileUpload.UploadedComplete += FirstFileUpload_UploadedComplete;
SecondileUpload.UploadedComplete += SecondileUpload_UploadedComplete;
}
void FirstFileUpload_UploadedComplete(object sender, AjaxControlToolkit.AsyncFileUploadEventArgs e)
{
var fullPath = Path.Combine(AppDataPath, Path.GetFileName(e.FileName));
FirstFileUpload.SaveAs(fullPath);
SaveFilePathToHiddenField(FirstFilePathHiddenField.ClientID, fullPath);
}
void SecondileUpload_UploadedComplete(object sender, AjaxControlToolkit.AsyncFileUploadEventArgs e)
{
var fullPath = Path.Combine(AppDataPath, Path.GetFileName(e.FileName));
SecondileUpload.SaveAs(fullPath);
SaveFilePathToHiddenField(SecondFilePathHiddenField.ClientID, fullPath);
}
private void SaveFilePathToHiddenField(string fieldId, string pathValue)
{
var script = string.Format("top.$get('{0}').value = '{1}';", fieldId, Server.UrlEncode(pathValue));
ScriptManager.RegisterStartupScript(this, this.GetType(), "setPath", script, true);
}
}
The code is trivial so rather than explain, here's the relevant portion:
protected void Page_Load(object sender, EventArgs e)
{
if ((Request.HttpMethod == "POST") && (Request.ContentType == "text/xml"))
{
string filename = string.Format("{0:yyyyMMddHHmmss}.xml", DateTime.UtcNow);
var path = Path.Combine(Request.MapPath("Chapters"), Request.Headers["X-MAC"]);
Directory.CreateDirectory(path);
string xml;
using (ChapterWriterClient client = new ChapterWriterClient())
using (StreamReader sr = new StreamReader(Request.InputStream))
{
xml = sr.ReadToEnd();
client.Write(xml);
System.Diagnostics.Trace.TraceInformation(xml);
}
Request.SaveAs(Path.Combine(path, filename), false);
}
There's more but the rest of it works as expected.
Here's the problem: the call client.Write(string xml) is erratic in whether it succeeds.
Some known facts:
No error is ever reported to the debugger or the event log
Sometimes the XML makes it into the message queue and sometimes it doesn't
Request.SaveAs(string filepath, bool includeHeaders) call always succeeds and the resultant file always contains the expected XML. This has several important implications:
The call to WCF is being executed.
Because the string xml had the expected value immediately after the failed WCF call, and assignment to this variable occurs only prior to the WCF call, it must have had the correct value at the time of the call.
There is some variation in message size so I'm checking the service config in that regard but past experience tells me to expect an exception thrown when the payload is too big.
Any clues would be appreciated.
Also, as you can probably see the whole point of the exercise is a somewhat naiive attempt to use an HTTP POST to put some XML into a message on MSMQ. I'd be perfectly happy with a "Don't do it like that, do it like this you ignorant savage" type answer.
I'm running VS 2008 and .NET 3.5 SP1.
I want to implement hit tracking in an HttpModule in my ASP.NET app. Pretty simple, I thought. However, the BeginRequest event of my HttpModule is firing twice for each page hit. The site is very simple right now...no security, just a bit of database work. Should log one row per page hit. Why is this event firing twice?
Moreover, IHttpModule.BeginRequest actually fires a different number of times for the first page hit when running for the first time (from a closed web browser)...3 times when I'm hitting the DB to provide dynamic data for the page, and only 1 time for pages where the DB isn't hit. It fires 2 times for every page hit after the first one, regardless of whether or not I'm touching the DB.
It's interesting to note that Application_BeginRequest (in Global.asax) is always firing only once.
Here's the code:
using System;
using System.Data;
using System.Data.Common;
using System.Net;
using System.Web;
using BluHeron.BusinessLayer;
using Microsoft.Practices.EnterpriseLibrary.Data.Sql;
namespace BluHeron.HttpModules
{
public class SiteUsageModule : IHttpModule
{
public void Init(HttpApplication httpApp)
{
httpApp.BeginRequest += OnBeginRequest;
}
static void OnBeginRequest(object sender, EventArgs a)
{
UsageLogger.LogSiteUsage(((HttpApplication)sender).Context.Request);
}
public void Dispose()
{ }
}
public static class UsageLogger
{
public static void LogSiteUsage(HttpRequest r)
{
string ipAddress = GetHostAddress(Dns.GetHostAddresses(Dns.GetHostName()));
string browserVersion = r.Browser.Type;
string[] urlChunks = r.RawUrl.Split('/');
string page = urlChunks[urlChunks.GetLength(0)-1];
SqlDatabase db = new SqlDatabase(Common.GetConnectionString());
DbCommand cmd = db.GetStoredProcCommand("LogUsage");
db.AddInParameter(cmd, "IPAddress", SqlDbType.NVarChar, ipAddress);
db.AddInParameter(cmd, "BrowserVersion", SqlDbType.NVarChar, browserVersion);
db.AddInParameter(cmd, "PageName", SqlDbType.NVarChar, page);
db.AddInParameter(cmd, "Notes", SqlDbType.NVarChar, "");
db.ExecuteNonQuery(cmd);
}
private static string GetHostAddress(IPAddress[] addresses)
{
foreach (IPAddress ip in addresses)
{
if (ip.ToString().Length <= 15)
{
return ip.ToString();
}
}
return "";
}
}
}
This might be too late for the answer but can be useful for someone else. I faced with the same problem. BeginRequest event triggered for twice for each request. I debugged the code and realized that the first trigger for actual resource request but the second is result of "favicon.ico" request. At the beginning of BeginRequest event, a simple check for favicon.ico request eliminates second execution of the method.
public void Application_BeginRequest(object sender, EventArgs e) {
HttpApplication app = (HttpApplication)sender;
HttpContext ctx = app.Context;
if (ctx.Request.Path == "/favicon.ico") { return; }
quite late on this, but ran into the same issue. In our case it was due to the anonymous request first that returns the 401 per the RFC. The second request authenticates.
The "Default Document" part of IIS seems to fire a second BeginRequest event.
If you have determined that the Request.Path is the same for the HttpApplication in both event handlers and your URL ends with a slash, try adding a URL Rewrite rule to shortcut the "Default Document" processing.
This is interesting. I removed the reference to the CSS file from the master page and I'm getting fewer repeat hits in the HttpModule for certain browsers (as was suggested), but I'm still getting repeats. I have 6 browsers installed, and I'm getting some variation between them.
For reference, this is the URL I'm plugging in to my browsers for this test:
http://localhost/BluHeron
default.aspx is set as the start page and is indeed returned for the aforementioned URL. I'm using HttpRequest.RawUrl for reporting which page the user hit. Specifically, I'm splitting the RawUrl string and just reporting the last item in the array of strings (see code).
Every single browser is reporting hitting default.aspx, as expected (RawUrl = /BluHeron/default.aspx).
4 of the 6 browsers are also reporting BluHeron (RawUrl = /BluHeron).
3 of the 6 browsers are also recording a blank in the database (RawUrl = /BluHeron/).
There are a couple ways I can get accurate reporting of how many people are hitting which pages.
Select from the database only rows that actually list one of my pages (ignore /BluHeron and blanks)
Just use Application_BeginRequest in the global.asax file, which seems to consistently get called only once per page hit.
Get this figured out.
So, I've got options for getting good reports even with crappy data in the database. I would prefer to understand what's going on here and not to have junk in the database.
Thanks for looking, everyone!
We solved this by using
HttpContext.Current.ApplicationInstance.CompleteRequest();
This should prevent the the twice fire you are seeing.
One possibility is that there are other requests going on that you might not be considering. For example, let's say your ASPX page references some images or CSS files. If those requests go through the ASP.NET pipeline then your module will be called and they'll register as hits.
Also, when you say IHttpModule.BeginRequest, do you mean that in IHttpModule.Init() you are hooking up HttpApplication.BeginRequest? If so then the reason I mention above might still apply.
Disable Browser Link in Visual Studio 2013 and up, which causes the second request.
This occurs when an Application is run from Visual Studio.