We'd like to restrict the maximum upload file size in our web site. We've already set the appropriate limits in our web.config. The problem we're encountering is if a really large file (1 GB, for example) is uploaded, the entire file is uploaded before a server-side error is generated, and the type of the error is different whether the file is huge or not.
Is there a way to detect the size of a pending file upload before the actual upload takes place?
Here's my relevant web.config settings that restrict requests to 16 MB:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.web>
<httpRuntime maxRequestLength="12288"/>
</system.web>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="12582912"/>
</requestFiltering>
</security>
</system.webServer>
</configuration>
I've tried creating an HTTP module so I could intercept a request early in the request lifecycle, but the uploads seem to take place even before the BeginRequest event of HttpApplication:
public class UploadModule : IHttpModule
{
private const int MaxUploadSize = 12582912;
public void Init(HttpApplication context)
{
context.BeginRequest += handleBeginRequest;
}
public void Dispose()
{
}
private void handleBeginRequest(object sender, EventArgs e)
{
// The upload takes place before this method gets called.
var app = sender as HttpApplication;
if (app.Request.Files.OfType<HttpPostedFile>()
.Any(f => f.ContentLength > MaxUploadSize))
{
app.Response.StatusCode = 413;
app.Response.StatusDescription = "Request Entity Too Large";
app.Response.End();
app.CompleteRequest();
}
}
}
Update:
I know that client-side technologies like Flash can detect file sizes before upload, but we need a server-side workaround because we're wanting to target platforms that have no Flash/Java/ActiveX/Silverlight support. I believe that IIS or ASP.NET has a bug that's allowing large files to be uploaded despite the limits, so I've filed a bug here.
Would an ISAPI extension give me more control over request processing than HTTP modules and handlers, such as allowing me to abort an upload if the Content-Length header is seen to be larger than the allowed limit?
Update 2:
Sigh. Microsoft has closed the bug I filed as a duplicate but has provided no additional information. Hopefully they didn't just drop the ball on this.
Update 3:
Hooray! According to Microsoft:
This bug is being resolved as it has been ported over to the IIS product team. The IIS team has since fixed the bug, which will be included in future release of Windows.
The problem is that the upload happens all at once using the HTTP Post request so you can only detect it after it's done.
If you want more control over this you should try Flash based upload widgets which have this and more. Check out this link http://www.ajaxline.com/10-most-interesting-upload-widgets
Microsoft has responded on their Microsoft Connect site with the following:
This bug is being resolved as it has been ported over to the IIS product team. The IIS team has since fixed the bug, which will be included in future release of Windows.
If you are requesting a fix for the current OS, a QFE request must be opened. Please let me know if this is the route that you want to take. Please note that opening a QFE request does not necessarily mean that it would be approved.
So I guess we have to wait for the next version of IIS for the fix (unless a QFE request is fulfilled, whatever that is).
Is there a way to detect the size of a
pending file upload before the actual
upload takes place?
No. That would require access to the file size on the client. Allowing a web server direct access to files on the client would be a bit dangerous.
Your best bet is to place a line of text stating the maximum allowed file size.
OR you could create some sort of ActiveX control, java applet, etc so that you're not dependent on browser restrictions. Then you have to convince your users to install it. Probably not the best solution.
Well.... Depends how low-level you want to get.
Create a service app that acts as a proxy for IIS. (All incoming port 80 socket requests go to the service.) Have the service pass everything it receives to IIS (website listening on a different port or IP), but monitor the total request size as its received.
When the size from a give connection exceeds you're desired limit, close connection. Return a redirect to an error page if you want to be polite.
Silly, but it'll let you monitor data in transit without waiting for IIS to hand over the request.
Related
I'm using Azure Web API 2. My clients are getting some 500 errors, and I'm trying to figure out why. I've turned on tracing in Azure portal, and without a truncated file, I see some great info like:
235. - GENERAL_RESPONSE_ENTITY_BUFFER
{"Message":...,"ExceptionMessage"...,"StackTrace":".....
The problem is my log files are getting truncated at 1MB. (The amount of posted JSON data can be large, which eats up log space.)
I see some potentially nice .htm files in LogFiles/DetailedErrors, but they are generic pages without any details or trace info.
In Web.Config I set <customErrors mode="Off" />. This added detail to trace files, but not to the DetailedErrors htm files.
Questions:
1) Can I increase the max size of the trace file? (I tried unsuccessfully using maxLogFileSizeKB, but didn't know where to put it, presumably in Web.Config.)
2) Any other way to see stack trace information on server errors from the LogFiles directory on the server, or otherwise?
I think your problem might be logging to the wrong place. There's three different places to store the logs, but the Preview Portal makes this less clear than the old Azure Portal. The documentation for logging still directs you to the old portal to set up and you can log to Blob Storage or to Table Storage.
https://azure.microsoft.com/en-us/documentation/articles/web-sites-enable-diagnostic-log/. Logging to tables might be less limiting.
While I was not able to increase the log size, I was able to get the trace information with IExceptionLogger. I don't need any special error handling, so just being notified is good enough for me. This is for API (2) controllers.
1) In App_Start/WebApiConfig.cs, I added the following line
config.Services.Add(typeof(IExceptionLogger), new ApiErrorLogger());
2) Create my ApiErrorLogger class
public class ApiErrorLogger : ExceptionLogger {
public override void Log(ExceptionLoggerContext context) {
addLogError(context.Request.RequestUri.ToString(), context.Exception.Message, context.Exception.StackTrace);
}
public static void addLogError(string uri, string message, string stackTrace) {
// Store data in Azure table
}
}
I don't have to use <customErrors mode="Off" />, which is good, and I can turn tracing off (which is resource expensive) in Azure portal.
I Have website that is in production server and it supposed to be very secure so i want to secure http header so that no unwanted information is leaked.
I have searched on net about securing http headers and so far found that we can remove un anted information like removing
'Server Microsoft-IIS/7.5
X-AspNet-Version 4.0.303319
X-Powered-By ASP.NET -'
I have found solution for X-Aspnet and X powered by :
1. For X-AspNet i have added below code in system.web section
<httpRuntime enableVersionHeader="false"/>
For X-Powered i have added below code in system.webserver section
But for Server header removal code is not working :(
Code i am using for is :
I have added a class with name CustomHeaderModule and inside that class code is as below
///
/// Summary description for CustomHeaderModule
///
public class CustomHeaderModule : IHttpModule
{
public void Dispose()
{
throw new NotImplementedException();
}
public void Init(HttpApplication context)
{
context.PostReleaseRequestState += PostReleaseRequestState;
}
void PostReleaseRequestState(object sender, EventArgs e)
{
//HttpContext.Current.Response.Headers.Remove("Server");
// Or you can set something funny
HttpContext.Current.Response.Headers.Set("Server", "CERN httpd");
}
}
and then registered this in web.config under system.webserver section
<modules runAllManagedModulesForAllRequests="true">
<add name="CustomHeaderModule" type="CustomHeaderModule" />
</modules>
Now this code is not working ..i am still seeing server in header in chrome browser..
how can i fix this and apart from these 3 setting is there any other to secure more ?
Considering your problem what I would suggest you is to use ASafaWeb to test your Website!
Second is to read these articles from Troy Hunt and Paul Bouwer:
Shhh… don’t let your response headers talk too loudly
Clickjack attack – the hidden threat right in front of you
ASafaWeb, Excessive Headers and Windows Azure
Following this articles you will finally have a look at NWebSec!
Sorry if this doesn’t answer your question directly but I wouldn’t really bother removing those headers. Someone can easily find out what server are you using by looking at the html code on the browser side.
If I look at source code and I see things like __VIEWSTATE I’ll immediately know this is ASP.NET and if I dig a little deeper I’ll probably be able to figure out the version too.
What I’d suggest is that you focus on standard security and risk procedures such as making sure you are not open to SQL injections, validating everything on the server side, making sure you have all backups in place and ready to be up in several mins, adding additional layer of authentication if needed, making sure you have all security updates on the server and such…
I have found one solution which works on IIS but not on local but i am okay with that...Removing/Hiding/Disabling excessive HTTP response headers in Azure/IIS7 without UrlScan
anyways apart from these 3 settings ..is there any other way i can more secure http headers..
I have an ASP.NET IHttpModule implementation designed to rewrite paths for serving files. The module handles only one event, PostAuthenticateRequest, as follows:
void context_PostAuthenticateRequest(object sender, EventArgs e)
{
if (HttpContext.Current.Request.Path.ToLower().Contains("foobar"))
{
HttpContext.Current.RewritePath("virtdir/image.png");
}
}
The path "virtdir", is a virtual directory child of the application. The application itself runs in a typical location: C:\inetpub\wwwroot\IisModuleCacheTest\ The virtual directory "virtdir" is mapped to C:\TestVirtDir\
A request to http://myserver/iismodulecachetest/foobar will, as expected, return image.png from the virtual directory. Equally, a request to http://myserver/iismodulecachetest/virtdir/image.png will return the same image file.
I then perform the following:
Request http://myserver/iismodulecachetest/foobar
Directly modify C:\testvirtdir\image.png (change its colour in paint and re-save).
Repeat.
After anywhere between 1 and 20 repeats spaced a few seconds apart, the image returned will be an out of date copy.
Once upset, the server will only return the current version after an unknown amount of time elapses (from 10 seconds up to a few minutes). If I substitute the URL in step 1 with http://myserver/iismodulecachetest/virtdir/image.png, the problem doesn't appear to arise. But strangely enough, after the problem has arisen by using the "foobar" URL, the direct URL also starts returning an out of date copy of the image.
Pertinent Details:
A recycle of the app-pool resolves the issue.
Waiting a while resolves the issue.
Repeatedly re-saving the file doesn't appear to have an effect. I'd wondered if a "file modified" event was getting lost, but once stuck I can save half a dozen modifications and Iis stil doesn't return a new copy.
Disabling cache in web.config made no difference. <caching enabled="false" enableKernelCache="false" />
The fact that this is a virtual directory seems to matter, I could not replicate the issue with image.png being part of the content of the application itself.
This is not a client-cache, it is definitely the server returning an out of date version. I have verified this by examining request headers, Ctrl+F5 refreshing, even using separate browsers.
I've replicated the issue on two machines. Win7 Pro 6.1.7601 SP1 + IIS 7.5.7600.16385 and Server 2008 R2 6.1.7601 SP1 + IIS 7.5.7600.16385.
Edit - More Details:
Disabling cache and kernel cache at the server level makes no difference.
Adding an extension to the URL makes no difference http://myserver/iismodulecachetest/foobar.png.
Attaching a debugger to IIS shows the context_PostAuthenticateRequest event handler is being triggered each time and behaving the same way whether or not the cache is stuck.
Edit2 - IIS Logs:
I enabled "Failed Request Tracing" in IIS (interesting how this works for non-failed requests also if configured appropriately. The pipeline is identical up until step 17 where the request returning the out of date version clearly shows a cache hit.
The first request looks just fine, with a cache miss:
But once it gets stuck, it repeatedly shows a cache hit:
The events after the cache hit are, understandably, quite different than the cache miss scenario. It really just looks like IIS is perfectly content to think its file cache is up to date, when it is definitely not! :(
A little further down the stack we see first request:
And then subsequent (faulty) cache-hit request:
Also note that the directory is apparently monitored, as per FileDirmoned="true".
You can do something like below.
void context_PostAuthenticateRequest(object sender, EventArgs e)
{
if (HttpContext.Current.Request.Path.ToLower().Contains("foobar"))
{
Random rnd = new Random();
int randomNumber = rnd.Next(int.MinValue, int.MaxValue);
HttpContext.Current.RewritePath("virtdir/image.png?"+randomNumber);
}
}
I had the same problem using the method RewritePath to address static resources in a virtual directory.
I do not have a solution for the use of this method but at the end I opted to use the method Server.TransferRequest and this shows no caching problems.
HttpContext.Current.Server.TransferRequest(newUrl);
The request transfer is processed again by the IHttpModule so you need to be careful to not produce loops.
I have an ASP.NET MVC application with a page that allows users to upload files. The files will be several hundred megabytes.
I am using FineUploader on the client side, which will use FileAPI/XHR if the browser supports it, otherwise will fallback to Iframe/form with enctype="multipart whatever".
So on the server side I need to evaluate Request.Files.Count > 1. If true, this is an old school upload and I save the file like Request.Files[0].InputStream.CopyTo(myFileStream) otherwise I do Request.InputStreawm.CopyTo(myFileStream).
Here's some of the actual code I've written that does this stuff: https://github.com/ronnieoverby/file-uploader/blob/master/server/ASP.NET%20MVC%20C%23/FineUpload.cs
This all works fine, but in my testing I've noticed that neither an ASP.NET MVC controller action nor an HttpHandler will begin processing until the entire file is uploaded, which is bad if the file very large because that means it's occupying a lot of the web server's RAM.
I found this: Streaming large file uploads to ASP.NET MVC which sounds promising, but I really don't have an idea of where the code resides in his application.
So, the question is: how to stream uploaded files to disk while the upload is still taking place in ASP.NET?
Update
I just saw a key detail that didn't sink in before. From the HttpPostedFile documentation:
By default, all requests, including form fields and uploaded files,
larger than 256 KB are buffered to disk, rather than held in server
memory.
Ok, that addresses the concern that the web server's RAM utilization could spike during a large upload. But, there's still a problem: After the file is completely transferred to the web server, the server has to spend time moving it to it's final destination. If the file system operation is a copy (guaranteed if the destination is on another physical disk), then the response is delayed unnecessarily.
Honestly, I could probably live with this by increasing response timeout for the upload handler/action. But, it would be nice to stream the bytes directly to their destination.
You can handle uploads in a completely customized way without buffering using
HttpRequest.GetBufferlessInputStream method. Basically you are getting access to the raw incoming data and free to do whatever you want with it.
I've just created small sample which saves raw request content to a file:
Create handler:
public class UploadHandler : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
using (var stream = context.Request.GetBufferlessInputStream())
using (var fileStream = File.Create("c:\\tempfile.txt"))
{
stream.CopyTo(fileStream);
}
}
public bool IsReusable { get { return true; } }
}
Register in Web.config:
<system.webServer>
<modules runAllManagedModulesForAllRequests="true"/>
<handlers>
<add name="UploadHandler" verb="POST"
path="/upload"
type="UploadHandler"
resourceType="Unspecified"/>
</handlers>
</system.webServer>
Create a page with a form:
<form action="/upload" method="post" enctype='multipart/form-data'>
<input type="file" name="aa" id="aa"/>
<input type="submit"/>
</form>
If the uploading and streaming is using up valuable server resources then you might wanna take a look at hosting your media files on a cloud of some sort. It's possible with ASP.NET to use a Rackspace, Amazon Cloud API have your users upload the files directly to a CDN network and then serve the content that way, I know this isn't answering your question but many people will or already have and thought I'd get my 2 cents in. Many people still not opting to use the cloud amazes me! once you go CDN you never go back. Furthermore with most CDN's you will also be given a streaming URL for your upload container where it supports lots of different movie types, and its lighting fast, not only for your users to upload too but also your never have slow speeds on your website as a result.
I would like to encrypt the connection string of my web.config. Here I have found a nice example on how to do this. I implemented this and on my development machine this runs find.
However if I upload it to the provider, it does not work with the following error:
[SecurityException: Request failed.]
System.Configuration.DpapiProtectedConfigurationProvider.Encrypt(XmlNode node)
In this blog I have read, that this is because of the web probably runs in medium trust and therefore WebConfigurationManager.OpenWebConfiguration can not be used. Instead of this, WebConfigurationManager.GetSection should be used. However, if I get the section as proposed, the call to ProtectSection fails with the following error message:
System.InvalidOperationException: This operation does not apply at runtime
Can anyone lead me to a solution, how I can encode (and decode) the connection string in the web.config file (at runtime)?
Update
Not a real answer to the question, but the hoster gave full trust to the web and now, all worked fine. I leave the quesion open, maybe someone posts a solution to the original question and helps with this people having the same problem but not getting full trust.
From http://msdn.microsoft.com/en-us/library/89211k9b%28v=vs.80%29.aspx
static void ToggleWebEncrypt()
{
// Open the Web.config file.
Configuration config = WebConfigurationManager.
OpenWebConfiguration("~");
// Get the connectionStrings section.
ConnectionStringsSection section =
config.GetSection("connectionStrings")
as ConnectionStringsSection;
// Toggle encryption.
if (section.SectionInformation.IsProtected)
{
section.SectionInformation.UnprotectSection();
}
else
{
section.SectionInformation.ProtectSection(
"DataProtectionConfigurationProvider");
}
// Save changes to the Web.config file.
config.Save();
}
UPDATE
Also, ensure that your service account has write permissions to the Web.config. Also, be aware that granting write permissions to your service account on the Web.config increases somewhat the security footprint of your application. Only do so if you understand and accept the risks.