I have statiŃ files in website folder, but need to check permissions for every file.
I decided to use HttpModule for that purposes.
ASP.NET receives all the http-requests (I used wildcard mapping) and
The algorith is the following:
HttpModule receives the request
HttpModule checks permissions
If access is denied then the answer is "Forbidden". If all is OK then httpModule's method just returns.
DefaultHttpHandler is automatically used to process request for static files
The problem is that DefaultHttpHandler is not effective enough (it doesn't use file cache, etc.). But IIS (without ASP.NET) works with static files in a very good way.
All I want is to let IIS serve static files after my checks.
Is there any way to implement it?
If you're using IIS7 then yes, it's quite easy. In the integrated mode, all requests go through the managed pipeline. Let IIS serve the files, but add a HttpHandler to do the checks. Or you can use one of the authorization methods that ASP.NET offers.
I have a solution that could be used to stream the file in IIS 6. It does all the good things like resumable downloads, client side caching (etag & expires) and server side caching.
http://code.google.com/p/talifun-web/wiki/StaticFileHandler
It should be easy enough to extend to include authorization before serving up the file.
Related
We're developing a plugin for ASP.NET web apps that can, ideally, be dropped into the folder that makes up the public facing part of our client's web site, i.e. a foo.aspx file that becomes publicly accessible at, say, /foo.aspx (and can thus be accessed by our browser component written in JS and loaded into the client's website).
However, we need something we're not really sure how to do idiomatically, which is the ability to pass a few configuration parameters to foo.aspx in a way that, preferrably, reuses whatever configuration mechanism our client's already using (we're assuming a sufficiently recent version of IIS) or at least something that's considered standard and can be applied to any IIS deployment.
Is IIS metabase something to look at? We need a way to pass in an SSL client certificate file (or path thereof) to foo.aspx so that it can talk back to our HTTP REST API via a secure channel. It's also not entirely obvious where's a good and standard disk location to drop the certificate file itself to.
I have an AngularJS WebAPI application that has a Javascript front-end. The front end makes calls to the back-end WebAPI for data. In the future there may well be more than one front-end making calls to the back-end.
I would like to change this application to use HTTPs and am looking into how to best architect this. The way I see it there are two ways (maybe more).
(1) Host the WebAPI C# application, index.html, Javascript, Javascript libraries and other HTML on the one (or more) web roles.
(2) Host the index.html, Javascript, Javascript libraries and other HTML on a CDN and put the WebAPI C# application on one (or more) web roles at one location.
From a performance point of view are there likely to be any issues with the split solution (2) when I am using SSL. Is there anything that I should consider that might help improve the start-up time (my goal is for this to be as fast as possible).
One more question I have. If I were to use the Azure CDN then would I still be able to address the index of my web site as www.mywebsite.com and if using HTTPS would I need a SSL certificate?
Option 2 is more preferible.
You have to think, that your application is what sits in the backend. The front end is just a suggested set of UI controls and interactions to consume that application you have. Then, if you can host them separately you have some benefits, starting by not creating UI dependency.
The approach would be like creating a thin client.
Since the application is AngularJS based, probably all the UI are static files with HTML, CSS, and Javascript. You can host them in BLOB storage, and scale it through the CDN. You can have a custom domain name pointing to Azure Blob Storage, like `www.yourdomain.com. It has many benefits, including better price and scaling than web roles. Put aside, that you pay for web roles no matter if you are getting hits or not. The only downside is that as far as I know, it would not be possible to use HTTPS, but that should not be a problem, since you are just hosting static content and templates that contains placeholders, no actual data.
On Blob storage, you can attach your own cache control headers, allowing the browser to cache those files locally. Then a user would download those files once, and be recovered from the browser cache next times. Also, you can store the content already compressed in GZIP, and then set the content encoding property to let the browser know it is compressed, therefore enabling a faster content download. Not forget you should bundle your resources. For example, you should bundle all your JS code in one JS file, all your CSS code in one CSS file, and all your AngularJS views should be bundled in the template.js file (also bundled into the unique JS file).
You need to host your backend application in worker/web role instances though. Here you can use HTTPS, and it would be no problem to use AJAX over HTTPS, although the page loaded on HTTP as long the SSL/TLS certificate is signed by a CA recognized by the browser (ie: a valid certificate). If you use a self-signed certificate, there will be no way for the browser to prompt the user to accept it. Keep this in mind if you plan to start with a self-signed one.
So then you would have all the things that are not user/state dependant in blob storage, that is cheap, fast and highly scalable; and all your user data interaction would happen through your worker/web roles through compact data request/response probably in JSON. Therefore you need less web/worker roles for providing the same level of service.
Now, if you have a very asymmetrical amount of massive queries and data changes request, you should consider an approach like CQRS.
On our web server we want to provide urls that can be used in HTML elements, for displaying user profile pictures.
I could do this in two ways:
put the images on the web server as static resources: mysite.com/user1.jpg
implement an IHttpHandler: mysite.com/images?userid=1
are there any benefits of one method over the other?
Not handling images via ASP.NET but letting IIS deal with them as static content and circumventing the ASP.NET pipeline altogether would be fastest.
Implementing an HttpHandler, apart from requiring code, would require the ASP.NET pipeline to be involved - this takes more resources from the server.
Basically, if the content is static, let it be static and let IIS handle it.
Whilst allowing IIS to handle the images would be easier, it depends on whether or not you want to compress the files as they are sent?
Certain versions of IIS support GZIP compression, thus meaning you can compress the files as IIS serves them, however, older versions of IIS may not, meaning if you need to compress the files, you would have to use an HTTPHandler or similar to compress and serve them.
Let IIS deal with them. Serving images from a handler can take up a lot of resources.
Don't do that unless absolutely necessary.
If your site has a lot of images, this means that many extra requests for Asp.net to handle.
Asp.net needs quite a lot more resources per request than what IIS would need to push out a static file.
Does IIS handle request that of static file eg:
http://localhost:9000/Content/ABC.pdf
If it doesnt then can we add some setting so that the .pdf request is also handled by IIS and it passes through URLRewite module.
Asp.net only receives requests for aspx, asmx, ashx.
If a file name extension has not been mapped to ASP.NET, ASP.NET will not receive the request.
If you create a custom handler to service a particular file name extension, you must map the extension to ASP.NET in IIS and also register the handler in your application's Web.config file. For more information, see HTTP Handlers and HTTP Modules Overview.
If possible change your url to an ashx file. If not, you can map pdf to be recognized by asp.net.
Yes, IIS handles static content just fine (it does serve images up, right?).
By default it will bypass any dynamic processing and return the content directly.
If your setup does not automatically handle PDF files correctly, you may simply need to add the correct mime type to the configuration.
What's the best way to implement a download system?
It needs to be integrated with an asp.net application.
I need the following features:
Deliver files larger than 50mb
Only users authorized by an asp.net login page can download
Need to know if the user downloaded the whole file, or part of it
Once the file is downloaded or canceled, the same url will not be available again
It's something similar to rapidshare I believe, but integrated with an asp.net application.
What would you guys suggest?
thanks!
What if you hosted the files on a lighttpd server running modsecdownload, and used your asp.net app to generate the secure urls to the files on that server?
That approach should handle items 1,2 and 4.
Not sure how you could tell from the server side that the download was completed successfully, maybe have some logic that parses the server logs?
You could also use nginx and its X-Accel-Redirect feature. It's simple, and nginx is even more ridiculously fast and lightweight than lighttpd. A common setup at least in the Ruby and Python web world is to run nginx in front of lighttpd or Apache. nginx serves all static media and proxies dynamic request to the web server behind it.