Loading .browser file dynamically in a non-ASP.net website applications - asp.net

I have an offline process that needs to do some analysis on User-Agent strings, that are logged from requests on our production machines, the problem is I need to use the .browser file that has the required filters to parse & recognize the browsers from user-agent, and the only way to use the file that I know, is through having an ASP.net website and placing the .browser file under APP_Browsers folder, but given the nature of the offline process I can't have it in a website and I find the need to construct a HttpWebRequest object for each record to read the HttpBrowserCapabilities over killing.
So is there any other way to consume the .browser file and read the HttpWebBrowserCapabilities matching the user-agent.
Note: I found a previous similar question that is about 2 years old, that didn't get enough traction, so I thought may be things changed since then.

Related

Symfony not found static files (images, js, txt)

How can Symfony deliver static files without bootstrapping/executing the framework?
For example: if some requests are failing by the webserver(images, js files are not found or something like this) then the framework tries to solve the route. Of course this does not exists.
Is there a way to avoid this or blacklist these extensions?
It could be a cache problem.
If it is :
If it is a cache problem, you could try to clear the cache on the symfony console with cache:clear. If it doesn't work you could try to remove the ressources in the general folder, leaving the original ones in your bundle, and running assetic:dump and assets:install.
If it isn't
Regarding the "remove-symfony-routing" thing, I don't know if it's possible, but it should not done anyways.
What you're asking is to be able to access, from the client side, any file on the server, which constitutes a major security breach.
This could allow the client to get any file on the server, meaning he could get his hands on your javascript or php files which most of the time contain valuable information (such as how your app works or even deadlier : global passwords and config values)
What you could do to access resources from the client would be a route that points to a controller function that could output to browser the file you're looking for, provided that it has an extension you'd be ok to share. For example you could allow any image file but forbid code files such as php or javascript.
EDIT: Or yeah, configure your webserver correctly. 2 simple answers while I was typing :D

Asp.net <clientCache>

As part of static file caching in my application, I am using the clientCache> feature supported by IIS7.5. But I would like to invalidate the client side files while performing new deployments, to ensure that the stale files are removed.
cacheControlMaxAge seems to be absolute. I want to invalidate the files one time (during deployment) and then they should be cached. Is there any way recommended?
Depends which files you are trying to invalidate. One solution could be to generate some sort of fingerprint and append it as a querystring to your files. For example:
<link rel="stylesheet" href="mystyle.css?fp=hash_generated_from_file" >
When you change file you should recreate the hash and append it to the file. Mads Kristensen wrote an article Cache busting in ASP.NET which would be the answer to your question. Instead of copying-pasting his article you might want to see how he does this.

Web-enabled file storage and security implications of giving delete permission to IIS_IUSRS

I've had this question for many years, and did research every time that this issue arose, but could never find a definite answer. Somehow the mighty Internet, MSDN, community forums, are either silent or vague on this. Out of thousands of development-related uncertainties, this is the only one that remained elusive.
To the point: in order to enable users to upload and manage images (and other files) used in their blog posts, in a shared hosting environment, I can either consider SQL Server binary data types (performance implications), or the file system. To use the latter, the necessary permissions need to be set for the IIS_IUSRS role on the storage directory : create/write, read and delete. My question - if I do this, what are the security implications? Could someone somehow take advantage of this, bypass the ASP.NET request pipeline and manipulate the files inside the folder without making a request to the corresponding ASP.NET handler (which checks rights, validates uploads, etc.)?
I've developed several systems that allowed file uploads and this has always bothered me. Now, hopefully, someone will be able to put my mind at ease and, ideally, explain the mechanics behind the process.
UPDATE
After viewing the latest answers (many thanks), another formulation of the question:
Is it in any way possible for a client to somehow bypass the request pipeline and create/delete files inside a directory that allows it (assuming the person knows the directory structure)? Or only the code that handles the request can do it? Any potential exploits?
The main problem is to been able to upload a script, an aspx page, in this directory with the photo files, and runs it.
Here is one case: I've been hacked. Evil aspx file uploaded called AspxSpy. They're still trying. Help me trap them‼
The solution to that is to add this extra web.config file on the directories that allow to upload files and not permit to run any aspx page. Also double check to allow only extensions that you permit and not allow to change that on the file name, if they have the opportunity to make rename.
<configuration>
<system.web>
<authorization>
<deny users="*" />
</authorization>
</system.web>
</configuration>
Also on the directories that you allow to upload files, do not permit to run any other script like simple asp, or php or exe, or anything.
general speaking
All your pages have permissions to run and manipulate many things on the server. What you give now is the ability of write on some directories, also by using some aspx page. The asp.net now have one more extra permission to write files there, on the photo folder. Also note here, that you asp.net page have this control, not the user. What you do there with your code can write on this directories, so must be carefuller there to double check where you write and not allow any other directories, not allow the user to manipulate the directory that can be written to.
So this is the weak link. To been able to upload more script that can take control of the server, at least the part that can be access by the asp.net user of this pool.
Having done this before, I'd make two recommendations:
First, do not store the uploaded files in the same directory structure as your application code (if possible). Make it a well-defined external location, and locked down explicitly to only the user the application is running as. This makes it harder for a malicious upload to be injected into your application as nothing in the web server, or ASP.NET itself, knows how to access the file (only your application).
If that is absolutely not possible to do so, be sure to make sure no external user can access the storage folder using standard ASP.NET authorization and only allow writes by your application user to this folder, nothing else.
Second, do not store the uploaded files with their original names and file extensions; Keep that meta-data separate. Just consider the file a raw binary blob of data. This is good for a couple reasons. First, it prevents inadvertent execution of the file on the server, be it by someone accessing the file system directly, the web server, or ASP.NET. Second, it makes it much more difficult for an attacker to exploit a malicious upload as they should never be able to guess the name, or path, of the file on the server.

IIS Security: Why is it dangerous to make locally hosted xml files writable by the application pool account?

I have a website that reads an xml file, caches the file's object model, and i have web pages that read from the cache. I now want to make that xml file writable by the application pool account that runs the website so it can be managed by the website.
I've heard from peers that making that file writable is a security risk because if a hacker were to hack the website, he could potentially use the app pool account to overwrite that xml file and put whatever he/she wants into that file. However, since that file is read by directly hitting the web cache (and not the xml file) and the application pool account has write access to it, doesn't that mean a hacker can modify the object model that represent the xml file, regardless if the xml file is writable? By modifying the web cache, the hacker could inflict the same damage as if he had access to the web cache. I don't see how making the xml file read-only makes the website safer from hackers.
If I understand correctly, your xml file is read, turned into an object or a collection or some .NET data structure. And presumably only xml files of a certain schema can be successfully read this way.
I guess this depends on if there is something interesting in the xml file. If the xml file is the list of administrators, then as a hypothetical hacker I'd like to modify that file and add my name to the administrators list, which will result in an xml file that still serializes and deserializes to the data structure previous defined in code.
Another way to use write ability, would be to update a price list so that the prices are all free or heavily discounted.
If the XML file is a list of US states, then even if I could modify the list, I'm not sure what I could do with it outside of mischief, which is a larger concern for internet apps than for intranet apps.
I would put the file in the App_Data folder so that it can't be downloaded directly, which will make it harder for a hacker to make correct modifications to it... but security through obscurity is not really a good plan on it's own.
If the hacker were to hack the website, then security is compromised anyway. Allowing write access to the XML on it's own shouldn't be an issue, but I wouldn't give this access to any other files within your website.

Custom VirtualPathProvider unable to serve URLs ending with a directory

As part of a CMS, I have created a custom VirtualPathProvider which is designed to serve a single file in place of an actual file structure. I have it set up such that if a file actually exists on the server, that file will be served. If the file does not exist, the virtual content stored for that address will be served instead. This is similar to the concept of serving a website from files stored in a database, though in this case the content is stored in XML files on the server.
This setup works perfectly when a request is made to a specific page. For example, if I ask for "www.mysite.com/foobar.aspx", the content that is stored for "foobar.aspx" will be served. Further, if I ask for "www.mysite.com/subdir/foobar.aspx", the appropriate content will also be served.
The problem is this: If I ask for something like "www.mysite.com/foobar", things begin to fall apart. If the directory exists on disk (and doesn't have a configured default page in IIS, such as index.aspx), I will get a "Directory Listing Denied" error. If the directory does not exist, I'll simply get a 404 - Resource Not Found.
I've tried several things, and so far nothing I've done has made a bit of difference. It seems as though IIS is simply noting the nonexistence of a directory (or default file in an existing directory) and serving up its own error code, without ever asking my application what to do with the request. If it ever did get to the application, I would be able to solve the problem, but as it stands, I'm quite lost. Does anyone know if there is some setting in IIS that is causing this?
I've looked for every resource I can find on the subject, and am coming up empty. I know this should be possible, because I have read tutorials on serving content from both databases and ZIP files. HELP!
p.s., I am running IIS6 and .NET 3.5
IIS will only pass a request to the ASP.NET process if it is configured to do so for the particular extension. The default is aspx, ascx, etc. In other words, if you request a .html file, ASP.NET will never see that HTTP request. Likewise for empty extension.
To change this behavior, add a wildcard mapping to the ASP.NET process. Load IIS Manager, go to the Properties for your web site and look at the Home Directory tab. Click on "Configuration" and there you will see the extension-to-applicaiton mappings.

Resources