Config file Performance question - asp.net

I have around 60 web apps on web server, all of these app have some of the same appsetting values in the web.config. These settings are loaded into memory as soon as the application starts. I would like to centralise these values in one config file for all apps to load.
My question is, if i load all of the apps up at the same time, would there be any performance issues accessing this same config file at the same time?
Cheers

Read locks are generally not exclusive so any number of applications can read from the same file at the same time. If you're not specifically requesting read exclusivity, you should be fine.
You should look at how you're loading the configuration file into each application.
See http://en.wikipedia.org/wiki/File_locking for more information.

Probably, because you need disk IO to access the file. But if the values stay in memory, I would say the performance issue would be minimal afterwards. Just be sure to read out the file without locking it (I believe with the FileShare.ReadWrite enum).

You might even see a small performance improvement since the file will be cached by the operation system when the first application reads the file, subsequent files will read directly from memory.
But the only way to know for sure is to measure and see.

Related

How to be sure about an uploaded file is not a virus in ASP.net?

I saw this question:
ASP.NET File Upload: how can I make sure that an uploaded file is really a JPEG?
and similar questions about being sure of the file being uploaded through asp:FileUpload control in ASP.net is really image. But What If users upload virus-infected images? How can I be insured of the image files being uploaded via my ASP.net application does not affect the files in my web app folder and/or images uploaded by other users?
As long as you don't serve it back to anyone as anything other than an image (content-type) and never trying to execute (.exe) the file you'll be fine.
Most anti-virus software run whats known as an "on-access scan". That is, when a file is changed, it automatically scans that file.
So save that file to the file system and let your server's anti-virus software do the work for you.
I'll take what is likely a somewhat controversial position.
There is no way to know with 100% certainty what the intent of a file is, be it good or evil. It is impossible. AV scanners give you a slice of data but they can't give you 100% guarantees either. No one can.
Given this reality, you need to build your app assuming that all files uploaded are bad. Yes, scanning is still fine and will filter out a bunch of stuff. But it will never be 100%. Is it 99.999% or 20%? Who knows. Does it really matter?
I would build any app today assuming that all user supplied content is bad. Very bad. Hostile bad. Because eventually it will be if you make it. And when it is, you'll be ready for them...rather than all the people that have to rearchitect their app because they made bad assumptions early on.
With a bit more data about your exact concerns, I'd be happy to comment on them more specifically...
As a side note, In older version of IIS (6 or prior versions) It could be possible to change FileName to the real malicious file name after save the file with original filename. Which has possibility to be read and execute regularly by the server.
E.G. set the file name like: file.asp;.jpg or file.asp%00.jpg etc...
It also has a possibility to change target directory by manipulation of file name. Which is extremely dangerous
E.G. newfolder.asp::$Index_Allocation or etc...
There is also some new way of attacks. Read more here.

Is it better to execute a file over the network or copy it locally first?

My winforms app needs to run an executable that's sitting on a share. The exe is about 50MB (it's a setup.exe type of file). My app will run on many different machines/networks with varying speeds (some fast, but some awfully slow, like barely 10baseT speeds).
Is it better to execute the file straight from the share or is it more efficient to copy it locally and then execute it? I am talking in terms of annoying the user the least.
Locally is better. A copy will read each byte of the file a single time, no more, no less. As you execute, you may revisit code that is out of cache, etc and gets pulled again.
As a setup program, I would assume that the engine will want to do some kind of CRC or other integrity check too, which means it's reading the entire file anyway.
It is always better to execute it locally than running it over the network.
If you're application is small, and does not need to load many different resource during runtime then it is ok to run it over the network. It might even be preferable because if you run it over the network the code is read (download and load to memory) once as oppose of manually downloading the file then run it which take 2 read code. For example you can run a clock widget application over the network.
On the other hand, if your application does read a lot of resources during runtim, then it is absolutely a bad idea to run it over the network because each read of the resource will go over the network, which is very slow. For example, you probably don't want to be running Eclipse over the network.
Another factor to take into consideration is how many concurrent user will be accessing the application at the same time. If there are many, you should copy the application to local and run from there.
I believe the OS always copy the file to a local temp folder before it is actually executed. There are no round trips from/to the network after it gets a copy, it only happens once. This is sort of like how a browser works... it first retrieves the file, saves it locally, then it runs if off of the local temp where it saved it. In other words, there is no need to copy it manually unless you want to keep a copy for yourself.

concurrent reading and writing image files (asp.net, but applies to most web languages)

I have a .jpg file which represents the current image from a webcam. User's will be downloading this file at an interval of once a second. Because there could be dozens of users reading it, this could be dozens of times a second (which is normal for any web server).
Problem is, this image is updated by a 3rd party application also once a second which "spiders" my local networks webcam portal image. This is so we can build our webcams into our current administration panel.
The problem I am already finding is ASP.net sometimes gets an error it can not access the file because it is open for write permissions by the bot. Likewise, the bot can not access it because IIS is feeding it to the user.
The bot uses io.streamwriter to save the data to the file, and my script uses Response.WriteFile to send the file to the script. (I need to use an actual ASP.net page with a JPG content-type that feeds the file to make sure only users with a active session can view the JPG).
My question is what is the best practices for this? I know why it's happening but what is the best resolution for this? Would storing as a BLOB in a database maybe be smarter since databases are created for concurrent read/writing already? Is there an easier way of doing this with a file I have not thought of yet?
Thanks in advance,
Anthony Greco
Using a BLOB will work if the readers use SNAPSHOT isolation model (SQL Server 2005 and up). See Download and Upload images from SQL Server via ASP.Net MVC for how to stream an image from a BLOB, and see Understanding Row Versioning-Based Isolation Levels for a lecture on SNAPSHOT.
But using a BLOB may be overkill, you could get away with something much simpler. For instance, if you only have one ASP.Net process, then you could have a global volatile variable for the current file name. The writer writes the JPG into a new file, and then updates the global 'current' file name with an Interlocked.CompareExchange operation (it has to be Compare because a newer writer might actually finish faster, outrun a previous writer, and you want to preserve the latest update). There are still some issues left to solve (find out the file name at startup, clean up old files etc) but they are all fairly ease to solve.
If you have a farm of servers, or multiple ASP.Net processes serving the site, then things could get complicated. I would still do a rotating file name and do a try-and-error approach (try to respond with newest file, fall back to previous older one if conflict is detected).
You could get the bot to write the data to a different filename and then do a delete and rename to the filename being served by ASP.Net. This should reduce the file lock time down to the time for a delete and rename to occur. To clarify:
ASP.Net serving image from "webcam.jpg"
bot writes image data to "temp.jpg"
when last image byte written, bot deletes "webcam.jpg" and renames "temp.jpg" to "webcam.jpg"
ASP.Net should check "webcam.jpg" exists, if not wait 10ms (or suitable small increment) and check again.

Strategy for handling user input as files

I'm creating a script to process files provided to us by our users. Everything happens within the same UNIX system (running on Solaris 10)
Right now our design is this
User places file into upload directory
Script placed on cron to run every 10 minutes.
Script looks for files in upload directory, processes them, deletes immediately afterward
For historical/legacy reasons, #1 can't change. Also, deleting the file after processing is a requirement.
My primary concern is concurrency. It is very likely that the situation will arise where the analysis script runs while an input file is still being written to. In this case, data will be lost and this (obviously) unacceptable.
Since we have no control over the user's chosen means of placing the input file, we cannot require them to obtain a file lock. As I understand, file locks are advisory only on UNIX. Therefore a user must choose to adhere to them.
I am looking for advice on best practices for handling this problem. Thanks
Obviously all the best solutions involve the client providing some kind of trigger indicating that it has finished uploading. That could be a second file, an atomic move of the file to a processing directory after writing it to a stage directory, or a REST web service. I will assume you have no control over your clients and are unable or unwilling to change anything about them.
In that case, you still have a few options:
You can use a pretty simple heuristic: check the file size, wait 5 seconds, check the file size. If it didn't change, it's probably good to go.
If you have super-user privileges, you can use lsof to determine if anyone has this file open for writing.
If you have access to the thing that handles upload (HTTP, FTP, a setuid script that copies files?) you can put triggers in there of course.

web.config auto caching

I have custom configuration section within web.config file. I'm lingering between:
Reading it into static class every time when I need any configuration value (because I guess that system already caches files when I open them (for instance when I run Word it takes longer the first time and much less on consecutive opens))
Reading it into static class and caching it using Application.Cache with file dependency and using cached data - I suppose it would be a bit quicker this way, but is it worth the hassle.
What do you think about auto file (on open) caching...
Write a custom configuration section and use ConfigurationManager.GetSection
.NET Takes care of caching this and invalidates whenever the web.config file is changed.
Reading values from web.config is very, very fast. The ConfigurationManager is highly optimized for the purpose. So fast that there is almost no gain over storing the value in Session, Cache, etc. However, if you store a setting in web.config changing the value restarts the app but the old cached value would still be present if you used the Cache ... so don't. Simply read the value from web.config when you need it; on a standard laptop I'm able to read a web.config setting over 600,000 times a second without issue.
AFAIK, config files are already cached in memory as long when the System.Configuration.ConfigurationManager is used.
Just one reason why changing a web.config/app.config requires an app restart to pick up changes

Resources