I want to monitor the activities of all the folders present at "C:\Inetpub\ftproot\san".User can work on any type of files and not only text files.Since we have given 1GB space (lets say) to each user, so user can do anything to utilize this space.
Now I want to monitor the activites that the user will do in his folder like creating new file, deleting an existing file or editing a file.I want to monitor user's activities because i have to keep track of the space given to the user so tht i can restrict the user to use 1GB space only and not more than that.
is there any class that i can use other than FileSystemWatcher as it works only in console applications and not in webapplications??
any help would be highly apperciated..
Many thanks
FileSystemWatcher should work just fine in a web application, but the problem is that the web application isn't always on. If nobody accesses it for a while, it can be shut down in lieu of other things that need resources and then started again when next accessed. It can also be re-started easily when things within its own structure change (such as its config file). It's very stateless and transient.
How do you plan to use this information within the web application? Does your web application really need to be constantly watching, or does it just need to generate a report of the current state of the filesystem when requested? If you really need the former, then the aforementioned nature of how web applications run on the server will make things a bit unreliable. Maybe a Windows service running on the web server would be more up to the task?
First, I would break this down into:
What are the possible ways users can modify the contents of the folder?
What are you trying to accomplish/present to the user via the Web interface?
One way to do this somewhat simply (in a sense) would be to maintain a service on the machine that periodically monitors the directory for the information you need (size, # of files, whatever), and connect this (via something like WCF) to the actual web application. In effect, you'd have a semi-soft limit, in that for a period users could operate on more than 1GB, but there are obviously corrective measures you could take, but this way you don't actually have to monitor every action of every user in realtime.
Off the cuff I would think that you need some sort of service to use the FileSystemWatcher class. The only way to really watch over the directory using asp.net is if you are controlling how all files get added and deleted from the directory. If that is the case then you can add code to skim through the directory and add the sizes of everything in there pretty easily.
If they can put files in these folders with other applications (such as an FTP client) then you are going to need a service to watch over the folders.
A better way of doing this is let your WebApp run as a portal to what's happening, but you will need a windows service running to ensure that someone does not go over the space allotment.
The service would also be able to help give data to your portal.
Remember, a website only runs when someone calls it. So if you don't use your website for 5 days, nothing will monitor it.
Sure you could keep a web page open for X amount of days, but that's overkill.
Related
I have an application (written in Java) and I want to limit the launch of one of its functionality (e.g. start a certain functionality maximum 1000 times). The application is inside the company Intranet and cannot use public Internet. One trivial solution could be to save the number of launch time in an encrypted file but this file can be copied and overwritten by the system administrator of the company (where the application runs). One other solution could be to using some lightweight database but I don't want to utilize a database system just to store one decreasing number.
Do you have any idea how to store this number securely?
There is no secure way. Once the application is in full control of the user it can be analyzed and every idea you have can be defeated. That's why you have software where the license restrictions are removed, have hacks to work around (offline) DRM etc.
The usual way is to make hacking around your restriction too hard. This can be done by restring ways to debug the application, obfuscating the checks as much as possible and interweaving them with the rest of the application so that attempts to hack around your restriction will cause the application to malfunction.
I came across a case study few days early. It is related to a web application architecture.
Here is the scenario,
There is a single web service used by say 1000 web applications. This web service is hosted on a particular server. If web service hosting location is changed, how the other applications come to know about this change ?
Keeping it in web.config doesn't seems to be a feasible solution as we need to modify web.config files for all the applications.
Keeping these settings in a common repository and let all the applications use it for web-service address was came in my mind, but again there is a question of storing this common repository.
I am just curious to know about how this could be achieved with better performance.
Thanks in advance for any kind of suggestions.
do you have full access or control over all those web applications consuming that web service? if so, you could have a script or some custom code which updates all their web.config(s) at once. it seems too much work but in fact in this way you have more control and you could also, eventually, point to the new url only some applications and leave some others on another url.
the idea with the setting in a centralized database gives you faster update propagation which could also be bad in case of errors and then you have all applications referring to the same place and no way to split this. Then you have anyway to connect to a centralized database from all of them and maybe you should add a key to their web.config(s) with the connection string to that database, then, in case that database is not reachable or is down, the web applications will not be able to consume the web service simply because they cannot get the url of it.
I would go for the web config, eventually you could have a settings helper class that abstract the retrieval of that url so the UI or front end does not know from where that url comes from.
anyway, do you plan to change the url of a web service often? wouldn't be better to copy it to a new url but to also keep it available on the current url for a while?
another advantage of web.config approach is that everytime you update and save it the application is restarted while a change in a database might take a while to be detected in case you have some caching mechanism,
hope this helps.
Davide.
I'm creating a web application using asp.net & WCF as 3 tier architecture, which is mostly looks like a social website. Users can register with the system and they can upload their profile images, documents, video clips etc. So, what i want to know is what is the best way to store those files? In the wcf side or web application side ?
Also I want to know that, if i choose web application side to store those files as set of folders, how it makes those folders shared and allow access to another different project (such as a desktop client need to upload files into that shared folder) ?
thank you all in advance.
I think the question can better be put like this:
save in a folder in the web application or close by and have the metadata stored in a database
grab the saved images from a database via WCF
The second approach would likely be rather slow. Grabbing information over a service, convert it, use an httphandler with the correct mime type to spit out the binary stream to the browser...
Most architectures cut down in the middle: save the images close, or in, the UI layer and have the metadata about them stored in the database. Retrieval of that information's mostly just a bunch of strings so easily retrieved.
Update for the new question:
Since winforms applications/other projects were not in your original question this deviates into something new. In that case you go for some of the following scenarios:
Use the WCF tier as a common ground and store the images behind that service. As I said it's going to be an extra to pull the byte arrays over.
Store the images in the Web UI tier and have a service (asmx or WCF one) to expose the images to your winforms client.
Make a share for the winforms client on the server where the web ui runs, and where the images are. Of course be sure to be respectful to security and possible hacks.
It depends on what the most used scenario is. My assumption is that the web ui layer will be mostly used and the the winforms are going to be used for image manipulation? If so there are ASP.NET third party controls available for such manipulation as well so the need for a winforms client would decrease.
This depends on how big you expect this thing to get.
If this is for the wider internet and you expect it to get big, having it on the webserver will make it difficult to scale up your application by adding new webservers to your web farm.
One approach would be to have the physical files uploaded to the webserver, to make the uploads quick for users, and then have a coordinator background service that is triggered by an upload, perhaps using a FileWatcher. This service would propogate the file to all nodes in the web farm so that subsequent requests to other nodes will find the file.
If it is a small application intended only for within a company, on the web server is okay, with the following conditions:
You have full control over the hosting server so that you can set up the appropriate folder permissions.
You write your file saving and retrieving code in such a way that it can be moved onto the lower tiers without too much pain. Do it through an interface and inject the implementation
I have a Flex application I’m writing (Learning exercise) that I’d like to run of a network drive for many users to access. I’d like users to be able to save high scores on the network.
Users have read write to the network location it's on.
I don’t want to change anything on the computers that might use it (IE install AIR) or IE/Firefox settings. They are just default.
I don’t want to run a server (IE PHP)
Is there anyway to do it?
Cheers
Nope, not without AIR. And even then with difficulty. Flex runs within the context of the browser, and only has available to it the resources available to the browser (for obvious security reasons.)
Flash enjoys a unique position of corporate trust for reliability and safety, and they do everything possible to protect that position. So you're sandboxed.
The best I can think of is put together something that serves a URL and a common or custom read-write protocol - probably not trivial.
You will have to use a backend to access any of those resources. Eg, if you're using BlazeDS then you can just use Java to write to the network. You will have a server anyways to host your application.
You really want to use a backend technology for this. If you're dead set against it, Flash Player 10 can write files to the local filesystem. You could probably trick it to use a network resource by referencing it as a mapped drive or maybe even a named host.
http://livedocs.adobe.com/flex/3/langref/flash/net/FileReference.html#save()
You can also use the "load()" method of FileReference to read a local file into your Flex application.
I really don't recommend you write in an application using this but it looks like it could be done. The caveat here is that these actions can happen only if the user specifically chooses a location for a file: they need to select the file you want to load or choose the location where a file is saved.
We have a situation where users are allowed to upload content, and then separately make some changes, then submit a form based on those changes.
This works fine in a single-server, non-failover environment, however we would like some sort of solution for sharing the files between servers that supports failover.
Has anyone run into this in the past? And what kind of solutions were you able to develop? Obviously persisting to the database is one option, but we'd prefer to avoid that.
At a former job we had a cluster of web servers with an F5 load balancer in front of them. We had a very similar problem in that our applications allowed users to upload content which might include photo's and such. These were legacy applications and we did not want to edit them to use a database and a SAN solution was too expensive for our situation.
We ended up using a file replication service on the two clustered servers. This ran as a service on both machines using an account that had network access to paths on the opposite server. When a file was uploaded, this backend service sync'd the data in the file system folders making it available to be served from either web server.
Two of the products we reviewed were ViceVersa and PeerSync. I think we ended up using PeerSync.
In our scenario, we have a separate file server that both of our front end app servers write to, that way you either server has access to the same sets of files.
The best solution for this is usually to provide the shared area on some form of SAN, which will be accessible from all servers and contain failover.
This also has the benefit that you don't have to provide sticky load balancing, the upload can be handled by one server, and the edit by another.
A shared SAN with failover is a great solution with a great (high) cost. Are there any similar solutions with failover at a reasonable cost? Perhaps something like DRBD for windows?
The problem with a simple shared filesystem is the lack of redundancy (what if the fileserver goes down)?