Limit the launch of an application functionality - encryption

I have an application (written in Java) and I want to limit the launch of one of its functionality (e.g. start a certain functionality maximum 1000 times). The application is inside the company Intranet and cannot use public Internet. One trivial solution could be to save the number of launch time in an encrypted file but this file can be copied and overwritten by the system administrator of the company (where the application runs). One other solution could be to using some lightweight database but I don't want to utilize a database system just to store one decreasing number.
Do you have any idea how to store this number securely?

There is no secure way. Once the application is in full control of the user it can be analyzed and every idea you have can be defeated. That's why you have software where the license restrictions are removed, have hacks to work around (offline) DRM etc.
The usual way is to make hacking around your restriction too hard. This can be done by restring ways to debug the application, obfuscating the checks as much as possible and interweaving them with the rest of the application so that attempts to hack around your restriction will cause the application to malfunction.

Related

asp.net secure my application so no one can transfer it and reuse it

I'm making an asp.net web application which will run locally on IIS
for a single user
I don't want this user to access my application files (in the www root ) or bring another programmer and steal my code
I just want the user to have the ability to access the website only and
stop any programmer from knowing my source code
I heard about an USB security system called "Dongle" but can it be used in a situation like this ?
any Ideas ?
thanks in advance
The website is just running code, but like anything, once the user has it they can do what they like to it, whether you like it or not. That's why there is a multi-million {currency} industry around securing applications.
You could use dongles but they're expensive and not trivial to implement. As #volleyball said, obfuscation would slow down most people from decompiling your app. without odfucation any licensing or dongles could just be patched out of your code.
Your most secure route would be to not give it to them. It's a web app, host it. This may not of course not meet your requirements.
Simon
I have never heard of a web application that uses a dongle. This is normally reserved for regular windows apps; and even then it's falling out of vogue. Generally speaking some of the more expensive software packages still use them.
However, the cost of duplicating a dongle is pretty low. Combined with the fact that getting around such security is relatively easy anyway and you have a situation in which you really shouldn't bother.
As Simon said, if it's a web app host it. Otherwise obfuscate it.
If neither of those are possible, then I'd recommend you change your licensing deal with your client to include the possibility of them going elsewhere. Perhaps for an additional charge you'll give them a non-exclusive site license permitting them to do whatever they want with the code short of selling it or giving it to another entity.
did you look at obfuscators. They do a good job at encrypting code. 99% of the time your code cannot be reverse engineered. But if someone sits on your stolen code they can reverse engineer.. In the sense that ordinary people may not obfuscate it. If the person is very intellingent, he will not reverse engineer he will write better code.

Checklist for ASP.NET / Database performance

Recently our customers started to complain about poor performance on one of our servers.
This contains multiple large CMS implementations and alot small websites using Sitefinity.
Our Hosting team is now trying to find the bottlenecks in our environments, since there are some major issues with loadtimes. I've been given the task to specify one big list of things to look out for, devided into different the parts (IIS, ASP.NET, Web specific).
I think it'd be good to find out how many instances of the Sitecore CMS we can run on one server according to the Sitecore documentation e.d. We want to be able to monitor and find out where our bottleneck is at this point. Some of our websites load terribly slow, other websites load very fast. Most of our Sitecore implementations that run on this server have poor back-end performance, and have terrible load times after a compilation.
Our Sitecore solutions run on a Win 2008 64 server with Microsoft SQL Server 2008 for db's.
I understand that it might be handy to specify more detailed information about our setup, but I'm hoping we'd be able to get some usefull basic information regarding how to monitor and find bottlenecks e.d.
What tools / hints / tips & tricks do you have?
do NOT use too many different asp.net pools, called and as dedicate pool in plesk. Place more sites on the same pool.
More memory, or stop non used programs/services on the server
Check if you have memory limits on the application pool that make the pool continues auto-restarts.
On the database, set Recovery Mode to simple.
Shrink database files, and reindex database, from inside the program
after all that Defrag your disks
Check the memory with process explorer.
To check whats starts with your server use the autoruns but be careful not to stop any critical service and the computer never starts again. Do not stop services from autoruns, use the service manager to change the type to manual. Also many sql serve services they not need to run if you never used them.
Some other tips
Move the temporary files / and maybe asp.net build directory to a different disk
Delete all files from temporary dir ( cd %temp% )
Be sure that the free physical memory is not zero, using the process exporer. If its near zero, then your server needs memory, or needs to stop non using programs from running.
To place many sites under the same pool, you need to change the permissions of the sites under the new share pool. Its not difficult, just take some time and organize to know what site runs under what pool. Now let say that you have 10 sites, its better to use 2 diferent pools, and spread the sites on this pools base on the load of each site.
There are no immediate answer to Sitecore performance tuning. But here are some vital tips:
1) CACHING
Caching is everything. The default Sitecore cache parameters are rarely correct for any application. If you have lots of memory, you should increase the cache sizes:
http://learnsitecore.cmsuniverse.net/en/Developers/Articles/2009/07/CachingOverview.aspx
http://sitecorebasics.wordpress.com/2011/03/05/sitecore-caching/
http://blog.wojciech.org/?p=9
Unfortunately this is something the developer should be aware of when deploying an installation, not something the system admin should care about...
2) DATABASE
The database is the last bottleneck to check. I rarely touch the database. However, the DB performance can be increased with the proper settings:
Database properties that improves performance:
http://www.theclientview.net/?p=162
This article on index fragmentation is very helpful:
http://www.theclientview.net/?p=40
Can't speak for Sitefinity, but will come with some tips for Sitecore.
Use Sitecores caching whenever possible, esp. on XSLTs (as they tend to be simpler than layouts & sublayouts and therefore Sitecore caching doesn't break them, as Sitecore caching does to asp.net postbacks), this ofc will only help if rederings & sublayouts etc are accessed a lot. use /sitecore/admin/stats.aspx?site=website to check stuff that isn't cached
Use Sitecores profiler, open up an item in the profiler and see which sublayouts etc are taking time
Only use XSLTs for the simplest content, if it get anymore complicated than and I'd go for sublayouts (asp.net controls), this is a bit biased as I'm not fond of XSLT, but experience indicates that .ascx's are faster
Use IIS' content expiration on the static files (prob all of /sitecore and if you have some images, javascript & CSS files) this is for IIS 6: msdn link
Check database access times with Sitecore Databasetest.aspx (the one for Sitecore 6 is a lot better than the simple one that works on Sitecore 5 & 6) Sitecore SDN link
And that's what I can think of from the top of my head.
Sitecore has a major flaw, its uses GUIDs for primary keys (amongst other poorly chosen data types), this fragments the table from the first insert and if you have a heavily utilised Sitecore database the fragmentation can be greater than 90% within an hour. These is not a well-designed database and recommend looking at other products until they fix this, it is causing us a major performance headache (time and money).
We are at a stand still we cannot add anymore RAM cannot rebuild the indexes more often
Also, set your IIS to recycle the app_pool ONLY once a day at a specific time. I usually set mine for 3am. This way the application never goes to sleep, recycle or etc. Best to reduce spin up times.
Additionally configure IIS to 'always running' instead of 'on starup'. This way, when the application restarts, it recompiles immediately and again, is ready to roar.
Sitefinity is really a fantastic piece of software (hopefully my tips above get the thumbs up, and not my endorsement of the product). haha

keep track of folder's activities

I want to monitor the activities of all the folders present at "C:\Inetpub\ftproot\san".User can work on any type of files and not only text files.Since we have given 1GB space (lets say) to each user, so user can do anything to utilize this space.
Now I want to monitor the activites that the user will do in his folder like creating new file, deleting an existing file or editing a file.I want to monitor user's activities because i have to keep track of the space given to the user so tht i can restrict the user to use 1GB space only and not more than that.
is there any class that i can use other than FileSystemWatcher as it works only in console applications and not in webapplications??
any help would be highly apperciated..
Many thanks
FileSystemWatcher should work just fine in a web application, but the problem is that the web application isn't always on. If nobody accesses it for a while, it can be shut down in lieu of other things that need resources and then started again when next accessed. It can also be re-started easily when things within its own structure change (such as its config file). It's very stateless and transient.
How do you plan to use this information within the web application? Does your web application really need to be constantly watching, or does it just need to generate a report of the current state of the filesystem when requested? If you really need the former, then the aforementioned nature of how web applications run on the server will make things a bit unreliable. Maybe a Windows service running on the web server would be more up to the task?
First, I would break this down into:
What are the possible ways users can modify the contents of the folder?
What are you trying to accomplish/present to the user via the Web interface?
One way to do this somewhat simply (in a sense) would be to maintain a service on the machine that periodically monitors the directory for the information you need (size, # of files, whatever), and connect this (via something like WCF) to the actual web application. In effect, you'd have a semi-soft limit, in that for a period users could operate on more than 1GB, but there are obviously corrective measures you could take, but this way you don't actually have to monitor every action of every user in realtime.
Off the cuff I would think that you need some sort of service to use the FileSystemWatcher class. The only way to really watch over the directory using asp.net is if you are controlling how all files get added and deleted from the directory. If that is the case then you can add code to skim through the directory and add the sizes of everything in there pretty easily.
If they can put files in these folders with other applications (such as an FTP client) then you are going to need a service to watch over the folders.
A better way of doing this is let your WebApp run as a portal to what's happening, but you will need a windows service running to ensure that someone does not go over the space allotment.
The service would also be able to help give data to your portal.
Remember, a website only runs when someone calls it. So if you don't use your website for 5 days, nothing will monitor it.
Sure you could keep a web page open for X amount of days, but that's overkill.

Best Practices for Self Updating Desktop Application in a network environment

I have searched through google and SO for possible answers to this question, but can only find small bits of information scattered around the place, most of which appear to be personal opinion.
I'm aware that this question could be considered subjective, but I'm not looking for personal opinion, rather facts with reasons (e.g. past experience) or even a single link to a blog/wiki which describes best practices for this (this is what I'd prefer to be honest). What I'm not looking for is how to make this work, I know how to create a self updating desktop application.
I want to know about the best practices for creating a self updating desktop application. The sort of best practices I'm especially curious about are:
Do you force an update if the clients software is out of date, but not going to break when trying to communicate with other version of the software or the database itself? If so how do you signify this breaking change?
How often should you check for updates? Weekly/daily/hourly and exactly why?
Should the update be visible to the user or run behind the scenes from a UI point of view?
Should you even notify the user that there is an update available if it is not a major update? (for instance fixing a single button in a remote part of the application which only one user actually requires)
Should you try to patch the application or do you re-download the entire application from scratch Macintosh style?
Should you allow users to update from a central location or only allow updating through the specified application? (for closed business applications).
Surely there is some written rules/suggestions about this stuff? One of the most annoying things about a lot of applications is the updating, as it's hard to find a good balance between "out of date" and "in the users face".
If it helps consider this to be written in .net C# for a single client, running on machines with constant available connectivity to the update server, all of these machines talk to each other through the application, and all also talk to a central database server.
One best practice that many software overlook: ask to update when the user is closing your application, NOT when it has just launched it.
It's incredible how many apps don't do that (Firefox, for example). You just ran the app, you want to use it now, and instead, it prompts you if you want to update, which of course is going to take 5 minutes and require restarting the app.
This is non-sense. Just do the update at the end.
It's hard to give a general answer. It depends on the context: criticality of the update, what kind of app is it, user preferences, #users, network width, etc. Here are some of the options/trade-offs.
Do you force an update if the clients software is out of date, but not going to break when trying to communicate with other version of the software or the database itself? If so how do you signify this breaking change?
As a developer your best interest is to have all apps out there to be as up to date as possible. This reduces your maintenance effort. Thus, if the user does not mind you should update.
How often should you check for updates? Weekly/daily/hourly and exactly why?
If the updates are transparent to the user, do not require an immediate restart of the app, then I'd suggest that you do it as often as your the communication bandwidth allows (considering both the update check-frequent but small-and the download-infrequent but large)
Should the update be visible to the user or run behind the scenes from a UI point of view?
Depends on the user preferences but also on the type of the update: bug fixes vs. functionality/UI changes (the user will be puzzled to see the look and feel has changed with no previous alert)
Should you even notify the user that there is an update available if it is not a major update? (for instance fixing a single button in a remote part of the application which only one user actually requires)
same arguments as the previous question
Should you try to patch the application or do you re-download the entire application from scratch Macintosh style?
if app size is small download it from scratch. This will prevent all sort of weird bugs created to mismatch between the different patches ("DLL hell"). However, this may require large download times or impose heavy toll on your network.
Should you allow users to update from a central location or only allow updating through the specified application? (for closed business applications).
I think both
From practical experience, don't forget to add functionality for updating the update engine. Which means that performing an update is usually a two step approach
Check if there are updates to the update engine
Check if there are updates to the actual application
Do you force an update if the clients
software is out of date, but not going
to break when trying to communicate
with other version of the software or
the database itself? If so how do you
signify this breaking change?
A common practice is to have a "ProtocolVersion" method which indicates the lowest/oldest version allowed.
The "ProtocolVersion" can either supplied by the client or the server depending on the trust level you have between the client and the server. In a low trust level it is probably better to have the client provide the "ProtocolVersion" and then deny access server side until the client is updated. In a "high trust level" scenario it will be easier to have the server supply the "ProtocolVersion" it accepts, and then all the logic for adapting to this - including updating the client application - implemented in the client only. Giving the benefit that the version check/handling code only needs to be in one place.
Do not ever try to force an update unless your lawyers demand that. Show the the user a update notification she can either accept or ignore. Try not to spam the same version too much is she rejected it. The help her make the decision, include a link to release notes or a short summary of changes.
Weekly would be a good default update check interval but let the user choose this, including completely disabling update check from the web. Do not check too often because she might be on an expensive mobile data plan, or she just doesn't like the idea of an application phoning home.
The update check part should be completely silent. If an update was found, display a notification for the user. During download and installation, show a progress bar.
To keep this simple, notify the user about any newer version. If you do not want to annoy them with frequent updates including just a few minor bug fixes, do not release every minor version at the download location watched by the update checker
Maintaining patches for all previously released versions is too much work. If the download size becomes a problem, figure out some other way than patches to make it smaller (7-zip compressed self-extracting exe, splitting the application to multiple MSI packages that have independent versions etc)
Two more things:
Do not implement the update engine as a process that is constantly running in the background even when I'm not using your application. My PC already ~10 such processes hogging resources, which is very annoying.
When updating the update engine itself, on one hand you need to have the engine running to show the installation progress UI but on the other hand the update process must be closed to avoid the reboot that would result from the exe file being locked. There are a number of things like running a helper program from %TEMP%, using Windows Installer restart manager, renaming the updater exe file before starting the installation package etc. Keep this in mind when architecting the update engine.
Do you force an update if the clients software is out of date, but not going to break when trying to communicate with other version of the software or the database itself? If so how do you signify this breaking change?
Ask the user.
How often should you check for updates? Weekly/daily/hourly and exactly why?
Ask the user.
Should the update be visible to the user or run behind the scenes from a UI point of view?
Ask the user.
Should you even notify the user that there is an update available if it is not a major update? (for instance fixing a single button in a remote part of the application which only one user actually requires)
Ask the user (notice a trend here?).
Should you try to patch the application or do you re-download the entire application from scratch Macintosh style?
Typically, patch, if the application is of any significant size.
As far as the "ask the user" responses go, it doesn't mean always prompt them every single time. Instead, give them the option to set what they should be prompted for and what should just be done invisibly (and the first time a given thing occurs, ask them what should be done in the future, and remember that). This shouldn't be very difficult and you gain a lot of goodwill from a larger portion of your user base, since it's very hard to have fixed settings suit the desires of everyone who uses your app. When in doubt, more options are better than less - especially when they're the kind of option that's fairly trivial to code.

Is it commonplace/appropriate for third party components to make undocumented use of the filesystem?

I have been utilizing two third party components for PDF document generation (in .NET, but i think this is a platform independent topic). I will leave the company's names out of it for now, but I will say, they are not extremely well known vendors.
I have found that both products make undocumented use of the filesystem (i.e. putting temp files on disk). This has created a problem for me in my ASP.NET web application as I now have to identify the file locations and set permissions on them as appropriate. Since my web application is setup for impersonation using Windows authentication, this essentially means I have to assign write permissions to a few file locations on my web server.
Not that big a deal, once I figured out why the components were failing, but...I see this as a maintenance issue. What happens when we upgrade our servers to some OS that changes one of the temporary file locations? What happens if the vendor decides to change the temporary file location? Our application will "break" without changing a line of our code. Related, but if we have to stand this application up in a "fresh" machine (regardless of environment), we have to know about this issue and set permissions appropriately.
Unfortunately, the components do not provide a way to make this temporary file path "configurable", which would certainly at least make it more explicit about what is going on under the covers.
This isn't really a question that I need answered, but more of a kick off for conversation about whether what these component vendors are doing is appropriate, how this should be documented/communicated to users, etc.
Thoughts? Opinions? Comments?
First, I'd ask whether these PDF generation tools are designed to be run within ASP.NET apps. Do they make claims that this is something they support? If so, then they should provide documentation on how they use the file system and what permissions they need.
If not, then you're probably using an inappropriate tool set. I've been here and done that. I worked on a project where a "well known address lookup tool" was used, but the version we used was designed for desktop apps. As such, it wasn't written to cope with 100's of requests - many simultaneous - and it caused all sorts of hard to repro errors.
Commonplace? yes. Appropriate? usually not.
Temp Files are one of the appropriate uses IMHO, as long as they use the proper %TEMP% folder or even better, use the integrated Path.GetTempPath/Path.GetTempFileName Functions.
In an ideal world, each Third Party component comes with a Code Access Security description, listing in detail what is needed (and for what purpose), but CAS is possibly one of the most-ignored features of .net...
Writing temporary files would not be considered outside the normal functioning of any piece of software. Unless it is writing temp files to a really bizarre place, this seems more likely something they never thought to document rather than went out of their way to cause you trouble. I would simply contact the vendor explain what your are doing and ask if they can provide documentation.
Also Martin makes a good point about whether it is a app that should run with Asp.net or a desktop app.

Resources