Make a .Net DLL Thread-safe for Web App Consumption? - asp.net

I've written a class in VB.Net that is consumed in an ASP.NET Web Application running IIS7. I use .NET Framework 4.0. The class performs a REST request to an online and retrieves an XML response containing strongly typed data.
This class also performs caching using an SQL Server database. The class is compiled to a .DLL and referenced by the Web Application. It works excellent and now I need to know how to make the class thread-safe.
I have no experience with making code 'thread-safe'. I don't know where to begin in determining whether or not it is thread-safe. Am assuming, because I didn't pay attention to this during development, that it is NOT thread-safe and that since it the web application will be used by many users at the same time that it must be payed attention to.
Can anyone point me on how to test for thread-safety? Are there any resources online that that will give me some ideas? Are there any rules of thumb that will tip me off as to where my main concerns are?

The easiest possible thing you can look out for is the use of "static" (C#) or "Shared" (VB.NET) variables. If these variables can be modified throughout the lifetime of the application you will likely run into threading issues which can really often result in "random looking" problems.
I would also be concerned about how you are doing the caching in your database as multiple .NET threads hitting SQL (for the cache) could cause issues as well depending on how its designed.
Bottom line is you are likely going to need to learn more about threading if you want to be sure this is going to not have issues. Probably the best book I have ever read in terms of simple to very complex C# topics is C# 4.0 In a Nutshell I would take a look at that book especially the threading chapters. (Seriously read the whole thing though) If you get that read through and have a good understanding of the concepts mentioned you should be fine.

Related

How secure are singletons in ASP.NET?

Are singletons in ASP.NET shared between users/sessions? And if they are, are there any safety considerations? Think serializing/deserializing vulnerabilities, thread safety etc.
Is it the way to go using settings from the database that are the same for all users?
Hand crafting the anti-pattern called "Singleton" in C# code is a really bad idea in general, ASP.NET or not.
The singleton lifetime that is supported in the dependency injection framework is a good idea if it does what you need.
I would advise you to only use it for read-only data, like settings, though. You don't have an application on the desktop as of old. Your application might be recycled on the fly, or maybe stretched across multiple nodes on a server farm. So suddenly your "singleton" is actually only a singleton if you have a single instance of your program running. Building your application so this becomes an artificial problem (i.e. the framework would support it, but your own code is build to fail if you actually do so) is not a smart way to go about this.
So to summarize: Singleton lifetime in your dependency injection container? Might be okay. Depends on your use case. An actual "Singleton" pattern in your code? Bad. Very Bad. Tells me you don't actually do any unit testing and nothing is planned to bring this application over a few thousand hobbyist users who don't care if your app is down every time you deploy.

Most reliable method for ASP.Net to Classic ASP 3Des/Aes Encryption

i was reviewing some other posts here and found some options that I have done some research on, but haven't quite found the information i'm looking for, such as reliability and its request capacity and speed, etc.
So far i've found 3 possible methods of (3DES/AES) encryption for ASP.Net/Classic ASP compatibility are:
1.Capicom.DLL
But how much work is involved when using the .Net app, I've heard it places extra variables in the encrypted data, so it makes the process a bit more troublesome in .Net.
It is distributed by Microsoft, but how well can this function operate under heavy workloads?
2.Chillkat
Third party component, never tried it, and don't know how well it can handle large workloads. I have used a third-party one before, and it has just crashed out when the workload got too heavy on the server.
3.ASP.Net web service using .Net library
Use HTTPRequests from classic asp to get the data. This is a possibility, but i'm just thinking something that was internal, like a DLL, would be quicker and more efficient/reliable?
Any help with this would be appreciated. Thank you.
If you're using ASP.NET C#, there's an existing namespace which you can call that does the security stuffs for you.
System.Security.Cryptography Namespace

Putting a new web interface on an old fat-client database

My company has a fairly old fat client application written in Delphi. We are very interested in replacing it with a shiny new web application. This will make maintenance a breeze and many clients want a web application.
The application is extremely rich in domain knowledge, some of which is out of our control. Our clients use the program to manage their own clients and report them to the government. So an inaccurate program is a pretty big thing. The old program has no tests. We are not sure yet if we will implement automated testing with the new one.
We first planned to basically start from scratch. But we are short handed and wanting to basically get everyone on the web as soon as possible. So instead of starting from scratch we've decided to try to make use of the legacy fat-client database.
The database is SQL Server and can be used in SQL Server 2008 easily. It is very rich in stored procedures, functions, a few triggers, and lots of tables with over 80 columns... But it is decently normalized. We want for both the web application and fat client to be capable of using the same database. This is so that if something breaks badly in the web application, our clients can still use the fat client and connect to our servers. After the web application is considered "stable", we'd deprecate the fat client.
Has anyone else done this? What tips can you give? We want to, after getting everyone on the website, to slowly change the database structure to take care of some design deficiencies. What is the best way to keep this in a data access layer so that later changes are easy?
And what about actually making the screens? Is there any way easier than just rewriting an 80 field form in ASP.Net? Are there any tools that can make this easier?
The current plan is to use ASP.Net WebForms (.Net 3.5). I'd really like to use MVC, but no one on the team knows it including me.
We are not sure yet if we will implement automated testing with the
new one.
Implement automated testing. What's the point in replacing one buggy program with another?
Good question, but "Slowly change" the db structure after getting everyone on the website, sounds like a joke...
I would rather take the opportunity to create a fresh db structure, write a bulletproof migration script for you db, that you can try out and rewrite a zillion times without any side effect fro your clients, and then write whaterver you want (fat/web) on the new db, have it tested and migrate everyone when it's ready.
I have a couple suggestions:
1) create a service layer to abstract away the dependance on the DAL. In a situation as you describe having a layer of indirection for the UI and BLL to rely on makes DB changes much safer.
2) Create automated tests (both unit and integration), especially if you plan on making fairly significant changes to the Domain or Persistance layers (BLL/DAL). To make this really easy you should always try to program to an interface. This makes your code more flexible as well as letting you use mocking frameworks (Moq is one I like) to ensure your tests truely are unit tests and not integration tests.
3) Take a look at DDD (http://domaindrivendesign.org/) as it seems to fit pretty well with the given scenario. At the very least there are some very useful patterns that can help make your application more flexible.
4) MVC isnt very hard to learn at all, it is however an easy way to get unit testing setup for the UI as a result of the MVC architecture (testing the controller and not the view). That said, there is no reason you couldn't unit test web forms, its just a bit more work. MVC really is just a UI framework/design pattern (more Model2 but we can ignore that for now). It gets you closer to the metal so to speak as you will be writting a lot more HTML and using a Model (the 'M') for passing data around.
For DDD take a look at Eric Evans book: http://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215/ref=sr_1_1?s=books&ie=UTF8&qid=1317333430&sr=1-1
Hope that helps
ASP.NEt forms is a no starter, is completely inappropriate for something like this. I recommend to start with something like Creating an OData API for StackOverflow including XML and JSON in 30 minutes, then build your Web app on top of that (ie. push it to the client, use JQuery/Silverlight).

Is caching using Application slower or more problematic than using Global.asax.cs static variable?

We have a Webforms application that stores a bunch of settings and terminology mappings (several hundred) that are used throughout the application.
http://www.dotnetperls.com/global-variables-aspnet makes these assertions:
The Application[] collection .... may be slower and harder to deal with.
the Application[] object ...is inefficient in ASP.NET.
Is this recommended? Yes, and not just by the writer of this article. It is noted in Professional ASP.NET by Apress and many sites on the Internet. It works well
So I am wondering if these statements are true. Can anyone elaborate on why using Application is slower or what kind of problems can crop up if you use Application? I am sort of assuming that any problems or slowdowns might only surface under production loads, so that is why I am asking for real world experience, rather than just benchmarking myself.
I am aware that there are many alternatives to caching (HttpRuntime.Cache, memcached, etc) but specifically I want to know if I need to go back and rewrite my legacy code that uses Application[]. Specifically if in any way I am incurring a performance penalty I would want to get rid of that.
How are you saving these settings? I would recommend the web.config
If you're using the web.config to store these settings (if they're application variables that's a solid place to start), then no need for Application variables.
I try to steer clear of the Application level variables because they are way more expensive than Session variables.
Also, variables in the web.config / app.config files can change without having to change code and/or recompile your project.
Application class (global variables) only exist in ASP.NET to help with backwards compatibility with classic ASP, you could say it's deprecated.
Another thing you could look into would be caching your settings so you're not always reading from disk.

Best Practices Server Side Scripting or Web Services

Let me start off by stating that I am a novice developer, so please excuse the elementary nature of my question(s).
I am currently working on a Flex Application, and am getting more and more confused about when to use server side scripting, and when to develop web services. For most of the functionality I am working on, I am taking various files from the user (client), uploading to the server for processing/conversion, then sending back to client in new format.
I am accomplishing most of this using asp.net generic handlers (ashx) files, but not very confident this is best practice. But at the same time, does making web services make any more sense? What would be considered best practice for this? Any suggestions would be greatly appreciated.
The way I look at it is as follows:
Web Services mean Established Best Practice.
For most of our development, we don't need to create "Web Services", or what I'm thinking when I think REST, SOAP, and the Twitter API. You only need to start doing that once you've got something you're going to be using every day for years.
Clean and DRY code will Lead you to Creating a Web Service
If you spend the time to clearly define the parts of your upload-process-render Architecture, and you find that it can be applied to almost everything you are doing, then all you need to do to make it a Web Service is define a clear, 1-2-3 set of rules for using the system (GET/POST data, etc.). As long as you are consciously building an architecture the whole way, you'll end up creating a Web Service if it's worthy. Otherwise there's no need.
It sounds like you have a clear workflow going, I don't know anything about asp.net though.
As far as it being confusing sometimes, and best practices, I suggest the following:
Create a Flex Library Project for your "generic ashx file handling" Flex classes. Give it a cool, simple name.
Create a .NET Library Project that encapsulates all the logic for your server-side file processing. Host it online and make it open source. I recommend github. Test it as you go, and document it, its purpose and the theory behind it.
If you don't have to do anymore work at this point, and it's just plug and chug, then you've probably arrived at something that might become a Web Service, though that's probably a few years down the road.
I don't think you should try to create a Web Service right off the bat. Just make some clean and reusable code, make a few examples, get it online and open source, have others contribute and give feedback, and if it solves a specific problem, then make it a web service. You can just use REST for now probably, and build your system around that. RestfulX is a great library for that.
Best,
Lance
making web services without any sense make no sense ;)
Now in the world of FLEX as3 with flash version 10, you can easily read local files, modify them with whatever modifcation algorithm and save local files without pinging server.
You only have to use webservices if you want to get some server data or to send some data to server. that's all.
RSTanvir
Flash / Flex uses a simple HTTP POST approach for file uploads, so trying to do that using SOAP web services will be problematic. Your approach of using ASHX here sounds reasonable to me.
To send / receive data that isn't file based (e.g. a list of files the user has uploaded previously), I would recommend looking at the open source Fluorine FX library. Fluorine uses AMF which is a highly performant way of doing data transfer with Flash. It's also purely configuration-based, which means you don't need to code against any of its APIs, just configure Fluorine to expose your .NET service classes. You could easily add attributes to those same classes to expose them as SOAP web services via WCF if you need that in the future. I would not recommend using SOAP with Flex however, due to the performance losses and also because the Flex implementation of SOAP has a history of bugs and interoperability problems.

Resources