Strategy for separating common logic across multiple websites - asp.net

I have a scenario where I have multiple websites using a commnon dll for authentication and general user detail fetching.
I now need to update the common dll with a slightly different login logic and it means I'll need to push this new dll into every website and do a release process for each.
I'm wondering whether it's better to host the common authentication methods in a webservice of some sort then have the websites call that internally. Would it be an internal web service? ajax callbacks from an server side only website? Or stick with the dll method to ensure code changes doesn't break the sites?
Are there any security concerns when not using a dll for this kind of task?

Using a webservice seems a good way to do that. I will cause less memory usage and can be updated independantly from the wesites (if ever needed).
You could go for a WCF services (with dual tcp?) maybe.

Both approaches work in my opinion, but there are significant differences between them that we should keep in mind.
First of all, all of this depends also on the language you are using, because sometimes the best theoretical answer is not always easily implemented in each and every language, making it practical unusable.
So, with this in mind, the best way for me is to have an internal web service, who deals with all requests regarding this "authentication and general user detail module", assuming that all websites use the same DB (or data layer) (otherwise, you will need to create a web service for each one and it's another completely different story). This approach will give you flexibility and maintainability. You could use direct ajax requests to this web service, or make the calls from you website server, and them reply to the browser already with that information. (this second option is more time consuming but much more secure, and if it is a real internal web service (ie hosted on the same machine, the lag will not be noticeable)).
The dll approach should if you need to apply the same business logic to different services. In practical terms: you have two completely separated web sites, that use the same kind of authentication logic. Keep in mind that using this approach to websites that use the same data layer, will force you most of the times to have "deprecated ways" working together with the new implementations, while you are updating the dll on all websites.
Regards,

Related

How can I programmatically load cookies inside a Visual Studio Web Performance Test?

I work for a Canadian government department, and our group uses primarily tools from Microsoft, including Visual Studio. We need to carry out load-testing on one of our department's web applications. I have no prior experience with load-testing, but from what I understand, this would entail creating web performance tests recording various testing scenarios, and then creating load tests pointing to these web performance tests.
One complication is that our application relies on an external authentication service, a service used by other applications (and other departments). Our service agreement with this service provider explicitly stipulates that we not subject the service to load-testing.
So we'll need to find a way to bypass the authentication mechanism to carry-out our load-testing. Here's the outline of one strategy a colleague and I came up with:
Log-in normally to the web site, going through the authentication
service as normal.
Use developer tools installed in the browser to capture the cookie(s) created when authenticating
Create a web performance test, and add some code to the web performance test to use the cookie(s), and thereby use the session I
had established when logging in manually.
But I'm not entirely confident that this is the right approach. And even if it is - I have no prior experience with creating web performance tests or load tests, so I'm a bit lost as to go about programmatically loading a cookie inside a web performance test.
Does anyone have any suggestions?
I would break down the task into smaller pieces. If your main job is to load test the application, I would set it up on the internal network with Windows authentication or anonymous authentication, and modify the application to avoid having to deal with that part of the problem.
For the authentication piece of the problem, try set it up so a single static cookie will work every time. (If you need thousands of distinct user cookies, this becomes a bigger job, of course.)
See here for a discussion of the Apache JMeter cookie manager.
I would ask if the authentication could be stubbed out. Instead of calling the 3rd party, call a stub application which will return the equivalent responses. That way, instead of stressing the 3rd party, it's only your (self-hosted) stub that is affected.
This is the opposite of not having a front-end application; in which case a test harness would be required to emulate the front-end. A stub is the equivalent for emulating a back-end application.

Is it more secure to put data access in a web service rather than a class within the current project?

We have a few projects that we put all the data access in a separate web service project and the parent project will call the web service for everything data related. The web service will only accept connections from the web project server. My assumption is that the web service would be less susceptible to intrusion this way. I'm not really sure this is correct.
Is this more secure than just putting the data access in a class or dll within the parent project?
NOTE
Developers above me made this decision.
I don't see that as an effective way of securing your database. Of all the various ways that exist to protect your data layer, I don't think that moving calls from a class library to a web service is an effective way to protect yourself.
A better approach would be to make sure that you use parameterized queries or stored procedures to prevent SQL injection, and limit the privileges of your logins to only the operations that they need to perform.
However, there would be other arguments for having data access in a separate web service... such as re-usability, or a service-oriented architecture. If the same data access layer is needed from a variety of projects on multiple servers, by having the web service you wouldn't need to have the same class library duplicated all over the place... which would cause you to worry about which project has which version of your data access layer.
So, more secure? I don't think so... Other benefits? Probably...
Short answer: Yes
Longer answer: My assumption is that the web server that is exposing the services is behind its own firewall. Doing it this way insulates the database from intrusion by forcing hackers to go through another layer if they were able to compromise your application servers. Since the database connection strings do not exist on the app server, and a firewall prevents direct connections from that server to the database, the hackers would need to somehow puncture that firewall and gain access to the server that is hosting your data services.
Now, I also assume that the web services are not simply exposing methods like
execute(string sqlCommand)
if that's the case, then this solution might actually less secure than simply using a database without the web services. For this solution to truly be more secure you would want to create operation-specific methods on the web service server.
A DLL can't be accessed and executed from the Web, so far as I know. A Web service can. If that's true, the class library referenced by a Web project (or even a Web Service) is more secure than a Web service encapsulating that logic directly.
Further, there's the whole notion of Separation of Concerns. In my mind, data access logic belongs on a separate tier, completely separate from business logic. In a well designed architecture, Web services expose discrete methods that represent business transactions--not necessarily data transactions. Business transactions encapsulate one or more data transactions, which are represented by separate classes that encapsulate the data access logic and provide the security to ensure that SQL injection never occurs.
Others, naturally, may disagree. We're developers. It's our nature to disagree. :)

is Silverlight more friendly to load-balancing than ASP.NET?

I was discussing load-balancing with a colleague at lunch. I admit that I know very little about this topic. We were discussing the various ways of maintaing session in a ASP.NET application -- none of which suited the high performance load balancing that he was looking for.
What about Silverlight? says I. As far as I know it is stateless, you've got the app running in the browser and you've got services on the server that feed/process data.
Does Silverlight totally negate the need for Session state management? Is it more friendly to load-balancing? Is it something in between?
I would say that Silverlight is likely to be a little more load-balancer friendly than ASP.NET. You have much more sophisticated mechanisms for maintaining state (such as isolated local storage), and pretty much, you only need to talk to the server when (a) you initially download the application, and then (b) when you make a web service call to retrieve or update data. It's analogous in this sense to an Ajax application written entirely in C#.
To put it another way, so long as either (a) your server-side persistence layer knows who your client is, or (b) you pass in all relevant data on each WCF call, it doesn't matter which web server instance the call goes to. You don't have to muck about with firewall-level persistence to make sure your HTTP call goes back to the right web server.
I'd say it depends on your application. If it's a banking application,then yes I want something timingout out after 5 minutes and asking for my password again. If it's facebook then not so much.
Silverlight depends on XMLHttpRequest like any other ajax impelementation and is therefore capable of maintaining a session, forms authentiction, roles, profiles etc etc.
The benefit you are getting is obviating virtually all of the traffic. json requests are negligable compared to serving pages. Even the .xap can be cached on the client.
I would say you are getting the best of both worlds in regards to your question.

Web Database or SOAP?

We’ve got a back office CRM application that exposes some of the data in a public ASP.NET site. Currently the ASP.NET site sits on top of a separate cut down version of the back office database (we call this the web database). Daily synchronisation routines keep the databases up-to-date (hosted in the back office). The problem is that the synchronisation logic is very complex and time consuming to change. I was wondering whether using a SOAP service could simply things? The ASP.NET web pages would call the SOAP service which in tern would do the database calls. There would be no need for a separate web database or synchronisation routines. My main concern with the SOAP approach is security because the SOAP service would be exposed to the internet.
Should we stick with our current architecture? Or would the SOAP approach be an improvement?
The short answer is yes, web service calls would be better and would remove the need for synchronization.
The long answer is that you need to understand the technology available for you in terms of web services. I would highly recommend looking into WCF which will allow you to do exactly what you want to do and also you will be able to only expose your services to the ASP.NET web server and not to the entire internet.
There would be no security problem. Simply use one of the secure bindings, like wsHttpBinding.
I'd look at making the web database build process more maintainable
Since security is obviously a concern, this means you need to add logic to limit the types of data & requests and that logic has to live SOMEWHERE.

When should a web service not be used?

Using a web service is often an excellent architectural approach. And, with the advent of WCF in .Net, it's getting even better.
But, in my experience, some people seem to think that web services should always be used in the data access layer for calls to the database. I don't think that web services are the universal solution.
I am thinking of smaller intranet applications with a few dozen users. The web app and its web service are deployed to one web server, not a web farm. There isn't going to be another web app in the future that can use this particular web service. It seems to me that the cost of calling the web service unnecessarily increases the burden on the web server. There is a performance hit to inter-process calls. Maintaining and debugging the code for the web app and the web service is more complicated. So is deployment. I just don't see the advantages of using a web service here.
One could test this by creating two versions of the web app, with and without the web service, and do stress testing, but I haven't done it.
Do you have an opinion on using web services for small-scale web app's? Any other occasions when web services are not a good architectural choice?
Web Services are an absolutely horrible choice for data access. It's a ton of overhead and complexity for almost zero benefit.
If your app is going to run on one machine, why deny it the ability to do in-process data access calls? I'm not talking about directly accessing the database from your UI code, I'm talking about abstracting your repositories away but still including their assemblies in your running web site.
There are cases where I'd recommend web services (and I'm assuming you mean SOAP) but that's mostly for interoperability.
The granularity of the services is also in question here. A service in the SOA sense will encapsulate an operation or a business process. Data access methods are only part of that process.
In other words:
- someService.SaveOrder(order); // <-- bad
// some other code for shipping, charging, emailing, etc
- someService.FulfillOrder(order); //<-- better
//the service encapsulates the entire process
Web services for the sake of web services is irresponsible programming.
Nick Harrison, a brilliant developer in Charlotte, suggested these scenarios where using a web service makes sense:
On a Web farm, where there are multiple web servers hosting website(s), all pointing to web service(s) running on another web server. This allows for distributing the load over multiple servers.
Client/server, where Windows forms apps can call a web service.
Cross platform
Passing through a firewall
Just because the tool generates a bunch of stubs doesn't mean it's a good use. WS-* excels in scenarios where you expose services to external parties. This means that each operation should be on the granularity of business process as opposed to data access.
The multitude of standards can be used to describe different facets of your contract in great detail and a (hypothetical) fully compliant WS stack can take away a lot of pain from the third party developers and even allow the fabled point and click integration a'la Yahoo Pipes. With good governance controls you can evolve your public interface and manage the backward compatibility as needed.
All this is next to impossible to be generated automatically. The C# stub generator knows only the physical interface of your class, but doesn't have any idea about the semantics involved. See this paper for more detailed discussion.
If you are building a web site, then build a web site. If you want asynchronous messaging inside your application, use MSMQ. If you want to expose data to internal clients, use POX. If you need efficient binary message format, check Google's Protocol Buffers or if you need RPC check Hessian for C# or DCOM.
Web services are a coarse grained integration solution. They are rigid, they are slower than alternatives, they take too much effort to do well (and when not done well are next to pointless).
To summarize: "When should a web service not be used?" - anytime you can get away without it
If you are just coding a tiny (less than 50 users) web application for your intranet, a web service seems overkill. Especially if its primary function (providing a single point of access to many services) won't be used.
I agree that the use of a web service in a small scale web app adds a layer of complexity that does not seem justified. Most of my solutions, internet and intranet, 10-50 users, do not employ web services. I am glad others feel the same...I thought I was the only one.
For a small scale web app I think that using web services is often quite a good idea, you can use it to easily decouple the web server from the data tier. With the straightofrward development requirements and great tooling I don't see the problem.
However don't use web services in the following scenarios:
When you must use Http as the transport and Xml serialization of your data and you need lots of different bits of data, synchronously and often. Whether REST or SOAP or WS-* you're going to suffer performance issues. The more calls you make the slower your system will be. If you want medium size chunks of data less frequently, asynchronously and you can use straight TcpIp (e.g. Wcf netTcpBinding) you'd be better off.
When you need to query and join data from your web service with other data sources, rather motivate for a data warehouse which can be populated with properly consolidated and rationalized data from across the enterprize
This is my experience, hope it helps.
For a small-scale web app (You have to ask the question, "Will it always remain small scale?" though) using web services, separate business layers, data layers, and so on and so forth can be overkill.
Before anyone shoots me, I do agree that separation of logic between layers along with unit tests, continuous integration, et al are bloody brilliant. In my current role I'd be utterly lost and rocking in the corner without them. However for a very small-scale web app being used to, for example, track contact numbers and addresses for a company of 36 employees, the cost/benefit analysis would suggest that all the "niceties" listed above would be overkill.
However... Remember to ask the question "Will it always remain small scale?" :-)

Resources