WCF necessary for web app? - asp.net

I'm working on a web app that will connect to a database to store and retrieve and manipulate data, and I was wondeiring if WCF is necessary. I've already deseigned and setup the database and i'm getting ready to start coding the app.
I read up about WCF but what i'm confused about is why i would need to use this layer when i can do the database coding inside the app itself by passing variable values directly into the database with stored procedures.. What am I missing?
BTW, I did research here and didn't see a question similar to this. I went to down two pages and typed "WCF necessary web app". No relevant hits...

WCF in this context is necessary if, for some reason, you need to have an abstraction layer between the database and the web app and you need to have this on another machine (or out-of-process). If that is not the case (likely) there is no need.
You might have noticed guidance to have an abstraction between the database and the web app. There are many pros and cons. Biggest con is effort. If you are not able to articulate clear benefits then you do not need to do this.
You do not "need" to do anything. Do what's right under your specific circumstances and requirements.

Related

Modular Application Design

We currently have an application that is usable by several clients, it is used to download and store data from our application that they have on their environment.
We have a need to pass this application over to a developer but at the same time, we need to protect our code. The way that I see it working is that we would like to some how consider our current app a framework, allowing another app to be created on top of it, but the app may have its own screens, but re-use some of the built-in screens.
Is it possible to protect our app in such a way with out rewriting everything into protected DLL's? Or should we just suck it up and share our code with consulting firms that want to build these types of apps for our clients?
If your proprietary code is entirely focused on downloading and storing data. You could create an online REST api that returns the data over the internet. The other developer could then just request the data from your servers using an HTTP call.
However if your code needs to be client-side, the only real thing you can do is compile a DLL, and even then that can be decompiled.

Sharing stored procedures across multiple apps

Team A has an enterprise app that uses ADO.NET for data access that executes stored procedures. The data access is encapsulated in it's own project (let's call it DAL.dll)
Team B is creating another unrelated app that's reusing the stored procedures in the enterprise app. This app is currently using the MS application block for data access. The issue we run into is that whenever Team A make any change to the input/output params in the stored procedures, there is a runtime error in Team B's app and this app needs to be updated to accommodate the additional params (or params that were removed). So, most of these go unnoticed until a user complains. At the very least, we would like to have the app throw a compilation error so that the build process warns us of the changes made.
One way to do this is to have Team B's project add a reference to the DAL.dll
I'd like to know if there are any other cleaner ways of solving the issue. We are ready to replace Team B's MS Data application block to use a different technology (Entity Framework?) if necessary.
Among the other answers, I'd strongly suggest getting those stored procedures into source control, in a Database Project. You then may be able to use the features of your source control system to do several things:
Lock some of the code so that it cannot be changed
Give you notifications if the code is changed
Warn you if the stored procedures change in a way that would prevent them from being called
Branch the stored procedures so that each team can have their own version of changed code, while keeping the unchanged stored procedures common. You of course will need to separate the different versions in the database.
I agree with the other posters on this thread that you should not share stored procedure's across different .NET DLL's, that is just a recipe for disaster. I would also shy away from ORM's like Entity Framework if you are doing anything at all complicated with your database schema because ORM's excel at getting a simple object model translated from your .NET application classes into SQL tables and SP's, but traditionally do poorly at optimizing them for performance on the database side. There will be people who claim otherwise, and they may have a valid point if you are an expert in wrangling an ORM to do waht you want like they are, but chances are you are not and it will cause you headaches in the long run.
A shared data access layer might work, but conceptually you are then just changing the implementation of the dependency from some code that a DBA wrote to some code that a .NET programmer wrote. Yes, you can use integration tests to achieve better verifiability, but the same case could be made for SQL with tools like Red Gate's SQL Test. I would shy away from this approach if the two applications are already experiencing some sort of pain from sharing SP's. That is an indication that the dependency just should be done away with.
If it were up to me, I'd just make a new schema for Team B's app. You can read more about schemas in SQL Server here: MSDN Schema description for 2008 R2. You can think of them as namespaces for SQL Server but with some additional bells and whistles like permission and access control. Separating out your different applications into separate schemas on the same shared database will probably make for the most flexible implementation in the long run.
unrelated app that's reusing the stored procedures in the enterprise app
If these two application are really unrelated why are those sharing procedures or even the same database. I know this is a long read, but I recommend you to read this: A Better Path to Enterprise Architectures
The partioning concept in there relates to the bounded context in Domain driven design:
Multiple models are in play on any large project. Yet when code based on distinct models is combined, software becomes buggy, unreliable, and difficult to understand. Communication among team members becomes confusing. It is often unclear in what context a model should not be applied.
Therefore: Explicitly define the context within which a model applies. Explicitly set boundaries in terms of team organization, usage within specific parts of the application, and physical manifestations such as code bases and database schemas. Keep the model strictly consistent within these bounds, but don’t be distracted or confused by issues outside.
It is expected you end with problems when you don't explicitely deal with this. You're lucky you're seeing early failures, as it can turn into problems much harder to find on the long run.
Analyze the problem again with the above in mind. Consider if you're missing some explicit context where this common functionality should live.
My question is: which team owns the store procedured and the database shared? Usually as a good architecture/design, you should not have two different apps sharing same database / procedures.
A better way to share data/functionality between two different applications is through a services or API, so the team who owns the functionality would be responsible to maintain it.
Also, have a good communication between both teams is highly recommend.
Depending on the owner of the DAL project, you could host web services and share the API. That way, you separate the Data Access Layer from the business logic, which allows anyone to use the same DAL without having to publish it to each different location.
From my point of view, it looks like both Team A and Team B should share the same core model and look at Multitier architecture as a possible solution.
It sounds like it would make sense to create a shared DAL that both applications can share.
I would add unit tests (or really integration tests) to make sure the DAL is compatible with the apps after changes. That way your tests would fail if incompatible changes have been made
"I'd like to know if there are any other cleaner ways of solving the issue."
The cleanest way is for Team B to sit down with Team A and encapsulate the relevant business logic into a shared API. It doesn't matter so much how you implement that API; what does matter is that the API's interface is documented and versioned so everyone knows what to expect.
One reasonable mechanism for this in a .NET environment is to use Microsoft's WebAPI.
In short, the question of "how do we share a stored procedure?" is most likely looking at the wrong level of abstraction.

Steps involved in reading an xml file from webservice .NET

I would like to know what are the basic steps involved in setting up your application to able to read data from another application. Then take that data and modify it and send it back to the application.
The data being read will have over 100 fields.... what is the most efficent way to store them? Put them in a class object?
I know web services are involved...any other info would be great!
The application is in .NET using vb
Thanks
You may need to be more specific about your requirements to get a truly useful answer. That said, Windows Communication Foundation (WCF) is likely to make your life much easier. Google for tutorials -- I can't say I have a favorite. You can handle one- or two-way communication readily with WCF, and you can then focus more on making your application logic work.

Web application configuration settings - Which is the better place to store

I came across a case study few days early. It is related to a web application architecture.
Here is the scenario,
There is a single web service used by say 1000 web applications. This web service is hosted on a particular server. If web service hosting location is changed, how the other applications come to know about this change ?
Keeping it in web.config doesn't seems to be a feasible solution as we need to modify web.config files for all the applications.
Keeping these settings in a common repository and let all the applications use it for web-service address was came in my mind, but again there is a question of storing this common repository.
I am just curious to know about how this could be achieved with better performance.
Thanks in advance for any kind of suggestions.
do you have full access or control over all those web applications consuming that web service? if so, you could have a script or some custom code which updates all their web.config(s) at once. it seems too much work but in fact in this way you have more control and you could also, eventually, point to the new url only some applications and leave some others on another url.
the idea with the setting in a centralized database gives you faster update propagation which could also be bad in case of errors and then you have all applications referring to the same place and no way to split this. Then you have anyway to connect to a centralized database from all of them and maybe you should add a key to their web.config(s) with the connection string to that database, then, in case that database is not reachable or is down, the web applications will not be able to consume the web service simply because they cannot get the url of it.
I would go for the web config, eventually you could have a settings helper class that abstract the retrieval of that url so the UI or front end does not know from where that url comes from.
anyway, do you plan to change the url of a web service often? wouldn't be better to copy it to a new url but to also keep it available on the current url for a while?
another advantage of web.config approach is that everytime you update and save it the application is restarted while a change in a database might take a while to be detected in case you have some caching mechanism,
hope this helps.
Davide.

Is it worth using the ASP.Net built in profile system?

I just discovered ASP.net uses its own profile system to register users and there seems to be a lot of features available as bonus with it (such as secure authentication). However it seems rather specific to have such a feature for a general purpose development environment and things which work in the background the way the profiles system does without me really knowing how (like where the user data is stored) kind of scares me.
Is it worth developing a website which requires user authentication using the asp.net profile system or would it be better to develop my own using SQL databases and such? I'm not going to avoid using SQL anyway, even if I use profiles I'll use the profiles unique ID to identify user data in the SQL table so in that sense I'm not going to avoid using SQL for user information at all.
My favorite thing about profiles is that you can create custom permissions in Web.config files using them () and avoid having to type in the same code to the top of all your aspx source files to do the authentication check.
The other thing I kind of like about it is that security is built in with secure authentication cookies, so I wouldn't have to deal with them myself.
But it doesn't seem like that big of a deal really. I'm just confused as to where profiles stand as far as ASP.Net development goes and what they're designed to accomplish.
The Profile/Membership and Role provider API is very intertwined, and specifies things very narrowly. The benefit is that there is little you have to do to get a lot of functionality working. The disadvantage is when what you need doesn't match what is provided. Nevertheless, there are many potential gotcha's that the API takes care of for you that it really does make sense to use it, at least for authentication.
My needs did not match what the API provided, and I really only needed the Membership portion. The problem is that I had a piece where I needed to use the same authentication and authorization across a web application and a desktop application. My needs are pretty unique, but it's designed for a classroom setting.
Getting the membership to work for my needs wasn't that difficult. I just had to implement the Membership API. There are several features I just didn't need with the Membership API like self-registration, etc. Of course this did present me with a challenge for role management. Typically, as long as your user object implements IPrinciple it can be used directly--but there are serialization issues with the development web server Visual Studio packages if your user class is not defined in the same assembly. Those problems deal with serialization, and your choices include putting the object in the GAC or handle cross-appdomain serialization yourself with objects that are in the GAC like GenericPrincipal and GenericIdentity. That latter option is what I had to do.
Bottom line is that if you don't mind letting the API do all the management for you, than it will work just fine. It is a bit of smart engineering work, and attempts to force you down a route with decent security practices. I've worked with a number of different authentication/authorization APIs (most were not CLR based), and the API does feel a bit constraining. However, if you want to avoid pitfalls with session/state/cache management you really need to use the API and plug in your own providers as necessary.
With your database, if you need to link a user with any database element you'll be storing the user's login id (Context.User.Identity.Name).
You seem to mix the Profile/Membership/Role provider API. But to answer your question: why not use it? I would use it unless there is a real constraint that makes it unusable...

Resources