Multiple consumers, single data provider - Too Early For RIA? Options? - asp.net

I have a few projects coming up that have a number of endpoints or clients that can hit the same data. For instance a site might have...
A asp.net MVC based end user facing website
A web based adminitration back end that can allow specific, limited updates from some users in situatiosn where a full client isn't useful (mobile web etc)
A full on rich client for administration so we can use touch and other techniques to really make the user experience for content management shine - these may be silverlight or full on WPF apps, as needed
The question is... whats the best way to connect all that? Right now I use a multiple project split for the MVC.
ProjectName.Core - this project has all the common models, all the repository classes and all the helper classes
ProjectName.Web - an MVC project that references the Core to pull in the repositories and models it needs
ProjectName.Admin.Web - another MVC project dedicated to the admin interface that references the Core to pull in the repositories and models it needs
(which will ultimately live on a seperate subdomain from the end user facing site)
Then the story peters out int he sense of clear guidance. When the time comes to build a WPF / Silverlight project to hit the data I can do one of the following to the best of my understanding now...
Convert the "Core" to provide a RIA style DomainService, and attempt to alter the MVC projects to make use of it. However there is little guidance on using MVC with RIA, RIA is in its infancy AND the WPF to RIA story is also still only thinly documented
Do the same as above, but using WCF. However WCF prings with it async complexity that I really want RIA to hide for me.
Fall back 20 yards and just bolt plain old web services onto the Core on top of the Repositories and Models I already have. This seems... old school :)
Any thoughs and input are welcome including pointers to examples and documentation. I want to make the best decisions I can now, and am coming up to speed fast on RIA and WCF so I can but community input is always helpful.
Thanks!

Take a look at ADO.NET Data Services (formerly known as Project “Astoria”).
On my way to work today I was listening to a .NET Rocks! podcast, "Stephen Forte on Data Access Options", and they very excited about this, especially for scenarios like the one you describe.
It's interesting stuff, and something I would check out sometime very soon.

I think there's something to be said about plain old WCF services, these can make use of your domain services and repositories and expose a model more appropriate for services. Too often I've found that simply exposing the domain model on the wire ends up with a duplication of logic on both the client and the server.
My advice would be for some sort of service layer, this has the logic of shaping your domain model into appropriate types for the wire. \
Ideally I'd like to be able to share my Domain Services between the client (WPF / Silverlight) and the server (ASP.NET MVC) and have different underlaying repositories (Linq to NHibernate / Astoria). Difficult with the asynchonous nature of Silverlight.

For the curious, it certainly looks like RIA Services is the win here. Build a single DomainService then consume it everywhere. Brad Abrams covers a lot of this ground at bit.ly/94fFx - it really helped.

Related

Does my program architecture make sense? AngularJS + ASP.NET Web API + SQL Server

I have been getting a bit lost in the creation of my program architecture and I want to take a step back to see if I'm approaching it correctly.
I am wondering if my setup makes sense. I'm starting to think it doesn't.
I am creating intranet applications (We were creating Internet applications, but now the scope has changed). We use an onsite Active Directory (Windows Server 2012 R2). We have a SQL Server Database.
I have been building Front End Angular applications and ASP.NET Web API's to push and pull data. I am now implementing Authentication with Auth0 and it's been a nightmare.
What kind of program architecture would you setup in this scenario?
Much Appreciated.
SQL Server + Asp.Net Web Api + Angular JS forms a perfect architecture for building Single Page Applications (SPAs). This architecture is useful for building desktop like web applications, i.e. apps that runs over web but works like desktop apps.
If you can be more specific about the problem you are facing, you will be able to get better recommendations from so.
This architecture is widely adopted in many scenarios such as SPAs. With it, you will be able to keep your front-end highly decoupled from your backend services being able to support multiple front-ends on the same set of services and run quite a few integration scenarios.
Some of the downsides of such an approach will be the extra layer of complexity added to the application (which might force you to write more tests and handle different failure scenarios that wouldn't happen otherwise, for an example) and authentication routines since you will need to authenticate two heterogeneous environments (the .NET/IIS one and the JS/Angular one).
As for the authentication pain, token-based auth schemes seem the current way to go (such as Auth0) since they let you keep and send an environment-agnostic token which will be used by different layers of your architecture.
In that sense, your architecture makes sense.
However, since you're feeling some pain in its implementation, you might want to ask yourself if you really needed all of these. When you choose an architecture, you do so trying to accomplish some specific goals (multiple front-ends? specific performance requirements? maintainability? auditability?) and the more goals you try to accommodate in your architecture the more complex will become up to a point where the pains start outweighing the benefits.
So, what were you trying to achieve in the first place?

JavaFX, DataFX and server-side code

I'm looking at enterprise JavaFX, and how to integrate JavaFX with server-side code. In the last few weeks I've done a certain amount of research in DataFX and Open Dolphin, and downloaded some videos, as well as looking at a couple of other frameworks. For example I've looked at the video on DataFX at:
https://www.youtube.com/watch?v=EN4fo6x0DcQ
However, although this video and others I've looked at explain how to set up a client application that connects with a server, I've found very little information on how to put together some server-side code that the client can connect to. Sure, one could use JAX-RS, but why re-invent the wheel? In the first instance I would like to put together some really simple server-side code that some test client-side code can connect to using DataFX or one of the other frameworks. The aim eventually is to get a client using JavaFX to communicate with a server.
My second question is that of the various frameworks available, is DataFX the best to use for a simple application?
I have experience with a Glassfish server hosting a JSF application, and it may be useful to have such a server hosting a JSF application communicating with a browser as well as communicating with a client JavaFX, as that way I can test out that the communication with the JavaFX application.
The latter is a bit of an aside, and my main questions are where can I get information on server-side programming for this, and the best frameworks to use?
Many thanks in advance.
If you are able to manage yourself the client-server communication you can pick any JavaFX Application Framework listed there:
https://github.com/mhrimaz/AwesomeJavaFX
Any of them allows to separate UI code from communication code.
As I'm author of JRebirth, I can advice you to create some RemoteService (extends Service and providing JAX-RS facilities or whatever) to perform this job.
If you search an all-in-one library managing client AND server side, DataFX + OpenDolphin is probably the most advanced one.
I'm the author of DataFX & Dolphin Platform (https://github.com/canoo/dolphin-platform). Both are valid frameworks that fill fit to your needs. Maybe a combination of both - Dolphin Platform as remoting layer between client and server and DataFX to define the routing and mvc based views on the client.
Some days ago I copied all the DataFX sources to GitHub (https://github.com/guigarage/DataFX) and currently trying to create a new version based on the modules that are maintained by me. Maybe I will extract the MVC related stuff and create a new framework based in it, we will see. What I currently can say is that I plan to work on it the next month next to the Dolphin Platform since I think a combination of both will be a good fit.
I would use this combination today to create applications but yeah, I'm the main developer of both frameworks so the choice is quite easy for me ;)
As you mentioned JSF I think that Dolphin Platform is a perfect match for you since one idea of the framework is to have a modern successor for JSF that can be used to create desktop & web based applications but provide managed controllers on the server. We provide a cool maven based jumpstart (Maven archetype) that will give you a quick introduction and a runnable client / server application with a desktop and web client in 2 minutes: https://canoo.github.io/dolphin-platform/#_dolphin_platform_jumpstart

Putting a new web interface on an old fat-client database

My company has a fairly old fat client application written in Delphi. We are very interested in replacing it with a shiny new web application. This will make maintenance a breeze and many clients want a web application.
The application is extremely rich in domain knowledge, some of which is out of our control. Our clients use the program to manage their own clients and report them to the government. So an inaccurate program is a pretty big thing. The old program has no tests. We are not sure yet if we will implement automated testing with the new one.
We first planned to basically start from scratch. But we are short handed and wanting to basically get everyone on the web as soon as possible. So instead of starting from scratch we've decided to try to make use of the legacy fat-client database.
The database is SQL Server and can be used in SQL Server 2008 easily. It is very rich in stored procedures, functions, a few triggers, and lots of tables with over 80 columns... But it is decently normalized. We want for both the web application and fat client to be capable of using the same database. This is so that if something breaks badly in the web application, our clients can still use the fat client and connect to our servers. After the web application is considered "stable", we'd deprecate the fat client.
Has anyone else done this? What tips can you give? We want to, after getting everyone on the website, to slowly change the database structure to take care of some design deficiencies. What is the best way to keep this in a data access layer so that later changes are easy?
And what about actually making the screens? Is there any way easier than just rewriting an 80 field form in ASP.Net? Are there any tools that can make this easier?
The current plan is to use ASP.Net WebForms (.Net 3.5). I'd really like to use MVC, but no one on the team knows it including me.
We are not sure yet if we will implement automated testing with the
new one.
Implement automated testing. What's the point in replacing one buggy program with another?
Good question, but "Slowly change" the db structure after getting everyone on the website, sounds like a joke...
I would rather take the opportunity to create a fresh db structure, write a bulletproof migration script for you db, that you can try out and rewrite a zillion times without any side effect fro your clients, and then write whaterver you want (fat/web) on the new db, have it tested and migrate everyone when it's ready.
I have a couple suggestions:
1) create a service layer to abstract away the dependance on the DAL. In a situation as you describe having a layer of indirection for the UI and BLL to rely on makes DB changes much safer.
2) Create automated tests (both unit and integration), especially if you plan on making fairly significant changes to the Domain or Persistance layers (BLL/DAL). To make this really easy you should always try to program to an interface. This makes your code more flexible as well as letting you use mocking frameworks (Moq is one I like) to ensure your tests truely are unit tests and not integration tests.
3) Take a look at DDD (http://domaindrivendesign.org/) as it seems to fit pretty well with the given scenario. At the very least there are some very useful patterns that can help make your application more flexible.
4) MVC isnt very hard to learn at all, it is however an easy way to get unit testing setup for the UI as a result of the MVC architecture (testing the controller and not the view). That said, there is no reason you couldn't unit test web forms, its just a bit more work. MVC really is just a UI framework/design pattern (more Model2 but we can ignore that for now). It gets you closer to the metal so to speak as you will be writting a lot more HTML and using a Model (the 'M') for passing data around.
For DDD take a look at Eric Evans book: http://www.amazon.com/Domain-Driven-Design-Tackling-Complexity-Software/dp/0321125215/ref=sr_1_1?s=books&ie=UTF8&qid=1317333430&sr=1-1
Hope that helps
ASP.NEt forms is a no starter, is completely inappropriate for something like this. I recommend to start with something like Creating an OData API for StackOverflow including XML and JSON in 30 minutes, then build your Web app on top of that (ie. push it to the client, use JQuery/Silverlight).

What is the best way to establish communication between silverlight controls and ASP.NET

I thought I ask the more exprienced Silverlight users about what they think is the best way to embed Silverlight user controls into an ASP.NET page - in special regards to stablish an easy way of communication between the conrols.
I have heard about services, query strings, etc. but I'd like to find out what has worked for you the best so far...
Take a look at WCF RIA Services. It provides a way for ASP.NET to expose methods and data entities and expose them to Silverlight automatically. RIA Services creates proxy objects on the Silverlight side and wires them up with WCF services without you having to do any work. It is pretty slick.
Here is a starting point. Beware: the learning curve is kind of steep. But it is rewarding once you get there.
http://www.silverlight.net/getstarted/riaservices/
Also, another thing to watch out for is that some of those articles refer to beta/CTP versions and some of the namespaces changed before it went to production.
take a look at the below article which uses event aggregation to communicate.. i found this useful..
http://blog.weareon.net/how-to-communicate-between-two-user-controls-using-event-aggregator/

Implementation of a ASP.NET based portal-like application

There is the requirement, to write a portal like ASP.NET based web application.
There should be a lightweigted central application, which implements the primary navigation and the authentication. The design is achieved by masterpages.
Then there are several more or less independent applications(old and new ones!!), which should easily and independent be integrated into this central application (which should be the entry point of these applications).
Which ways, architectures, patterns, techniques and possibilities can help and support to achieve these aims? For example makes it sense to run the (sub)applications in an iframe?
Are there (lightweighted and easy to learn) portal frameworks, which can be used (not big things like "DOTNETNUKE")?
Many thanks in advance for you hints, tips and help!
DON'T REINVENT THE WHEEL! The thing about DotNetNuke is that it can be as big or as small as you make it. If you use it properly, you will find that you can limit it to what you need. Don't put yourself through the same pain that others have already put themselves through. Unless of course you are only interested in learning from your pain.
I'm not saying that DNN is the right one for you. It may not be, but do spend the time to investigate a number of open source portals before you decide to write your own one. The features that you describe will take 1000s of hours to develop and test if you write them all from scratch.
#Michael Shimmins makes some good suggests about what to use to implement a portal app with some of the newer technology and best practice patterns. I would say, yes these are very good recommendations, but I would encourage you to either find someone who has already done it this way or start a new open source project on codeplex and get other to help you.
Daniel Dyson makes a fine point, but if you really want to implement it your self (there may be a reason), I would consider the following components:
MVC 2.0
Inversion of Control/Dependency Injection (StructureMap for instance)
Managed Extensibility Framework
NHibernate (either directly or through a library such as Sh#rp or Spring.NET
A service bus (NServiceBus for instance).
This combination gives you flexible user interface through MVC, which can be easily be added to via plugins (exposed and consumed via MEF), a standard data access library (NHibernate) which can be easily configured by the individual plugins to connect to specific databases, an ability to publish events and 'pick them up' by components composed at runtime (NServiceBus).
Using IoC and DI you can pass around interfaces which are resolved at runtime based on your required configuration. MEF gives you the flexibility of defining 'what' each plugin can do, and then leave it up to the plugins to do so, whilst your central application controls cross cutting concerns such as authentication, logging etc.

Resources