I have gone through few posts like THIS and THIS, etc. but I couldn't clarify a confusion with respect to design that's in my mind. What would be the best practice to add CRUD for related entities when working with micro services? (my question is generally related to microservices, but currently I'm using aspboilerplate, to be specific)
For instance, I have an entity Product having around seven to ten properties in it. And then we may have another entity ProductContract. Each product can have multiple contracts added/updated.
Question: is it a good idea to create a separate Application Service for ProductContract entity? as it is going to have it's own input/output Dtos and logic, etc.
OR
It is better to create a ProductContractManager and keep its logic there. Then inject this manager into Product Application Service? This way, Product Application Service would have methods like 'AddProductContract', 'UpdateProductContract', etc.
The main driver of whether ProductContract should be owned by a different service is the consistency requirements, because achieving stronger levels of consistency between microservices is difficult (especially without tying the services together at the DB level: if going down that way, the decision to have separate microservices should probably also be revisited). So the main question you want to ask is, if updating/deleting a Product can there be a window of time where that update/delete isn't reflected in ProductContract (and vice versa)?
If the answer to that is that it can't, then you'll save yourself a lot of trouble by just putting responsibility for ProductContract in the same service that has responsibility for Product. If eventual consistency is allowed, then a separate service might be called for: the consideration there isn't really technical but more related to human organization (who will be developing and maintaining the functionality) and expectations around whether ProductContract and Product will evolve together (functionality changes in one being likely to also lead to functionality changes in the other).
As a novice to this realm, I am planning on building an mvc application. I had originally started a web forms application but decided the scalability and testability will benefit more with an mvc application. I chose to switch with the added benefit of being easier to add more features later on (instead of having code baked into web forms pages).
Now a little about my application, it is an application to stimulate an RPG class builder and moveset. In all simplicity, users can register for a class, and depending on other skills they can register for, they can see a custom move set based on these categories. The way I am envisioning it is I will be able to go back and add more classes and skills later in the database and have users register for this new content immediately once it has been added to the project.
Everything lives in normalized tables, so many joint tables do exist. For each new skill or class I add will mean a handful of tables will be added to the database. This speaks to the way the data will be stored, everything and all information about classes, user data, skills, etc will be stored in the database.
I have designed all the initial database tables I will need to have at the start, and functionality I need (a home page, view skills page, view move sets page, etc.). I am stuck at the next step; where do I go? Should I make my controllers first? Models? Views? Design my page layouts? I am asking for advice from people who have taken a similar organic approach to an mvc project. I am facing analysis paralysis on what to start on, knowing I have a lot of work ahead of me.
Thank you for taking your time to answer.
I've taken everyone's advise and am putting together a website to learn MVC: http://learnaspnetmvc.azurewebsites.net
The most important advice I can give you: just start. A big project can seem overwhelming, especially when you're looking at it like a big project. Instead, break it into small achievable tasks. Find something you can do right now, the ever-so-smallest subset of functionality, and do it. Then do the next one. And the next.
That said, I'll tell you my personal process. When I start on a new application or piece of an application, I first like to create my models. That way I can play with the interactions between them, flesh out the relationships, and think about the needs of my application in a somewhat low-pressure, easily disposable way. I also use code-first, whereas you've gone an created your database tables already. Some people prefer to do it that way. Personally, I find starting with my classes and letting those translate into an underlying data store much more organic. In a sense, it relegates the database to almost a non-existent layer. I don't have to think about what datatype things need to be, what should be indexed and what shouldn't, how querying will work, what kind of stored procedures I need, etc. Those questions have their time and place -- the nascent development stage is not that time and place. You want to give your brain a place to play with ideas, and classes are a cheap and low-friction medium. If an idea doesn't work out, throw the class away and create a new one.
Once I have my models, I like to hit my controllers next. This lets me start to see my models in action. I can play around with the actual flow of my application and see how my classes actually work. I can then make changes to my models where necessary, add additional functionality, etc. I can also start playing around with view models, and figuring out what data should or should not be passed to the view, how it will need to be displayed (will I need a drop down list for that? etc.), and such. This, then, naturally leads me into my views. Again, I'm testing my thinking. With each new layer, I'm hardening the previous by getting a better and better look at how it's working.
Each stage of this process is very liquid. Once I start working on my controllers, I will make changes to my models. Once I hit the views, the controllers will need to be adjusted and perhaps the models as well. You have to give yourself the freedom to screw up. Inevitably, you'll forget something, or design something in a bone-headed way, that you'll only see once you get deeper in. Again, that's the beauty of code-first. Up to this point, I don't even have a database, so any change I make is no big deal. I could completely destroy everything I have and go in a totally different way and I don't have to worry about altering tables, migrating data, etc.
Now, by this point my models are pretty static, and that's when I do my database creation and initial migration. Although, even now, really, only because it's required before I can actually fire this up in a browser to see my views in action. You can always do a migration later, but once you're working with something concrete, the friction starts to increase.
I'll tend to do some tweaking to my controllers and obviously my views, now that I'm seeing them live. Once I'm happy with everything, then I start looking at optimization and refactoring -- How can I make the code more effective? More readable? More efficient? I'll use a tool like Glimpse to look at my queries, render time, etc., and then make decisions about things like stored procedures and such.
Then, it's just a lot of rinse and repeat. Notice that it's all very piecemeal. I'm not building an application; I'm building a class, and then another class, and then some HTML, etc. You focus on just that next piece, that small chunk you need to move on to the next thing, and it's much less overwhelming. So, just as I began, I'll close the same: just start. Writers have a saying that the hardest thing is the first sentence. It's not because the first sentence is really that difficult; it's because once you get that, then you write the second sentence, and the third, and before you know it, you've got pages of writing. The hardest part is in the starting. Everything flows from there.
The other answers here have great advice and important nuggets of information, but I think they do you a disservice at this stage. I'm the first to advocate best practice, proper layering of your applications, etc. But, ultimately, a complete app that follows none of this is more valuable than an incomplete app that incorporates it all. Thankfully, we're working with a malleable medium -- digital text -- and not stone. You can always change things, improve things later. You can go back and separate your app out into the proper layers, create the repositories and services and other abstractions, add in the inversion of control and dependency injection, etc. Those of us who have been doing this awhile do that stuff from the start, but that's because we've been doing this awhile. We know how to do that stuff -- a lot of times we already have classes and libraries we drop in for that stuff. For someone just starting off, or for an app in its earliest nascent stage, it can be crippling, though. Instead of just developing your app, you end up spending days or weeks pouring through recommendations, practices, libraries, etc. trying to get a handle on it all, and by the end you have nothing really to show for it. Don't worry about doing things right and do something. Then, refactor until it's right.
As a first step in planning a MVC framework application, We should start with a strong Model (typical C# props). This process is going to take most of our time, based on the fact that we need to understand the business first and then the relationships between different workflows and entities. So times business models evolve as time passes. So spend qualitative time on building this layer, but not too much.
Once domain (Business) Models are ready, before we actually start coding for Repository classes, we should define our Repository Contracts which are typically Interfaces. Contracts help all parties(other components) to interact with each other in the exact same way. Then we implement contracts on the Repository component, which is just going to act like PUSH and PULL data from your persistent medium (say database). Remember repository component never going to have any idea on business logic.
Once backend has been established, We can concentrate on my actual business process implementation. We can define one more level of Contract which defines all business operations which are to be done using Model classes. This interface has been implemented by BusinessLogic Component which does the core business activity (specific methods for every business operation). This particular component will use Repository component to delegate business data to persistence medium.
With above step completed, We can easily go and build Controllers. We should be calling business logic component methods in controllers and get work done. Once controllers are done, we can define our views and other UI elements like partial views etc.
Pictorial representation of the flow is as follows -
A simple architecture (from high to low level)
Presentation Layer
Domain Logic Layer
Data Access Layer
Database
Presentation layer is MVC project containing Views, Controllers and optional View-Models.
Domain Logic Layer is Class Library project which Presentation layer will access (via DLL or Service reference). This layer contains business logic and rules for the application.
Data Access Layer may contain two sub-layers-
Repository. User repository is best practice for any long term application.
Entity Framework Model.
This will communicate with database.
Database you already have.
I would suggest reading through a book by Scott Millet, called Professional ASP.NET Design Patterns.
ISBN : 0-470292-78-4
Scott walks through what a good ASP.NET site should look like from an architectural standpoint - i.e. DataAccess layers, Business Logic layers, Presentation Layers, Domain events etc.
By following on from industry standards, you will gain a better knowledge of how to correctly put together an MVC web-site.
Hope this helps.
I would suggest you to make your MVC application around a ASP.NET Web API , since it will help, in case you go for a mobile application later.
Since you are a MVC newbie, you should download some open source projects on MVC shared by seniors in the community. Study two or three projects, and analyze a solution which will the best for you.
A quick googling will get you to some good projects.
e.g.
Making a simple application , Prodinner
After that you should also go through MSDN tutorial on MVC 5 app with SSO , to enable social logins.
I m working on a legacy web project, so there is no ORM(EF,Nhibernate) available here.
The problem here is I feel the structure is tedious while implementing a new function.
let's say I have biz Object Team.
Now if I want to get GetTeamDetailsByOrganisation
,following current coding style in the project,I need:
In Team's DAL, creat a method GetTeamDetailsByOrganisation
Create a method GetTeamDetailsByOrganisation in the Biz Object Team, and call the DAL method which I just created
In Team's BAL, wrap up the Biz object Team's method in another method,maybe same name, GetTeamDetailsByOrganisation
Page controller class call the BAL method.
It just feels not right. Any good practice or pattern can solve my problem here.
I know the tedium you speak of from similarily (probably worse) structured projects. Obviously there are multiple sensible answers to this problem, but it all depends upon your constraints and goals.
If the project is primarily in maintenance mode with very no new features being added I might accept that is the way things are. Although it sounds like you are adding at least some new features.
Is it possible to use a code generator? A project I worked on had a lot of tedium like this, which apparently was caused because they originally used a code generator for the code base which was lost to the sands of time. I ended up recreating the template which saved me a lot of time, sanity, and defects.
If the project is still under active development maybe it makes sense to perform some sort of large architectural change. My current project is currently in this category. We're decoupling code and adding repositories as we go. It's a slow process that takes diligence and discipline by the whole dev team. Each time a team takes on a story they tax that story with rewriting some of the legacy code in that area. To help facilitate this we gave a presentation to the rest of the team to get buy-in and understanding. We also created some documentation for our dev team that lists out the steps to take and the things to watch out for. In the past 6 months we've made a ton of progress. We don't have the duplication you speak of, but we have tight coupling issues which makes unit testing impossible without this refactor.
This is less likely to fit your scenario, but it may also be a possibility to take certain subset of features and separate those out into separate services that can be rewritten using a better platform and patterns. The old codebase can interoperate at the service layer if needed. You likely make changes in certain areas more than others, so the areas of heavy change might be top priority to move to a dedicated service. This has the benefit of allowing you to create a modern code base without having to rewrite the entire application from scratch all at once. This strategy is what Netflix has done to rewrite their their platform as they go and move it to the cloud.
The majority of the Application Architecture advice seems to advise strongly that only Presentation Layer should have access to HTTPContext (to promote loose coupling, decrease dependencies, increase testability etc).
So, how do people deal with Caching and Session? Very specific DataAccess and Business Logic knowledge is required to determine what items need caching to best benefit application performance. However, and an ASP.Net web application, access to these is provided via the HTTPContext.
One option would be to create a CacheFactory and a SessionFactory, and an ICache and ISession interface - and then somehow use DI principle to pass and ISession and ICache to each method in the BLL and subsequently DA Layer that needs it.
Is this really what developers end up doing? Is there another, easier, way to deal with this?
Thanks for any advice.
Personally, if I'm developing for ASP.NET (web) only, and I know the class libraries are only going to target the web, I have no issues referencing System.Web or any other assembly as needed. Sometimes practicality should come first.
On the other hand if you know, or are unsure if the BLL, DAL or other layers may be reused in a client-server or other environment, you may need to look at using a more flexible approach such as a session handler interface, etc.
The main thing is you clearly understand what best practice recommends. At that point you can make rational decisions about what should apply to each project or situation. It won't always fit like a glove.
In my experience caching is done ata certain layer - and what you're caching applies within the scope of that layer, so:
You might cache data in the DAL so that the DAL returns it from memory and not by hitting the DB again; so here we're caching "raw" data at the data level.
The BL might cache similar data (say POCO's) - in other words "logical" data that has had BL applied to it; so here we're caching BL data in the BL level.
Your UI might cache (you guessed it) information that is built at the UI layer - liek a presentation of data as a webpage, XML, etc.
When you look at caching, as the architect / designer of the system you'll make decisions about what things are worth caching, generally this will be driven by a need to balance:
Performance needs to hit a specific target.
What time (and cost) options you have.
Assuming we're caching to increase performance, you'll want to analyse your system and identify the bottlenecks that need to be removed or avoided - the first pass of this analysis should at least identify which layer to focus on first.
Another thing to consider is that the more you can cache closer to the consumer - the less overall processing there is that needs to occur; in otherwords, if you can cache at the UI layer then the BL and DAL layers won't even get a look-in (you won't need to add caching there).
At this point I want to ask you what exactly it is that you're trying to achieve.
While everything I've said is true (as far as I am aware) it's all based on the assumption that we're delivering "vertical" slices of the systems functionality (like "creating an invoice", "adding a product" etc); there's another model you can apply - the cross-cutting concerns model.
Think about something like logging - every part of you system (UI / BL / DAL) should be able to log system events; my favourite tool for this is the MS Enterprise Libraries (MSEntLibs). When I work with them I consider them to be a "black-box" - a self contained unit. I (by choice) have no idea how they are architected - the thing that is important is that they are isolated and have dependencies which are relativly easy to manage.
The MSEntLibs can log to a database, but if I call an MSEntLibs logging method (to log to the DB) direct from my UI (and skipping over my BL) I don't care.
So depending on what you're trying to do the right anmswer will either be:
Simply identifying what needs to be cached at letting the appropriate layer handle it.
You don't need to try using DI to pass stuff to you BL at all - a cross-cutting black-box component might be appropriate?
I work for a firm that provides certain types of financial consulting services in most states in the US. We currently have a fairly straightforward CRUD application that manages clients and information about assets and services we perform for each. It only concerns itself with the fundamental data points and processes that are common to all locations--the least common denominator.
Now we want to implement support for tracking disparate data points and processes that vary from state to state while preserving the core nationally-oriented system. Like this:
(source: flickr.com)
The stack I'm working with is ASP.Net and SQL Server 2008. The national application is a fairly straightforward web forms thing. Its data access layer is a repository wrapper around LINQ to SQL entities and datacontext. There is little business logic beyond CRUD operations currently, but there would be more as the complexities of each state were introduced.
So, how to impelement the satellite pieces...
Just start glomming on the functionality and pursue a big ball of mud
Build a series of satellite apps that re-use the data-access layer but are otherwise stand-alone
Invest (money and/or time) in a rules engine (a la Windows Workflow) and isolate the unique bits for each state as separate rule-sets
Invest (time) in a plugin framework a la MEF and implement each state's functionality as a plugin
Something else
The ideal user experience would appear as a single application that seamlessly adapts its presentation and processes to whatever location the user is working with. This is particularly useful because some users work with assets in multiple states. So there is a strike against number two.
I have no experience with MEF or WF so my question in large part is whether or not mine is even the type of problem either is intended to address. They both kinda sound like it based on the hype, but could turn out to be a square peg for a round hole.
In all cases each state introduces new data points, not just new processes, so I would imagine the data access layer would grow to accommodate the addition of new tables and columns, but I'm all for alternatives to that as well.
Edit: I tried to think of some examples I could share. One might be that in one state we submit certain legal filings involving client assets. The filing has attributes and workflow that are different from other states that may require similar filings, and the assets involved may have quite different attributes. Other states may not have comparable filings at all, still others may have a series of escalating filings that require knowledge of additional related entities unique to that state.
Start with the Strategy design pattern, which basically allows you outline a "placeholder", to be replaced by concrete classes at runtime.
You'll have to sketch out a clear interface between the core app and the "plugins", and you have each strategy implement that. Then, at runtime, when you know which state the user is working on, you can instantiate the appropriate state strategy class (perhaps using a factory method), and call the generic methods on that, e.g. something like
IStateStrategy stateStrategy = StateSelector.GetStateStrategy("TX"); //State id from db, of course...
stateStrategy.Process(nationalData);
Of course, each of these strategies should use the existing data layer, etc.
The (apparent) downside with this solution, is just that you'll be hardcoding the rules for each state, and you cannot transparently add new rules (or new states) without changing the code. Don't be fooled, that's not a bad thing - your business logic should be implemented in code, even if its dependent on runtime data.
Just a thought: whatever you do, completely code 3 states first (with 2 you're still tempted to repeat identical code, with more it's too time-consuming if you decide to change the design).
I must admit I'm completely ignorant about rules or WF. But wouldn't it be possible to just have one big stupid ASP.Net include file with instructions for states separated from main logic without any additional language/program?
Edit: Or is it just the fact that each state has quote a lot a completely different functionality, not just some bits?