Modular Contracts with Spring Cloud Contract - spring-cloud-contract

we have the Problem, that we have to use parts of a contract in several project. Is there the possibility to split contracts into several fragments and include this parts into every contracts who needs it in there structure?

You can extend the DSL (https://cloud.spring.io/spring-cloud-static/Edgware.SR2/multi/multi__customization.html#_common_jar). Create a separate project with the common piece of DSL and then reuse it in any contract you need.

Related

Clean Architecture (DDD) Why are domain objects (DB Entitites) and DbContext in separate projects?

I understand the need for abstraction and separating concerns and unit tests, however, it seems to me that separating entities and context into 2 projects is slight overengineering?
I could be missing something really, but is this because you want to be open for different ORM-s?
Much thanks for the clarification.
The main reason I prefer to have Infrastructure in a separate project, rather than just a separate folder, from the domain model (Core project) is simple: enforcing my design via the compiler.
I have a design rule, which is basically the Dependency Inversion Principle. Don't depend on low level implementations (such as those found in Infrastructure), instead depend on abstractions (interfaces). Also, don't have your abstractions depend on details; have details depend on abstractions. The details of how and which infrastructure is being used for a given abstraction are in the Infrastructure service implementations.
Abstractions say what; implementations say how.
What: I need to send an email.
ISendEmail interface
How: I want to do it using the SMTP protcol
SmtpEmailSender class (implements ISendEmail)
How: I want to do it using a SendGrid API
SendGridEmailSender class (implements ISendEmail)
So, in a single project, how would you ensure that the implementations depend on the interfaces, and not vice versa?
How would you ensure your domain classes didn't directly reference or use Infrastructure types?
I'm not aware of a way to do this.
But if you put them in separate projects, and you have the implementation details project depend on the abstractions-and-models project, you now have solved the problem. The compiler WILL NOT ALLOW the Core project to reference anything in the Infrastructure project, because it would create a circular dependency.
This constraint helps developers do the right thing and keeps them falling into the pit of success even if they don't completely grok how the dependency inversion principle works or why it's important.
And I've never found 3 projects (Core/Infra/UI) to be overengineering for any non-demo app I've built for real work. It's only 3 projects.

Cascading microservices using Meteor

I've been looking into scaling Meteor, and had an idea by using the Meteor Cluster package;
Create a super-service*, which the user connects to, containing general core packages to be used by every micro-service (api, app, salesSite, etc. would make use of its package),
The super-service then routes to the appropriate micro-service (e.g., the app), providing it with the functionality of its own packages.
(* - as in super- and sub-, not that it's awesome... I mean it is but...)
The idea being that I can cascade each service as a superset of the super-service. This would also allow me to cleverly inherit functionality for other services in a cascading service style. E.g.,
unauthedApp > guestApp > userApp > modApp > adminApp,
for the application, where the functionality of the previous service are inherited to the preceding service (e.g., the further right along that chain, the more extra functionality is added and inherited).
Is this possible?
EDIT: If possible, is there a provided example of how to implement such a pattern using micro-services?
[[[[[ BIG EDIT #2: ]]]]]
Think I'm trying to make a solution fit the problem, so let me re-explain so this question can be answered based on the issue rather than the solution I'm trying to implement.
Basically, I want to "inherit" (for lack of a better word) the packages depended on needed functionality, so that no code is unnecessarily sent through the wire.
So starting with the core packages, which has libraries I want all of my services to have, I then want to further "add" the functionality as needed. Then I want to add page packages if serving a page-based service (instead of, say, the API service, which doesn't render pages), then the appropriate role-based page packages, etc., until the most specific packages are added.
My thought was that I could make the services chain in such a way that I could traverse through from the most generic to most specific service, and that would finally end with a composition of packages from multiple services. So, for e.g., the guestApp, that might be the core packages + generic page packages + generic app packages + unauthApp packages + guestApp packages, so no unneccessary packages are added.
Also with this imaginary pattern I'm describing, I don't need to add all my core packages to each microservice - I can deal with them all within the core package right at the top of the package traversal I've discussed above and not have to worry about forgetting to add the packages to the "inherited" packages.
Hope my reasoning here makes sense, and I hope you guys know of a best practice for doing this. Thank you!
Short answer:
Yes! That's a good use to a microservice architecture.
Long answer:
Microservices don't necessarily provide you an inheritence mechanism as in OOP. You should consider microservices as independent "functions" which take in an input and respond with an output/action. Any microservice can depend on another to complete its own task.
And then, you "compose" necessary microservices in order to achieve the final output/action.
You can have one or few web facing "frontend" services that use a mix of few other backend microservices whose ports are not open to the public network.
The drawback with a microservice would be its "minimum footprint". The idea with microservices is around some main benefits:
Separate core services so that they can be "maintained" independently
Separate core services so that they can be "replaced" independently
Separate core services so that they can be "scaled" independently
But then, each microservice, being a node/meteor app, will have its minimum cpu/ram footprint even when they are just idle and waiting for a connection.
Furthermore, managing a single monolithic app, or just a few "largish" services is much easier, from a devops standpoint, than managing tens of individual deployments.
So with all engineering decisions, the right answer would imply some kind of "balance".
Edit: reference to inheritence
As per the OP's comment, the microservices can indeed be referenced from a parent code as either functions or classes and be composed (functions) or inherited from (classes) because after all the underlying functionality are DDP endpoints.
If you are using the cluster package from meteorhacks
// create a connection to your microservice
var someService = Cluster.discoverConnection("someService");
// call a normal meteor method from that service
var resultFromSomeService = someService.call("someMethodFromSomeService");
So as with any piece of javascript code, you can wrap the above piece of code in a function or a class with its constructor and all and inherit from it, exposing its interfaces as you desire.

Multiple projects in one visual studio solution

I am asp.net MVC beginner and I have just created new solution. What I have noticed is that there is now an option of adding two projects under the same solution, and that is something that is new to me.
What is a main purpose that one should add multiple project under same solution?
A solution can hold multiple projects that are related and logically grouped together. For example, a solution may contain two web site projects (a user site and an administration site) and then also a class library project that they both share which contains common database access code or business logic.
Usually we separate our code into multiple projects for easier maintenance in future. On an high level we make separate Class libraries for Data Access, Domain Models, Business Logic. Web project for Front end UI. This way we are physically separating code, that it increases re-usability. Say in future you want to re-use your Data Access components, then build that class library and take the DLL and use it in other projects.
Also in future if you want to replace a certain layer, then you can simply decouple it and change, without changing any other components of code.
This physical segregation of code with Logical Dependency Injection would give you more cleaner, easy to maintain, re-usable, loosely coupled systems

Modular Software Design

I am trying to implement modular design in an asp.net project dividing the application into different modules like HR, Inventory Management System etc. Since I am trying to keep different modules independent of each other, I separated these modules in such a way that each module is a separate Visual studio solution having UI, BLL, DAL and even a separate database schema.
Up till now I thought this as a common practice for developing Management systems and ERPs but I am searching the web for last three days but hardly found any help full stuff regarding developing modular applications. Most of what I found is mere theory explaining the concepts of cohesion and coupling but not real world scenarios. So I wonder
Is it the right approach of separating modules?
How the real world modular applications are developed?
How should the different modules communicate with each other yet they stay independent of each other.
I think there should be a core application which makes use of these modules, how should the core application communicate with these modules?
There is some data, entities , objects which are common to each module, should I put them in the core modules in order for other modules to use them (I think this will make the modules coupled to core) or should every modules maintain its own copy of data + define those object, (which I think voilates DRY)
Any thoughts, links are warmly welcome.
This is a personal opinion and is debatable.
I separated these modules in such a way that each module is a separate Visual studio solution having UI, BLL, DAL and even a separate database schema.
Sounds like a total overkill. Abstraction over abstraction makes your application pain in the neck to maintain, support, and enhance. Is it that large that you need to separate modules into separate solutions?
Is it the right approach of separating modules?
No, I think it is a total over-engineering. I would suggest using projects to separate modules. And not separate solutions. The problem with solution is that it will require external dependencies management tool, which requires a lot of effort to bring in and later maintain.
How the real world modular applications are developed?
Using abstraction (interfaces and abstract classes) and separate projects.
How should the different modules communicate with each other yet they stay independent of each other.
By using interfaces, DI, IOC, TDD
I think there should be a core application which makes use of these modules, how should the core application communicate with these modules?
Core does not communicate with modules. In fact it should ideally not depend on any other project/library. This makes it simple to reference and use in large solutions.
There is some data, entities , objects which are common to each module, should I put them in the core modules in order for other modules to use them (I think this will make the modules coupled to core) or should every modules maintain its own copy of data + define those object, (which I think voilates DRY)
I would highly recommend using a single copy from the Core project. See this questions for details of why.
This is one of those topics that is entirely subjective for the most part, but you may wish to consider a SOA (Service Oriented Architecture).
Using SOA, you can define a service (for this example, I'll stick to web services, though other service types exist depending on requirements) for each business area - an HR web service, a projects web service, a finance web service and so forth.
You can then bring all these together with a front end system that will communicate with and utilise these services, that would normally be your core application, though depending on your needs and requirements you may opt for multiple front end systems.
For the front end system I would recommend using ASP.NET MVC which has the concept of areas and will let you separate the front end into specific areas - an HR area, a projects area, a finance area and so forth that will contain the models and views for each specific area.
Doing this will let you build in a modular manner, you can build your first web service, say, the HR web service, that has methods for getting relevant HR data and so forth, and then build the HR area of your MVC application. Expanding then simply depends on building the web service, and creating the front end in the MVC application. There is nothing stopping say the HR area then accessing the finance web service if it needs finance information, but it still keeps everything in distinct independent modules.
Using this method can also be helpful in aiding future interoperability - it may be that other systems in the company will find it useful to interact with certain web services. For example, in a previous role it was useful for the companies engineering software to integrate with the projects team web service as it allowed for engineering related information to be linked to it's related project.
If the system grows in terms of resource requirements it should also be fairly scalable as it is trivial to say, offload the projects web service to another service if it starts eating a lot of system resources. It also allows you to switch modules out if need be - if you ever decided to move to say, a Linux/Java platform, you could trivially move by porting module by module with no real interruption of the overall system.
But of course, as I say, this is simply one such option and much of it depends on the specifics of your circumstances.
It is too late to answer but it seems interesting.
Since I am trying to keep different modules independent of each other, I separated these modules in such a way that each module is a separate Visual studio solution having UI, BLL, DAL and even a separate database schema.
It depends on your scale of application. If you create a very small-simple application with a little functionality, then it is safe to has a combined assembly. Or if you want, just separate the UI with other module. At least it can help you to emphasize SOC. Keep in mind that loading multiple assembly can be slower than a single assembly.
Is it the right approach of separating modules?
Module separation always has a drawback, that it is require mapping. It means slower performance in general (maybe negligible, but still there is), and slower development time. If your application will be large and complex enough, it is worth it, since you can create modular unit tests for each module.
How the real world modular applications are developed?
No exact practice though, every problem needs a solution. You won't need a heavy multi-threading or dependency injection architecture for a simple calculator application.
How should the different modules communicate with each other yet they stay independent of each other.
Using interface. You can make the implementation different later on. Example is, you currently use C# Winform for your application, communicate to the BLL using interface. Later on, you want to migrate to ASP.Net, then you just change the implementation, but keep the interface to communicate with the BLL the same.
I think there should be a core application which makes use of these modules, how should the core application communicate with these modules?
There is some data, entities , objects which are common to each module, should I put them in the core modules in order for other modules to use them (I think this will make the modules coupled to core) or should every modules maintain its own copy of data + define those object, (which I think voilates DRY)
I assume it is an enterprise level application which share the same modules / data such as employee. If it is really need to behave uniformly, then you should provide the very basic logic at the core Level. At the application / implementation level, you may has different implementation to fulfill each requirement.
Do not force to uniform all of the business logic to the core. If a specific application need a different implementation, it is hard to make the core configurable.

What is proper abstraction using Entity Framework in webforms

I am trying to figure out a good way to architect my solution. I know that I am going to be using the following technologies, Asp.Net Webforms, and Entity Framework 4.1. My EF model is based on an existing database. I'm planning to use the EF DbContext generator to build my context and entities. And this is the point where things get a little tricky for me.
I want to have proper separation of concerns, providing for better testability and allowing me to separate my business logic from my DAL. I have three projects in my solution currently: Web, Core, and Data. I would like dependencies to be Web -> Core <- Data, with no dependency between Web and Data at all. This requires my entities to actually exist in Core, rather than Data (where my edmx is). Currently, my thought is to move the Entities.tt file to Core and change the inputFile to point to my edmx in Data to generate my Entities in Core. But I'm unsure what to do with the Context. It's heavily dependent on EF and therefore I don't simply want to move that into Core. I thought about interfacing it, creating my own IEntities.Context.tt and dropping that in Core. My concern is the loss of functionality if my interface doesn't create DbSets and DbContext.
Two thoughts I've been having on this are, 1) put a ref to System.Data.Entity in Core, 2) don't use DbSet and replace it with ICollection (or some such generic container) and wrap DbContext as just an Object in my interface.
Any insight would be very appreciated. Thank you.
There are lots of different patterns you could use, but two come to mind immediately:
1) Add a business / service layer - this will abstract between your data layer and your presentation layer. This is the approach I take most often - using AutoMapper and Dependency Injection (I like Ninject) to make the monkey work easier. Your business layer would expose either its own version of your database objects (not recommended), or objects which related to your business model (a more robust approach).
2) Use the Inversion of Control pattern - Very popular at the moment, though I'm yet to give it a bash in a real life scenario. Apparently very good for TDD / mocking etc... it basically means that your data layer has a dependency on your business layer instead of the other way around.
FYI - My "Core" or "Common" assemblies know nothing about my business or data layers - they merely provide platform agnostic helpers and common classes - if I want to create common MVC functionality, for example, I'll create a Company.MVC.Core assembly instead.
If your solution is completely greenfield then I like to use a code first approach in entity framework (forgive the shameless plug but I've put a tutorial on my blog about this http://www.terric.co.uk/code-first-entity-framework-and-sql-migrations/). I like the control it gives me that I can't seem to get when I generate a .edmx.
Moving onto structure, I usually separate the layers of my project into separate assemblies: Domain (and Data) and WebUI structured with the following folders (namespaces):
Domain (business layer and data layer assembly)
Data (contains my EF data context and Interface to the context)
Entities (contains my POCO objects for the context)
WebUI (presentation layer assembly)
Infrastructure (contains my dependency inject initialiser)
I never DI my entities and instead use the concretes in my presentation layer, however the context I'll always DI as I may want / have to use ADO.Net (especially for legacy apps) where my Domain layer will still use ADO.Net to read / write my POCO entities. This way, when I eventually get scope to implementing an ORM with my legacy app I can simply DI the ORM version of my Domain.
As a footnote to this, if you were following the repository pattern you could always interface them and DI your repositories. Either way, your POCOs should be specific to the solution so the underlying data structure doesn't dramatically change often hence I never DI them.

Resources