Business model with APIPlateform - symfony

We use Symfony 4 + Doctrine + APIPlateform to deliver a HTTP API.
APIPlateform exposes Doctrine entities as API Rest resources (via annotation inside entity class), which (I found) no a good practice, since the business model (exposed by the API) should be the same than the Symfony Doctrine model.
As a result, it more looks like a basic CRUD than a real application.
Am I wrong about that or is it possible to create kind of Doctrine virtual entities in order to use tools like APIPlatform (or even Symfony form)?

It's true that API platform seems to be strongly opinionated towards very simple CRUD cases.
Symfony Form component however will accept any object as the data. We normally use objects like MyThingType (extended from AbstractType) and MyThingDTO (data transfer object) for separating the UI from the business model and persistence layer. Works just fine.
API Platform has kind of a support for DTO's but in my opinion it's not that usable for complex scenarios. For complex application with good separation of concerns and complex business model it would seem that using Symfony controllers/form directly or perhaps with FOS Rest would be more direct way to achieve it.
https://api-platform.com/docs/core/dto/
You can hide some of the internals of your Doctrine entity by serialization groups but it's still not the same as real separation.

Related

Spring MVC and Web Application Architecture

My understanding of Spring MVC is that the View layer is responsible for the user interface that is populated with data, the Model represents a Map of values made available to the View Layer, and the Controller controls how and what data is passed to the Model as well as what business logic is carried out. The "business logic" can be divided into one or more other layers - generally a Service layer and/or Data Access layer. Is this correct?
I've seen other explanations for the MVC pattern where the Model is considered a layer with Entities, Data Access Objects, and Services. The View is responsible for the user interface. And the Controller is the gateway between the two. I don't think this applies to Spring MVC and other implementations of MVC for web frameworks.
Your understanding as outlined in the first paragraph is mostly correct. Where I differ slightly is in viewing Model, View and Controller as separate layers of an application (since you have references to View layer). MVC is a pattern for implementing user interfaces, which would typically be part of a Presentation layer in an application. There are other patterns for implementing the presentation layer such as MVVM, MVP, PAC, and so on besides MVC.
Spring MVC is built on top of the Spring framework. If you are familiar with the Spring framework you would know that it is one of many available Dependency Injection frameworks for Java. Spring MVC Controllers are regular Spring managed beans that can be discovered by the Spring DI container and can have other Spring beans injected into them.
Model objects in a Spring MVC application can be instances of any Java class, whether in-built data types such as String, Long, BigInteger, etc. or user-defined classes and enumerations.
Views can again be anything meaningful for an end-user - an HTML page, an XML document, a JSON document, a PDF document, an Excel spreadsheet and so on. Spring MVC does not provide any view generating mechanism out-of-the-box. However, it makes available integration to several existing view generation technologies, such as, regular JSPs, JSTL, templating engines such as Freemarker, Java, Thymeleaf and StringTemplate, reporting frameworks such as Jasper Reports, XML binding frameworks such as JAXB and Castor, JSON binding frameworks such as Jackson and GSON, and so on. The Spring MVC API is fairly easy to integrate with view generation technologies and therefore the framework can accommodate new technologies relatively easily.
Since Spring MVC is a presentation layer framework, it does not specify, recommend or enforce how business logic should be implemented. However, it is generally a good idea to keep the business logic out of the presentation layer (see SOLID principle for details). For example, if you wish to provide certain users or business partners programmatic access to your business logic, you would be better off having the business logic in a separate layer of its own which the web presentation layer would invoke. You could then create a thin layer that also invokes the same business logic layer and allows programmatic access to external users using data interchange mechanisms such as SOAP, REST, EDI, etc.
MVC is the UI layer.
A model is a map of objects, representing data for your views. These objects often are JPA entities, but don't have to be. It could be a simple class representing a username and password in a login form.
Keep some logic in model classes. For example, if you want to calculate the interest rate on a loan, you could do this in a model class. For complicated logic, especially when multiple model classes are involved, use a service.
Model classes must be completely independent of views and controllers, in that they can exist without them.
A controller responds to HTTP requests. Generally it is responsible for loading the correct models and choosing the correct view, and returning this info. Controllers should be pretty dumb.
You want "fat models and skinny controllers". Keep as much logic in the model as you can.
A view is a JSP or template (like Thymeleaf or Freemarker) which can consume models. The trick is to have as little logic in a view as possible.

Do AOP violate layered architecture for enterprise apps?

The question(as stated in the title) comes to me as recently i was looking at Spring MVC 3.1 with annotation support and also considering DDD for an upcoming project. In the new Spring any POJO with its business methods can be annotated to act as controller, all the concerns that i would have addressed within a Controller class can be expressed exclusively through the annotations.
So, technically i can take any class and wire it to act as controller , the java code is free from any controller specific code, hence the java code could deal with things like checking security , starting txn etc. So will such a class belong to Presentation or Application layer ??
Taking that argument even further , we can pull out things like security, txn mgmt and express them through annotations , thus the java code is now that of the domain object. Will that mean we have fused together the 2 layers? Please clarify
You can't take any POJO and make it a controller. The controller's job is get inputs from the browser, call services, prepare the model for the view, and return the view to dispatch to. It's still a controller. Instead of configuring it through XML and method overrides, you configure it through annotations, that's all.
The code is very far from being free from any controller specific code. It still uses ModelAndView, BindingResult, etc.
I'll approach the question's title, regarding AOP:
AOP does not violate "layered architecture", specifically because by definition it is adding application-wide functionality regardless of the layer the functionality is being used in. The canonical AOP example is logging: not a layer, but a functionality--all layers do logging.
To sort-of tie in AOP to your question, consider transaction management, which may be handled via Spring's AOP mechanism. "Transactions" themselves are not specific to any layer, although any given app may only require transactions in only a single layer. In that case, AOP doesn't violate layered architecture because it's only being applied to a single layer.
In an application where transactions may cross layers IMO it still doesn't violate any layering principles, because where the transactions live isn't really relevant: all that matters is that "this chunk of functionality must be transactional". Even if that transaction spans several app boundaries.
In fact, I'd say that using AOP in such a case specifically preserves layers, because the TX code isn't mechanically reproduced across all those layers, and no single layer needs to wonder (a) if it's being called in a transactional context, or (b) which transactional context it's in.

SOA architecture for ASP.NET with Entity Framework

I am redesigning a solution's architecture to implement SOA.
After doing the design changes I have come up with:
MySolution.Client.MyProject.Web ...... //ASP.NET WebSite
MySolution.Client.MyProject.Proxy .... //Service Proxy C# Library Project *1
MySolution.Service ................... //Service.cs for Service.svc is here
MySolution.Service.DataContract ...... //IService.cs for Service.cs is here *[2]
MySolution.Service.HttpHost .......... //Service.svc is here
MySolution.Model ..................... //All custom data classes and EDMX model is here *[3]
MySolution.Repository ................ //Repository queries the DB using LINQ and ADO.NET queries
*1 MySolution.Client.MyProject.Proxy:
This project contains service proxy and contains presentation classes
*[2]MySolution.Service.DataContract:
This project contains IService and Request/Response Classes
All methods in Service.cs takes the Request classes as input and return Response classes as output
Therefore this project is referenced by the 2 Client Projects(Web and Proxy), as it contains the IService and all of the Request/Response classes that will be required to communicate with Service.cs
*[3]MySolution.Model:
This project contains the .edmx that is the Data Model of Entity Framework and some custom classes that are used in the project.
PROBLEM:
Because I only use request/response classes to communicate between service and client, the MySolution.Service.DataContract project is only used by Service.cs and Repository.cs
And due to that all responses that Repository generates I have to map them to the properties of its respective response class (which make both the original returned entity, and the response class almost identical). But I am OKAY with it...
For example:
The GetCustomer() in Repository.cs method is called by Service.cs
The GetCustomer() method in repository performs the LINQ query and returns a "Customer" object
Service.cs then maps all the properties of "Customer" object to the "CustomerResponse" object
Service.cs then returns the "CustomerResponse" to the caller.
In this case, most of the properties will repeat in both classes. If there is a solution to this, it's good, otherwise, I am fine with it.
However, when Repository.cs's method GetCustomers() (Notice it's not GetCustomer()) is called, it will return a list of Customer objects, and mapping this for return purposes would mean a "for loop" that iterates the collection and do the mapping... This is NOT OKAY...
Is there a better way of doing this, considering I do not want to return "Customer" object without "CustomerResponse" as first of all it violates the SOA architecture, and secondly I don't want my client projects to have any reference to the Model or Repository projects?
So is it just the mapping that you're having trouble with? If so, you could look at some open source mapping libraries like Mapper Extensions or AutoMapper that will automate the task.
If you don't like separate mapping between entities and DTOs expose IQueryable in your repository and use direct projections to DTOs. The disadvantage is that such solution can't be effectively unit tested. Mocking repository in such scenario doesn't make sense because query agains mock is Linq-to-objects whereas query against real repository is Linq-to-entities (different set of features where difference can be seen only at runtime).
Btw. I don't see too much SOA in your application - I see just multi tier application. It is like planting a tree in a garden and saying that you have a forest. Moreover it sounds like you are building CRUD interface (entities almost 1:1 to DTOs). I have a bad feeling that you are investing too big effort to architecture which you don't need. If your main intention is to build CRUD operations exposed as services on top of database you can expose entities directly moreover you can use tools like WCF Data services.
It sounds like your main point of grief is the tedious mapping of data objects to data transfer objects (DTOs). I haven't used this myself, but it seems like AutoMapper is made for doing automatic object-to-object mappings declaratively.
I would definitely stick to having your data objects separate from the data contracts in your services.

Can Ado Data Services replace my webservices which i use in ajax calls in my websites?

I used to create normal webservices in my websites, and call these services from javascript to make ajax calls.
Now i am learning about Ado Data Services,
My question is:
Does this Ado Data Services can replace my normal webservice in new sites i will create?
And if Yes,
Can i put these Ado Data Services in a separate project "local on the same server" and just reference from my website? "to use the same services for my websites internal use and also give the same services to other websites or services, the same as twitter for example doing"
depends what you want to do , I suggest you read my conversation with Pablo Castro the architect of Ado.Net Data Services
Data Services - Lacking
Here is basically Pablo's words.
I agree that some of these things are quite inconvenient and we're looking at fixing them (e.g. use of custom types in addition to types defined in the input model in order to produce custom result-sets). However, some others are just intrinsic to the nature of Data Services.
The Data Services framework is not a gateway to a database and in general if you need something like that then Data Services will just get in the way. The goal of Data Services is to create a resource model out of an input data model, and expose it with a RESTful interface that exposes the uniform interface, such that every unit of data in the underlying model ("entities") become an addressable resource that can be manipulated with the standard verbs.
Often the actual implementation of a RESTful interface includes more sophisticated behaviors than just doing CRUD over the data under the covers, which need to be defined in a way that doesn't break the uniform interface. That's why the Data Services server runtime has hooks for business logic and validation in the form of query/change interceptors and others. We also acknowledge that it's not always possible or maybe practical to model absolutely everything as resources operated with standard verbs, so we included service operations as a escape-hatch.
Things like joins dilute the abstraction we're trying to create. I'm not saying that they are bad or anything (relational databases without them wouldn't be all that useful), it's just that if what's required for a given application scenario is the full query expressiveness of a relational database to be available at the service boundary, then you can simply exchange queries over the wire (and manage the security implications of that). For joins that can be modeled as association traversals, then data services already has support for them.
I guess this is a long way to say that Data Services is not a solution for every problem that involves exposing data to the web. If you want a RESTful interface over a resource model that matches our underlying data model, then it usually works out well and it will save you a lot of work. If you need a custom inteface or direct access to a database, then Data Services is typically not the right tool and other framework components such as WCF's SOAP and REST support do a great job at that.

Building a Decoupled N-Tier App With Entity Framework and VB.NET

So we are building an application with a
UI Layer (web, mobile, Ajax client, etc)
Service/API Layer
Business Logic Layer
Data Access Layer
Our goal is to have an Entity Framework dependency from the Service Layer on down to the DAL. That means the Sevice layer will only accept and return POCOs (plain old CLR objects).
What we're currently doing is manually coding a mapping layer between the service layer and business logic layer that will convert POCOs to EF Entities and vice versa.
So in short, page has form, form has codebehind that takes form contents, stuffs them into a POCO, sends it to service layer. Service layer converts to EF Entity, sends it to Business Logic Layer which performs certain transformations to the entity, and then interacts with the DAL to get it persisted.
Yes, its a bit tedious, but we were wondering if theres a better way?
Yes, I know someone published an EF Poco Adapter but it's all in C# and we would prefer a VB.NET solution.
Yes, Switching to NHibernate is an option of last resort, as we are already deep into our development cycle.
You should be performing business logic on POCOs. The entire purpose of ORM is that it is an implementation detail. The business logic should be implemented in the domain model and in the domain services - in a business application, "domain" means "business". The DAL should be there to take a POCO and persist it - in your case, that means mapping it to and persisting an EF entity.
That's the theoretical / NHibernate model, at any rate.
Can you stand having a dependency in the service layer on the EF interfaces? You could implement IPOCO.
There's even a way to do it automatically.

Resources