How to do updates with GraphQL mutations(Hot Chocolate) - .net-core

We recently introduced GraphQL to our project, I've never used it before, however I like it a lot.
We decided to go with the HotChocolate library for .NET Core and Apollo for client side.
One thing I am not quite sure is about mutations, specifically peforming updates and partial updates with mutations.
I read somewhere that the practice is and that I should stick with creating specific mutation for each update, for instance updateUsername(), updateAddress(), updateCity() all of them should have specific mutation.
Issue with that is that my codebase will grow enormously if I decide to go in that direction, as we are very much data driven, with a lot of tables and columns per table.
Another question is, how to handle nullable properties, I can create a mutation which accepts some input object, but I'll end up with my entity being overwritten and all nullable properties not provided on the calling end will be set to null.
Is there a way to handle this update partially or I should go with specific update mutation for each property I want updated?

I think you understood the best practice around specific mutations wrong. It's less "have one mutation to update one field" and more "have specific mutations that encapsulate actions in your domain". A concrete example would be creating an "addItemToBasket" mutation, instead of having 3 mutations that update the individual tables related to your shopping basket, etc.
GraphQL is very much centered around front-end development, so your mutations should, in general, closely resemble actions a user can perform in your front-end. E.g. your front-end has an "Add to basket" button and your backend has an "addItemToBasket" mutation that resembles the action of placing an item in the user's basket.
If you design your mutations with this in mind, most of the time you shouldn't be having an issue with partial updates, since the mutation knows exactly what's to do and you are not just letting the user of your schema update fields at will.
If for whatever reason you need to have these partial updates, you likely won't get around implementing the patching yourself, unless your data provider supports it. Meaning you will have to have an input object type with nullable properties and your mutation that decides which fields have been changed and changing them using your data provider.
That being said, there's also a proposal for patching types in the Hot Chocolate repository that should simplify the patching part: https://github.com/ChilliCream/hotchocolate/issues/1326

for instance updateUsername(), updateAddress(), updateCity() all of them should have specific mutation.
Issue with that is that my codebase will grow enormously if I decide
to go in that direction, as we are very much data driven, with a lot
of tables and columns per table.
Correct. That's practically impossible to follow that way for more or less big data-driven applications.
Consider how we implement the patching in our API here. Also consider following the discussion about the patching feature in HotChocolate github thread. Hope, that helps!

Related

Manipulating the JSON response of an API call, composed by various entities

I am creating an API that will provide data to a CMS, which will handle managing all orders for my e-commerce in Symfony 6.
I have primarily 7 entities: Address, Carrier, Customer, OrderDetail, OrderProducts, Orders, Product.
Each entity has its own endpoint, but I would like to create an endpoint that provides data from all the entities in a single API call (in some cases, I will need all properties of the entity, and in others, only some properties).
The goal of this single API call is to provide data to the CMS dashboard.
As in the picture, there are various data provided by the different entities, ex:
What is the best way to manipulate the JSON response inside Controller? Should I do everything through the Orders repository (with all the join between the entity's tables)? Use DTOs? What is the best approach?
There is a lot of data, as the default call will provide data of the orders of the last 6 months.
It really depends on the modification you're doing. But I'd say this :
If there are filters applied to the resulting values after modification : try to do it in the query itself, it will make the filtering easier.
If the modifications are minor, doing it in the query might save you a tiny bit of performance (basically nothing) while keeping the query readable (or not much worse).
If the modification are not minor (even just "medium"), do it in Php. It will be MUCH easier to update and to understand. Where, is kinda a matter of preference. I think some people would say "if you doubt, do it in a service". But not in the controller.
Some people think that it's a bad thing to operate with entities, and that DTOs are always the way to go. Because you control more precisely what's in it, nothing is unpredictable. However, if you keep both entites, and DTOs, it makes the code more complex. Personally, I don't do it. But you can.
Understand that what you're asking is pretty subjective.

Best practice for managing / controlling object state with 2 way databinding using Polymer

Lets try this explanation again...
I'm new to polymer (and getting back into web dev after a relatively long absence), and I'm wondering what the recommended approach might be to more closely manage object state while employing 2 way databinding. I am currently consuming rest API (json) objects. My question is if polymer keeps a copy of the original object before initiating updates to the bound object's properties/attributes...so one might be able to easily undo the changes? While allowing 2 way databinding to work its magic is often desired, there are cases where I'd like to prevent / delay changes to the object / DOM until the user approves the changes (say via the paper-dialog component for instance). I suppose one could make a temporary copy of the object and bind fields to that version, and then only persist the changes back to the source object upon user approval. In any case, I'd be interested to hear thoughts and see an example or two of recommended approaches (especially if I am off-track with my ideas!)
I suppose one could make a temporary copy of the object and bind
fields to that version, and then only persist the changes back to the
source object upon user approval
This.
Consider that view-models are essentially different from pure data-models (sometimes called business-data). Frequently, the differences are irrelevant and one can use them interchangeably. However, be aware of scenarios where the view-model is distinct (uncommitted user edits are a good example).
The notion of a field editor that requires approval from the user is purely UI/View oriented. Whatever data is managed in that modality is purely in the domain of the view, and fetches/commits to the business-data should be discrete.

How can I implement additional entity properties for Entity Framework?

We have a requirement to allow customising our core product and adding additional fields on a per client basis e.g. People entity some client wants to record their favourite colour etc. As far as I know we can't add properties to EF at runtime as it needs classes defined at startup. Each customer has their own database but we are deploying the same solution to all customers with all additional code. We are then detecting which customer they are and running customer specific services etc.
Now the last thing I want is to be forking my project or alternatively adding all fields for all clients. This would seem likely to become a nightmare. Also more often than not the extra fields would only be required in a very limited amount of place. Maybe some reports, couple of screens etc.
I found this article from Jermey Miller http://codebetter.com/jeremymiller/2010/02/16/our-extension-properties-story/ describing how they are adding extension properties and having them go from domain to the web front end.
Has anyone else implemented anything similar using EF? How did it work out? Are there any blogs/samples that anyone has seen? I am not sure if I am searching for the right thing even if someone could tell me the generic name for what we want to do that would help. I'm guessing it is a problem that comes up for other people.
Linked question still requires some forking or implementing all possible extensions in single solution because you are still creating strongly typed extensions upfront (= you know upfront what extensions customer wants). It is not generally extensible solution. If you want generic extensible solution you must leave strongly typed world and describe extensions as data.
You will need to use some metamodel. Your entity classes will contain only properties used by all customers and navigation property to special extension entity (additional table per every extensible entity) where you will be able to put additional properties as name / value pair (you can add other columns like type, validation, etc. if needed).
This will in general moves part of your model from hardcoded scenario to configuration based scenario and your customers will even be allowed to define extensions at runtime (if you implement such feature).

Dynamic form creation in asp.net c#

So, I need some input refactoring an asp.net (c#) application that is basically a framework for creating dynamic forms (any forms). From a high level point of view, there is a table that has the forms, and then there is a table that has all the form fields, where it is one to many between the two. There is a validation table, where each field can have multiple types of validation, and it is a one to many from the form fields table to the validation table.
So the issue is that this application has been sold as the be-all-end-all customizable solution to all the clients. So, the idea is whatever form they want, we can build it jsut using DB configurations. The thing is, that is not always possible, because there is complex relationship between the fields, and complex relationship between the forms themselves. Also, there is only once codebase, and this is for multiple clients - all of whom host it on their own. There is very specific logic for each of the clients, and they are ALL in the same codebase, with no real separation. Sometimes it was too difficult to make it generic, so there are instances where it has hard coded logic (as in if formID = XXX then do _). You can also have nested forms, as in, one set of fields on its own within each form.
So usually, when one client requests a change, we make the change and deploy it to that client - but then another client requests a different change, and we make the change and deploy it for THAT client, but the change from the earlier client breaks it, and its a headache trying to debug, because EVERYTHING is dynamic. There is no way we can rollback the earlier change, because then the other client would be screwed.
Its not done in a real 3-tier architecture - its a web site with references to a DB class, and a class library. There is business logic in the web site itself, in the class library, and the database stored procs (Validation is done in the stored procs).
I've been put in charge of re-organizing the whole thing, and these are my thoughts/questions:
I think this is a bad model in general, because one of the things I heard one of the developers say is that anytime any client makes a change, we should deploy to everybody - but that is not realistic, if we have say 20 clients - there will need to be regression testing on EVERYTHING, since we don't know the impact...
There are about 100 forms in total, and their is some similarity in them (not much). But I think the idea that a dynamic engine can solve ALL form requests was not realistic as well. Clients come up with the most weird requests. For example, they have this engine doing a regular data entry form AND a search form.
There is a lot of preserving state between pages, and it is all done using session variables, which is ok, except that it is not really tracked, and so sessions from the same user keep getting overwritten, and I think sessions should be got rid of.
Should I really just rewrite the whole thing? This app is about 3 years old, and there has been lots of testing and things done, and serious business logic implemented, so I hate to get rid of all that (joel's advice). But its really a mess of a sphagetti code, and everything takes forever to do, and things break all the time because of minor changes.
I've been reading Martin Fowlers "Refactoring" and Michael Feathers "working effectively with legacy code" - and they are good, but I feel they were written for an application that was 'slightly' better architected, where it is still a 3-tiered architecture, and there is 'some' resemblance of logic..
Thoughts/input anyone?
Oh, and "Help!"
My current project sounds like almost exactly the same product you're describing. Fortunately, I learned most of my hardest lessons on a former product, and so I was able to start my current project with a clean slate. You should probably read through my answer to this question, which describes my experiences, and the lessons I learned.
The main thing to focus on is the idea that you are building a product. If you can't find a way to implement a particular feature using your current product feature set, you need to spend some additional time thinking about how you could turn this custom one-off feature into a configurable feature that can benefit all (or at least many) of your clients.
So:
If you're referring to the model of being able to create a fully customizable form that makes client-specific code almost unnecessary, that model is perfectly valid and I have a maintainable working product with real, paying clients that can prove it. Regression testing is performed on specific features and configuration combinations, rather than a specific client implementation. The key pieces that make this possible are:
An administrative interface that is effective at disallowing problematic combinations of configuration options.
A rules engine that allows certain actions in the system to invoke customizable triggers and cause other actions to happen.
An Integration framework that allows data to be pulled from a variety of sources and pushed to a variety of sources in a configurable manner.
The option to inject custom code as a plugin when absolutely necessary.
Yes, clients come up with weird requests. It's usually worthwhile to suggest alternative solutions that will still solve the client's problem while still allowing your product to be robust and configurable for other clients. Sometimes you just have to push back. Other times you'll have to do what they say, but use wise architectural practices to minimize the impact this could have on other client code.
Minimize use of the session to track state. Each page should have enough information on it to track the current page's state. Information that needs to persist even if the user clicks "Back" and starts doing something else should be stored in a database. I have found it useful, however, to keep a sort of breadcrumb tree on the session, to track how users got to a specific place and where to take them back to when they finish. But the ID of the node they're actually on currently needs to be persisted on a page-by-page basis, and sent back with each request, so weird things don't happen when the user is browsing to different pages in different tabs.
Use incremental refactoring. You may end up re-writing the whole thing twice by the time you're done, or you may never really "finish" the refactoring. But in the meantime, everything will still work, and you'll have new features every so often. As a rule, rewriting the whole thing will take you several times as long as you think it will, so don't try to take the whole thing in a single bite.
I have a number of similar apps for building dynamic forms that I support.
There's a whole lot of things you could/could not do & you're right to think hard before throwing away 3 years of testing/development.
My input for you to consider is to implement a plug-in architecture on top of what you're got. Any custom code for a form goes in the plug-in & the name of this plug-in is stored with the form. When you generate a form, the correct plug-in is called to enhance the base functionality. that way you get to move all the custom code out of the existing library. It should also mean less breaking changes, each plug-in only affects the form it's attached to.
From that point it'll be easy to refactor the core engine as it's common functionality across all clients & forms.
Since your application seems to have become a big ball of mud, a complete (or an almost complete rewrite) might make sense.
You should also take into account new technologies like document-oriented databases (couchDB, MongoDB)
Most of the form definitions could probably fit pretty well in document-oriented databases. For exemple:
To define a customer form, you could use a document that looks like:
{Type:"FormDefinition",
EntityType: "Customer",
Fields: [
{FieldName:"CustomerName",
FieldType:"String",
Validations:[
{ValidationType:"Required"},
{ValidationType:"StringLength", Minimum:15, Maximum:50},
]},
...
{FieldName:"CustomerType",
FieldType:"Dropdown",
PossibleValues: ["Standard", "Valued", "Gold"],
DefaultValue: ["Standard"]
Validations:[
{ValidationType:"Required"},
{
ValidationType:"Custom",
ValidationClass:"MySystem.CustomerName.CustomValidations.CustomerStatus"
}
]},
...
]
};
With this kind of document to define your forms, you could easily add forms and validations which are customer specific.
You could easily add subforms using a fieldtype of SubForm or whatever.
You could define FieldTypes for all common types of fields like e-mail, phone numbers, address, etc.
namespace System.CustomerName.CustomValidations {
class CustomerStatus: IValidator {
private FormContext form;
private List<ValidationErrors> validationErrors;
CustomerStatus(FormContext fc) {
this.validationErrors = new List<ValidationErrors>();
this.form = fc;
}
public List<ValidationErrors> Validate() {
if (this.formContext.Fields["CustomerType"] == "Gold" && Int.Parse(this.form.Fields["OrderCount"]) < 10) {
this.validationErrors.Add(new ValidationError("A gold customer must have at least 10 orders"))
}
if (this.formContext.Fields["CustomerType"] == "Valued" && Int.Parse(this.form.Fields["OrderCount"]) < 5) {
this.validationErrors.Add(new ValidationError("A valued customer must have at least 5 orders"))
}
return this.validationErrors;
}
}
}
A record of a document with that definition could look like this:
{Type:"Record",
EntityType: "Customer",
Fields: [
{FieldName:"CustomerName", Value:"ABC Corp.",
{FieldName:"CustomerType", Value:"Gold",
...
]
};
Sure, this solution is a lot of work, but if/when realized it could be really easy to create/update/customize forms.
This is a common but (IMO) somewhat naive design approach. "Instead of solving the customer's problem, let's build a tool to let them solve their own problems!". But the reality is, that generally customers want YOU to solve their ACTUAL problems. So build things that solve their problems.
If you can architect it in a way that allows you to reuse some parts for different customers, fine. But that is generally what the frameworks have done for you already - work out the common features that applications need and make them available in neat packages.

EF4 ASP.NET - Managing Entity Edits between HTTP Posts and Rollback

I am struggling with the following use-case:
User amends an existing order. The order is complex - lots of related 'entities' (addresses, post options, suppliers, makes, models, various items etc). Across multiple http posts.
User wants to discard the changes.
--
I have an order entity and as the user is editing this I am making various changes to the entity associations e.g changing order.address, order.items.add(item)...
In a single post this is fine, but across posts I don't know how best store state. If I store the entities then I cannot save the changes as they are across different data contexts. I have read that it is bad practice to store the data context in the session state i.e. long-lived context. I can't save changes after each edit/post because I cannot roll-back (?). I really would like to work with the entities during the editing process rather than one big save at the end (taking UI settings and applying these in one chunk).
This must be a pretty common problem - it's driving me mad. Any help really appreciated.
Cheers!
We have a similar problem where we are building a complex business object through a multi-page wizard.
Instead of creating a partially complete business object at each step of the wizard, we create a dedicated wizard object that looks pretty similar to the business object, populate that through the wizard. At each step in the wizard, the wizard object is saved into the database. At the end the user can accept it and it is converted to a real business object and then becomes visible to everyone else, or they can bin it and no-one else ever knows it existed.
If this kind of approach was not suitable, I suspect you're looking at some kind of difference tracking, either at the entity or database levels. Neither are simple to implement, work with or manage in a system. The former would be some kind of calculation and storage of n changes to the entities and developing an algorithm to undo them, the latter depends on your RDBMS, but might include versioned rows or similar.
Yes its pretty much common for us. In most scenarios we use the MVC approach. Even without the actual ASP .NET MVC Projects, we use similar ViewModel with our Views/Pages/Scenarios etc. where there is no direct/single entity mapping to the Business Layer (in other words, Business.Entities). This is pretty much similar to DTOs.
It is always easy to use Disconnected EF. We retrieve data and discard the context, then transform the Entities into ViewModels/DTOs if necessary. When you need to persist changes, all you have to do is to create a new context, find the latest entity instance do the changes.
The Views/Pages/Controllers will be managing these ViewModels/DTOs. Tracking Changed and Deleted content can be done by introducing a HistoryList<T> (you can extend a List<T> to implement this).
Once done, using a Controller/Workflow/Component you can observe the ViewModel/DTO and do the necessary changes to your Entities using a new Context to retrieve and persist.
It involves a bit of a coding and I would say its not a perfect solution since it has its own pros and cons.
/KP

Resources