Calling FluentMigrator's builder methods while inside the action that I pass to Execute.WithConnection causes a null reference exception to be thrown.
What I am trying to do is select some data so that I may manipulate it in c#, as that is easier than manipulating it in T-SQL, and use the result of my c# operations to update the data or insert new data (to be more specific, I need to pick one query string parameter out of a stored url string and insert it somewhere else).
The only way I see to select data within a migration is to use Execute.WithConnection and retrieve the data myself (FluentMigrator provides no helpers for selecting data), but if I try to use any fluent migrator expression in the action I pass to Execute.WithConnection a null reference exception is thrown.
Here is a boiled down version of my code:
[Migration(1)]
public class MyMigration : Migration
{
public void Up()
{
Execute.WithConnection(CustomDml);
}
public void CustomDml(IDbConnection conn, IDbTransaction tran)
{
var db = new NPoco.Database(conn).SetTransaction(tran); // NPoco is a micro-ORM, a fork of PetaPoco
var records = db.Fetch<Record>("-- some sql"); // this is immediately evaluated, no reader is left open
foreach (var r in records) {
var newValue = Manipulate(r.OriginalValue);
Insert.IntoTable("NewRecords").Row(new { OriginalValueId = r.Id, NewValue = newValue }); // <-- this line causes the exception
}
}
public void Down() {}
}
The line that calls Inser.IntoTable causes a null exception to be thrown from line 36 of FluentMigrator\Builders\Insert\InsertExpressionRoot.cs - it appears that the _context variable may be null at this point but I do not understand why this is. (when testing Create.Table, e.g., it occurs on line 49 of FluentMigrator\Builders\Create\CreateExpressionRoot.cs)
Any help would be appreciated. Perhaps there is disagreement on whether DML is appropriate in a migration, and I am open to suggestions, but this scenario has come up twice this week alone. For now I am simply performing the insert using my micro-ORM within the action rather than FluentMigrator and that does work, but it seems like what I am trying to do should work.
When using the Execute.WithConnection expression all you get is the db connection and the transaction.
Using Execute.WithConnection creates an PerformDBOperationExpression expression. When processing the expression, a processor calls the Operation property (an example in the SqlServerProcessor) and the processor does not have a reference to the MigrationContext. But even if it did have access to the MigrationContext, when FluentMigrator has come to the processing stage, it is already too late. You would be trying to process expressions in a expression and at the moment FluentMigrator is not built to handle that type of nesting.
An alternative would be to make the connection string available in the migration context, see this issue: https://github.com/schambers/fluentmigrator/issues/240
Would that be a better approach?
Related
I've migrated concepts from a couple of CQRS frameworks I've seen and just started facing some issues.
I have a common EntityDbContext subclass which I use in any consuming project without further extension to suit the domain of the application, rather I provide interfaces, IReadEntities and IWriteEntities which have methods like Query() and Get() which behind the scenes call Set() returning the DbSet() then allowing the standard LINQ expressions to be chained on as for any EF query. I'm facing issues around using Include() on my IQueryables as I'm using LinqKit with AsExpandable() at the end of all my calls. This is what my context Query methods look like
public new IQueryable<TEntity> Query<TEntity>() where TEntity : class, IEntity
{
// AsNoTracking returns entities that are not attached to the DbContext
return QueryUnfiltered<TEntity>().Where(_recordAuthority.Clause<TEntity>());
}
public IQueryable<TEntity> QueryUnfiltered<TEntity>() where TEntity : class, IEntity
{
// AsNoTracking returns entities that are not attached to the DbContext
return Set<TEntity>().AsNoTracking().AsExpandable();
}
A typical query handler looks like this:
public async Task<IEnumerable<GetCustomerView>> Handle(CustomersBy query, CancellationToken cancellationToken)
{
var customers = _db.Query<Customer>();
// Apply filters
if (!string.IsNullOrEmpty(query.FirstName))
customers = customers.Where(x => x.FirstName.Contains(query.FirstName));
if (!string.IsNullOrEmpty(query.LastName))
customers = customers.Where(x => x.LastName.Contains(query.LastName));
// Execute the query and return the results
var view = await customers.Select(x => new GetCustomerView
{
Id = x.Id,
FirstName = x.FirstName,
LastName = x.LastName,
EmailAddress = x.EmailAddress
}).ToListAsync(cancellationToken).ConfigureAwait(false) as IEnumerable<GetCustomerView>;
return view;
}
This scenario works fine if I wanted to pull address details from a related table as I use projection on the database serve given I'm using the Select prior to execution. There are scenarios though where it makes sense to pull an object graph back and specify Include(...) statements but as it stands specifying _db.Query<Customer>().Include(c => c.Address) doesn't hydrate the Address navigation property. I've tried leaving the AsExpandable() off and then the results come back.
The question is, does anyone see a way to allow the Include statements to be provided maybe as a parameter to the method and then I loop through them and tack them on before calling AsExpandable()? I can't quite get my head around how to do it, if it's possible.
Maybe there's another approach?
Interestingly this apparently works fine on a version of this pattern a colleague uses where they are using EF 6. He says they specify Include after the AsExpandable without a problem.
This is known issue with EF Core and LinqKit AsExpandable (and in general with any extension library which uses custom IQueryProvider to perform its query expression tree preprocessing like LinqKit), because EF Core ignores all EF Core specific IQueryable extensions (Include / ThenInclude, AsNoTracking etc.) it the query provider is different (or does not inherit) the EF Core one (EF6 has no such requirements).
With that being said, currently there is no other solution than applying all EF Core specific extensions before calling AsExpandable.
Ok this works. I created an overload:
public IQueryable<TEntity> Query<TEntity, TProperty>(IEnumerable<Expression<Func<TEntity, TProperty>>> includes) where TEntity : class, IEntity
{
var query = Set<TEntity>().AsNoTracking();
foreach (var expression in includes)
{
query = query.Include(expression);
}
return query.AsExpandable();
}
From my handler I create a list of include expressions and pass to the Query:
var includes = new List<Expression<Func<Customer, object>>>
{
c => c.Address
};
var customers = _db.Query(includes);
var result = await customers.ToListAsync(cancellationToken).ConfigureAwait(false);
Execution of the query has the navigation property populated:
Means I'm not 'Fluent'ly chaining them from the client code's perspective, but I don't think it's terrible.
Thoughts?
To make updates to a record of SQL Server using Entity Framework Core, I query the record I need to update, make changes to the object and then call .SaveChanges(). This works nice and clean.
For example:
var emp = _context.Employee.FirstOrDefault(item => item.IdEmployee == Data.IdEmployee);
emp.IdPosition = Data.IdPosition;
await _context.SaveChangesAsync();
But is there a standard method if I want to update multiple records?
My first approach was using a list passing it to the controller, but then I would need to go through that list and save changes every time, never really finished this option as I regarded it as not optimal.
For now what I do is instead of passing a list to the controller, I pass each object to the controller using a for. (kind of the same...)
for(int i = 0; i < ObjectList.Count; i ++)
{
/* Some code */
var httpResponseObject = await MyRepositories.Post<Object>(url+"/Controller", Object);
}
And then do the same thing on the controller as before, when updating only one record, for each of the records...
I don't feel this is the best possible approach, but I haven't found another way, yet.
What would be the optimal way of doing this?
Your question has nothing to do with Blazor... However, I'm not sure I understand what is the issue. When you call the SaveChangesAsync method, all changes in your context are committed to the database. You don't have to pass one object at a time...You can pass a list of objects
Hope this helps...
Updating records in bulk using Entity Framework or other Object Relational Mapping (ORM) libraries is a common challenge because they will run an UPDATE command for every record. You could try using Entity Framework Plus, which is an extension to do bulk updates.
If updating multiple records with a single call is critical for you, I would recommend just writing a stored procedure and call if from your service. Entity Framework can also run direct queries and stored procedures.
It looks like the user makes some changes and then a save action needs to persist multiple records at the same time. You could trigger multiple AJAX calls—or, if you need, just one.
What I would do is create an endpoint—with an API controller and an action—that's specific to your needs. For example, to update the position of records in a table:
Controller:
/DataOrder
Action:
[HttpPut]
public async void Update([FromBody] DataChanges changes)
{
foreach(var change in changes)
{
var dbRecord = _context.Employees.Find(change.RecordId);
dbRecord.IdPosition = change.Position;
}
_context.SaveChanges();
}
public class DataChanges
{
public List<DataChange> Items {get;set;}
public DataChangesWrapper()
{
Items = new List<DataChange>();
}
}
public class DataChange
{
public int RecordId {get;set;}
public int Position {get;set;}
}
The foreach statement will execute an UPDATE for every record. If you want a single database call, however, you can write a SQL query or have a stored procedure in the database and pass the data as a DataTable parameter instead.
I've read multiple questions similar to this one but none are exactly my situation.
Using linq-to-sql I insert a new record and submit changes. Then, in the same web request, I pull that same record, and update it, then submit changes. The changes are not saved. The DatabaseContext is the same across both these operations.
Insert:
var transaction = _factory.CreateTransaction(siteId, userId, questionId, type, amount, transactionId, processor);
using (IUnitOfWork unitOfWork = UnitOfWork.Begin())
{
transaction.Amount = amount;
_transactionRepository.Add(transaction);
unitOfWork.Commit();
}
Select and Update:
ITransaction transaction = _transactionRepository.FindById(transactionId);
if (transaction == null) throw new Exception(Constants.ErrorCannotFindTransactionWithId.FormatWith(transactionId));
using (IUnitOfWork unitOfWork = UnitOfWork.Begin())
{
transaction.CrmId = crmId;
transaction.UpdatedAt = SystemTime.Now();
unitOfWork.Commit();
}
Here's the unit of work code:
public virtual void Commit()
{
if (_isDisposed)
{
throw new ObjectDisposedException(GetType().Name);
}
_database.SubmitChanges();
}
I even went into the designer.cs file and put a breakpoint on the field that is being set but not updated. I stepped through and it entered and execute the set code, so the Entity should be getting "notified" of the change to this field:
public string CrmId
{
get
{
return this._CrmId;
}
set
{
if ((this._CrmId != value))
{
this.OnCrmIdChanging(value);
this.SendPropertyChanging();
this._CrmId = value;
this.SendPropertyChanged("CrmId");
this.OnCrmIdChanged();
}
}
}
Other useful information:
ObjectTracking is enabled
No errors or exceptions when second SubmitChanges is called (just silently fails update)
SQL profiler shows insert and select but not the subsequent update statement. Linq-To-Sql is not generating the update statement.
There is only one database, one database string, so the update is not going to another database
The table has a primary key.
I don't know what would cause Linq-To-Sql to not issue the update command and not raise some kind of error. Perhaps the problem stems from using the same DataContext instance? I've even refreshed the object from the database using the DataContact.Refresh method before it is pulled for the update, but that didn't help.
I have found what is likely to be the root cause. I am using Unity. The initial insert is being performed in a service class with a PerWebRequest lifetime. The select and update is happening in a class with a Singleton lifetime. So my assumption that the DataContext instances are the same was incorrect.
So, in my class with the Singleton lifetime, I get a fresh instance of the database repository and perform the update and no problem.
Now I still don't know why the original code didn't work and my approach could still be considered more a workaround than a solution, but it did solve my problem and hopefully will be useful to others.
We are evaluating Grid Gain 6.5.5 at the moment as a potential solution for distribution of compute jobs over a grid.
The problem we are facing at the moment is a lack of a suitable asynchronous notification mechanism that will notify the sender asynchronously upon job completion (or future completion).
The prototype architecture is relatively simple and the core issue is presented in the pseudo code below (the full code cannot be published due to an NDA). *** Important - the code represents only the "problem", the possible solution in question is described in the text at the bottom together with the question.
//will be used as an entry point to the grid for each client that will submit jobs to the grid
public class GridClient{
//client node for submission that will be reused
private static Grid gNode = GridGain.start("config xml file goes here");
//provides the functionality of submitting multiple jobs to the grid for calculation
public int sendJobs2Grid(GridJob[] jobs){
Collection<GridCallable<GridJobOutput>> calls = new ArrayList<>();
for (final GridJob job : jobs) {
calls.add(new GridCallable<GridJobOutput>() {
#Override public GridJobOutput call() throws Exception {
GridJobOutput result = job.process();
return result;
}
});
}
GridFuture<Collection<GridJobOutput>> fut = this.gNode.compute().call(calls);
fut.listenAsync(new GridInClosure<GridFuture<Collection<GridJobOutput>>>(){
#Override public void apply(GridFuture<Collection<GridJobOutput>> jobsOutputCollection) {
Collection<GridJobOutput> jobsOutput;
try {
jobsOutput = jobsOutputCollection.get();
for(GridJobOutput currResult: jobsOutput){
//do something with the current job output BUT CANNOT call jobFinished(GridJobOutput out) method
//of sendJobs2Grid class here
}
} catch (GridException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
});
return calls.size();
}
//This function should be invoked asynchronously when the GridFuture is
//will invoke some processing/aggregation of the result for each submitted job
public void jobFinished(GridJobOutput out) {}
}
}
//represents a job type that is to be submitted to the grid
public class GridJob{
public GridJobOutput process(){}
}
Description:
The idea is that a GridClient instance will be used to in order to submit a list/array of jobs to the grid, notify the sender how many jobs were submitted and when the jobs are finished (asynchronously) is will perform some processing of the results. For the results processing part the "GridClient.jobFinished(GridJobOutput out)" method should be invoked.
Now getting to question at hand, we are aware of the GridInClosure interface that can be used with "GridFuture.listenAsync(GridInClosure> lsnr)"
in order to register a future listener.
The problem (if my understanding is correct) is that it is a good and pretty straightforward solution in case the result of the future is to be "processed" by code that is within the scope of the given GridInClosure. In our case we need to use the "GridClient.jobFinished(GridJobOutput out)" which is out of the scope.
Due to the fact that GridInClosure has a single argument R and it has to be of the same type as of GridFuture result it seems impossible to use this approach in a straightforward manner.
If I got it right till now then in order to use "GridFuture.listenAsync(..)" aproach the following has to be done:
GridClient will have to implement an interface granting access to the "jobFinished(..)" method let's name it GridJobFinishedListener.
GridJob will have to be "wrapped" in new class in order to have an additional property of GridJobFinishedListener type.
GridJobOutput will have to be "wrapped" in new class in order to have an addtional property of GridJobFinishedListener type.
When the GridJob will be done in addition to the "standard" result GridJobOutput will contain the corresponding GridJobFinishedListener reference.
Given the above modifications now GridInClosure can be used now and in the apply(GridJobOutput) method it will be possible to call the GridClient.jobFinished(GridJobOutput out) method through the GridJobFinishedListener interface.
So if till now I got it all right it seems a bit clumsy work around so I hope I have missed something and there is a much better way to handle this relatively simple case of asynchronous call back.
Looking forward to any helpful feedback, thanks a lot in advance.
Your code looks correct and I don't see any problems in calling jobFinished method from the future listener closure. You declared it as an anonymous class which always has a reference to the external class (GridClient in your case), therefore you have access to all variables and methods of GridClient instance.
So, there are a wealth of Flex articles online about how to handle a .NET WebMethod that returns a DataSet or DataTable. Here is an example:
Handling web service results that contain .NET DataSets or DataTables
So, I know how to use result.Tables.<tablename>.Rows and the like. But what I cannot seem to figure out or find online is how to go the other direction - a method to pass objects or tables back to the .NET Webservice from Flex, without stooping to passing XML as a string, or making huge web service methods that have one parameter for each property/column of the object being stored. Surely others, smarter than I, have tackled this issue.
I am using ASP.NET 2.0 Typed DataSets, and it would be really nice if I could just pass one object or array of objects from Flex to the web service, populate my Typed DataTable, and do an Update() through the corresponding typed TableAdapter. My dream would be a [WebMethod] something like one of these:
public void SaveObject(TypedDataTable objToSave) { ... }
public void SaveObject(TypedDataSet objToSave) { ... }
I've had the typed datatables saving to the database, I know how to do that part and even a few tricks, but we had XML being passed back-and-forth as a string - eww. I'm trying to get to a more object-based approach.
The best object based approach is AMF. I assume its probably a bit late in your your development cycle to change your integration layer, but otherwise I dont know of a way to get around marshalling your object(s) back into XML or separating them out into their primitive components.
For .NET implementations of AMF check out:
FlourineFX(FOSS)
WebORB for .NET
Its amazing how easy things become once AMF is used, for example using the Mate MVC framework and an AMF call passing a complex object to the server looks something like this:
<mate:RemoteObjectInvoker instance="yourWebservice" method="saveComplexObject" showBusyCursor="true" >
<mate:resultHandlers>
<mate:CallBack method="saveComplexObjectSuccess" arguments="{[resultObject]}" />
</mate:resultHandlers>
<mate:faultHandlers>
<mate:MethodInvoker generator="{DataManager}" method="presentFault" arguments="{fault}" />
</mate:faultHandlers>
</mate:RemoteObjectInvoker>
With result and fault handlers being optional.
The direction I ended up going was close to what I hoped was possible, but is "hack-ish" enough that I would consider SuperSaiyen's suggestion to use AMF/ORM a better solution for new/greenfield projects.
For sake of example/discussion, let's say I am working with a Person table in a database, and have a typed DataSet called PeopleDataSet that has PersonTableAdapter and PersonDataTable with it.
READ would look like this in .NET web service:
[WebMethod]
public PeopleDataSet.PersonDataTable GetAllPeople() {
var adapter = new PersonTableAdapter();
return adapter.GetData();
}
... which in Flex would give you a result Object that you can use like this:
// FLEX (AS3)
something.dataProvider = result.Tables.Person.Rows;
Check out the link I put in the question for more details on how Flex handles that.
CREATE/UPDATE - This is the part I had to figure out, and why I asked this question. The Flex first this time:
// FLEX (AS3)
var person:Object = {
PersonID: -1, // -1 for CREATE, actual ID for UPDATE
FirstName: "John",
LastName: "Smith",
BirthDate: "07/19/1983",
CreationDate: "1997-07-16T19:20+01:00" // need W3C DTF for Date WITH Time
};
_pplWebSvcInstance.SavePerson(person); // do the web method call
(For handling those W3C datetimes, see How to parse an ISO formatted date in Flex (AS3)?)
On the .NET web service side then, the trick was figuring out the correct Type on the web method's parameter. If you go with just Object, then step into a call with a debugger, you'll see .NET figures it is a XmlNode[]. Here is what I figured out to do:
[WebMethod]
public int SavePerson(PeopleDataSet p) {
// Now 'p' will be a PeopleDataSet with a Table called 'p' that has our data
// row(s) (just row, in this case) as string columns in random order.
// It WILL NOT WORK to use PeopleDataSet.PersonDataTable as the type for the
// parameter, that will always result in an empty table. That is why the
// LoadFlexDataTable utility method below is necessary.
var adapter = new PersonTableAdapter();
var tbl = new PeopleDataSet.PersonDataTable();
tbl.LoadFlexDataTable(p.Tables[0]); // see below
// the rest of this might be familiar territory for working with DataSets
PeopleDataSet.PersonRow row = tbl.FirstOrDefault();
if (row != null) {
if (row.PersonID > 0) { // doing UPDATE
row.AcceptChanges();
row.SetModified();
}
else { // doing CREATE
row.CreationDate = DateTime.UtcNow; // set defaults here
row.IsDeleted = false;
}
adapter.Update(row); // database call
return row.PersonID;
}
return -1;
}
Now, the kluge utility method you saw called above. I did it as extension method, that is optional:
// for getting the Un-Typed datatable Flex gives us into our Typed DataTable
public static void LoadFlexDataTable(this DataTable tbl, DataTable flexDataTable)
{
tbl.BeginLoadData();
tbl.Load(flexDataTable.CreateDataReader(), LoadOption.OverwriteChanges);
tbl.EndLoadData();
// Probably a bug, but all of the ampersand (&) in string columns will be
// unecessarily escaped (&) - kluge to fix it.
foreach (DataRow row in tbl.Rows)
{
row.SetAdded(); // default to "Added" state for each row
foreach (DataColumn col in tbl.Columns) // fix & to & on string columns
{
if (col.DataType == typeof(String) && !row.IsNull(col))
row[col] = (row[col] as string).Replace("&", "&");
}
}
}