WriteXml - Collection was modified; enumeration operation might not execute - collections

I'm coding in Visual Studio 2019 and I get the following error message: "Collection was modified; enumeration operation might not execute".
It happens when I try to save a dataset with the method WriteXml:
myDataset.WriteXml(myPath)
Due to the fact that I modify the dataset with other methods connected to events, maybe the modification could happens during the WriteXml, but to avoid this situation I fisrt copy the dataset and then use the second dataset for the WriteXml:
Dim myDs As DataSet = myDataset
myDs.WriteXml(myPath)
Could somebody help me? Thank you

Related

Parallel For Each get in a DeadLock

i'm trying to loop over a datatable with more then 100 000 row using the Parrallel For each. Everything work fine up to around 25 000 iterations. I dont get any error, and I see the apps still working, but it kind of block and nothing happen. I tried to encapsulate the loop in a factory.startnew and I get a random abort expection at around 5000 iterations for no reason.
Dim lstExceptions As New ConcurrentQueue(Of Exception)
Dim options As New ParallelOptions
options.MaxDegreeOfParallelism = 3
Parallel.ForEach(ReservationReportDS.Tables(0).AsEnumerable(), options,
Sub(row)
Try
Dim tmpRow As DataRow = CType(row, DataRow)
Dim ReservationID As Integer = tmpRow.Field(Of Integer?)("autNoReservation")
Dim customerID As Integer = tmpRow.Field(Of Integer?)("CustomerID")
Dim VehiculeID As Integer = tmpRow.Field(Of Integer?)("autNoVehicule")
Dim bill As New BillingPath()
bill.Calculate_Billing(ReservationID, customerID, VehiculeID)
Catch err As Exception
lstExceptions.Enqueue(err)
End Try
End Sub
)
If (lstExceptions.Count > 0) Then
Throw New AggregateException(lstExceptions)
End If
Catch errAgg As AggregateException
For Each ex As Exception In errAgg.InnerExceptions
Log(Log_Billing_UI, "", System.Reflection.MethodBase.GetCurrentMethod().Name & GetExceptionInfo(ex))
Next
Catch ex As Exception
Log(Log_Billing_UI, "", System.Reflection.MethodBase.GetCurrentMethod().Name & GetExceptionInfo(ex))
End Try
Since you have such amount of records, I would like to recommend you to think about following concept:
Read all records into ConcurrentQueue(Of SomeBillingInfoClass) collection first - it will allow you to not keep connection to DB opened, make thread-safe rest operations with data readed from DB.
Create list of Tasks with Billing calc code inside. This will allow you to run tasks in parallel and pass ConcurrentQueue variable from #1 easily.
Keep tasks running in loop while at least one element in ConcurrentQueue remains.
In case you can aggregate billing calculation result to some other class - you may do it using additional thread safe ConcurrentQueue(Of BillingCalcResultInfoClass) collection.
After all billings are calculated - write to DB in single thread and single long transaction - this may be faster then granular writing to db.
Some notes about your code - I think you may not need to throw AggregateException manually - .Net environment will do it for you automatically. You only will need to catch it in .ContinueWith() method of task (sorry, mostly I'm c# developer and use c# notation).
I used similar approach to process millions of records and it works fine. Typically I use 3-5 tasks. But you can always study how much tasks you may have.
Using ConcurrentQueue or similar thread safe collection will allow you to keep your code thread safe more easily.
Please let me know if you have any questions.
Thank you all for your answers and especially Anton Norko. I finally found the problem and it was on my side. Under certain condition, Calculate_Billing was stuck in an infinite loop. Since I used 3 threads at the same time, they were getting stuck one by one.

How to optimally handle sql connections in vb.net?

ASP.NET WEB PROGRAMMING
How to optimally handle
Open
Close
Dispose
Exception
when retrieving data from a database.
LOOKING TO OPTIMISE
I have always used the following to make a connection, catch any errors, and then correctly dispose of the connection in either event.
VB.NET
Try
con.Open()
//...
con.Close()
Catch ex As Exception
lbl.Text = ex.Message
Finally
If con IsNot Nothing Then
con.Dispose()
End If
End Try
After reading many articles I find people practically throwing up at this code; however, I do not see any other way to accomodate the four steps required efficiently.
The alternative, and I believe more cpu friendly USING statement, seems to be the tool of choice for the correct disposal of a sql connection. But what happens when bad data is retrieved from the database and there is nothing in place to indicate to an end user what went wrong?
QUESTION
Between: Using, try catch, and other.
Which is faster, cleaner, and/or most efficient way to handle a data retrieval statement?
I am happy to hear people's opinions but I am looking for facts.
You can also use the following block of code as template. It combines the Using...End Using and Try...Catch blocks.
Using conn As New SqlConnection(My.Settings.SQLConn)
Try
Conn.open
Catch ex As SqlException
End Try
End Using
There is no need to call conn.Dispose() because using block does that automatically.
Use Entity Framework, it implements Unit Of Work Pattern for You efficiently, and perform your operations within transaction scope
Always use the "Using" process as it will automatically assign and then free up system resources. You are still able to perform the try-catch within this to present errors to the user.

QtSQL: "prepared statement "qpsqlpstmt_1" does not exist" on clear() of model

I am getting the following error message printed in the console:
Unable to free statement: ERROR: prepared statement "qpsqlpstmt_1" does not exist
It is printed when the the following function is called in the application (or when the object is deleted (if clear() is not called before delete):
sqlQueryModel->clear();
sqlQueryModel object is of type QSqlQueryModel and is used throughout a derived class to communicated with a PostgreSQL database. It also serves as a model for QCompleter. I have never declared or used the name "qpsqlpstmt_1".
Could someone help me interpret the error message please, and explain what might be causing it? Is this indicative of a problem in my code or a Qt bug? (likely the former :))
On reviewing the PostgreSQL log file on the server, the exact same statement appears plus an additional line:
STATEMENT: DEALLOCATE pqsqlpstmt_1
See these Qt issue tracker entries:
https://bugreports.qt.io/browse/QTBUG-8860
https://bugreports.qt.io/browse/QTBUG-16007
https://bugreports.qt.io/browse/QTBUG-15979
... all of which mention your prepared statement name and relate to deletion.
After a considerable amount of time, I realized that I was simply closing the connection to the database before calling clear... not a good strategy.

moq returning dataReader

I'm having a strange experience with moq/mocking.
Im trying to mock the data going into a method so that i dont have to have adatabase available at test time.
So im loading in some data ive previously seralised.
Loading it into a dataTable, then creating a data reader from there, because my business layer method expects a data reader.
Then creating a mock for my data layer. and setting the return value for a particular method to my new datareader.
I am then setting (injecting) my mock data layer into my business layer so it can do the work of returning the data when the time comes..
var dataTable = DataSerialisation.GetDataTable("C:\\data.xml");
IDataReader reader = dataTable.CreateDataReader();
var mock = new Mock<IRetailerDal>();
mock.Setup(x => x.ReadRetailerDetails("00")).Returns(reader);
retailersBusinessLayer.RetailerDal = mock.Object;
var r = retailersBusinessLayer.GetRetailerDetail("00");
Now.. when the "GetRetailerDetail" is called is basically gets to "while(data.Read())" and crashes out but only sometimes.
I get the exception:
System.InvalidOperationException : DataTableReader is invalid for current DataTable 'Table1'.
Othertimes it move past that and can read some columns data, but other columns dont exist. (which must be to do with my serialisation method)
Well, this isnt exactly a satisfactory answer, but the code works now..
its similar to this.. in that no reason was found.
here
Anyway... as stated above the issue was ocuring inside my GetRetailerDetail method, where the code hits while(data.Read()) it throws the error..
The fix.. change the name of the data reader variable.. i.e. its was "data" and its now "data2".. thats all i changed.

InsertOnSubmit not triggering a database insert on SubmitChanges

I'm experiencing an odd scenario and I'm looking for ways to figure out what's going wrong. I've got a piece of code that inserts a row into a table - the kind of thing I've done in dozens of other apps - but the end result is nothing happens on the database end, and no errors are generated. How do I find out what's going wrong?
Here's my code:
Partial Class MyDatabaseDataContext
Public Sub CreateEnrollee(subId, depId)
dim newEnrollee = New enrolee With {.subId = subId, .depId = depId}
Me.enrollees.InsertOnSubmit(newEnrollee)
Me.SubmitChanges()
dim test = NewEnrollee.id '<-- auto-incrementing key'
End Sub
End Class
After SubmitChanges is called, no new row is created, and "test" is zero. No errors are generated. I have no idea why it's not trying to insert the row. Any ideas on how to debug this?
You could enable logging:
Me.Log = Console.Out;
You could check the ChangeSet for your object.
FOUND IT! Part of the debugging I did for some other issues included adding some logging to some of the extensibility methods:
Partial Private Sub InsertEnrollee(instance As Enrollee)
End Sub
I thought "InsertEnrollee" existed so I could perform actions after the Enrollee was inserted, so I added logging code here and that's when the trouble started. Now I'm guessing this is how you would override the Enrollee insert and do it yourself if you so desired. Since I was essentially overriding with logging code, that's why nothing was happening (from a database perspective).

Resources