Why is doctrine updating every single object of my form? - symfony

I've got a big Symfony 2 form on a huge collection (over 10k objects). For simple reasons, I cannot display a form of thousands of objects. I am displaying a form of about 300 objects.
I have found no way to filter a collection into a form and thus do the following :
$bigSetOfObjects = array(
'myObject' => $this
->getDoctrine()
->getRepository('MyObject')
->findBy(... )
);
$form = $this->createForm(new MyObjectForm(), $bigSetOfObjects);
// And a little further
if ($this->getRequest()->getMethod() == 'POST') {
$form->bindRequest($this->getRequest());
$this->getDoctrine()->getEntityManager()->flush();
}
Everything works great. Form is displayed with the correct values and the update works fine also. Data is correctly saved to the database. The problem is that Doctrine is executing a single update statement per object meaning the whole page is about 300 SQL statements big causing performance issues.
What I do not understand is that I'm updating only a couple of values of the form, not all of them. So why is Doctrine not able to detect the updated objects and thus update only those objects in the database?
Is there anything I'm doing wrong? I might have forgotten?

By default Doctrine will detect changes to your managed objects on a property-by-property basis. If no properties have changed then it should not be executing an update query for it. You may want to check that the form isn't inadvertently changing anything.
You can, however, change how doctrine determines that an object has changed by modifying the tracking policy. Since you are working with a lot of objects you may want to change to the DEFERRED_EXPLICIT tracking policy. With this method you would specifically call:
$em->persist($object);
on the entities that you want to be updated. You would have to implement your own logic for determining if an object needs to be persisted.

Related

Woocommerce Subscriptions - Set owner programmatically

I'm using WooCommerce Subscriptions on a site to provide team-based memberships. I'd like to ensure that the owner of the Subscription matches the owner of the team (one user to rule them all...!)
It's possible to do this via admin by using the customer dropdown fields.
So, I have been trying to set this programmatically. As I understand it, there are getter and setter methods for all the Subscription data (and as a Subscription is extended from WC_Order, those methods should work too). However, I can't figure out what method to use to make this change.
I've tried creating both a subscription and an order instance from a subscription ID, but neither of the methods I've tried below work:
set_user_id(456)
set_customer_id(456)
When I print_r() the Subscription instance, the original customer_id is still there under the data array:
WC_Subscription Object
(
[data:protected] => Array
(
...
[customer_id] => 123
)
...
)
Given that the array is protected, I'm guessing there's a setter method I haven't tried yet. Can someone please help me with what type of instance and setter method I need for this please?
Cheers!
I'm pleased to say I've solved this one myself - posting here to hopefully help someone else from banging their heads against the walls!
Turns out I was doing everything correctly, I just wasn't calling the save() method after I made my changes......! D'oh!
I'm quite used to functions in WordPress having immediate effect - a valid call to update_post_meta, for example, will take effect straight away.
Instead, WooCommerce stores changes via getters/setters within the local instance created through WC_Order (or other abstractions). These are only saved to the database* when you call the save() method. I believe this is to help prevent unnecessary database calls.
*or data store if you're doing something very fancy.
Code example for those who need it, for an order ID '123' and a new user ID '456':
// Create order instance
$order_instance = wc_get_order(123);
// Set new customer id
$order_instance->set_customer_id(456);
// Save changes
$order_instance->save();
// To echo data back, use the get_data() method to create an array of data, which you can assign however needed. For example:
$order_data = $order_instance->get_data();
$customer_id = $order_data['customer_id'];
echo 'customer number = ' . $customer_id;
I found the information about why the data requires manually saving (it's only stored in the local instance) from the very helpful doc at Advanced Woo:
"Setter methods update information in the WC_Data object held in working memory. However, one of the Database Operations Methods must be called to make the change in the database."
https://advancedwoo.com/topic/wc_data-and-data-storage-manipulate/#/setters

Symfony2 - Doctrine2 store changeset for later (or alternative solution to approve changes)

I have several entities, each with its form type. I want to be able, instead of saving the entity straight away on save, to save a copy of the changes we want to perform and store it in DB.
We'd send a message to the user who can approve the change, who will review the original and the changed field(s) and will approve or not. If approved the entity would be properly flushed.
To solve the issue I was thinking about:
1) doing a persist
2) getting the changesets (both the one related to "normal" fields, and the one relative to collections)
3) storing it in DB
4) Performing $em->refresh() to discard changes.
Later what I need is to get the changset(s) back, ask the (other) user to approve it and flush it.
Is this doable? What I'm especially concerned about is that the entity manager that generated the first changeset is not the same we are going to use to perform the flush, I basically need to "load" a changeset.
Any idea on how to solve the issue (this way, or another way ;) )
Another solution (working only for "normal" fields, not reference ones that come from other entities to the current one, like a many to many) would be to clone the current entity, store it, and then once approved copy the field(s) from the cloned to the original one. But it does not work for all fields (if the previous solution does not work we'd limit the feature just to "normal" fields).
Thank you!
SN
Well, you could just treat the modifications as entities themselves, so that every change is stored in the database, and then all the changes that were approved are executed against the entity.
So, for example, if you have some Books stored in the database, and you want to make sure that all the modifications made to these are approved, just add a model that would contain the changeset that has to be processed, and a handler that would apply these changes:
<?php
class UpdateBookCommand
{
// If you'll store these commands in a database, perhaps this field would be a relation,
// or you could just store the ID
public $bookId;
public $newTitle;
public $newAuthor;
// Perhaps this field should be somehow protected from unauthorized changes
public $isApproved;
}
class UpdateBookHandler
{
private $bookRepository;
private $em;
public function handle(UpdateBookCommand $command)
{
if (!$command->isApproved) {
throw new NotAuthorizedException();
}
$book = $this->bookRepository->find($command->bookId);
$book->setTitle($command->newTitle);
$book->setAuthor($command->newAuthor);
$this->em->persist($book);
$this->em->flush();
}
}
Next, in your controller you would just have to make sure that the commands are somehow stored (in a database or maybe even in a message queue), and the handler gets called when the changesets could possibly get applied.
P.S. Perhaps I could have explained this a bit better, but mostly the inspiration for this solution comes from the CQRS pattern that's explained quite well by Martin Fowler. However, I guess in your case a full-blown CQRS implementation is unnecessary and a simpler solution should work.

What's the proper use of $unitOfWork->getScheduledCollectionDeletions() in Doctrine 2 (and Symfony)?

I'm trying to detect changes in a many-to-many relation in an onFlush event.
If new entities are added to the relation or the relation is updated (always keeping an element), I can detect changes using $unitOfWork->getScheduledCollectionUpdates() and then check for getInsertDiff() or getDeleteDiff(). So far so good.
The problem comes when I take all the entities out of the relation: "There were two related entities before but there are NO related entities now."
When the relation is left empty I can access $unitOfWork->getScheduledCollectionDeletions(), but there is no way of knowing which entities were deleted:
getDeleteDiff() for this collections doesn't tell anything.
getSnapshot() doesn't tell me which entities were there before
How should I know which entities were taken out of the many-to-many relation?
I've added a Gist with the full implementation: everything works ok (it may need some optimization) except $uow->getScheduledCollectionDeletions() (line 101)
https://gist.github.com/eillarra/5127606
The cause of this problem is twofold:
1) When the method clear() is called on a Doctrine\ORM\PersistentCollection, it will:
clear its internal collection of entities.
call scheduleCollectionDeletion() on the Doctrine\ORM\UnitOfWork.
take a new snapshot of itself.
Number 2 is the reason your collection shows up in $uow->getScheduledCollectionDeletions() (and not in $uow->getScheduledCollectionUpdates()). Number 3 is the reason why you cannot determine what was in the collection before it was cleared.
2) When using the Symfony2 Form component, specifically the ChoiceType or CollectionType types in combination with the option multiple, that clear() method will get called when all entities should be removed from the collection.
This is due to the MergeDoctrineCollectionListener which is added here:
https://github.com/symfony/symfony/blob/master/src/Symfony/Bridge/Doctrine/Form/Type/DoctrineType.php#L55
This is done as optimization: It's faster to clear a collection this way, in stead of checking which entities should be removed from it.
I can think of two possible solutions:
1) Create a fork symfony/symfony and implement an option in order to not add the MergeDoctrineCollectionListener. Maybe something like no_clear to prevent the listener from being added. This won't introduce a BC break and would solve your problem because the clear() method of a collection won't get called when all entities should be removed.
2) Redesign your counter: Maybe also listen to the OnLoad event which can count the amount of entities in the collection at the time it's fetched from the db. That way your OnFlush listener can use that number to know how many entities where removed from the collection when it was cleared.
I found that if set 'by_reference' => false, option to EntityType form, then UnitOfWork detect changes of collection.
See difference state in UnitOfWork at OnFlush event:
'by_reference' => false
'by_reference' => true
In case the last item gets removed (like on a form submission), the "getDeleteDiff" sometimes turns out to be empty but in reality, items were there, the solution is to fetch the original data from the database. In my example we use a clone of the collection to achieve it. So the original collection stays untouched and everything still works.
public function onFlush(OnFlushEventArgs $args)
{
$uow = $args->getEntityManager()->getUnitOfWork();
foreach ($uow->getScheduledCollectionDeletions() as $collection) {
/**
* "getDeleteDiff" is not reliable, collection->clear on PersistentCollection also clears the original snapshot
* A reliable way to get removed items is: clone collection, fetch original data
*/
$removedData = $collection->getDeleteDiff();
if (!$removedData) {
$clone = clone $collection;
$clone->setOwner($collection->getOwner(), $collection->getMapping());
// This gets the real data from the database into the clone
$uow->loadCollection($clone);
// The actual removed items!
$removedData = $clone->toArray();
}
}
}
The reason the ->getDeleteDiff() is sometimes empty is because the "onSubmit" function of a form calls the ->clear() function on a PersistentCollection. And by clearing it, the original "snapshot" gets cleared too (for performance reasons I guess). And the "getDeleteDiff" function actually relies on that snapshot, but now it's empty.
There are multiple issues on Github about this problem:
https://github.com/doctrine/orm/issues/2272
https://github.com/doctrine/orm/issues/4173

Problem persisting collection of interfaces in JDO/Datanucleus. "unable to assign an object of type.."

I am getting below error whilst trying to persist an object that has a collection of interfaces which I want to hold a couple of different types of objects. Seems to be happening almost randomly. Sometimes after restarting it works ok ( I might be doing something wrong though).
class CommentList {
#Persistent
#Join
ArrayList<IComment> = new ArrayList<IComment>();
}
somewhere else...
CommentList cl = new CommentList();
cl.addComment( new SimpleComment() );
cl.addComment( new SpecialComment() );
repo.persist( cl );
I can see the join table has been created in my DB along with ID fields for each of the Implementation classes of IComment.
SimpleComment and SpecialComment implement IComment. If I just add a SimpleComment it works fine. As soon as I start trying to add other types of objects I start to get the errors.
error im getting
java.lang.ClassCastException: Field "com.myapp.model.CommentList.comments" is a reference field (interface/Object) of type com.myapp.behaviours.IComment but DataNucleus is unable to assign an object of type "com.myapp.model.ShortComment" to this field. You can only assign this field to a type specified by the "implementation-classes" extension attribute.
at org.datanucleus.store.mapped.mapping.MultiMapping.setObject(MultiMapping.java:220)
at org.datanucleus.store.mapped.mapping.ReferenceMapping.setObject(ReferenceMapping.java:526)
at org.datanucleus.store.mapped.mapping.MultiMapping.setObject(MultiMapping.java:200)
at org.datanucleus.store.rdbms.scostore.BackingStoreHelper.populateElementInStatement(BackingStoreHelpe
r.java:135)
at org.datanucleus.store.rdbms.scostore.RDBMSJoinListStoreSpecialization.internalAdd(RDBMSJoinListStore
Specialization.java:443)
at org.datanucleus.store.mapped.scostore.JoinListStore.internalAdd(JoinListStore.java:233)
When it does save, if I restart the server and try to query for a list of the comments, I get null values returned.
I'm using mysql backend - if I switch to db4o it works fine.
Please let me know if any info would be useful.
If you have any idea where I might be going wrong or can provide some sample code for persisting collection of different objects implementing the same interface that would be appreciated.
Thanks for any help.
Tom
When I used interfaces I just enabled dynamicSchemaUpdates (some persistence property with a name like that) and FK's are added when needed. The log gives all SQL I think
I fixed this by specifying
<extension implemention-classes="SimpleComment SpecialComment"/>
for the field cl in my pacakge.jdo.

How do I execute an action in drupal after each time a node is saved?

I'm developing an Action in Drupal which is supposed to activate after saving a node, exporting content to XML (which includes data from the node that was just saved), using the "Trigger: After saving an updated post" trigger.
Unfortunately this action actually happens right before the information from the recently saved post is saved to the database. ie. when looking at the XML later, I find that the most recent change I made was not included. Saving after editing a different node will restore the previously missing data.
How can I get my action to fire after the saving process is complete?
There is a common pitfall in this context, regardless of whether you use a trigger or Mike Munroes suggestion via hook_nodeapi() (+1):
As long as your export logic runs on the same page cycle that processed the update, and it uses node_load() to get the nodes data, node_load()might return a statically cached version of the node from before the update that does not contain the changes yet. If this is the problem in your case, you can work around it in two ways:
Force a reset of the static node cache by passing TRUE as the third parameter to node_load(). This would ensure that the node gets populated freshly from the database (at the price of some additional db queries, so be aware of a potential performance impact).
If you are going the hook_nodeapi() route, you could avoid the need to call node_load() altogether if you pass the $node object available there directly to your export function, as it will be a representation of the updated state.
You should use hook_nodeapi and invoke your action on insert and update. Look over the documenation for hook_nodeapi for other instances where you could call your export logic.
example where module name = 'export_to_xml':
/**
* Implementation of hook_nodeapi().
*/
function export_to_xml_nodeapi(&$node, $op, $a3, $a4) {
if ($op == 'update' || $op == 'insert') {
export_logic_function();
}
}

Resources