Short Description of the problem:
I generate a file inside the entity class and would like to save the filename to the database. The controller doesn't know about this (wheter or not the filename has changed, so it's not practical to persist from the controller.
Is there a way for an Entity to persist itself?
Example of my use:
The entity class is for an image in a gallery. I always keep the original file and work with a cached version of the file. When the image is changed (rotated for example), the cached version is deleted. The cached version also be deleted in other cases. When the file is needed, I check if the cached file exists, otherwise it is regenerated with a new filename from the archived image. I need a new filename because that resets the cache for various thumbnail sizes.
When I generate that new file, I have to save its filename to the database somehow. Because it is only decided in the Entity when to regenerate the image, it would be practical if the entity could persist itself to the database, but I haven't found a solution for that.
Is there a way to do this or is there a whole different concept I should be using to regenerate the image file?
Entities in Doctrine are not active records - they cannot perform persistance actions by themselves, so they rely on a Big Brother [the entity manager].
Even if the controller doesn't know wether any filename as changed or not, you do - just persist your picture every time: if nothing changed, Doctrine won't touch the entity.
Have a look at lifecycle events too, maybe you can find useful to fire a #PreUpdate method before persistance [e.g. generating thumbnails].
Related
I have this unique requirement where each time a particular Content Fragment is updated in AEM, all the Experience Fragments referencing that particular Content Fragment need to be automatically exported to Adobe Target.
Thinking about using SQL2 query to retrieve XFs referencing a particular CF and then incorporating this into a workflow process. Also, wondering if I can leverage aem OOTB workflow process called "Export to Target" in this.
Not really sure of how to call this "Export to Target" process on each Experience Fragment that we need to export to Target or is this possible at all?
Wondering if anyone has ever come across this requirement and succeeded.
Highly appreciate any tips or suggestions in this regard. Many Thanks in advance.
Whenever a Content Fragment is created or updated an OSGi event is triggered. All events are logged under http://localhost:4502/system/console/events. You could write a EventListener or EventHandler, get the path of the event, get the Resource and adapt it to com.adobe.cq.dam.cfm.ContentFragment. The topic for these events is "com/day/cq/dam" or in this constant.
From the adapted Class or Resource you can get informations about the model and if it's the model you want to process.
To find all references I would also create an oak index and use SQL2 query to find all references.
The query would be something like this:
select [jcr:path], [jcr:score], * from [nt:base] as a where contains(*, '"/content/dam/myReferencedModel"')
If you have all referencing XF's you can kick off any workflow via WorkflowService:
#Reference
private WorkflowService workflowService;
WorkflowSession wfSession = workflowService.getWorkflowSession(session);
WorkflowModel wfModel = wfSession.getModel("/var/workflow/models/mymodel");
WorkflowData wfData = wfSession.newWorkflowData("JCR_PATH", "/payload");
wfSession.startWorkflow(wfModel, wfData);
I have a form with a file upload field (limited to PDF format only). After the form has been submitted and a valid uploaded file is present, I rename and move the file. Then I try to display the page with the form again - at which point the Symfony\Component\HttpFoundation\File\File class constructor throws a FileNotFoundException because, for some reason, it's been handed the path to the no-longer-existing temp file.
Relevant facts:
The form fields (including the file upload field) are not mapped to an entity because I'm using a Wordpress-style "meta" table for the additional data associated with the entity; the controller handles creating any new objects to be persisted.
The new filename is successfully saved to the database; I've further verified that the error occurs with the $form->createView() call.
Before anyone suggests that this was due to my PHP settings, note that I purposefully tried uploading a PDF file with a size below the limit.
My temporary solution is to redirect to another page (there is no error when I do this), but longterm it's much more ideal to still show the form page after submitting the form.
I tried overwriting the UploadedFile object I got from the form with the File object returned by the ->move() method, but this didn't help.
I also tried creating a child class of the built-in FileType class and changing the data_class to be SplFileInfo (the parent class of Symfony's File class) b/c SplFileInfo doesn't throw an exception over an invalid path, but this had no effect either.
My interpretetation of what's happening is that, for some reason, when you create a form view Symfony instantiates a new File object for the file upload field using the old, temporary file path - thus resulting in a fatal error that absolutely shouldn't be happening (because when would you ever not be moving the uploaded file?).
Any suggestions for things to try that I haven't thought of yet would be much appreciated!
Solved
I figured out the problem by looking at the stack trace - I have a Twig extension that was calling Request::createFromGlobals(), and THAT resulted in trying to create a new UploadedFile object with the no-longer-existing temp file path. Having the extension get the already-existing Request object in its constructor should prevent this.
I have several entities, each with its form type. I want to be able, instead of saving the entity straight away on save, to save a copy of the changes we want to perform and store it in DB.
We'd send a message to the user who can approve the change, who will review the original and the changed field(s) and will approve or not. If approved the entity would be properly flushed.
To solve the issue I was thinking about:
1) doing a persist
2) getting the changesets (both the one related to "normal" fields, and the one relative to collections)
3) storing it in DB
4) Performing $em->refresh() to discard changes.
Later what I need is to get the changset(s) back, ask the (other) user to approve it and flush it.
Is this doable? What I'm especially concerned about is that the entity manager that generated the first changeset is not the same we are going to use to perform the flush, I basically need to "load" a changeset.
Any idea on how to solve the issue (this way, or another way ;) )
Another solution (working only for "normal" fields, not reference ones that come from other entities to the current one, like a many to many) would be to clone the current entity, store it, and then once approved copy the field(s) from the cloned to the original one. But it does not work for all fields (if the previous solution does not work we'd limit the feature just to "normal" fields).
Thank you!
SN
Well, you could just treat the modifications as entities themselves, so that every change is stored in the database, and then all the changes that were approved are executed against the entity.
So, for example, if you have some Books stored in the database, and you want to make sure that all the modifications made to these are approved, just add a model that would contain the changeset that has to be processed, and a handler that would apply these changes:
<?php
class UpdateBookCommand
{
// If you'll store these commands in a database, perhaps this field would be a relation,
// or you could just store the ID
public $bookId;
public $newTitle;
public $newAuthor;
// Perhaps this field should be somehow protected from unauthorized changes
public $isApproved;
}
class UpdateBookHandler
{
private $bookRepository;
private $em;
public function handle(UpdateBookCommand $command)
{
if (!$command->isApproved) {
throw new NotAuthorizedException();
}
$book = $this->bookRepository->find($command->bookId);
$book->setTitle($command->newTitle);
$book->setAuthor($command->newAuthor);
$this->em->persist($book);
$this->em->flush();
}
}
Next, in your controller you would just have to make sure that the commands are somehow stored (in a database or maybe even in a message queue), and the handler gets called when the changesets could possibly get applied.
P.S. Perhaps I could have explained this a bit better, but mostly the inspiration for this solution comes from the CQRS pattern that's explained quite well by Martin Fowler. However, I guess in your case a full-blown CQRS implementation is unnecessary and a simpler solution should work.
I'm developing an Action in Drupal which is supposed to activate after saving a node, exporting content to XML (which includes data from the node that was just saved), using the "Trigger: After saving an updated post" trigger.
Unfortunately this action actually happens right before the information from the recently saved post is saved to the database. ie. when looking at the XML later, I find that the most recent change I made was not included. Saving after editing a different node will restore the previously missing data.
How can I get my action to fire after the saving process is complete?
There is a common pitfall in this context, regardless of whether you use a trigger or Mike Munroes suggestion via hook_nodeapi() (+1):
As long as your export logic runs on the same page cycle that processed the update, and it uses node_load() to get the nodes data, node_load()might return a statically cached version of the node from before the update that does not contain the changes yet. If this is the problem in your case, you can work around it in two ways:
Force a reset of the static node cache by passing TRUE as the third parameter to node_load(). This would ensure that the node gets populated freshly from the database (at the price of some additional db queries, so be aware of a potential performance impact).
If you are going the hook_nodeapi() route, you could avoid the need to call node_load() altogether if you pass the $node object available there directly to your export function, as it will be a representation of the updated state.
You should use hook_nodeapi and invoke your action on insert and update. Look over the documenation for hook_nodeapi for other instances where you could call your export logic.
example where module name = 'export_to_xml':
/**
* Implementation of hook_nodeapi().
*/
function export_to_xml_nodeapi(&$node, $op, $a3, $a4) {
if ($op == 'update' || $op == 'insert') {
export_logic_function();
}
}
I'm building a Drupal based site that requires the communication of a node ID to a seperate web service. This web service handles the uploading of files to a seperate server (from the one Drupal is on).
This creates a problem where in if I create a new node, the Node ID is not generated until the form is submitted - meaning I can't attach the files until I save the node and open it back up to edit it. I'd like to remove that step.
Is it possible to create a two step node creation process where the basics of the node are submitted and saved, and then the form re-directs to step two where I can attach the files?
I'd also consider an AJAX enabled node submission form - but that seems to add even more complexity to the situation.
Any advice, examples will be appreciated!
you could do this with a multi-step form. see http://pingv.com/blog/ben-jeavons/2009/multi-step-forms-drupal-6-using-variable-functions for the canonical way to do this (besides the code, also check the comments).
you could also do it by adding a second submit handler to the form. the first, default one (node_form_submit) saves your node (including the attached file) the standard Drupal way. the second handler could upload the file to the separate server, do upload error checking, delete the file from the Drupal DB, etc. you can add an additional submit handler to a Drupal 6 form by adding it to the form's #submit property, either in the form definition or via hook_form_alter / hook_form_FORM_ID_alter.
Depending on what exactly you want to do, you might use hook_nodeapi on its 'insert' operation. It is fired after successful node creation, so the node object will contain the newly assigned nid there already.
NOTE: The wording of the API documentation is a bit ambiguous concerning the 'insert' and 'update' operations:
"insert": The node is being created
(inserted in the database).
This sounds like it is right in the middle of the process, whereas the node has already been created at this point.
I guess the node_save function can help you.
I ran into exactly this same issue and did it the wrong way. I added the hook myself.
http://drupal.org/node/313389