Im wondering what the best practice for caching entity / values is in symfony2?
For example if i have a entity $entity, with a method ->getLongCalculation(), i want to store that result rather than have to work it out every time its called.
Obviously its very easy just to store the result in APC using the entity ID but the problem i have is that that has to be done outside of the entity, as it doesnt have access to the container.
So for example i end up with code repeated everywhere in my controllers where i call the getter, checking if we already know the value.
Is there any clean way for me to just be able to call $entity->getLongCalculation() and the function itself handle any caching? Obviously i can just pass the container into the entity but i understand thats not best practice?
Whats the best way to go about this?
Thank you.
Is it acceptable for you to store in database the result of longCalculation ?
In this case ( i used it for my needs ) you can use lifeCycle events ( prePersist & preUpdate) to populate your result in entity
http://symfony.com/fr/doc/current/book/doctrine.html
http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/events.html#lifecycle-events
In my opinion,
Yes, saving the results of calculations in database - in the same table or as a relation, it doesn't matter. It's better than cache, because it changes only when the entity is changing. Then you can use Doctrine cache system - it's quite simple and needs only to set the cache method on the repository or query (setResultCacheMethod or sth like that).
You can achieve the best performance with Doctrine cache (sql and results), redis for session, proper database architecture which avoids long time calculations and Varnish with ESI (reverse proxy cache).
At the and of the work you can finish with application at least 1000% faster than in the beginning.
Regards,
Related
OK, one may argue that there shouldn't be business logic in entity. But sometimes there is a good reason. For example when getting roles for a user, one may force a role to be returned by default from within getter method, for example as explained here.
Anyway, my question is in many documentation pages it is either done in the function which creates/updates user or with doctrine listener. For example as mentioned here.
Doing it manually every time is extra work, meanwhile using doctrine listener seems inefficient for something which will be used rarely.
So, I was wondering, why not encode the password within setPlainPassword() function in entity? always one needs to encode the password after calling this method anyway.
Next part of the question is how to access the encoder from inside the User entity?
Thanks!
What is the best practice for using the Cache Component in Symfony 3?
Simple example:
If I call getCategoryById (In Repository) from different places (Controller, FormType, Twig function, Listener, ...), how can I verify that the data is cached or not?
Problem:
I can't call Cache Component in Repository and I don't want to write and duplicate the same code in every place ( isHint ... ).
Question:
So what is the best practice? Create an intermediate cache service between all components and the Repository?
Thank you very much :)
I have an Article and Comment entity (oneToMany)
On Comment prepersist lifecycle event I would like to count how many comments there are for this article and update Article's comment_count field.
If I understand symfony2 approach correctly, I need to write a service for this. Let's call it CommentCountManager.
My question is: How exactly do I make container available in entity so that I can get the CommentCountManager and triger the funciton that count comment for given article, and how do I access Doctrine's entity manager in my CommentCountManager so that I can actually run queries there?
Am I on the right path?
Your help is greatly appreciated.
You don't need to store the comments count in a separate column — you can count them on output. What you are trying to do is denormalization and I recommend avoiding it unless you absolutely need it for performance reasons — and only when you are sure that that part is causing the problems. But even then, query optimization and caching are much better alternatives to denormalization.
Making entities aware of the container is a bad idea too. If you need this, then you are doing something wrong.
To access an entity manager in a service, you need to inject it.
I am trying to find a specific key within the current Cache.
Problem is, my key in the cache are composite, and I would like to run like a Linq Where expression.
Is this possible? if so - how? does it reduce performance on the server?
Thanks
The whole idea behind a key is that it enables direct lookup of the item. If you have to scan all the items in the cache to find what you're looking for that's not going to perform very well at all. If you're using AppFabric Caching you can "tag" similar items with the same tag and then pull back all the items from the cache with that "tag" with a single call, but there is no such concept in the built in standard ASP.NET caching classes.
I am struggling with the following use-case:
User amends an existing order. The order is complex - lots of related 'entities' (addresses, post options, suppliers, makes, models, various items etc). Across multiple http posts.
User wants to discard the changes.
--
I have an order entity and as the user is editing this I am making various changes to the entity associations e.g changing order.address, order.items.add(item)...
In a single post this is fine, but across posts I don't know how best store state. If I store the entities then I cannot save the changes as they are across different data contexts. I have read that it is bad practice to store the data context in the session state i.e. long-lived context. I can't save changes after each edit/post because I cannot roll-back (?). I really would like to work with the entities during the editing process rather than one big save at the end (taking UI settings and applying these in one chunk).
This must be a pretty common problem - it's driving me mad. Any help really appreciated.
Cheers!
We have a similar problem where we are building a complex business object through a multi-page wizard.
Instead of creating a partially complete business object at each step of the wizard, we create a dedicated wizard object that looks pretty similar to the business object, populate that through the wizard. At each step in the wizard, the wizard object is saved into the database. At the end the user can accept it and it is converted to a real business object and then becomes visible to everyone else, or they can bin it and no-one else ever knows it existed.
If this kind of approach was not suitable, I suspect you're looking at some kind of difference tracking, either at the entity or database levels. Neither are simple to implement, work with or manage in a system. The former would be some kind of calculation and storage of n changes to the entities and developing an algorithm to undo them, the latter depends on your RDBMS, but might include versioned rows or similar.
Yes its pretty much common for us. In most scenarios we use the MVC approach. Even without the actual ASP .NET MVC Projects, we use similar ViewModel with our Views/Pages/Scenarios etc. where there is no direct/single entity mapping to the Business Layer (in other words, Business.Entities). This is pretty much similar to DTOs.
It is always easy to use Disconnected EF. We retrieve data and discard the context, then transform the Entities into ViewModels/DTOs if necessary. When you need to persist changes, all you have to do is to create a new context, find the latest entity instance do the changes.
The Views/Pages/Controllers will be managing these ViewModels/DTOs. Tracking Changed and Deleted content can be done by introducing a HistoryList<T> (you can extend a List<T> to implement this).
Once done, using a Controller/Workflow/Component you can observe the ViewModel/DTO and do the necessary changes to your Entities using a new Context to retrieve and persist.
It involves a bit of a coding and I would say its not a perfect solution since it has its own pros and cons.
/KP