Sonata Media: Change context programmatically - symfony

I'm writing a little blog app where the user can publish public and private news. Users can attach files to these news. I have two contexts for this app: public_news, with files which can be accessed by everyone; and private_news, with files which can only be accessed if the user has log on.
I want to be able to move files from the public_news context to the private_news context when the user changes a news from public to private, and vice versa.
I was hoping to do something as simple as $media->setContext('private_news');, but this won't move the physical file from one directory to the other.

What do you think about recreating this media?
$oldMedia = getYourOldMedia();
// $media = clone($oldMedia); # For me it didn't work as expected
# YMMV - I didn't spend lots wondering about that
$media = new Media();
// This will work fine with image and file provider,
// but it was not tested with other providers
$pool = $container->get('sonata.media.pool');
$provider = $pool->getProvider($oldMedia->getProviderName());
$media->setBinaryContent($provider->getReferenceFile($oldMedia));
}
$media->setProviderName($oldMedia->getProviderName());
$media->setContext('private_news');
/* copy any other data you're interested in */
$mediaManager->save($media);
$mediaManager->delete($oldMedia);
$mediaManager->delete might not delete your physical files depending on provider, you might want to create your own provider if you wish to do so.
Edit:
On further research I found out that you can manualy delete your files before deleting old media:
if ($pool->getFilesystem()->has($path)) {
$pool->getFilesystem()->delete($path);
}
But don't do that before saving your new media entity.

Related

List of permissions for Drupal8 routing file

I'm working on custom Drupal8 module. My module uses this routing file:
kalvis.routing.yml
kalvis.content:
path: '/kalvis/{from}/{to}'
defaults:
_controller: '\Drupal\kalvis\Controller\kalvisController::content'
_title: ''
requirements:
_permission: 'access content'
What does _permission part stand for and where can I find a list of all possible values for this parameter?(in tut's I've watched were used only access content and access administrative content but I suppose there is a lot more of them)
PS: I'm using Drupal 8 beta 10 installed on WAMP
If you want to see a list of all permission, the code below should work. work. If you are coding your own module you can define your own permissions and test if a user has a role with that permission.
function my_module_page_attachments_alter(array &$attachments) {
$perms = array_keys(\Drupal::service('user.permissions')->getPermissions());
}
To answer the question what is the _permission part of the routing structure. Here is a quote from the drupal docs about what it does.
_permission: A permission string (e.g., _permission: 'access content'). You can specify multiple permissions by separating them with ',' (comma) (e.g., _permission: 'access content,access user profiles') for AND logic or '+' (plus) for OR logic (e.g., _permission: 'access content+access user profiles' means a visitor needs either the access content permission or the access user profiles permission to view the page. Having both is fine, too.). Module-specific permission strings can be defined in my_module_name.permissions.yml. See hook_permission() replaced with permissions defined in a my_module_name.permissions.yml file for details.
source: https://www.drupal.org/docs/drupal-apis/routing-system/structure-of-routes
To put it simply this restricts access to this route by only allowing users with the specified permission(s) to access it. To use it you need to know the system name of the permission(s) you want to use to restrict access. Then you just place then as a string behind this paramerter. Like in the quote above. You can choose to use multiple permissions by separating them with , for AND logic or + for OR logic. Permissions system names are allowed to have spaces in them and frequently do.
I don't think there is any way to directly see it in ui if you are talking about the system names of the permissions. You can ofcource see all permissions on www.site.com/admin/people/permissions. If you are in a hurry and/or looking for a specific permission you can always look through the module.permissions.yml file of the module this permission is defined in.
If you do want to see all permissions you can make your own list of all the system names.
You can use the PermissionHandler service from the core module.
This does the following gets all yaml's and creates a list.
You would call this by calling Drupal::service('user.permissions')->getPermissions() (https://api.drupal.org/api/drupal/core%21modules%21user%21src%21PermissionHandler.php/function/PermissionHandler%3A%3AgetPermissions/8.2.x)
You can use or try to write similar code to the functionality of the user_role_permissions function from the user.module file in drupal core. It looks like this:
function user_role_permissions(array $roles) {
if (defined('MAINTENANCE_MODE') && MAINTENANCE_MODE == 'update') {
return _user_role_permissions_update($roles);
}
$entities = Role::loadMultiple($roles);
$role_permissions = array();
foreach ($roles as $rid) {
$role_permissions[$rid] = isset($entities[$rid]) ? $entities[$rid]
->getPermissions() : array();
}
return $role_permissions;
}
This code as you can see just loads all the role entities with loadMultiple (although technically you should use the entitytypemanager to load the entities whenever possible like $entities = \Drupal::entityTypeManager()->getStorage($entity_type)->loadMultiple([1, 2, 3]); for more information see the drupal entity api (https://www.drupal.org/docs/drupal-apis/entity-api/working-with-the-entity-api)).
After loading all the roles it makes a list of all permissions.
Source information below. This should stay up to date because drupal keeps their documentation versioned. But because comments suggested it I figured I might as well write it out to save you some clicks.
Original drupal documentation.
https://api.drupal.org/api/drupal/core!modules!user!user.module/function/user_role_permissions/8.2.x
Hope this helps! :)
You can confirm in the page '/admin/people/permissions'.
A quick and dirty way to see them is to create a View with a Page display. Then in the 'Access' section, ensure 'Permission' is selected and open up the options as if you were going to choose a different permission.
You can now inspect the HTML of the <select> element, the Ids of each option is the correct name for each permission:

Laravel/blade caching css files

I am working on Nginx server, with PHP-FPM. I installed Laravel 4.1 and bootstrap v3.1.1., and here is the problem. For the last 30 minutes, I have been trying to change a css rule that I first declared to check boostrap.
.jumbotron{
background: red;
}
The first time it worked. The jumbotron container was red. So, I removed that css value and started working, but still no matter which browse I use, the container is red. I even checked the css file through the Google Chromes inspection tool, and it is showing me that first value when jumbotron had a background:red. I deleted the css file and renamed it and add new styles, I configured chrome not to cache pages. But Still the same value. I'm convinced now, that Laravel has kept a cache of the first style declaration.
Is there any way to disable this at all?
General explanation
When you access a Laravel Blade view, it will generate it to a temporary file so it doesn't have to process the Blade syntax every time you access to a view. These files are stored in app/storage/view with a filename that is the MD5 hash of the file path.
Usually when you change a view, Laravel regenerate these files automatically at the next view access and everything goes on. This is done by comparing the modification times of the generated file and the view's source file through the filemtime() function. Probably in your case there was a problem and the temporary file wasn't regenerated. In this case, you have to delete these files, so they can be regenerated. It doesn't harm anything, because they are autogenerated from your views and can be regenerated anytime. They are only for cache purposes.
Normally, they should be refreshed automatically, but you can delete these files anytime if they get stuck and you have problems like these, but as I said these should be just rare exceptions.
Code break down
All the following codes are from laravel/framerok/src/Illuminate/View/. I added some extra comments to the originals.
Get view
Starting from Engines/CompilerEngine.php we have the main code we need to understand the mechanics.
public function get($path, array $data = array())
{
// Push the path to the stack of the last compiled templates.
$this->lastCompiled[] = $path;
// If this given view has expired, which means it has simply been edited since
// it was last compiled, we will re-compile the views so we can evaluate a
// fresh copy of the view. We'll pass the compiler the path of the view.
if ($this->compiler->isExpired($path))
{
$this->compiler->compile($path);
}
// Return the MD5 hash of the path concatenated
// to the app's view storage folder path.
$compiled = $this->compiler->getCompiledPath($path);
// Once we have the path to the compiled file, we will evaluate the paths with
// typical PHP just like any other templates. We also keep a stack of views
// which have been rendered for right exception messages to be generated.
$results = $this->evaluatePath($compiled, $data);
// Remove last compiled path.
array_pop($this->lastCompiled);
return $results;
}
Check if regeneration required
This will be done in Compilers/Compiler.php. This is an important function. Depending on the result it will be decided whether the view should be recompiled. If this returns false instead of true that can be a reason for views not being regenerated.
public function isExpired($path)
{
$compiled = $this->getCompiledPath($path);
// If the compiled file doesn't exist we will indicate that the view is expired
// so that it can be re-compiled. Else, we will verify the last modification
// of the views is less than the modification times of the compiled views.
if ( ! $this->cachePath || ! $this->files->exists($compiled))
{
return true;
}
$lastModified = $this->files->lastModified($path);
return $lastModified >= $this->files->lastModified($compiled);
}
Regenerate view
If the view is expired it will be regenerated. In Compilers\BladeCompiler.php we see that the compiler will loop through all Blade keywords and finally give back a string that contains the compiled PHP code. Then it will check if the view storage path is set and save the file there with a filename that is the MD5 hash of the view's filename.
public function compile($path)
{
$contents = $this->compileString($this->files->get($path));
if ( ! is_null($this->cachePath))
{
$this->files->put($this->getCompiledPath($path), $contents);
}
}
Evaluate
Finally in Engines/PhpEngine.php the view is evaluated. It imports the data passed to the view with extract() and include the file with the passed path in a try and catch all exceptions with handleViewException() that throws the exception again. There are some output buffering too.
Same issue here. I am using VirtualBox with Shared Folders pointing to my document root.
This pointed me in the right direction:
https://stackoverflow.com/a/26583609/1036602
Which led me to this:
http://www.danhart.co.uk/blog/vagrant-virtualbox-modified-files-not-updating-via-nginx-apache
and this:
https://forums.virtualbox.org/viewtopic.php?f=1&t=24905
If you're mounting your local dev root via vboxsf Shared Folders, set EnableSendFile Off in your apache2.conf (or sendfile off if using Nginx).
For what it's worth and because this answer came up first in my google search...
I had the same problem. The CSS and JS files wouldn't update. Deleting the cache files didn't work. The timestamps were not the problem. The only way I could update them was to change the filename, load it directly to get the 404 error, and then change the name back to the original name.
In the end the problem was not related to Laravel or the browser cache at all. The problem was due to NginX using sendfile which doesn't work with remote file systems. In my case, I was using VirtualBox for the OS and the remote file system was vboxsf through Guest Additions.
I hope this saves someone else some time.
In Laravel 5.8+ you can use so:
The version method will automatically append a unique hash to the filenames of all compiled files, allowing for more convenient cache busting:
mix.js('resources/js/app.js', 'public/js').version();
After generating the versioned file, you won't know the exact file name. So, you should use Laravel's global mix function within your views to load the appropriately hashed asset. The mix function will automatically determine the current name of the hashed file:
<script src="{{ mix('/js/app.js') }}"></script>
full document: https://laravel.com/docs/5.8/mix

Prevent dropping of users when publishing a DACPAC using SqlPackage.exe

Is there any way of preventing users being dropped when publishing a DACPAC using SqlPackage.exe, other than changing the setting below, which prevents all objects from being dropped if they're not in the DACPAC.
<DropObjectsNotInSource>True</DropObjectsNotInSource>
We deploy to a number of environments, each with different users. Current workarounds are to either:
Script the users for each environment to recreate them after deploying
Use /Action:Script and manually change the deployment script.
Neither of these are ideal though...
Use SqlPackage.exe parameters (since February 2015 release: New Advanced Publish Options to Specify Object Types to Exclude or Not Drop):
Here's the actual parameters we use in our deployment:
/p:DropObjectsNotInSource=True
/p:ExcludeObjectTypes=Users;Logins;RoleMembership;Permissions
The first line cleans all, but the next line further refines what not to drop. This combination proved the most effective with us to drop all unnecessary objects, yet retain the login mappings as they were.
Detailed documentation of all the parameters and their possible values can be found from MSDN - SqlPackage.exe
I ran into the same issue and used Pre/Post deployment scripts to reinsert users, permissions, roles, etc like the suggested blog post. However this became unmaintainable in the long run (users unable to authenticate during deployment, if the deployment fails permissions are not restored, security changes require going through source control and re-deployment).
Recently, I reevaluated the problem as we were migrating our deployment platform. With the DacFx API (and bug fixes) released, I was able to extend the deployment process in SSDT by creating a DeploymentPlanModifier. They provide an example for filtering objects on creation, with simple modifications I filter any drops for permission based object types (using /p:AdditionalDeploymentContributors argument).
[ExportDeploymentPlanModifier( UserMappingFilter.PlanFiltererContributorId, "1.0.0.0" )]
public class UserMappingFilter : DeploymentPlanModifier
{
public const string PlanFiltererContributorId = "Dac.UserMappingFilter";
protected override void OnExecute( DeploymentPlanContributorContext context )
{
DeploymentStep next = context.PlanHandle.Head;
while( next != null )
{
DeploymentStep current = next;
next = current.Next;
DropElementStep dropStep = current as DropElementStep;
if( dropStep != null && ShouldFilter( dropStep ) )
{
base.Remove( context.PlanHandle, dropStep );
}
}
}
private bool ShouldFilter( DropElementStep createStep )
{
TSqlObject target = createStep.TargetElement;
if( target.ObjectType.Name == "RoleMembership" || target.ObjectType.Name == "User" || target.ObjectType.Name == "Role" )
{
return true;
}
return false;
}
}
We handle this in post-deploy scripts. It's a bit harder to set up, but once set up allows you to configure a slightly different script for each environment. We use this in conjunction with Publish Profiles with a different profile per environment. Basically, you use Powershell to generate a bunch of scripts for users and permissions, add those scripts to your project(s), and then Include the files in the project. Add what is referred to in the blog post as "SecurityAdditionsWrapper.sql" to your post-deploy script, and you should be good. Just remove the other security from your project to ensure that it's set correctly.
http://schottsql.blogspot.com/2013/05/ssdt-setting-different-permissions-per.html
There are also options in SSDT for:
"Drop Permissions not in source" - False
"Drop role members not defined in source" - False
"Ignore permissions" - True
"Ignore role membership" - True
We use those, but if you need better control over your users/permissions by environment, I'd strongly recommend checking out that blog post. (With thanks to Jamie Thomson for the original idea.)

ASP.NET creating resources at runtime

I'm developing an ASP.NET webapp that has a multilanguage feature allowing the webmaster to create new languages at runtime.
The approach that I was thinking is the following:
The user selects one available (not created) language.
When the user confirms, the application automatically copies a set of existing resources, replacing the filename with the new culture. For example: default.aspx.en-us.resx to default.aspx.es-ar.resx.
The user edits the recently created resources.
Currently I'm having troubles with step number 2. I've achieved to copy the resources, but then these new resources are ignored. I think that this happens because the new resources are not included in the running assembly, and therefore are being ignored.
When I test the following code in my local project, I would have to manually add the new resources to the solution and then recompile to make it work.
Does anyone know how to make this work?
This is the code of the mentioned copy.
string _dir = path_ + "App_LocalResources\\\\";
DirectoryInfo _dirInfo = new DirectoryInfo(_dir);
foreach (FileInfo _file in _dirInfo.GetFiles("*en-us.resx")) {
_file.CopyTo(_dir + _file.Name.Replace("en-us", idioma_.Cultura));
}
string _dir2 = path_ + "App_GlobalResources\\\\";
_dirInfo = new DirectoryInfo(_dir2);
foreach (FileInfo _file in _dirInfo.GetFiles("*en-us.resx")) {
_file.CopyTo(_dir2 + _file.Name.Replace("en-us", idioma_.Cultura));
}
Thank you very much.
Creating or editing Resource files is not possible the same way as reading data.
In order to create or edit a resource file, you should do it the same way you create or edit XML files because resource files have with a specific structured XML elements.
Maybe this article will help you...

ASP.Net How to access images from different applications

I have 2 different project. One is supposed to upload images (admin) and the other is supposed to show them.
I was writing something like "/Contents/images/image path"... But wait! I will I upload the images from the application into that address?
Any help and suggestions please.
If you have two applications that will interact with the same files, it's probably better to have an ImageController with an action that allows you to upload/download the image rather than storing them directly as content. That way both applications can reference the same file location or images stored in a database and manipulate them. Your download action would simply use a FileContentResult to deliver the bytes from the file. You can derive the content type from the file extension.
Example using a database. Note that I assume that the database table contains the content type as determined at upload time. You could also use a hybrid approach that stores the image metadata in a database and loads the actual file from a file store.
public class ImageController : Controller
{
public ActionResult Get( int id )
{
var context = new MyDataContext();
var image = context.Images.SingleOrDefault( i => i.ID == id );
if (image != null)
{
return File( image.Content, image.ContentType );
}
// or you could return a placeholder image here if appropriate.
throw new HttpException( 404, "The image does not exist" );
}
}
An alternative would be to incorporate your administrative interface in an area of the same application rather than in a separate project. This way you could reuse the content/images directory if you wanted. I find that when you have dynamic images the database or a hybrid approach works better from a programming perspective since it's more consistent with the rest of your data model.
you could try like this..
Let's assume that all of your images are in Project A and you want to use the same images in Project B.
Open Project B with Visual Studio. In the Solution Explorer, right click your Project Name and select "Add Existing Item...".
Browse to the physical location on disc where your images in Project A are stored and select the files that you want to import.
You'll then be able to access those images from project A in Project B.

Resources