Drupal rules php action - drupal

I am experimenting with the Rules module for the first time and I am attempting to redirect my users with some simple php code as below:
drupal_set_message('testing');
drupal_goto('node/3');
The first line of code executes but the second, which should direct my users to node/3, does not have the desired effect.
How can I get this redirecting feature working?

It's most likely because you have ?destination=some/path in the page URL, these lines in drupal_goto() cause any path you pass to the function to be overwritten by whatever's in the URL:
if (isset($_GET['destination']) && !url_is_external($_GET['destination'])) {
$destination = drupal_parse_url($_GET['destination']);
$path = $destination['path'];
// ...
You can probably get around it simply by changing your code to this:
if (isset($_GET['destination'])) {
unset($_GET['destination']);
}
drupal_goto('node/3');
If that doesn't work try adding this line before drupal_goto:
drupal_static_reset('drupal_get_destination');
which will reset the static cache for the drupal_get_destination() function, which also gets involved in this process at some point (I think).
If all else fails, go old school:
$path = 'node/3';
$options = array('absolute' => TRUE);
$url = url($path, $options);
$http_response_code = 302;
header('Location: ' . $url, TRUE, $http_response_code);
drupal_exit($url);
That's pretty much nicked straight from the drupal_goto() function itself and will definitely work.

Related

Wordpress Cron - Call External API - Save JSON File

i thought i would reach out to get some guidance on a little thing i am working on.
What i would like to do within Wordpress:
Call external API (with token header)
Get the results of the api and save it into a file in wpallimport's upload folder
I would assume i can just make a simple WP plugin and within the 'activate' hook for the plugin:
create a wp-cron (as i would like it to run every day) for the following:
$url = 'the-api-url';
$data = wp_remote_get( $url ,
array('headers' => array( 'Token' => 'tokenkey')
));
$jsonfile = $data['body'];
global $wp_filesystem;
if (empty($wp_filesystem)) {
require_once (ABSPATH . '/wp-admin/includes/file.php');
WP_Filesystem();
}
$file = '/wp-content/uploads/wpallimport/files/JSONFILE.JSON';
$wp_filesystem->put_contents($file, $jsonfile);
However i am not having success with the above (with the correct API url and token etc obviously)
Thanks in advance!

Fetch the content of url get stucks when using Cro or HTTP::UserAgent

I want to get the content of https://translate.google.cn, however, Cro::HTTP::Client and HTTP::UserAgent just stucks, and WWW get the content, I don't know why.
If I change the $url to https://perl6.org, all three modules works fine:
my $url = "https://translate.google.cn";
use Cro::HTTP::Client;
my $resp = await Cro::HTTP::Client.new(
headers => [
User-agent => 'Cro'
]
).get($url);
say await $resp.body-text();
use HTTP::UserAgent;
my $ua = HTTP::UserAgent.new;
$ua.timeout = 30;
my $response = $ua.get($url);
if $response.is-success {
say $response.content;
} else {
die $response.status-line;
}
)
use WWW;
say get($url)
Do I missed sonething? Thanks for suggestion for me.
For me HTTP::UserAgent works and Cro::HTTP::Client gets stuck. If you wish to debug things further both modules have a debug option:
perl6 -MHTTP::UserAgent -e 'my $ua = HTTP::UserAgent.new(:debug); say $ua.get("https://translate.google.cn").content'
CRO_TRACE=1 perl6 -MCro::HTTP::Client -e 'my $ua = Cro::HTTP::Client.new(); say $ua.get("https://translate.google.cn").result.body-text.result'
WWW also works for me. It is surprising it works for you since it is backed by HTTP::UserAgent ( which does not work for you ). Here is its get method to show you how it uses HTTP::UserAgent:
sub get ($url, *%headers) is export(:DEFAULT, :extras) {
CATCH { .fail }
%headers<User-Agent> //= 'Rakudo WWW';
with HTTP::UserAgent.new.get: $url, |%headers {
.is-success or fail .&err;
.decoded-content
}
}
This could be down to http2 on the problematic https sites. In fact what you are describing is pretty much what I raised in https://github.com/croservices/cro-http/issues/45.
A workaround until a fix is in is to try making requests using http/1.1
Cro::HTTP::Client.get('https://translate.google.cn', :http<1.1>);

Virtual Filesystem for PHPUnit tests in Laravel 5.4

i'm having a bit of a problem with my PHPUnit integration tests, i have a method which handles a form upload for a video file as well as a preview image for that video.
public function store($request)
{
/** #var Video $resource */
$resource = new $this->model;
// Create a new Content before creating the related Photo
$contentRepo = new ContentRepository();
$content = $contentRepo->store($request);
if($content->isValid()) {
$resource->content_id = $content->id;
$directory = 'frontend/videos/assets/'.date("Y").'/'.date('m').'/'.time();
\File::makeDirectory($directory, 0755, true);
$request->video->move($directory.'/', $request->video->getClientOriginalName());
$resource->video = '/'.$directory.'/'.$request->video->getClientOriginalName();
$request->preview_image->move($directory.'/', $request->preview_image->getClientOriginalName());
$resource->preview_image = '/'.$directory.'/'.$request->preview_image->getClientOriginalName();
$resource->highlighted = intval($request->input('highlighted') == 'on');
$resource->save();
return $resource;
}
else {
return $content;
}
}
The important part to keep is the $request->video->move() call which i probably need to replace in order to use Virtual Filesystem.
and then the test
public function testVideoUpload(){
File::put(__DIR__.'/frontend/videos/assets/image.mp4', 'test');
$file = new UploadedFile(__DIR__.'/frontend/videos/assets/image.mp4', 'foofile.mp4', 'video/mp4', 100023, null, $test=true);
File::put(__DIR__.'/frontend/images/assets/image.jpg', 'test');
$preview = new UploadedFile(__DIR__.'/frontend/images/assets/image.jpg', 'foofile.jpg', 'image/jpeg', 100023, null, $test=true);
$this->post('/admin/videos', [
'title' => 'My Video #12',
'description' => 'This is a description',
'actors' => [$this->actor->id, $this->actor2->id],
'scenes' => [$this->scene->id, $this->scene2->id],
'payment_methods' => [$this->paymentMethod->id],
'video' => $file,
'preview_image' => $preview
])->seeInDatabase('contents', [
'title' => 'My Video #12',
'description' => 'This is a description'
]);
}
As you can see, i need to create a dummy file in some local directory and then use that in the HTTP request to the form's endpoint, then after that, that file would be moved and i need to delete the created folder and the new moved file... it's an authentic mess.
As such i want to use Virtual Filesystem instead, but i have no idea how to set it up in this particular case, i've already downloaded a package and set it up, but the questions are, first, which package have you used/recommend and how would you tweak the class and the test to support the Virtual Filesystem? Would i need to switch over to using the Storage facade instead of the $request->video->move() call? If so how would that be done exactly?
Thank you in advance for your help
I couldn't figure out the VFS system, however i do have somewhat of an alternative that's still kinda messy but gets the job done.
Basically i set up two methods on my PHPUnit base class to setup and teardown the temp folders i need on any test that requires them, because i'm using Database Transactions the files get deleted on every test run and i need to create new dummy files every time i run the test.
So i have two methods setupTempDirectories and teardownTempDirectories which i will call at the beginning and at the end of each test that requires those temporary directories.
I put my temp files in the Storage directory because sometimes i run my tests individually through PHPStorm and the __DIR__ command gets messed up and points to different directories when i do that, i also tried __FILE__ with the same result, so i just resorted to using Laravel's storage_path instead and that works fine.
Then that leaves the problem of my concrete class which tries to move files around and create directories in the public folder for them... so in order to fix that i changed the code to use the Storage facade, then i Mock the Storage facade in my tests
So in my concrete class
$directory = 'frontend/videos/assets/'.date("Y").'/'.date('m').'/'.time();
Storage::makeDirectory($directory, 0755, true);
Storage::move($request->video, $directory . '/' . $request->video->getClientOriginalName());
$resource->video = '/'.$directory.'/'.$request->video->getClientOriginalName();
Storage::move($request->preview_image, $directory . '/' . $request->preview_image->getClientOriginalName());
$resource->preview_image = '/'.$directory.'/'.$request->preview_image->getClientOriginalName();
And then in my test i mock both the makeDirectory and the move methods like such
// Override the Storage facade with a Mock version so that we don't actually try to move files around...
Storage::shouldReceive('makeDirectory')->once()->andReturn(true);
Storage::shouldReceive('move')->twice()->andReturn(true);
That makes my tests work and does not actually leave files behind after it's done...i hope someone has a better solution but for the time being this is what i came up with.
I was actually trying to use VFS but it never worked out... i keep getting errors that the original file in the storage directory is not found even though it's right there...
I'm not even sure the Storage facade was using VFS in the background to begin with even though it should...

How to integrate Dropzonejs with wordpress media handler in frontend?

How can I integrate Dropzonejs file uploader library in wordpress front end just like the built in one and have the uploaded one available in my media library?
Dropzonejs is a very extensive javascript library that provides a lot of options to handle media uploading.
To integrate dropzonejs with wordpress the process is pretty straight forward. Assume the following piece of code is where you want to appear your uploader.
<div id="media-uploader" class="dropzone"></div>
<input type="hidden" name="media-ids" value="">
Having a class dropzone will automatically attach the dropzone event with the element. That will stop us from overriding default parameters. So we would like to disable the auto discover feature of the library.
// Disabling autoDiscover, otherwise Dropzone will try to attach twice.
Dropzone.autoDiscover = false;
Now we will use jQuery to bind our configuration with the element.
jQuery("#media-uploader").dropzone({
url: dropParam.upload,
acceptedFiles: 'image/*',
success: function (file, response) {
file.previewElement.classList.add("dz-success");
file['attachment_id'] = response; // push the id for future reference
var ids = jQuery('#media-ids').val() + ',' + response;
jQuery('#media-ids').val(ids);
},
error: function (file, response) {
file.previewElement.classList.add("dz-error");
},
// update the following section is for removing image from library
addRemoveLinks: true,
removedfile: function(file) {
var attachment_id = file.attachment_id;
jQuery.ajax({
type: 'POST',
url: dropParam.delete,
data: {
media_id : attachment_id
}
});
var _ref;
return (_ref = file.previewElement) != null ? _ref.parentNode.removeChild(file.previewElement) : void 0;
}
});
In the code above what we have done is we attached dropzone with our element with some parameters-
url - location where we want to send our files to upload. I'll initialize the variable later.
acceptedFiles - since we are only interested in uploading images, we will limit the files to be attached only to images. You can find about more in the website of this library.
success - a callback that is fired when the file/image is uploaded successfully. It accepts two parameter the reference of the uploaded file itself and the response from the server. This is very important, here we stored the attachment id in our form. You can perform a validation here prior to store the id.
error - if the file failed to upload then you can perform any task here.
addRemoveLinks - add the remove file link below the preview panel, you can style it with your css.
removedfile - handles the operation while you click on the remove file link for an image in the preview panel. In this function we sent an ajax call to our server to remove the image from the library
Of course there are a lot of option available, but I found these are the most basic parameters I required to setup my drag-n-drop media uploader.
Now the most important thing is to decide about the file uploader url. You can have a custom file where you would want to process the operation. But I found another way.
From this question and the answer I found using admin-post.php file is pretty amazing.
Many people complained about this admin-post.php, so think sticking to the wp_ajax.php is the best option.
So I initialized the drophandler variable prior to my dropzone initialization as follows-
wp_enqueue_script('dropzone','path/to/dropzone', array('jquery'));
wp_enqueue_script('my-script','path/to/script',array('jquery','dropzone'));
$drop_param = array(
'upload'=>admin_url( 'admin-ajax.php?action=handle_dropped_media' ),
'delete'=>admin_url( 'admin-ajax.php?action=handle_deleted_media' ),
)
wp_localize_script('my-script','dropParam', $drop_param);
Now we are ready to send our images to the server. Here we will add some php code whether in the theme's function.php file or in our plugin file, but we need to be assured that it is loaded.
The following function will take care of the uploading the image and saving as an attachment in the library.
add_action( 'wp_ajax_handle_dropped_media', 'handle_dropped_media' );
// if you want to allow your visitors of your website to upload files, be cautious.
add_action( 'wp_ajax_nopriv_handle_dropped_media', 'handle_dropped_media' );
function handle_dropped_media() {
status_header(200);
$upload_dir = wp_upload_dir();
$upload_path = $upload_dir['path'] . DIRECTORY_SEPARATOR;
$num_files = count($_FILES['file']['tmp_name']);
$newupload = 0;
if ( !empty($_FILES) ) {
$files = $_FILES;
foreach($files as $file) {
$newfile = array (
'name' => $file['name'],
'type' => $file['type'],
'tmp_name' => $file['tmp_name'],
'error' => $file['error'],
'size' => $file['size']
);
$_FILES = array('upload'=>$newfile);
foreach($_FILES as $file => $array) {
$newupload = media_handle_upload( $file, 0 );
}
}
}
echo $newupload;
die();
}
The following action take care of the deletion of the media element. Second parameter of wp_delete_attachment() function allows us to decide whether we want to trash the image or completely delete it. I wanted to delete it completely so passed true.
add_action( 'wp_ajax_handle_deleted_media', 'handle_deleted_media' );
function handle_deleted_media(){
if( isset($_REQUEST['media_id']) ){
$post_id = absint( $_REQUEST['media_id'] );
$status = wp_delete_attachment($post_id, true);
if( $status )
echo json_encode(array('status' => 'OK'));
else
echo json_encode(array('status' => 'FAILED'));
}
die();
}
This will return the attachment_id in the response and we'll get it in the success function. In the media_handle_upload( $file, 0 ); I passed the reference of the file and a 0 because I didn't wanted to assign the media with any post yet (0 for no post, but if you want to assign then pass the post ID here. More reference in the codex.)
This is all for uploading media in wordpress.
Note: I haven't completed the removing uploaded file part. I'll complete this in a moment.
UPDATE
The post is updated. Now we can remove uploaded media elements from the uploader container. Thanks to this question and the answer I could figure out the actual process.
Those who are having problems getting this to work for non-admin users; please use admin-ajax.php instead of admin-post.php.
I had faced a strange issue that admin-post.php would work for non-admin users on my local server; but my live server refused to let non-admins upload files. php would echo entire page instead of the echoed value.
I replaced admin-post.php with admin-ajax.php and uploads work super cool.
I hope this helps.
The solution added to this post is incorrect unless I've misunderstood the question. Basically the solution won't work for anyone who isn't logged in as an admin. It took me 30 minutes to work it out plus the solution for removing images doesn't delete it from the media library.

Must manually load PHP ActiveRecord models

I'm drawing a blank. I have code working locally (which is MAMP). When moving to a nginx ubuntu box (running php-fpm), for some reason, phpactiverecord is acting up.
I finally traced it down to this - All of my model classes, I have to load manually. If I add a require_once() underneath my code, then it works fine. If I don't, then I get errors like:
PHP Fatal Error: Class not found ... on the models I've created..
Does anyone have ANY idea what direction I could troubleshoot this in? I checked permissions to the models folder (which is not in the public root), echo'd out the path that is sent over to cfg->set_model_directory is correct, etc..
This sound like a nginx or php thing? I'm guessing nginx since this works on my MAMP?
Doesn't work:
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory(BASE_PATH . '/models');
$cfg->set_connections(
array(
'development' => 'mysql://blah:removed#localhost/com_dbname'
)
);
}
);
Works:
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory(BASE_PATH . '/models');
$cfg->set_connections(
array(
'development' => 'mysql://blah:removed#localhost/com_dbname'
)
);
}
);
require_once(BASE_PATH . '/models/model1.php');
require_once(BASE_PATH . '/models/model2.php');
Update
Adding in actual code to help identify issue:
require_once ('../lib/php-activerecord/ActiveRecord.php');
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory('/var/www/uc1/models');
$cfg->set_connections(
array(
'development' => 'mysql://test_usr:test_pwd#localhost/test_db'
)
);
}
);
require_once ('/var/www/uc1/models/ucurls.php'); //Name of model file. Must manually include to get this to work on my nginx server.
$_record = UCUrls::find_by_urlkey('example.com/123');
echo "urlkey=" . $_record->urlkey;
I solved this issue in windows adding a line in the file ActiveRecord.php,
in the function activerecord_autoload($class_name)
at the line 39 or 40
$file = "$root/$class_name.php";
//add this line
$file = strtolower($file);
Set trace in ActiveRecord.php too look where are ActiveRecord is searching for models.
But I think your issue is in filesystem - Mac OS X by default uses Case Insensitive filesystem, while Ubuntu's Case Sensitive filesystem.
So your model UCUrls should be in file /var/www/uc1/models/UCUrls.php, not in /var/www/uc1/models/ucurls.php

Resources