Must manually load PHP ActiveRecord models - nginx

I'm drawing a blank. I have code working locally (which is MAMP). When moving to a nginx ubuntu box (running php-fpm), for some reason, phpactiverecord is acting up.
I finally traced it down to this - All of my model classes, I have to load manually. If I add a require_once() underneath my code, then it works fine. If I don't, then I get errors like:
PHP Fatal Error: Class not found ... on the models I've created..
Does anyone have ANY idea what direction I could troubleshoot this in? I checked permissions to the models folder (which is not in the public root), echo'd out the path that is sent over to cfg->set_model_directory is correct, etc..
This sound like a nginx or php thing? I'm guessing nginx since this works on my MAMP?
Doesn't work:
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory(BASE_PATH . '/models');
$cfg->set_connections(
array(
'development' => 'mysql://blah:removed#localhost/com_dbname'
)
);
}
);
Works:
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory(BASE_PATH . '/models');
$cfg->set_connections(
array(
'development' => 'mysql://blah:removed#localhost/com_dbname'
)
);
}
);
require_once(BASE_PATH . '/models/model1.php');
require_once(BASE_PATH . '/models/model2.php');
Update
Adding in actual code to help identify issue:
require_once ('../lib/php-activerecord/ActiveRecord.php');
ActiveRecord\Config::initialize(
function ($cfg) {
$cfg->set_model_directory('/var/www/uc1/models');
$cfg->set_connections(
array(
'development' => 'mysql://test_usr:test_pwd#localhost/test_db'
)
);
}
);
require_once ('/var/www/uc1/models/ucurls.php'); //Name of model file. Must manually include to get this to work on my nginx server.
$_record = UCUrls::find_by_urlkey('example.com/123');
echo "urlkey=" . $_record->urlkey;

I solved this issue in windows adding a line in the file ActiveRecord.php,
in the function activerecord_autoload($class_name)
at the line 39 or 40
$file = "$root/$class_name.php";
//add this line
$file = strtolower($file);

Set trace in ActiveRecord.php too look where are ActiveRecord is searching for models.
But I think your issue is in filesystem - Mac OS X by default uses Case Insensitive filesystem, while Ubuntu's Case Sensitive filesystem.
So your model UCUrls should be in file /var/www/uc1/models/UCUrls.php, not in /var/www/uc1/models/ucurls.php

Related

Wordpress Cron - Call External API - Save JSON File

i thought i would reach out to get some guidance on a little thing i am working on.
What i would like to do within Wordpress:
Call external API (with token header)
Get the results of the api and save it into a file in wpallimport's upload folder
I would assume i can just make a simple WP plugin and within the 'activate' hook for the plugin:
create a wp-cron (as i would like it to run every day) for the following:
$url = 'the-api-url';
$data = wp_remote_get( $url ,
array('headers' => array( 'Token' => 'tokenkey')
));
$jsonfile = $data['body'];
global $wp_filesystem;
if (empty($wp_filesystem)) {
require_once (ABSPATH . '/wp-admin/includes/file.php');
WP_Filesystem();
}
$file = '/wp-content/uploads/wpallimport/files/JSONFILE.JSON';
$wp_filesystem->put_contents($file, $jsonfile);
However i am not having success with the above (with the correct API url and token etc obviously)
Thanks in advance!

wp_enqueue_style seems to fail for local resources

Or behaves in a different way, but I do not get it :( :( :(.
Let's see, I got this :
$stylesheets_path = get_template_directory_uri().'/core/css/';
$test_url= #fopen('https://maxcdn.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css','r');
if ( $test_url !== false ) {
echo "REMOTE:: $stylesheets_path";
wp_enqueue_style( 'font-awesome', '//maxcdn.bootstrapcdn.com/font-awesome/4.7.0/css/font-awesome.min.css', false );
} else {
echo "LOCAL:: {$stylesheets_path}font-awesome.min.css";
wp_enqueue_style('font-awesome', $stylesheets_path . 'font-awesome.min.css', false);
}
The point is, no matter how hard I try to make the remote option fails or simply use directly the local one, this last one does never loads at all :(.
Whenever I simply try to load the local option just removing all that code above just leaving
$stylesheets_path = get_template_directory_uri().'/core/css/';
wp_enqueue_style('font-awesome', $stylesheets_path . 'font-awesome.min.css', false);
it is useless and the css never loads. However, when I echo the stylesheets_path.'font-awesome.min.css' it shows a valid url where indeed the resource is located and properly loaded if I browse it.
The remote option loads seamlessly , but the local one no way !
Getting nuts :(

Virtual Filesystem for PHPUnit tests in Laravel 5.4

i'm having a bit of a problem with my PHPUnit integration tests, i have a method which handles a form upload for a video file as well as a preview image for that video.
public function store($request)
{
/** #var Video $resource */
$resource = new $this->model;
// Create a new Content before creating the related Photo
$contentRepo = new ContentRepository();
$content = $contentRepo->store($request);
if($content->isValid()) {
$resource->content_id = $content->id;
$directory = 'frontend/videos/assets/'.date("Y").'/'.date('m').'/'.time();
\File::makeDirectory($directory, 0755, true);
$request->video->move($directory.'/', $request->video->getClientOriginalName());
$resource->video = '/'.$directory.'/'.$request->video->getClientOriginalName();
$request->preview_image->move($directory.'/', $request->preview_image->getClientOriginalName());
$resource->preview_image = '/'.$directory.'/'.$request->preview_image->getClientOriginalName();
$resource->highlighted = intval($request->input('highlighted') == 'on');
$resource->save();
return $resource;
}
else {
return $content;
}
}
The important part to keep is the $request->video->move() call which i probably need to replace in order to use Virtual Filesystem.
and then the test
public function testVideoUpload(){
File::put(__DIR__.'/frontend/videos/assets/image.mp4', 'test');
$file = new UploadedFile(__DIR__.'/frontend/videos/assets/image.mp4', 'foofile.mp4', 'video/mp4', 100023, null, $test=true);
File::put(__DIR__.'/frontend/images/assets/image.jpg', 'test');
$preview = new UploadedFile(__DIR__.'/frontend/images/assets/image.jpg', 'foofile.jpg', 'image/jpeg', 100023, null, $test=true);
$this->post('/admin/videos', [
'title' => 'My Video #12',
'description' => 'This is a description',
'actors' => [$this->actor->id, $this->actor2->id],
'scenes' => [$this->scene->id, $this->scene2->id],
'payment_methods' => [$this->paymentMethod->id],
'video' => $file,
'preview_image' => $preview
])->seeInDatabase('contents', [
'title' => 'My Video #12',
'description' => 'This is a description'
]);
}
As you can see, i need to create a dummy file in some local directory and then use that in the HTTP request to the form's endpoint, then after that, that file would be moved and i need to delete the created folder and the new moved file... it's an authentic mess.
As such i want to use Virtual Filesystem instead, but i have no idea how to set it up in this particular case, i've already downloaded a package and set it up, but the questions are, first, which package have you used/recommend and how would you tweak the class and the test to support the Virtual Filesystem? Would i need to switch over to using the Storage facade instead of the $request->video->move() call? If so how would that be done exactly?
Thank you in advance for your help
I couldn't figure out the VFS system, however i do have somewhat of an alternative that's still kinda messy but gets the job done.
Basically i set up two methods on my PHPUnit base class to setup and teardown the temp folders i need on any test that requires them, because i'm using Database Transactions the files get deleted on every test run and i need to create new dummy files every time i run the test.
So i have two methods setupTempDirectories and teardownTempDirectories which i will call at the beginning and at the end of each test that requires those temporary directories.
I put my temp files in the Storage directory because sometimes i run my tests individually through PHPStorm and the __DIR__ command gets messed up and points to different directories when i do that, i also tried __FILE__ with the same result, so i just resorted to using Laravel's storage_path instead and that works fine.
Then that leaves the problem of my concrete class which tries to move files around and create directories in the public folder for them... so in order to fix that i changed the code to use the Storage facade, then i Mock the Storage facade in my tests
So in my concrete class
$directory = 'frontend/videos/assets/'.date("Y").'/'.date('m').'/'.time();
Storage::makeDirectory($directory, 0755, true);
Storage::move($request->video, $directory . '/' . $request->video->getClientOriginalName());
$resource->video = '/'.$directory.'/'.$request->video->getClientOriginalName();
Storage::move($request->preview_image, $directory . '/' . $request->preview_image->getClientOriginalName());
$resource->preview_image = '/'.$directory.'/'.$request->preview_image->getClientOriginalName();
And then in my test i mock both the makeDirectory and the move methods like such
// Override the Storage facade with a Mock version so that we don't actually try to move files around...
Storage::shouldReceive('makeDirectory')->once()->andReturn(true);
Storage::shouldReceive('move')->twice()->andReturn(true);
That makes my tests work and does not actually leave files behind after it's done...i hope someone has a better solution but for the time being this is what i came up with.
I was actually trying to use VFS but it never worked out... i keep getting errors that the original file in the storage directory is not found even though it's right there...
I'm not even sure the Storage facade was using VFS in the background to begin with even though it should...

Running more than one console commands in a controller Symfony 2

First of all I would like to thank you all for looking at my question. Here's my question.
I want to run three existing console commands in Symfony 2 from a controller. So I wrote three separate functions for that (see below). I managed to run 'doctrine:mapping:import' (find the code below) without any issue inside a controller. Next thing I wanted to do is generate entities based on the imported mapping files. I could not run the 'doctrine:generate:entities' command without shutting down and booting the kernel (which I think a bad idea, look at the code below). Without shutting down and booting the kernel it won't generate the entities for me. But after shutting down and booting the kernel it creates the entities for me (I am somewhat happy now). The next problem I am having is when I run the 'doctrine:generate:form' command (find the code below). When I run this code just after generating entities it say's 'Class 'THE NAME OF MY CLASS' does not exist'. This can't happen. Because I am running the form build command after generating the entities. Even I try searching for the class whether it actually there by accessing the file physically. And it is there. So I am totally stuck in here, I'd say.
Well, I know it's a lengthy question. If someone can tell what's causing generate entities to not to create entities without shutting down and booting the kernal and form builder command not to work, even the entity files are there, that would be really really appreciated. One thing I noticed though, this commands (3 functions) works fine when I run one at a time. But I want to call these 3 functions one after another. So mainly the problem occurred when I sequentially call these 3 functions.
Code to 'doctrine:mapping:import'
public function executeImportCommandAction($filter)
{
$kernel = $this->container->get('kernel');
$app = new Application($kernel);
$app->setAutoExit(false);
$input = new \Symfony\Component\Console\Input\ArrayInput(
array('command' => 'doctrine:mapping:import', 'bundle' => 'TESTClientBundle', '--filter'
=> $filter, 'mapping-type' => 'yml'));
$app->doRun($input, new \Symfony\Component\Console\Output\ConsoleOutput());
}
Code to 'doctrine:generate:entities'
public function executeBuildFormCommandActions($entity)
{
$kernel = $this->container->get('kernel');
$kernel->shutdown();
$kernel->boot();
$app = new Application($kernel);
$app->setAutoExit(false);
$input = new \Symfony\Component\Console\Input\ArrayInput(
array('command' => 'doctrine:generate:entities', 'name' => 'TESTClientBundle',
'--no-backup' => 'true'));
$app->doRun($input, new \Symfony\Component\Console\Output\ConsoleOutput());
}
Code to 'doctrine:generate:form'
public function executeBuildFormCommandActions($entity)
{
#$kernel = $this->container->get('kernel');
$app = new Application($kernel);
$app->setAutoExit(false);
$input = new \Symfony\Component\Console\Input\ArrayInput(
array('command' => 'doctrine:generate:form', 'entity' => 'TESTVClientBundle:'.$entity.''));
$app->doRun($input, new \Symfony\Component\Console\Output\ConsoleOutput());
}
Thanks a lot in advance.
Cheers!
Do you know there is an Process Component? http://symfony.com/doc/current/components/process.html
You can easily run a symfony command with it.
First of all I need to thank #vincecore to giving me the heads up regarding Symfony Process Component. I managed to workaround with Process component to achieve what I wanted to achieve. I assume kernel boot shutdown is not a proper approach even though it works well. However generate form did not work even after dealing with the kernel. This is the piece of code I found working when running all 'doctrine:mapping:import', 'doctrine:generate:entities' and 'doctrine:generate:form' console commands inside the controller. For the sake of clarity I'd like to illustrate code related to generating form.
public function executeBuildFormCommandActions($form_file)
{
$move_to_project = 'C:/xampp5.5.11/htdocs/proj_test/';
$commandline = "php app/console doctrine:generate:form TESTClientBundle:$form_file";
$form_type_file = $this->get('kernel')->getRootDir() . DIRECTORY_SEPARATOR . '..' . DIRECTORY_SEPARATOR . 'src' . DIRECTORY_SEPARATOR . 'TEST'
. DIRECTORY_SEPARATOR . 'ClientBundle' . DIRECTORY_SEPARATOR .
'Form' . DIRECTORY_SEPARATOR . $form_file.'Type.php';
if(is_file($form_type_file)){
unlink($form_type_file);
}
$process = new \Symfony\Component\Process\Process($commandline);
$process->setWorkingDirectory($move_to_project);
$process->run();
try {
if (!$process->isSuccessful()) {
throw new \RuntimeException($process->getErrorOutput());
}
echo $process->getOutput().'<hr/>';
} catch (\RuntimeException $r) {
echo $r->getMessage();
}
}
The good thing about this approach (Process component) is you can directly execute the command as you are executing using the command console. However without setting the working directory to project folder, this did not work at first place. That's obvious and reason behind the fact is command can't access 'app/console' anywhere outside the project folder. So I had to move the console commands inside the project folder ($process->setWorkingDirectory($move_to_project)) as executing commands within the project folder. Rest of two functions (import and generate entities) are also same, only the commands and arguments change.
Hope this helps someone who tries and find no luck when it comes to running more than one command consoles inside Symfony 2 Controller.
Cheers!

webdriver-test is unusable

On a virtual machine (clean, fresh Ubuntu server 11.04) I created a test website as described in Creating Your First Yii Application and now I want to create simple test using webdriver-test.
I set up proper TEST_BASE_URL in protected/tests/WebTestCase.php and created protected/tests/functional/MySimpleTest.php
<?php
Yii::import( 'ext.webdriver-bindings.CWebDriverTestCase' );
class MySimpleTest extends CWebDriverTestCase {
protected function setUp() {
parent::setUp( '192.168.57.1', 4444, 'firefox' );
}
public function testMySite() {
$this->get( TEST_BASE_URL );
$qElem = $this->findElementBy( LocatorStrategy::linkText, 'Users' );
$this->assertNotNull( $qElem, 'There is no "Users" link!' );
$qElem->clickAndWait();
$this->assertTrue( $this->isTextPresent( 'test1#example.com' ), 'The is no "test1#example.com" text on result page!' );
}
}
Running it looks like this:
etam#ubuntu:/var/www/test/protected/tests$ phpunit functional/MySimpleDbTest.php
PHPUnit 3.5.15 by Sebastian Bergmann.
E
Time: 5 seconds, Memory: 5.25Mb
There was 1 error:
1) MySimpleTest::testMySite
PHPUnit_Framework_Exception: setBrowserUrl() needs to be called before start().
/opt/yii-1.1.8.r3324/framework/test/CWebTestCase.php:61
/var/www/test/protected/extensions/webdriver-bindings/CWebDriverTestCase.php:156
FAILURES!
Tests: 1, Assertions: 0, Errors: 1.
Notice that it's complaining about setBrowserUrl() from PHPUnit_Extensions_SeleniumTestCase_Driver, which is not the same as one from CWebDriverTestCase.
I tried to find out what's going on, but it's too complicated to me. It looks like problems with old and new selenium API existing together, but I'm not sure about it.
I'm using:
Ubuntu server 11.04
yii 1.1.8.r3324
webdriver-test 1.1b
phpunit 3.5.15 (repaired as described in bugs.launchpad.net/ubuntu/+source/phpunit/+bug/701544)
Please help!
You need to call the setBrowseUrl() method right after the parent::setup() method because selenium requires this url to resolve relative paths on your test cases. So this way you could call open('full.url.com/someAction') or just open('/someAction') and both would go to the same page.

Resources