Problems using cron to run a encrypted php file - encryption

I'm having problems using cron to run a encrypted php file. The file is encrypted using Source Guardian v8.0 and i can run it successfully when i call th script directly from the browser. The frequency is to run every 24 hours and i'm testing by setting this to every 5 minutes so i can test it quickly. The frequency works ok and im getting the emails to the email address i specify in the job.
my emails contain this:
X-Powered-By: PHP/5.3.26
Content-type: text/html
but the resultant script(which updates a db table) is not running.
My web host (shared hosting) advised i use:
php -c /home/myUserName/public_html/path/to/my/daily/repeated/script.php
Source Guardian Support suggest full paths and specifying the php.ini path, thus:
/usr/local/bin/php -c /usr/lib/php.ini -f /home/myUserName/public_html/path/to/my/daily/repeated/script.php
Previously i was using:
php /home/myUserName/public_html/path/to/my/daily/repeated/script.php
Nothing seems to run this script!! with/without switches; full paths or not ...
any suggestions appreciated

Related

Specific folders of postman collection runs from tool successfully but throws a 401 error while running from newman tool with --folder option

The postman collection file has the environment settings, globals and data files which are configured
and runs successfully from postman
From newman
newman run -e ..\..\Source\ABCDEF\test\Automation\API_Webservices\Environment\GHIJ_QA.postman_environment.json -g ..\..\Source\LMNOPQRSTUV\test\Automation\API_Webservices\Globals\A12345_4568.postman_globals.json --folder Folder1 -d ..\..\Source\LMNOPQRSTUV\test\Automation\API_Webservices\TestData\test.json ..\..\Source\LMNOPQRSTUV\test\Automation\API_Webservices\Collections\MicroServices\A12345_4568\MicroServices.postman_collection.json -r htmlextra --reporter-htmlextra-export ..\..\Source\LMNOPQRSTUV\test\Automation\API_Webservices\Output\Output_HTMLReport.html
expected response to have status code 200 but got 401
RESPONSE INFORMATION
Response Code:401 - Unauthorized
The folder level services implemented would need to be run either as single folder or multiple folders in sequence to check simple and advanced flows.
Currently this is not working from newman CLI but from tool only.
Should the collectionvariables used inside the script be made global to work in newman?
Is that a possible issue please shed some light

Enabling gzip for external http requests

I have a local env that is not automatically decompressing deflated files. The staging server and the production server seem to be doing it automatically (because the code we have up there is working fine) but on my local machine the request body is still compressed. I had some success using:
gzinflate()
but I would rather find a solution where my local set up is just closer to the production set up (and don't have to change the code all that much).
p.s. this is a wordpress set up.
K I figured it out. Turns out the issue was that I had just updated my local env to php 7 and enabled x-debug. When doing this I also needed to install something for SOAP and CURL requests (if someone would like to explain further I would be appreciative). Here is the command I had to run:
sudo apt-get install php-soap php-curl

Windows installation through http URL

We have created a http URL and stored the windows setup files. Our requirements is to execute this setup files through HTTP.
we tried with command CURL http://192.168.2.20/win2k12/setup.exe but it shows only the ASCII characters.
Also is it possible to execute the .exe files through wget?
Can anyone help us how to execute the setup.exe file from command prompt?

meteor application: how to get access log / error log like Apache server?

is it possible to see access log / and Apache log as we see for Apache server to check what going with meteor server. ? can i look it as a domain or as a whole servers?
If you are using meteor-up to deploy your applications into your own server (which I recommend and should save you a lot of time on the long run), you can access your logs at: /var/log/upstart/app.log.
This will even allow you to tail logs from the server in your local application directory by running:
mup logs -f # it supports all the options of tail
Also, if you want to reset your logs:
cd /var/log/upstart/
sudo cat /dev/null > app.log
well, depends on how fancy you want to go, it could be as simple as:
meteor > /var/log/meteor.log
it redirect the output to /var/log/meteor.log
if that doesnt float your boat, there are a bunch of packages pre-made for that, of which this one looks like the most flexible:
https://atmospherejs.com/package/trail

Wget doesn't exit after copying files

On my Unix server I execute this command to copy all content from folderc via the unix shell.
wget -r -nH --accept=ismv,ismc,ism,jpg --cut-dirs=5 --level=0 --directory-prefix="/root/sstest" -o /root/sstest2.log http://site.com/foldera/folderb/folderc/
All the content from folderc is actually copied to /root/sstest .
The wget does not exit after copying and take me back to the command prompt.
What could be causing this behaviour?
I had the same problem, and I just add single quote to the front and end of the URL.
This step resolved this issue form me.
It's possible that the HTTP server miscommunicates the length of a response, so that Wget keeps waiting for more data. It could be due to a bug in Wget or in the server (or a software component running on the server) which you don't notice in an interactive web browser.
To debug this, make sure you are running the latest version of Wget. If the problem persists, use the -d flag to collect the debug output, and send a report about the misbehavior to Wget developers at bug-wget#gnu.org. Be sure to strip the sensitive data, such as passwords or internal host names, from the report before sending it.
I observe a similar problem when downloading files from dropbox with wget:
the download finishes (file is complete)
wget (or curl, depending on what I use for download) do not show up in running processes, anymore, after the file is complete
wget (or curl) do not return to the command prompt
returning to the command prompt can be "forced" by simply hitting enter, I do not have to actually kill any process to return to the command prompt, it's just kind of stuck before I press enter one more time.
The problem is not wget-specific, it also occurs when I try to download the same file from the same location with curl. The problem does not occur at all if I download the same file from several unix web server, neither with wget, nor with curl.
I have tried using timeout (with a sufficiently long time) to force wget/curl to return to the command prompt, but they even do not return to the command prompt after timeout kills them.

Resources