PHPExcel throwing error, locale? Not using locale - phpexcel

I'm trying to use PHPExcel, and it's throwing an error for even the most basic things, and even for a script copied from somewhere ( http://blog.clock.co.uk/phpexcel-example/ ).
<br />
<b>Warning</b>: Invalid argument supplied for foreach() in <b>/home/.../public_html/pear/PEAR/PHPExcel/PHPExcel/Calculation.php</b> on line <b>1685</b><br />
The outputted file has this as the very top of the file, for which Excel (or Open Office) says is not a valid file. If I remove those two lines, everything is fine and Excel (or OO) can open it with no problems and everything the script does is there.
Calculation.php line 1685:
foreach (glob($localeFileDirectory.'/*',GLOB_ONLYDIR) as $filename) {
And the function it is in:
private function __construct() {
$localeFileDirectory = PHPEXCEL_ROOT.'PHPExcel/locale/';
foreach (glob($localeFileDirectory.'/*',GLOB_ONLYDIR) as $filename) {
$filename = substr($filename,strlen($localeFileDirectory)+1);
if ($filename != 'en') {
self::$_validLocaleLanguages[] = $filename;
}
}
$setPrecision = (PHP_INT_SIZE == 4) ? 12 : 16;
$this->_savedPrecision = ini_get('precision');
if ($this->_savedPrecision < $setPrecision) {
ini_set('precision',$setPrecision);
}
} // function __construct()
I installed PHPExcel via PEAR.
I didn't see a "locale" directory anywhere in the PHPExcel setup, so I tried creating it but still have the same problem.
I'm not setting or using a locale feature.

It would appear then that there is a problem in the PEAR installation of PHPExcel, which I'll need to investigate.
You can find the locale directory and files in the source repository on github (https://github.com/PHPOffice/PHPExcel/tree/master/Classes) or in the standard zip distributions; but it would probably be better to use the full zip installation in case there are any other problems with the PEAR instal

I met this problem in PHP 5.3 + PHPExcelv1.7.6(2011-02-27) .
I solved this by updating to PHPExcel v1.8.0(2014-03-02)

Related

Download all files in all folders from URL

I'd like to recursively download all files from nested folders from this URL to my computer in the same nested structure:
https://hazardsdata.geoplatform.gov/?prefix=Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings%20HYDA/
I've tried several different approaches, using curl and RCurl, including this and some others. There are multiple file types within this folder. But I keep running into cryptic error message such as Error in function (type, msg, asError = TRUE) : error:1407742E:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert protocol version
I'm not even sure how to begin.
in their javascript you'll find the url https://hazards-geoplatform.s3.amazonaws.com/ and there you'll find a xml file containing the path to (seemingly?) all their files, from there it shouldn't be hard, so
1: download the XML list of files from https://hazards-geoplatform.s3.amazonaws.com
2: each of the XML's <content> tag describes a file or a folder. filter out all the tags that is not relevant to you, that means if the content->key tag does not contain the text Brookings HYDA, filter it out.
3: the remaining content tags contain your download path and save path, for every key tag that ends with /: this is a "folder", you can't download a fol6der, just create the path, for example if the key is
<Contents>
<Key>Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence/</Key>
this means you should create the folders Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence and move on, however if the key's value does not end with /, it means you should download it, for example if you find
<Contents>
<Key>Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence/200724-CityBrookings-AirportInfo_Email.pdf</Key>
<LastModified>2022-03-04T17:54:48.000Z</LastModified>
<ETag>"9fe9af393f043faaa8e368f324c8404a"</ETag>
<Size>303737</Size>
<StorageClass>STANDARD</StorageClass>
</Contents>
it means the save filepath is Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence/200724-CityBrookings-AirportInfo_Email.pdf
and the url to download the file is https://hazards-geoplatform.s3.amazonaws.com/ + urlencode(key), in this case:
https://hazards-geoplatform.s3.amazonaws.com/Region8%2FR8_MIT%2FRisk_MAP%2FData%2FBLE%2FSouth_Dakota%2F60601300_BrookingsCO%2FBrookings%20HYDA%2FHydraulics_DataCapture%2FCorrespondence%2F200724-CityBrookings-AirportInfo_Email.pdf
idk how to do it with curl/r, but here's how to do it in PHP, happy porting
<?php
declare(strict_types=1);
function curl_get(string $url): string
{
echo "fetching {$url}\n";
static $ch = null;
if ($ch === null) {
$ch = curl_init();
curl_setopt_array($ch, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_ENCODING => '',
CURLOPT_FOLLOWLOCATION=>1,
CURLOPT_VERBOSE=>0
));
}
curl_setopt($ch, CURLOPT_URL, $url);
$ret = curl_exec($ch);
if(curl_errno($ch)) {
throw new Exception("curl error ".curl_errno($ch).": ".curl_error($ch));
}
return $ret;
}
$base_url = 'https://hazards-geoplatform.s3.amazonaws.com/';
$xml = curl_get($base_url);
$domd = new DOMDocument();
#($domd->loadHTML($xml));
$xp = new DOMXPath($domd);
foreach($xp->query("//key[contains(text(),'Brookings HYDA')]") as $node) {
$relative = $node->nodeValue;
if($relative[-1] === '/'){
// it's a folder, ignore
continue;
}
$dir = dirname($relative);
if(!is_dir($dir)) {
mkdir($dir, 0777, true);
}
$url = $base_url . urlencode($node->nodeValue);
file_put_contents($relative, curl_get($url));
}
after running that for a few seconds i have
$ find
.
./fuk.php
./Region8
./Region8/R8_MIT
./Region8/R8_MIT/Risk_MAP
./Region8/R8_MIT/Risk_MAP/Data
./Region8/R8_MIT/Risk_MAP/Data/BLE
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence/200724-CityBrookings-AirportInfo_Email.pdf
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Correspondence/2D_Exceptions_2021Update.pdf
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/DCS_Checklist_Hydraulics_BrookingsCoSD.xlsx
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Simulations
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Simulations/RAS
./Region8/R8_MIT/Risk_MAP/Data/BLE/South_Dakota/60601300_BrookingsCO/Brookings HYDA/Hydraulics_DataCapture/Simulations/RAS/0.2PAC
soo it seems to be working.
the last output from the command is
fetching https://hazards-geoplatform.s3.amazonaws.com/Region8%2FR8_MIT%2FRisk_MAP%2FData%2FBLE%2FSouth_Dakota%2F60601300_BrookingsCO%2FBrookings+HYDA%2FHydraulics_DataCapture%2FSimulations%2FRAS%2F0.2PAC%2FPostProcessing.hdf
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 65019904 bytes) in /home/hans/test/fuk.php on line 17
meaning some of their files are over 134MB in size - it's easy to optimize the curl code to write directly to disk instead of storing the entire file in ram before writing to disk, but since you want to do this in R anyway, i won't bother optimizing the sample php script.

Laravel 5.7: openssl_cipher_iv_length(): Unknown cipher algorithm

I am developing an app in Laravel Framework 5.7.13.
I have a class called
<?php
namespace App\Library;
class Crypto{
private $cipher;
private $cstrong;
private $keylen;
private $key;
public function __Crypto(){
$this->cipher= Config::get('cipher');
$this->cstrong = true;
$this->keylen = 5;
$this->key = bin2hex(openssl_random_pseudo_bytes($keylen, $cstrong));
}
public function opensslEncrypt($value){
$ivlen = openssl_cipher_iv_length($this->cipher);
$iv = openssl_random_pseudo_bytes($ivlen);
$ciphertext_raw = openssl_encrypt($value, $this->cipher, $this->key, $options=OPENSSL_RAW_DATA, $iv);
$hmac = hash_hmac('sha256', $ciphertext_raw, $this->key, $as_binary=true);
$ciphertext = base64_encode( $iv.$hmac.$ciphertext_raw );
return $ciphertext ;
}
}
Now in my controller I did:
$crypto = new Crypto();
$encryptedValue = $crypto->opensslEncrypt($orderId);
In my Config\app.php
'cipher' => 'AES-256-CBC'
But when I run my app, I get
ErrorException (E_WARNING)
openssl_cipher_iv_length(): Unknown cipher algorithm
How to resolve this?
I tried to comment the cipher line in the Config\app.php, but then it gave some other errors.
Please help...
I ran into a similar problem with Laravel 5.7.13.
My error with Laravel and the openssl_cipher_iv_length() function was encountered when I updated my WampServer installation to PHP v7.2.x (from v7.1.10). Yes, I am running on Windows.
Switching back to php v7.1.10 would clear the error.
To solve my error with openssl_cipher_iv_length(), I compared the php.ini files from the two php versions. When comparing the files I noticed that I did not have the extension_dir set properly. This was my main issue, but there were other edits I made in the past that I also incorporated into the new PHP environment (i.e. extensions that were enabled and XDEBUG settings).
Also... I did notice that the extension names were previously defined as:
extension=php_<ext>.dll
or
extension=<ext>.so
and are now using:
extension=<ext>
So my issue with openssl_cipher_iv_length() was a result of the PHP version and not Laravel.
I hope this information helps.

Unable to create folder with RCurl

I'm having trouble using the ftpUpload() function of RCurl to upload a file to a non-existent folder in an SFTP. I want the folder to be made if its not there, using the ftp.create.missing.dirs option. Here's my code currently:
.opts <- list(ftp.create.missing.dirs=TRUE)
ftpUpload(what = "test.txt",
to "sftp://ftp.testserver.com:22/newFolder/existingfile.txt",
userpwd = paste(user, pwd, sep = ":"), .opts = opts)`
It doesn't seem to be working as I get the following error:
* Initialized password authentication
* Authentication complete
* Failed to close libssh2 file
I can upload a file to an existent folder with success, its just when the folder isn't there I get the error.
The problem seems be due the fact you are trying to create the new folder, as seen in this question: Create an remote directory using SFTP / RCurl
The error can be found in Microsoft R Open git page:
case SSH_SFTP_CLOSE:
if(sshc->sftp_handle) {
rc = libssh2_sftp_close(sshc->sftp_handle);
if(rc == LIBSSH2_ERROR_EAGAIN) {
break;
}
else if(rc < 0) {
infof(data, "Failed to close libssh2 file\n");
}
sshc->sftp_handle = NULL;
}
if(sftp_scp)
Curl_safefree(sftp_scp->path);
In the code the parameter rc is related to libssh2_sftp_close function (more info here https://www.libssh2.org/libssh2_sftp_close_handle.html), that tries close the nonexistent directory, resulting in the error.
Try use curlPerform as:
curlPerform(url="ftp.xxx.xxx.xxx.xxx/";, postquote="MkDir /newFolder/", userpwd="user:pass")

Download Multiple Files from http using Powershell with proper names

I have searched for something similar and I keep running across the FTP download answers. This is helpful information, but ultimately proving to be difficult to translate. I have found a powershell script and it works, but I am wondering if it can be tweaked for my needs. I don't have much experience with powershell scripting, but I'm trying to learn.
The need is this. I need to download and install a series of files to a remote machine, unattended. The files are distributed via email via tinyurls. I currently throw those into a .txt file, then have a powershell script read the list and download each file.
Requirements of the project and why I have turned to powershell (and not other utilities), is that these are very specialized machines. The only tools available are ones that are baked into Windows 7 embedded.
The difficulties I run into are:
The files download one at the time. I would like to grab as many downloads at the same time that the web server will allow. (usually 6)
The current script creates file names based off the tinyurl. I need the actual file name from the webserver.
Thanks in advance for any suggestions.
Below is the script I’m currently using.
# Copyright (C) 2011 by David Wright (davidwright#digitalwindfire.com)
# All Rights Reserved.
# Redistribution and use in source and binary forms, with or without
# modification or permission, are permitted.
# Additional information available at http://www.digitalwindfire.com.
$folder = "d:\downloads\"
$userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1"
$web = New-Object System.Net.WebClient
$web.Headers.Add("user-agent", $userAgent)
Get-Content "d:\downloads\files.txt" |
Foreach-Object {
"Downloading " + $_
try {
$target = join-path $folder ([io.path]::getfilename($_))
$web.DownloadFile($_, $target)
} catch {
$_.Exception.Message
}
}
If you do the web request before you decide on file name you should be able to get the expanded path (otherwise you would have to make two web requests, one to get the extended path and one to download the file).
When I tried this, I found that the BaseResponse property of the Microsoft.PowerShell.Commands.HtmlWebResponseObject returned by the Invoke-WebRequest cmdlet had a ResponseUri property which was the extended path we are looking for.
If you get the correct response, just save the file using the name from the extended path, something like the following (this sample code does not look at HTTP response codes or similar, but expects everything to go well):
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
This method could be called using something like the following, given that the $HOME\Documents\Temp folder exists:
Save-TinyUrlFile -TinyUrl http://tinyurl.com/ojt3lgz -DestinationFolder $HOME\Documents\Temp
On my computer, that saves a file called robots.txt, taken from a github repository, to my computer.
If you want to download many files at the same time, you could let PowerShell make this happen for you. Either use PowerShell workflows parallel functionality or simply start a Job for each url. Here's a sample on how you could do it using PowerShell Jobs:
Get-Content files.txt | Foreach {
Start-Job {
function Save-TinyUrlFile
{
PARAM (
$TinyUrl,
$DestinationFolder
)
$response = Invoke-WebRequest -Uri $TinyUrl
$filename = [System.IO.Path]::GetFileName($response.BaseResponse.ResponseUri.OriginalString)
$filepath = [System.IO.Path]::Combine($DestinationFolder, $filename)
try
{
$filestream = [System.IO.File]::Create($filepath)
$response.RawContentStream.WriteTo($filestream)
$filestream.Close()
}
finally
{
if ($filestream)
{
$filestream.Dispose();
}
}
}
Save-TinyUrlFile -TinyUrl $args[0] -DestinationFolder $args[1]
} -ArgumentList $_, "$HOME\documents\temp"
}

openWithDefaultApplication fails on files in application folder

I'll ONLY recieve an "Error #3000: Illegal path name" if I try to open a file which is placed inside the app-folder of the air. If the file is somewhere else outside of the app-folder it works.
private var file:File = File.documentsDirectory;
public function download():void{
var pdfFilter:FileFilter = new FileFilter("PDF Files", "*.pdf");
file.browseForOpen("Open", [pdfFilter]);
file.addEventListener(Event.SELECT, fileSelected);
}
private function fileSelected(e:Event):void
{
var destination:File = File.applicationDirectory
destination = destination.resolvePath("test.pdf");
/*
//This works, also if the file to copy is placed inside the appfolder
file.copyTo(destination, true);
*/
/*This Throws me an Error #3000, but ONLY if the file is located in
the App folder*/
file.openWithDefaultApplication();
}
When i try to get the same file and copy it to another place it's doing fine.
Why that? Something special to do if i wanna open files which are inside the appfolder?
It also don't work in debug mode - bin-debug.
Regards, Temo
After reading the document a few times i saw that this is not possible (it's not a bug, it's a feature!?!)
Opening files with the default system application
You cannot use the openWithDefaultApplication() method with files located in the application directory.
So I do this instead:
file.copyTo(tempFile);
tempFile.openWithDefaultApplication();
Not so nice, but it works.

Resources