I have .xls file (340Kb), but when I try to load it - process takes about 15 minutes.
Can you help me, what problem with the file?
My code:
$file_type = PHPExcel_IOFactory::identify( $file_name );
$OR = PHPExcel_IOFactory::createReader($file_type);
$E = $OR->load( $file_name );
Thanks!
=============================================================
Update:
I found, that on load a large number of notices appears:
Notice: Uninitialized string offset: -1120109318 in /PHPExcel/Reader/Excel5.php
function _GetInt2d
line return ord($data[$pos]) | (ord($data[$pos+1]) << 8);
function _GetInt4d
line $_or_24 = ord($data[$pos + 3]);
function _GetInt4d
line return ord($data[$pos]) | (ord($data[$pos+1]) << 8) | (ord($data[$pos+2]) << 16) | $_ord_24;
Maybe this is the problem?
Is it file problem or PHPExcel?
With $OR->setReadDataOnly(true); all works fine.
Related
I'm reading a unix book and specifically the part about execve() system call. The book says that file descriptors related to opened file are passed to child processes and also ( default behaviour ) after a process calls execve().
However, when I tried this code to read an opened file descriptor delivered to a process generated with execve() it doesn't seem to work. What's the problem ?
Program that calls execve() :
int main(int arg,char *argv[],char **env){
int fd;
if ( (fd = open("text.txt",O_RDWR | O_CREAT, ALL_OWNER )) == -1 ){
printf("Open failed\n");
exit(1);
};
printf("%d\n",fd); // 3
char buff [] = "Hello World\n";
write(fd,buff,strlen(buff));
int res;
if ( (res = execl("./demo",(char *)0)) == -1 ){
exit(1);
};
}
Program demo invoked by execve() :
setbuf(stdout,NULL);
printf("Demo executing...\n");
ssize_t r;
char buff[1024];
while ( (r = read(3,buff,sizeof(buff))) > 0 ){
write(STDOUT_FILENO,buff,r);
}
I'm using a Mac OS
The "demo" process inherit file descriptor and can read the file, but the file offset is at the end of the file. Use lseek(fd, 0, SEEK_SET) before calling execl(), or do it in "demo" before reading the file.
I have a problem copying files with scp. I use Qt and copy my files with scp using QProcess. And when something bad happens I always get exitCode=1. It always returns 1. I tried copying files with a terminal. The first time I got the error "Permission denied" and the exit code was 1. Then I unplugged my Ethernet cable and got the error "Network is unreachable". And the return code was still 1. It confuses me very much cause in my application I have to distinct these types of errors.
Any help is appreciated. Thank you so much!
See this code as a working example:
bool Utility::untarScript(QString filename, QString& statusMessages)
{
// Untar tar-bzip2 file, only extract script to temp-folder
QProcess tar;
QStringList arguments;
arguments << "-xvjf";
arguments << filename;
arguments << "-C";
arguments << QDir::tempPath();
arguments << "--strip-components=1";
arguments << "--wildcards";
arguments << "*/folder.*";
// tar -xjf $file -C $tmpDir --strip-components=1 --wildcards
tar.start("tar", arguments);
// Wait for tar to finish
if (tar.waitForFinished(10000) == true)
{
if (tar.exitCode() == 0)
{
statusMessages.append(tar.readAllStandardError());
return true;
}
}
statusMessages.append(tar.readAllStandardError());
statusMessages.append(tar.readAllStandardOutput());
statusMessages.append(QString("Exitcode = %1\n").arg(tar.exitCode()));
return false;
}
It gathers all available process output for you to analyse. Especially look at readAllStandardError().
I have a script which I use to load a webpage, retrieve information from said webpage, then output said information to a file. It had been working perfectly until today, when I have been getting an error which reads:
invoke-webrequest : Response object error 'ASP 0251 : 80004005'
Response Buffer Limit Exceeded
/foo/Reports/SearchLocation.asp, line 0
Execution of the ASP page caused the Response Buffer to exceed its configured limit.
At C:\path.ps1:7 char:12
+ $url = invoke-webrequest "http://url/ ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebException
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
I do not believe there have been any changes to the site which it is pulling its data from, and the file it is getting the input information from has no formatting errors.
Some googling around leads me to believe that the issue is that the page has more than 4 mb worth of data to load, and the default buffer size is 4 mb, but I can't find any instructions for how to change the buffer size in PowerShell.
I came across the clear-webconfiguration cmdlet, but I'm not certain whether or not that is what I need, or how exactly to implement it within my script. Here is the main portion of my code:
foreach($c in $csv){
[array]$tags = $null
$url = invoke-webrequest "http://url.com" -UseDefaultCredentials
$table = $url.ParsedHTML.getElementsByTagName('table')[7]
$rows = $table.getElementsByTagName('tr')
if($c.'Was this user on the old file?' -eq 'Yes'){
if($table.innerText -like "*N/A*" ){
$man = $c.'Manager Name' -replace ',',';'
$badusers += $c.'User ID' + "," + $c.Name + "," + $man + "," + $c.'CC'
}
else{
foreach($row in $rows){
$bcol = $row.getElementsByTagName('td') | Where-Object{$_.cellIndex -eq 1} | select -First 1
$ccol = $row.getElementsByTagName('td') | Where-Object{$_.cellIndex -eq 7} | select -First 1
$bcol = $bcol.innerText
$ccol = $ccol.innerText
if($ccol -ne $c.'CC'){
$tags += $bcol + ",," + $c.'CC' + "," + $c.'User ID'
}
}
if($tags -ne $null){
$results += $tags
}
}
}
}
Any help on solving this issue is much appreciated.
I modified nacl_sdk/pepper_49/examples/demo/flock/flock.cc with the following change:
...
// Mount the images directory as an HTTP resource.
mount("images", "/images", "httpfs", 0, "");
// NEW CODE HERE
DIR *dir = opendir("/images");
struct dirent *entry;
while((entry = readdir(dir)) != NULL)
{
fprintf(stdout, "Found '%s'\n", entry->d_name);
}
closedir(dir);
FILE* fp = fopen("/images/flock_green.raw", "rb");
...
but I am only getting this
Found '.'
Found '..'
opening the file /images/flock_green.raw using fopen works fine. I just cannot see it with readdir. Is there a way to get readdir to work in NACL? or is it some kind of sandbox restriction?
I have a piece of code as show belowe, wich run well with C++Builder-6.
Now I have moved tha program to C++Builder-XE and the call to "RiconfiguraNodo << nomeNodo ...." give me the ambguity error report belowe.
I tried several way to rewrite the call to the ole proceudre "RiconfiguraNodo", but I didn't find a working solution.
How can I rewrite this snippet of code in way suitable for C++BuilderXE
Error reported:
[BCC32 Error] UnitMain.cpp(262): E2015 Ambiguity between 'operator
System::AutoCmd::<<(const System::Currency) at c:\program files
(x86)\embarcadero\rad studio\8.0\include\windows\rtl\sysvari.h:3561'
and 'operator System::AutoCmd::<<(const System::TDateTime)
at c:\program files (x86)\embarcadero\rad
studio\8.0\include\windows\rtl\sysvari.h:3562'
Full parser context
UnitMain.cpp(245): parsing: void _fastcall TFormMain::RiconfiguraNodo(System::UnicodeString,System::UnicodeString,System::UnicodeString,System::UnicodeString)
Sample code:
Procedure RiconfiguraNodo( L"RiconfiguraNodo" );
if (VarServerPmvManager.IsEmpty() || VarServerPmvManager.IsNull())
{
VarServerPmvManager = VarServerPmvManager.CreateObject(ProgId_ServerPmvmanager);
}
try
{
VarServerPmvManager.Exec( RiconfiguraNodo << nomeNodo << ipAddress << tipoPmv << cmdType );
}
catch (Exception & ex)
{
Mylog(Fun + Sysutils::Format("ERROR=[%s] ", ARRAYOFCONST((ex.Message))));
}
I found the solution.
The procedure exec simply require Variant instead of plain string
Variant vNomeNodo, vIpAddress, vTipoPmv, vCmdType;
vNomeNodo = nomeNodo;
vIpAddress = ipAddress;
vTipoPmv = tipoPmv;
vCmdType = cmdType;
VarServerPmvManager.Exec( RiconfiguraNodo << vNomeNodo << vIpAddress << vTipoPmv << vCmdType );