Can't import Divi layouts in Wordpress - wordpress

Trying to import Divi layouts (.json files)
Getting this error:
This file cannot be imported. It may be caused by file_uploads being
disabled in your php.ini. It may also be caused by post_max_size or/and
upload_max_filesize being smaller than file selected. Please increase
it or transfer more substantial data at the time.
This is not the case, however as I don't have a limit on either and I can upload anywhere else within my WP installation.
Does anyone have any idea what else would cause this error?

My response pertains to Divi Theme Version 3.0.46
I had the same problem, and what follows is how I fixed it.
In my case the error is being generated from the Divi Builder portability.js file, line 464:
var fileSize = Math.ceil( ( data.file.size / ( 1024 * 1024 ) ).toFixed( 2 ) ),
formData = new FormData();
// Max size set on server is exceeded.
if ( fileSize >= $this.postMaxSize || fileSize >= $this.uploadMaxSize ) {
etCore.modalContent( '<p>' + $this.text.maxSizeExceeded + '</p>', false, true, '#' + $this.instance( '.ui-tabs-panel:visible' ).attr( 'id' )\
);
$this.enableActions();
return;
}
The thing to note here is, this script is rounding the max upload sizes to whole numbers of MB.
So my max file size for this site was 2MB, and my file is 1,495,679 bytes, which the script turned into:
if 2>=2 {
// throw an error
}
So it seems, the solution is to make both your php upload max size and max post size at least 1MB greater than the file you are trying to upload.
The Elegant Themes people have a lengthy post on this:
https://www.elegantthemes.com/blog/tips-tricks/is-the-wordpress-upload-limit-giving-you-trouble-heres-how-to-change-it
This is as simple as setting this in my php.ini.
; TO have a 31.4MB file upload into Divi, these must be at least 32MB.
post_max_size = 32M
upload_max_filesize = 32M
The final thing I want to say about this as since this error is generated by javascript in the browser, you must:
change your php.ini
restart your webserver [at least I needed to w/ Apache]
refresh the page, as the limits are cached in the browser, not the result of any sort of ajax call, etc.

I had the same issue.I googled and found the solution below.(Apology for not having a proper explanation!)
Solution:
Create a new file php.ini with the following text & save in website's root directory and it's done.
file_uploads = On
upload_max_filesize = 100M
post_max_size = 100M

Related

ASP File Object Issue

I am working on an ASP classic website, which the client reported suddenly exhibited an issue with a previously working function which listed only "image" file types. Upon reading the code, I found that the loop that lists the files in a folder, uses the InStr() function to identify files by their type, which should be "image." However, I found that something must have changed in the OS, as the type is not longer "image", but "JPG", or "PNG", etc. This dramatically changes the way the code works. Following is a snipped of the code:
Set oFSO = Server.CreateObject("Scripting.FileSystemObject")
Set oFolder = oFSO.GetFolder(Server.MapPath(sCurrentDirectoryPath))
Set oSubFolder = oFolder.Files
iFileCount = 0
For Each oFileName in oSubFolder
If InStr(1, LCase(oFileName.Type),"image") > 0 Then
iFileCount = iFileCount + 1
End If
Next
Because the InStr() function is trying to find a file type of "image", no files are counted up, and the function returns zero files found. Whilst debugging, I found that the value being returned by oFileName.Type, was as follows:
This is the file type:JPG File
This is the file type:JPG File
This is the file type:Text Document
This is the file type:Data Base File
Files in the folder were two "whatever.jpg" files, a "whatever.txt" file, and a "thumbs.db" file. So, it appears that the OS (Windows Server 2019) may have changed to be less generic with reporting an "image" file, and is now reporting "JPG file" or "PNG file", etc. This of course, breaks this code! Are there any suggestions from you'all on how I could go about modifying this code to work on reporting exactly how many image files are present?
On Windows 10, the Type values for .jpg and .png files are JPEG image and PNG image respectively. What OS are you using?
Also, Type doesn't actually analyze the file, you could have a virus.exe file in the folder that has been renamed to virus.jpg, and the Type value will still show it as JPEG. So if the function is indented to check user uploaded content to ensure images are actually images, the Type value will be of no use. If you have root access you could install a COM DLL that uses a program such as ExifTool to properly analyze files (https://github.com/as08/ClassicASP.ExifTool), however that would be a complete rewrite.
But assuming you're not looking to check if an image file is actually an image file, you could just split the filename extensions and use Select Case to count the image files if your OS is returning just XXX file and no longer XXX image in the Type value (alternatively you could split the Type value, but you'd still need to check for valid image file extensions):
Dim oFSO, oFolder, oSubFolder, oFileName, iFileCount, oFileNameSplit
Set oFSO = Server.CreateObject("Scripting.FileSystemObject")
Set oFolder = oFSO.GetFolder(Server.MapPath(sCurrentDirectoryPath))
Set oSubFolder = oFolder.Files
iFileCount = 0
For Each oFileName In oSubFolder
oFileNameSplit = Split(oFileName.Name,".")
If uBound(oFileNameSplit) > 0 Then
Select Case Trim(lCase(oFileNameSplit(uBound(oFileNameSplit))))
Case "jpg", "jpeg", "png", "gif" ' Maybe add some more extensions...
iFileCount = iFileCount + 1
End Select
End If
Next
Set oFSO = Nothing
Set oFolder = Nothing
Set oSubFolder = Nothing
Response.Write(iFileCount)

How to increase memory_limit in wordpress?

Recently I set my site on my server(http://abc.hostname.com/sitename/) and I get following error:
It seems like the memory_limit on the server is too low for the fade on load feature.
I add the following code in wp-config.php to solve this error:
define( 'WP_MEMORY_LIMIT', '96M' ); also
ini_set("memory_limit","60M");
But the error is still there.
To increase memory limit .All you have to do is find line number 36 or 37 in wp-config file and add the following code there.
define('WP_MEMORY_LIMIT', '96M');
The problem will be solved.I think you are placing above code on somewhere else in wp-config file.Please add above code on line 36 or 37 only.

PHP read large excel file error

I'm trying to create a application for load and read a large excel file (more than 60,000 rows) using PHPExcel library. I am getting internal server. It works fine upto 600 rows. But its not working for large files. Please help.
Or any other php library available to load large files.
set_time_limit(36000);
ini_set('max_execution_time', 36000);
##$top_records is a boolean set to get just the header and the first data row (( For field Mapping))
if (PHP_SAPI == 'cli')
die('This example should only be run from a Web Browser');
set_include_path(get_include_path() . PATH_SEPARATOR . SITE_PATH . '/core/extLib/Excel/Classes/');
require_once (SITE_PATH . '/core/extLib/Excel/Classes/PHPExcel.php');
include (SITE_PATH . '/core/extLib/Excel/Classes/PHPExcel/IOFactory.php');
PHPExcel_Cell::setValueBinder1( new PHPExcel_Cell_AdvancedValueBinder() );
$inputFileName = $file;
$inputFileType = PHPExcel_IOFactory::identify($inputFileName);
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$objPHPExcel = $objReader->load($inputFileName);
Thanks
I have same your problem. The problem occur when work on this line
$objPHPExcel = $objReader->load($inputFileName);
Unfortunately PHPExcel is not meant to handle very large files. Changing the memory limit or time limit is at best a temporary fix but you will have the same problem with bigger files.
I can suggest you taking a look at Spout instead. It works great for your use case!
Increase memory limit. (eg: ini_set('memory_limit', '512') ),
Or use chunk instead. link

phpExcel Read in chunks so slow and memory errors

I'm trying to read a big excel file of about 20mb to import into mysql.
I've searched across internet and found the "Chunks reading" solution, however is not working... or is SO slowly for me, and I'm not sure why.
This is what im doing:
// .....
// into MyReadFilter class.. this is the most important function:
public function readCell($column, $row, $worksheetName = '') {
// Only read the rows and columns that were configured
if (($row == 1) || ($row >= $this->_startRow && $row < $this->_endRow)) {
if (in_array($column,$this->_columns)) {
return true;
}
}
return false;
}
// .....
$filter = new MyReadFilter(1, 22000);
$chunkSize = 10;
$objReader = PHPExcel_IOFactory::createReader($inputFileType);
$objReader->setReadFilter($filter);
$objReader->setReadDataOnly(false); //not sure if this should be true
for ($startRow = 2; $startRow <= 65536; $startRow += $chunkSize) {
echo "Reading";
$filterSubset->setRows($startRow, $chunkSize);
$objPHPExcel = $objReader->load($inputFileName); // this line takes like 40 seconds... for 10 rows?
echo "chunk done! ";
}
However, inside the for, the $objReader->load() is taking like 40 seconds, and in fact, after 2 loops I got a memory error.
If I unset the $objReader inside the for I can make it run about 20 times inside the for... (although it take like 10 minutes) and.. memory error.
I'm wondering why the load function seems to read all the file if im using a filter, also the filter strategy seems to parse all rows and return false for all rows that are not being required... is not posible to abort reading or really read just the required ones?
I've tried a couple of FilterClass and code snippets but got same results...
If You're using a filter, then the Reader is still reading the whole file but only populating the PHPExcel object cells that are defined by the filter; and the Reader still needs to read the whole file each pass of the filtering process, which is what makes it slower.
The Reader needs to read the whole file because of the structure of the raw spreadsheet files. Cell data is not stored with cell formatting, and cell content may also be stored separately. The Reader needs to pull all this together. You can't simply abort the reader when the filter condition is met, because the reader has no way of knowing that it has been completed... if you have a filter that is limiting the load to cells A1:C3, then you can't abort after B3 has been read because you don't know if cell B2 comes after that in the file, or there may be comments associated with cell A1 further on in the file. Until the whole file has been loaded and parsed, you can't start to filter.
The main memory usage in PHPExcel is the PHPExcel object, and specifically the cells (typically about 1k/cell on 32-bit PHP).... the main solution provided to reduce memory here is cell caching. This can (using SQLite caching) reduce cell memory usage to 0k/cell, though at a cost in speed.
The Reader doesn't use much more memory than the size of the Excel file (decompressed) itself, so is normally far less of a memory problem; but this is being addressed (for XML-based spreadsheet formats) by switching from SimpleXML to XMLReader. But it is dependent on the format of the file being loaded; xls format files are very different to xlsx files (xlsx will benefit from this, xls won't) and also dependent on the developers being able to find the time to do this - but it is on the roadmap for the coming year, and work has already started.

Solution for "Fatal error: Maximum function nesting level of '100' reached, aborting!" in PHP

I have made a function that finds all the URLs within an html file and repeats the same process for each html content linked to the discovered URLs. The function is recursive and can go on endlessly. However, I have put a limit on the recursion by setting a global variable which causes the recursion to stop after 100 recursions.
However, php returns this error:
Fatal error: Maximum function nesting level of '100' reached,
aborting! in
D:\wamp\www\crawler1\simplehtmldom_1_5\simple_html_dom.php on line
1355
I found a solution here: Increasing nesting function calls limit but this is not working in my case.
I am quoting one of the answers from the link mentioned above. Please do consider it.
"Do you have Zend, IonCube, or xDebug installed? If so, that is probably where you are getting this error from.
I ran into this a few years ago, and it ended up being Zend putting that limit there, not PHP. Of course removing it will let >you go past the 100 iterations, but you will eventually hit the memory limits."
Is there a way to increase the maximum function nesting level in PHP
Increase the value of xdebug.max_nesting_level in your php.ini
A simple solution solved my problem. I just commented this line:
zend_extension = "d:/wamp/bin/php/php5.3.8/zend_ext/php_xdebug-2.1.2-5.3-vc9.dll
in my php.ini file. This extension was limiting the stack to 100 so I disabled it. The recursive function is now working as anticipated.
Another solution is to add xdebug.max_nesting_level = 200 in your php.ini
Rather than going for a recursive function calls, work with a queue model to flatten the structure.
$queue = array('http://example.com/first/url');
while (count($queue)) {
$url = array_shift($queue);
$queue = array_merge($queue, find_urls($url));
}
function find_urls($url)
{
$urls = array();
// Some logic filling the variable
return $urls;
}
There are different ways to handle it. You can keep track of more information if you need some insight about the origin or paths traversed. There are also distributed queues that can work off a similar model.
Rather than disabling the xdebug, you can set the higher limit like
xdebug.max_nesting_level=500
It's also possible to fix this directly in php, for example in the config file of your project.
ini_set('xdebug.max_nesting_level', 200);
Go into your php.ini configuration file and change the following line:
xdebug.max_nesting_level=100
to something like:
xdebug.max_nesting_level=200
on Ubuntu using PHP 5.59 :
got to `:
/etc/php5/cli/conf.d
and find your xdebug.ini in that dir, in my case is 20-xdebug.ini
and add this line `
xdebug.max_nesting_level = 200
or this
xdebug.max_nesting_level = -1
set it to -1 and you dont have to worry change the value of the nesting level.
`
probably happened because of xdebug.
Try commenting the following line in your "php.ini" and restart your server to reload PHP.
  ";xdebug.max_nesting_level"
Try looking in /etc/php5/conf.d/ to see if there is a file called xdebug.ini
max_nesting_level is 100 by default
If it is not set in that file add:
xdebug.max_nesting_level=300
to the end of the list so it looks like this
xdebug.remote_enable=on
xdebug.remote_handler=dbgp
xdebug.remote_host=localhost
xdebug.remote_port=9000
xdebug.profiler_enable=0
xdebug.profiler_enable_trigger=1
xdebug.profiler_output_dir=/home/drupalpro/websites/logs/profiler
xdebug.max_nesting_level=300
you can then use #Andrey's test before and after making this change to see if worked.
php -r 'function foo() { static $x = 1; echo "foo ", $x++, "\n"; foo(); } foo();'
php.ini:
xdebug.max_nesting_level = -1
I'm not entirely sure if the value will ever overflow and reach -1, but it'll either never reach -1, or it'll set the max_nesting_level pretty high.
You could convert your recursive code into an iterative code, which simulates the recursion. This means that you have to push the current status (url, document, position in document etc.) into an array, when you reach a link, and pop it out of the array, when this link has finished.
Check recursion from command line:
php -r 'function foo() { static $x = 1; echo "foo ", $x++, "\n"; foo(); } foo();'
if result > 100 THEN check memory limit;
You could try to wiggle down the nesting by implementing parallel workers (like in cluster computing) instead of increasing the number of nesting function calls.
For example: you define a limited number of slots (eg. 100) and monitor the number of "workers" assigned to each/some of them. If any slots become free, you put the waiting workers "in them".
<?php
ini_set('xdebug.max_nesting_level', 9999);
... your code ...
P.S. Change 9999 to any number you want.
Stumbled upon this bug as well during development.
However, in my case it was caused by an underlying loop of functions calling eachother - as a result of continuous iterations during development.
For future reference by search engines - the exact error my logs provided me with was:
Exception: Maximum function nesting level of '256' reached, aborting!
If, like in my case, the given answers do not solve your problem, make sure you're not accidentally doing something along the lines of the following simplified situation:
function foo(){
// Do something
bar();
}
function bar(){
// Do something else
foo();
}
In this case, even if you set ini_set('xdebug.max_nesting_level', 9999); it will still print out the same error message in your logs.
If you're using Laravel, do
composer update
This should be work.
In your case it's definitely the crawler instance is having more Xdebug limit to trace error and debug info.
But, in other cases also errors like on PHP or core files like CodeIgniter libraries will create such a case and if you even increase the x-debug level setting it would not vanish.
So, look into your code carefully :) .
Here was the issue in my case.
I had a service class which is library in CodeIgniter. Having a function inside like this.
class PaymentService {
private $CI;
public function __construct() {
$this->CI =& get_instance();
}
public function process(){
//lots of Ci referencing here...
}
My controller as follow:
$this->load->library('PaymentService');
$this->process_(); // see I got this wrong instead it shoud be like
Function call on last line was wrong because of the typo, instead it should have been like below:
$this->Payment_service->process(); //the library class name
Then I was keeping getting the exceed error message. But I disabled XDebug but non helped. Any way please check you class name or your code for proper function calling.
I had a error when i was installing many plugins So the error 100 showed including the location of the last plugin that i installed C:\wamp\www\mysite\wp-content\plugins\"..." so i deleted this plugin folder on the C: drive then everything was back to normal.I think i have to limit the amount of plug-in i install or have activated .good luck i hope it helps
I had this issue with WordPress on cloud9. It turns out it was the W3 Caching plugin. I disabled the plugin and it worked fine.
Another solution if you are running php script in CLI(cmd)
The php.ini file that needs edit is different in this case. In my WAMP installation the php.ini file that is loaded in command line is:
\wamp\bin\php\php5.5.12\php.ini
instead of \wamp\bin\apache\apache2.4.9\bin\php.ini which loads when php is run from browser
You can also modify the {debug} function in modifier.debug_print_var.php, in order to limit its recursion into objects.
Around line 45, before :
$results .= '<br>' . str_repeat(' ', $depth * 2)
. '<b> ->' . strtr($curr_key, $_replace) . '</b> = '
. smarty_modifier_debug_print_var($curr_val, ++$depth, $length);
After :
$max_depth = 10;
$results .= '<br>' . str_repeat(' ', $depth * 2)
. '<b> ->' . strtr($curr_key, $_replace) . '</b> = '
. ($depth > $max_depth ? 'Max recursion depth:'.(++$depth) : smarty_modifier_debug_print_var($curr_val, ++$depth, $length));
This way, Xdebug will still behave normally: limit recursion depth in var_dump and so on.
As this is a smarty problem, not a Xdebug one!
I had the same problem and I resolved it like this:
Open MySQL my.ini file
In [mysqld] section, add the following line: innodb_force_recovery =
1
Save the file and try starting MySQL
Remove that line which you just added and Save

Resources