XLConnect loadWorkbook error - POIXMLException (Java) - r

I'm trying to load a hefty Excel workbook (.xlsm format, ~30 mb) that has a large number of array calcs.
> wb1 <- loadWorkbook("Mar_SP_20130227_V6.1.xlsm")
Error: POIXMLException (Java): java.lang.reflect.InvocationTargetException
But I am able to successfully load a values-only/no-macro version of the workbook.
> wb2 <- loadWorkbook("Mar_SP_20130227_V6.1_VALUES_ONLY.xlsx")
> wb2
[1] "Mar_SP_20130227_V6.1_VALUES_ONLY.xlsx"
What could be causing the error?
From the maintainer's website I can see that there can be issues with workbooks containing array calcs or unsupported formula functions, but this doesn't look like the same errror.
Java Info:
C:\> java -version
java version "1.6.0_21"
Java(TM) SE Runtime Environment (build 1.6.0_21-b07)
Java HotSpot(TM) Client VM (build 17.0-b17, mixed mode)

It turns out that the root of this error was the JVM running out of memory (even with options(java.parameters = "-Xmx1024m")).
I tried to increase the memory, but couldn't get the JVM to take more than -Xmx2048m, which still wasn't enough to load the workbook.
So I upgraded the JRE from 32 bit to 64 bit and ran 64 bit R.
I was then able to set -Xmx4096m and successfully load my 30mb workbook.

Related

Cholmod error 'out of memory' : Merging Seurat Objects

I am trying to merge Seurat class objects that contain transcriptome count data (sparse matrix). I am relatively new to R, so any help/solutions is appreciated. I have added a screenshot of the data I'm working with.
**General Info:**
-------------
> memory.size(max = TRUE)
[1] 2533.94
R version 4.0.3 (2020-10-10)
Platform: i386-w64-mingw32/i386 (32-bit)
Running under: Windows 10 x64 (build 19041)
attached base packages:
[1] stats graphics grDevices utils
[5] datasets methods base
other attached packages:
[1] RSQLite_2.2.3 Seurat_3.2.3
I am not sure if my storage is the issue or if I should split the function in two.
options(stringsAsFactors = F)
setwd("C:/Users/Amara/OneDrive - Virginia Tech/XieLab/ZebraFish_Project/zf_brain-master/data")
folders <- list.files("C:/Users/Amara/OneDrive - Virginia Tech/XieLab/ZebraFish_Project/zf_brain-master/data")
library(Seurat)
library(dplyr)
zfbrainList = lapply(folders,function(folder){
CreateSeuratObject(counts = Read10X(folder),
project = folder )
})
zfbrain.combined <- merge(zfbrainList[[1]],
y = c(zfbrainList[[2]],zfbrainList[[3]],zfbrainList[[4]],zfbrainList[[5]],
zfbrainList[[6]],zfbrainList[[7]],zfbrainList[[8]],zfbrainList[[9]],
zfbrainList[[10]],zfbrainList[[11]],zfbrainList[[12]],zfbrainList[[13]],
zfbrainList[[14]],zfbrainList[[15]]),
add.cell.ids = folders,
project = "zebrafish")
Error in .cbind2Csp(x, y) :
Cholmod error 'out of memory' at file ../Core/cholmod_memory.c, line 147
Data folder
The machine used to process the data in the original question has a 64-bit Windows operating system running a 32-bit version of R. The result from memory.size() shows that approximately 2.4Gb of RAM is available to the malloc() function used by R. The 32-bit version of R on Windows can access a maximum of slightly less than 4Gb of RAM when running on 64-bit Windows, per the help for memory.size().
Memory Limits in R tells us that in 32-bit R on Windows it is usually not possible to allocate a single vector of 2Gb in size due to the fact that windows consumes some memory in the middle of the 2 Gb address space.
Once we load the data from the question, the zfbrainList object consumes about 1.2Gb of RAM.
options(stringsAsFactors = F)
folders <- list.files("./data/zebraFishData",full.names = TRUE)
library(Seurat)
library(dplyr)
zfbrainList = lapply(folders,function(folder){
CreateSeuratObject(counts = Read10X(folder),
project = folder )
})
format(object.size(zfbrainList),units = "Gb")
...and the result:
> format(object.size(zfbrainList),units = "Gb")
[1] "1.2 Gb"
At this point, the code attempts to merge the objects from the list into a single object.
zfbrain.combined <- merge(zfbrainList[[1]],
y = c(zfbrainList[[2]],zfbrainList[[3]],zfbrainList[[4]],zfbrainList[[5]],
zfbrainList[[6]],zfbrainList[[7]],zfbrainList[[8]],zfbrainList[[9]],
zfbrainList[[10]],zfbrainList[[11]],zfbrainList[[12]],zfbrainList[[13]],
zfbrainList[[14]],zfbrainList[[15]]),
add.cell.ids = folders,
project = "zebrafish")
When we calculate the size of the resulting zfbrain.combined object, we find that it is also about 1.2Gb in size, which exceeds the RAM available to R on the original poster's machine.
format(object.size(zfbrain.combined),units = "Gb")
> format(object.size(zfbrain.combined),units = "Gb")
[1] "1.2 Gb"
Since the zfbrainList must be in RAM while zfbrain.combined is being created, it is not possible to execute the merge as coded above in an instance of R that has only 2.4Gb of RAM accessible because the RAM consumed by both zfbrainList and zfbrain.combined is between 2.4 - 2.5Gb, exclusive of other RAM needed by R to run.
Solution: use the 64-bit version of R
Since most Windows-based machines have at least 4Gb of RAM, and the amount of RAM reported by memory.size() was 2.4Gb, it's likely there is at least 4Gb of RAM on the machine. The machine used in the original post already had 64-bit Windows installed, so we can enable R to access more memory by installing and running the 64-bit version of R.
On a Windows-based machine with 8Gb RAM, 32-bit R reports the following for memory.size() and memory.limit().
Interestingly, R reports 25.25 for memory.size() because 1Mb is rounded down to 0.01 per the help documentation, but memory.limit() provides a number between 0 and 4095 (also per the documentation). On our test machine it reports 3583, about 3.5Gb of RAM.
When we run these functions in 64-bit R on the same machine, memory.size() reports 34.25, which means that malloc() will allocate a single object as large as 3.3Gb, and memory.limit() reports that R can access a total of 8Gb of RAM, the total amount that is installed on this particular machine.
Testing the solution
When I run the code in a 32-bit R 4.0.3 session on 64-bit Windows, I am able to replicate the out of memory error.
When I run the code in the 64-bit version of R, it runs to completion, and I am able to calculate the size of the resulting zfbrain.combined object.

Error with H2O in R - can't connect to local host

I can't get the h2o to work in my R. It shows the following error. Have no clue what it means. Previously it gave me an error because I didn't have Java 64 bit version. I downloaded the 64bit - restarted my pc - and started the process again and now it gives me this error.
Any suggestions?
library(h2o)
----------------------------------------------------------------------
Your next step is to start H2O:
> h2o.init()
For H2O package documentation, ask for help:
> ??h2o
After starting H2O, you can use the Web UI at http://localhost:54321
For more information visit http://docs.h2o.ai
----------------------------------------------------------------------
Attaching package: ‘h2o’
The following objects are masked from ‘package:stats’:
cor, sd, var
The following objects are masked from ‘package:base’:
%*%, %in%, &&, ||, apply, as.factor, as.numeric, colnames, colnames<-, ifelse,
is.character, is.factor, is.numeric, log, log10, log1p, log2, round, signif, trunc
> h2o.init(nthreads = -1)
H2O is not running yet, starting it now...
Note: In case of errors look at the following log files:
C:\Users\ADM_MA~1\AppData\Local\Temp\RtmpygK1EJ/h2o_Adm_Mayur_started_from_r.out
C:\Users\ADM_MA~1\AppData\Local\Temp\RtmpygK1EJ/h2o_Adm_Mayur_started_from_r.err
java version "9"
Java(TM) SE Runtime Environment (build 9+181)
Java HotSpot(TM) 64-Bit Server VM (build 9+181, mixed mode)
Starting H2O JVM and connecting: ............................................................
[1] "localhost"
[1] 54321
[1] TRUE
[1] -1
[1] "Failed to connect to localhost port 54321: Connection refused"
[1] 127
Error in h2o.init(nthreads = -1) :
H2O failed to start, stopping execution.
In addition: Warning message:
running command 'curl 'http://localhost:54321'' had status 127
Screenshot for h2o error in R
Based on the error message and the troubleshooting we carried out in the comments, it seems that you are using a version of Java (Java 1.9) which is too new for your version of H2O.
Your 2 options seem to be:
Verify that your version of H2O is up to date. If not, update it.
Download a compatible version of Java, i.e. Java 1.8 (you can just use it for this 1 task rather than for everything, if you prefer)
Note that on the main documentation page of H2O v3 it says:
Java 7 or later. Note: Java 9 is not yet released and is not currently
supported.
But at the same time they usually have several Beta and Alpha development branches going, so you might find one of those that works with Java 9.
So if anyone else is facing the same issue.
My recommendation (after spending about over 10 hours trying to figure this out (worth mentioning)) is check your version of java.
If it's higher than 8 then either keep it remove it.
I removed it because I didn't want to deal with setting the JAVA Home function in R and to reduce work.
Make sure you install Java 7 or 8 but a 64 bit version. h2o doesn't work if you have 32 bit.
Then voila! Just go ahead and type install.package('h2o') in your rstudio.
I wanted to be extra careful in my final attempt of this so unloaded and uninstalled the library because I had installed it before and then installed it again and then loaded it using library(h2o) and then h20.init() worked just fine.

Deepwater with Tensorflow a fatal error java

I have an error executing the following code R:
hsc = h2o.init(ip="127.0.0.1",port=54321,nthreads=-1,max_mem_size="8G")
model_tf <- h2o.deepwater(
x = col_start:col_end,
y = col_class,
backend = "tensorflow",
training_frame = train)
Error from console h2o:
A fatal error has been detected by the Java Runtime Environment:
SIGILL (0x4) at pc=0x00007f49f117892d, pid=4616, tid=0x00007f4a7d88a700
JRE version: Java(TM) SE Runtime Environment (8.0_144-b01) (build 1.8.0_144-b01)
Java VM: Java HotSpot(TM) 64-Bit Server VM (25.144-b01 mixed mode linux-amd64 compressed oops)
Problematic frame:
C [libtensorflow_jni.so00358a4a-1301-4222-a4f6-273b7a1baf4c+0x211992d]
Are you running this on an Ubuntu 16.04 machine with an Nvidia GPU and all the requirements met from this page https://github.com/h2oai/deepwater ?
The reason I'm asking is that this is the error you get when you try to run the GPU version on a machine that does not have a GPU.
Deepwater won't work unless the requirements are met. A simple way to do this is to use one of the docker images
https://github.com/h2oai/deepwater#pre-release-docker-image

(R error) Error: cons memory exhausted (limit reached?)

I am working with big data and I have a 70GB JSON file.
I am using jsonlite library to load in the file into memory.
I have tried AWS EC2 x1.16large machine (976 GB RAM) to perform this load but R breaks with the error:
Error: cons memory exhausted (limit reached?)
after loading in 1,116,500 records.
Thinking that I do not have enough RAM, I tried to load in the same JSON on a bigger EC2 machine with 1.95TB of RAM.
The process still broke after loading 1,116,500 records. I am using R version 3.1.1 and I am executing it using --vanilla option. All other settings are default.
here is the code:
library(jsonlite)
data <- jsonlite::stream_in(file('one.json'))
Any ideas?
There is a handler argument to stream_in that allows to handle big data. So you could write the parsed data to a file or filter the unneeded data.

How to initialise JVM with larger maximum heap size using rJava

I am attempting to make use of the Stanford NLP tools (Java) in R, using the rJava package. When attempting to create a StanfordCoreNLP object I get the following error:
Error in .jnew("edu/stanford/nlp/pipeline/StanfordCoreNLP", props) : java.lang.OutOfMemoryError: Java heap space
To resolve this, I have attempted to initialise the JVM with a larger maximum heap size, using variations of the following code:
.jinit(parameters=c("-Xms1g","-Xmx4g"))
When the maximum heap is set to 1GB using -Xmx1g the JVM loads but I continue to get the OutOfMemoryError. When the maximum heap size is set to 2 or 3 GB (-Xmx2g or -Xmx3g), R will stop responding. When set to 4GB or higher -Xmx4g I will get the following message:
Error in .jinit(parameters = c("-Xms1g", "-Xmx4g"), force.init = TRUE) : Cannot create Java virtual machine (-6)
How do you successfully initialise the JVM using rJava to values larger than 1GB? I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
I am using 32bit versions of Java (v8 u51) and R (v3.2.0)
That's your problem right there. Switch to 64bit versions.

Resources