I have h5 model already trained. I want to load that model to my r application.
Code used is :
library(keras)
m <- keras::load_model_hdf5("www/ae_model.h5")
Even tried:
m <- load_model_hdf5("www/ae_model.h5")
My R session asks for restart.
it shows R session Aborted R encountered a fatal error. The session was terminated on my screen.
I could solve this issue by myself, I made sure that the file is stored in my current working directory. and added another "/" to the URL i.e
m <- load_model_hdf5("www//ae_model.h5").
To my existing code. Thank you for the response.
Related
Some context on my environment:
I am running R Studio in a docker container called rocker/verse.
I downloaded this dataset from Kaggle, which has about 470 MB.
When working with it, at some point RStudio restart. It does't happen after a specific call, and I've seen the same problem when working with other projects. Though it is not related to my code, I am posting it bellow.
library(data.table)
fraud<- fread("path.csv")
fraud1<- sort(sample(nrow(fraud), nrow(fraud)*.7))
train<- fraud[fraud1, ]
test<-fraud[-fraud1, ]
Usually on the console this message is printed:
Error: Error occurred during transmission
And, this pop up is also showed:
I have no idea what is causing it. I would appreciate any help.
Delete the .Rhistory files associated with the installation and any open project.
You have a problem with your user data files for Rstudio. Follow the hints given here: https://community.rstudio.com/t/rstudio-server-error-occurred-during-transmission/84258 and here: https://support.rstudio.com/hc/en-us/articles/218730228-Resetting-a-user-s-state-on-RStudio-Server.
I'm building a R package for binary classification and I'm using opencpu to host it. Currently I've saved the h5 file as .RData file(serialized), which is then loaded in the environment using the .onLoad() function in R. This enables the R script to use the environment variable to load keras model using keras::unserialized_model().
I've tried directly using keras::load_model_hdf5() in the code, but after building and deploying on opencpu, when I try to hit the prediction API, I get error
ioerror: unable to open file (unable to open file: name = '/home/modelfile_26feb.h5', errno = 13, error message = 'permission denied', flags = 0, o_flags = 0)
I have changed permission for the file(777) and even the groups but still getting the error.
I even tried putting the file in inst/extdata folder so that it gets in the package but still same error.
Can anyone help on this, or suggest some alternative to load the h5 model directly?
Which OS does OpenCPU run on? Why does it try to write in /home/, this is very unusual? The best solution is to adapt your code to write in getwd() or tempdir(). Even better is to store data in a local database or redis server and let R read it from there, so you don't need disk access at all.
If you run on Ubuntu Server, reading from /home/ is not permitted by default. If you want to allow this, you need to add apparmor rules, see section 3.5 of the server manual.
Some relevant topics from the opencpu mailing list:
write in home dir: https://groups.google.com/d/msg/opencpu/5vRvgSKY-qE/4xMzZCGJBAAJ
keras in opencpu: https://groups.google.com/d/msg/opencpu/HhRzFVVFdaA/n5Nu1sxyFgAJ
write tmp folder: https://groups.google.com/d/msg/opencpu/Y1tYhaQUzwU/ubSEd_CDCgAJ
My situation is I wrote an R script using a university windows computer and emailed the file to myself. The first and second time I ran the script it was working perfectly, however, I started having the following error on the third run.
I figured out the line causing the problem and the error message:
> mapdt <-
> geojson_read("http://maperic.clst.org/wupl/Stuff/gz_2010_us_040_00_500k.json",
> what = "sp")
Error in curl::curl_fetch_disk(url, x$path, handle = handle) :
Timeout was reached
I have tried http_proxy method in this link, and it was not working properly. I have the same problem in two different network environment.
Thank you.
I would like to build a simple webserver using Rook, however I am having strange errors when trying it in R-Studio:
The code
library(Rook)
s <- Rhttpd$new()
s$start()
print(s)
returns the rather useless error
"Error in listenPort > 0 :
comparison (6) is possible only for atomic and list types".
When trying the same code in a simple R-Console,everything works - so I would like to understand why that happens and how I can fix it.
RStudio is Version 0.99.484 and R is R 3.2.2
I've experienced same thing.
TLDR: This pull request solves the problem: https://github.com/jeffreyhorner/Rook/pull/31
RStudio is treated in different way and Rook port is same as tools:::httpdPort value. The problem is that in current Rook master tools:::httpdPort is assigned directly. It's a function that's why we need to evaluate it first.
If you want to have it solved right now, without waiting for merge into master: install devtools and load package from my fork #github.
install.packages("devtools")
library(devtools)
install_github("filipstachura/Rook")
My R script worked fine in RStudio (Version 0.98.1091) on Windows 7. Then I restarted my laptop, entered again in RStudio and now it provides the following error messages each time I want to execute my code:
cl <- makeCluster(mc); # build the cluster
Error: could not find function "makeCluster"
> registerDoParallel(cl)
Error: could not find function "registerDoParallel"
> fileIdndexes <- gsub("\\.[^.]*","",basename(SF))
Error in basename(SF) : object 'SF' not found
These error messages are slightly different each time I run the code. It seems that RStudio cannot find any function that is used in the code.
I restarted R Session, cleaned Workspace, restarted RStudio. Nothing helps.
It must be noticed that after many attempts to execute the code, it finally was initialized. However, after 100 iterations, it crashed with the message related to unavailability of localhost.
Add library(*the package needed/where the function is*) for each of the packages you're using.