I am trying to upload a data.frame called 'ftp_test' via ftpUpload command
library(RCurl)
ftpUpload("Localfile.html", "ftp://User:Password#FTPServer/Destination.html")
and am getting an error:
Error in file(what, "rb") : cannot open the connection
In addition: Warning message:
In file(what, "rb") :
cannot open file 'ftp_test': No such file or directory
Could anyone tell me what is the issue here? Can I actually use data.frame and upload from r global environment ?
If I can't use the data.frame is there any workaround?
Many thanks,
Artur
You problem is, that you are trying to send an R object with an file transfer protocol. Since you are saving it there, you have to tell how to save it. A workaround is to save it as a file, upload it and then delete it on your local afterwards. Also saving as R.History is fine, but you need to transfer the R object to a file in some way. This example is used with an open ftp sever (uploads get deleted immediately, but you can try if it works)
filename="test.csv"
write.csv(df, file=filename)
#use your path to the csv file here instead of ".~/test.csv", you can check with getwd()
ftpUpload("~/test.csv", paste("ftp://speedtest.tele2.net/upload/",filename, sep=""))
file.remove(filename)
Also make sure your server is running. You can try your code with the open ftp server.
I created a R shiny app that automatically runs every day using a batch file.
Everything works fine when lauching the app, but the next day it crashes and I get the following message:
Warning in file(open = "w+") :
cannot open file
'C:\Users\bertin\AppData\Local\Temp\RtmpKiBPOU\Rf3f835d1a66' : No such file or directory
Warning: Error in file: cannot open the connection
[No stack trace available]
Actually this issue is related to the tempdir() folder created by the R session executing the shiny app. This folder is automatically deleted after a certain time. Do I have to delete all Temp files on each refreshing? Or on the contrary is it needed to prevent R from deleting all shiny temp files on Temp folder? Thanks!
Edit - Here is how to intentionally generate the error:
tempdir()
dir.exists(tempdir())
library(shiny)
# Windows shell required
shinyApp(
ui = fluidPage("Please reload to see me fail."),
server = function(input, output) {
shell(paste("rmdir", dQuote(
normalizePath(tempdir(), winslash = "/", mustWork = FALSE), q = FALSE
), "/s /q"))
}
)
By now I've found a setting in Windows 10 (Storage Sense) concerning the deletion of temporary files, which seems to be active by default.
Navigate as follows and uncheck:
Settings
System Storage
Storage Sense
Change how we free up space automatically
Delete temporary files that my apps aren't using
With the deletion of your temp directory also session data gets lost. But if I understand your question correctly, this is not relevant for your Shiny Application.
So if you don‘t need any session data from yesterday you could call ‘.rs.restartR()‘ to restart your R session and thus setting a new temporary directory. You will probably get an error that your last session could not be saved (as the directory doesn‘t exist anymore).
After this you should be able to start your Shiny App again.
I'm trying to publish an R Shiny app. It works when run locally, but once published to shinyapps.io produces the following error.
Warning in gzfile(file, "wb") :
cannot open compressed file '/key.rda', probable reason 'Permission denied'
Error in value[[3L]](cond) : cannot open the connection
Calls: local ... tryCatch -> tryCatchList -> tryCatchOne -> <Anonymous>
Execution halted
You can also see the actual page with the error here: https://povertylab.shinyapps.io/ACS-Map-Dashboard/
Though I have tried to reproduce this error it doesn't appear when I publish other apps, and my searches haven't turned up anything. Other things I've tried: publishing from other computers, publishing only global.R, server.R, and ui.R files, and copying files to a new project and publishing from there.
You can find all code for the app here: https://github.com/Poverty-Lab/ACS-Map-Dashboard
I would appreciate any input, even if it's just guidance on what gzfile is and what the error message could mean. Thank you!
Where is the key.rda file supposed to be? I've looked through your repo and I don't see it, which is probably causing the "cannot open the connection" error.
As a side note, you should probably ignore the .Renviron file; right now anyone can use see and use your key. Make sure you remove it from the history as well.
Thanks all. Turns out this was a problem with the way we were handing the api key for the acs package. We were using api.key.install to install the api key inside the app, and one of api.key.install's default arguments is file = "key.rda", and that file apparently could not be found. I'm still not sure why this problem only came up when we published the app, but we got around it by supplying the actual api key to the acs.fetch function in server.R.
After installing sparklyr package I followed the instruction here ( http://spark.rstudio.com/ ) to connect to spark. But faced with this error. Am I doing something wrong. Please help me.
sc = spark_connect( master = 'local' )
Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file 'C:\Users\USER\AppData\Local\Temp\RtmpYb3dq4\fileff47b3411ae_spark.log':
Permission denied
But I am able to find the file at the stated location. And on opening, I found it to be empty.I
First of all, did you install sparklyr from github devtools::install_github("rstudio/sparklyr") or CRAN?
There were some issues some time ago with Windows installations.
The issue you have seems to be related to TEMP and TMP folder level permission on Windows or to file creation permission. Every time you start sc <- spark_connect(), it tries to create a folder and file to write the log files.
Make sure you have a write access to these locations.
I could observe the same error message with version 2.4.3 and 2.4.4
in different cases:
When trying to connect to a non "local" master, using spark_connect(master="spark://192.168.0.12:7077", ..),
if the master is not started or not responding at the specified master url.
when setting a specific incomplete configuration
in my case trying to set dynamicAllocation to true, without other required dynamicAllocation settings:
conf <- spark_config()
conf$spark.dynamicAllocation.enabled <- "true"
I'm trying to run two things: first, I'm creating a PDF with 4x5, ending with dev.off(), and then trying to create a new graph. However, after starting the second plot, I get:
Error in gzfile(file, "wb") : cannot open the connection
In addition: Warning message:
In gzfile(file, "wb") :
cannot open compressed file '/var/folders/n9/pw_dz8d13j3gb2xgqb6rfnz00000gn/T/RtmpTfm1Ur/rs-graphics-822a1c83-b3fd-46c3-8028-4e0778f91d0c/4db4b438-ac35-403b-b791-e781baba152c.snapshot', probable reason 'No such file or directory'
Graphics error: Error in gzfile(file, "wb") : cannot open the connection
What is this error? The working directory is one I have read/write access to, and my hard drive isn't full.
Also, I'm using RStudio.
This is a bit late but for anyone coming here for help, I got this error when I was trying to write a file from RStudio and my destination file path was very long. I realized this could be a problem because when I wrote the file to another location with a shorter name and tried to copy it into my original destination, Windows gave me an error saying "File path too long". You might need to save the original file into another location with a shorter absolute path.
Maybe you should look here. At the end it says
Note:
The most common reason for failure is lack of write permission in the current directory. For save.image and for saving at the end of a session this will shown by messages like
Error in gzfile(file, "wb") : unable to open connection
In addition: Warning message:
In gzfile(file, "wb") :
cannot open compressed file '.RDataTmp',
probable reason 'Permission denied'
So rapidly, if you try getwd(), look at where is your working directory set. If you're trying to save your document in a place where it's not in your current working directory, it will throw you this error.
At the end of your error message, it says probable reason 'No such file or directory'
Graphics error: Error in gzfile(file, "wb") : cannot open the connection
My diagnosis would be simply that it's trying to save your item in the wrong place and RStudio is not able to find the right place.
This burned me so hopefully saves someone else some toil. The issue was that the classifiers loaded just fine on OS X but on the Linux deployment system they would fail with the error listed in the question. The issue was the the files on the disk had extension abc.RData but the code modelAbc <- readRDS(file="abc.Rdata"). The difference in the upper and lowercase D in the .RData vs .Rdata extension would fail on Linux. It was not very noticeable but check your extensions for case.
You may have no permission to save file in the directory.
On RStudio, get your working directory by getwd().
Then, go to the directory in linux and observe its owner by ls -l.
Now you can change the owner of the directory by chown -R username directoryname.
But you must be root.
Problem resolved by specifying full file path:
saveRDS(df,'C:\\users\\matt\\desktop\\code\\df.Rdata')
I faced this issue lately. Try turning off your anti-virus and build the package, it might help. It worked for me. Usually anti-virus blocks the permissions and you could avoid it by disabling for sometime just before building a package.
I was trying to save an RDS file to my local Dropbox folder so it syncs with my Dropbox.
I figured out I got the same error because I was trying to create a new folder and looks like saveRDS cannot create a new folder, but it can add files to existing folders. So I changed the path to add the file into an existing folder and it worked!
In my case it was Windows Defender which was preventing Rstudio to write any file on hard drive. Either you need to turn Controlled Folder Access off or add Rstudio in the exclusion list.
I also had this problem when working with RStudio and R Markdown. I was getting this error message and had an annoying number of fatal errors which closed RStudio. My issue was that I was working off a network drive and either the name was too long, as in #AHedge above or my network firewalls were giving me trouble. For the moment, I have moved my working files to my desktop and things seem to be working fine. Not sure what this means for my file management over time.
Just want to add more clarity(scenarios in my experience) to what M Beausoleil mentioned.
When you are using a shared-working-directory and trying to rewrite the RDS files which are already existing in a working-directory written by some other user, you get this error.
As some people have already quoted that deleting the existing RDS files or changing the working directory works. It's not a magic. It just works because you are writing a new RDS file and not trying to re-write the old ones.
I came into the same problem after I re-install a new version of RStudio.
The Rmarkdown file I created using old version of RStudio shows the same problem.
When I use ggplot() to draw a picture the error code are as follow:
Warning in gzfile(file, "wb") :
cannot open compressed file 'I:/Rlearning/.Rproj.user/shared/notebooks/58A1385C-PCA作图/1/2C15461A183AC56C/cco192gb0pow1_t\_rs_rdf_32004888ecb.rdf', probable reason 'No such file or directory'
Error in gzfile(file, "wb") : cannot open the connection
Solution:
Create a new Rmarkdown file
Delete all codes
Copy your old Rmarkdown code into it.
I had the same problem.For me, it was caused due to not having enough disk space on the drive where R studio was installed.Freeing up space works.
The reason for the error is that your username is Chinese.Please create new user folder with English in the user directory.For example, you could name the folder for "DavidSmith".Then, you need create three folders("AppData","Local","Temp").File directory C:\Users\DavidSmith\AppData\Local\Temp.
In the Advanced system settings which will modify the environment variables TMP and TEMP C:\Users\DavidSmith\AppData\Local\Temp.Save them.
After modification, open RStudio and try again.
Notice:TMP and TEMP are modified in the USER VARIABLE.
I just ran into this problem after changing my system locale.
Check your locale using Sys.getlocale().
Change it to appropriate one using Sys.setLocale("LC_ALL","ENG") (replace "ENG" with appropriate one)
I can't say with certainty which locale would be appropriate, but it seems to be coherent with default OS one.
Hope this helps!
I had this error because of an invalid character in the filename to be used to save the file, in my case "/" (there are many such characters that cannot be used in a filename). I removed the character and it was solved.
In my case, I received the error "Error in gzfile(file, "wb") : cannot open the connection" when trying to exit R in the Anaconda Prompt and saving workspace image. I am using Windows 10 and R-3.5.2. To fix it, I had to go to the Program Files folder, right click and the R folder, then selected Properties. Selected the Security tab, then, in the Group or user names box, selected Users, then clicked Edit. In the Permissions for Users, I checked Full control and Modify and saved the changes. Then I was able to save the workspace image.
I have another instance of this error which seems to be new (or at least not listed here or here: apparently it's not OK to save a file with the name aux.RData. I guess it's a reserved filename.
x <- rnorm(9000)
save(x, file = "aux.RData")
Error in gzfile(file, "wb") : no se puede abrir la conexión
Also: Warning message:
In gzfile(file, "wb") :
cannot open compressed file 'aux.RData', probable reason 'No such file or directory'
But when I change the filename saves with no problem:
save(x, file = "aux_file.RData")
Haven't seen this case in the other answers:
if this seems to happen all the time, and to be very persistent when it does happen, check the default directory in your file handling software connection.
In my case FileZilla was logging on to my DigitalOcean droplet as "root" and whenever I used FileZilla to create a directory it was setting write permissions to "root", whereas my RStudio on the same droplet read/wrote as "My_Name". Anytime I set something up in FZ (e.g. large imported files, renamed or copied) the permissions would switch and I'd get this error.
If this is what is causing frequent error messages it can be solved instantly with chown -R My_Name directoryname but in the longer run, if you are going to be using your file handler to define and create a lot of directories, it will pay to create a connection whose default name is the same name you use for RStudio.
In my case, when it happened first, months ago, the solution here worked.
But recently, it came back, constantly... What solved this time was to change the anti-virus. I have not just the Windows defender, but also a 2nd anti-virus, the same in both times. I ended up deinstalling it and installing another antivirus... After this, the problem did not happen again...
After several days trying to solve this same ERROR or problem in my case (Windows 10 and R), I tried to save my file(file.RData) in D disk instead of C disk (where I always was working and I have installed R) and it was fine, without problems,my file was saved in D:/Users.When I tried many times to save it in C disk, always gave me Permission denied.
save(Myfile, file="D:/Users/Myfile.RData")
I encountered this same issue when trying to save an Rds file from an Markdown file. Changing my relative file path to an absolute file path worked for me.
In my case, this error was because the file that I wanted to re-write, was read-only (for whatever reason, I didn't do it myself). I just right-click on the file's name in the folder and unchecked the read-only property. After that it worked.