Transfer data from 32 bit session to 64 bit session r - r

I am using R to connect to an enterprise database via ODBC to extract data and do some analysis. The ODBC connection requires some 32 bit .dll files so I used the 32 bit version of R. However, I need to use R 64 bit for the analysis. I saved down the data in .rds files and tried to pull them back into an R 64 bit session, but I hit an error:
df <- do.call('rbind', lapply(list.files(path = "path", pattern = ".rds"), readRDS))
Error in gzfile(file, "rb") : cannot open the connection
In addition: Warning message:
In gzfile(file, "rb") :
cannot open compressed file 'filename.rds', probable reason 'No such file or directory'
I know I could save the data down to .csv and import it, but there will be a fair amount of formatting required as my data is over 200 columns wide with about every data type represented. I'm wondering if there's a simpler way to get data from a 32 bit session to a 64 bit session without the need for reformatting all the data.
Thanks for your help!

Related

R ftpUplad error: cannot open the connection

I am trying to upload a data.frame called 'ftp_test' via ftpUpload command
library(RCurl)
ftpUpload("Localfile.html", "ftp://User:Password#FTPServer/Destination.html")
and am getting an error:
Error in file(what, "rb") : cannot open the connection
In addition: Warning message:
In file(what, "rb") :
cannot open file 'ftp_test': No such file or directory
Could anyone tell me what is the issue here? Can I actually use data.frame and upload from r global environment ?
If I can't use the data.frame is there any workaround?
Many thanks,
Artur
You problem is, that you are trying to send an R object with an file transfer protocol. Since you are saving it there, you have to tell how to save it. A workaround is to save it as a file, upload it and then delete it on your local afterwards. Also saving as R.History is fine, but you need to transfer the R object to a file in some way. This example is used with an open ftp sever (uploads get deleted immediately, but you can try if it works)
filename="test.csv"
write.csv(df, file=filename)
#use your path to the csv file here instead of ".~/test.csv", you can check with getwd()
ftpUpload("~/test.csv", paste("ftp://speedtest.tele2.net/upload/",filename, sep=""))
file.remove(filename)
Also make sure your server is running. You can try your code with the open ftp server.

Cannot load my CSV file into my R? keep getting error messages

So basically I succesfully exported my SQL view data into a csv file. but no when I load into Rgui software, I get the following errror:
> load("C:\\Users\\dachen\\Documents\\vTargetBuyers.csv")
Error: bad restore file magic number (file may be corrupted) -- no data loaded
In addition: Warning message:
file ‘vTargetBuyers.csv’ has magic number 'Marit'
Use of save versions prior to 2 is deprecated
What should I do? Is it the R version installed wrong? or something wrong with my CSV file?
Try using read.csv instead of load. load is for reading files created by save.
Type ?read.csv to access the documentation.

R XBRL IO Error when attempting to read from SEC web site and local file

I am having trouble with the XBRL library examples for reading XBRL documents from either the SEC website and from my local hard drive.
This code first attempts to do the read from the SEC site as written in the example in the pdf file for the XBRL library, and second tries to read a file saved locally:
# Following example from XBRL pdf doc - read xml file directly from sec web site
library(XBRL)
inst <- "http://www.sec.gov/Archives/edgar/data/1223389/000122338914000023/conn-20141031.xml"
options(stringsAsFactors = FALSE)
xbrl.vars <- xbrlDoAll(inst)
# attempt 2 - save the xml file to a local directory - so no web I/O
localdoc <- "~/R/StockTickers/XBRLdocs/aapl-20160326.xml"
xbrl.vars <- xbrlDoAll(localdoc)
Both of these throw an IO error. The first attempt to read from the SEC site results in this and crashes my RStudio instance:
error : Unknown IO error
I/O warning : failed to load external entity "http://www.sec.gov/Archives/edgar/data/1223389/000122338914000023/conn-20141031.xml"
So I restart RStudio, re-load XBRL library and try the second attempt, to read from a local file give this error:
I/O warning : failed to load external entity "~/R/StockTickers/XBRLdocs/aapl-20160326.xml"
I am using R version 3.3.0 (2016-05-03)
I hope I am missing something obvious to somebody, I am just not seeing it. Any help would be appreciated.

uncompress a big .gz file

I need to uncompress a transactions.gz file downloaded from Kaggle; approximately (2.86 GB), 350 million rows, 11 columns.
I tried on RStudio, windows Vista, 32 bits, RAM: 3 GB:
transactions <- read.table(gzfile("E:/2014/Proyectos/Kaggle/transactions.gz"))
write.table(transactions, file="E:/2014/Proyectos/Kaggle/transactions.csv")
But i receive this error message on the console
> transactions <- read.table(gzfile("E:/2014/Proyectos/Kaggle/transactions.gz"))
Error: cannot allocate vector of size 64.0 Mb
> write.table(transactions, file="E:/2014/Proyectos/Kaggle/transactions.csv")
Error: cannot allocate vector of size 64.0 Mb
I checked this case, but it didn't work for me: Decompress gz file using R
I would appreciate any suggestions.
This file decompresses to a 22GB .csv file. You can't process it all at once in R on your 6GB machine because R needs to read everything into memory. It would be best to process it in an RDBMS like postgresql. If you are intent on using R you could process it in chunks, reading a manageable number of rows at a time: read a chunk, process it, and then overwrite with the next chunk. For this data.table::fread would be better than the standard read.table.
Oh, and don't decompress in R, just run gunzip from the command line and then process the csv. If you're on Windows you can use winzip or 7zip.

Trouble with using RODBC to access Northwind.accdb file

I'm trying to demo SQL queries from within R using the Northwind.accdb file. I visited http://office.microsoft.com/en-us/templates/desktop-northwind-2007-sample-database-TC001228997.aspx and was able to download the .accdt file. I've tried creating a database connection using RODBC with the following two lines, but both return the same error that R is unable to find the file, even though I am certain of its location (on the desktop).
Here are the lines of code that I'm using:
conn <- odbcConnectAccess2007("Nwind.accdt", uid = "", pwd = "")
conn <- odbcConnectAccess2007("Nwind.accdb", uid = "", pwd = "")
The error output I'm getting looks like this:
Warning messages:
1: In odbcDriverConnect(con, ...) :
[RODBC] ERROR: Could not SQLDriverConnect
2: In odbcDriverConnect(con, ...) : ODBC connection failed
Very grateful for any pointers you all might have.
There are some problems when using 32 bit drivers on 64 bit Windows. If you have ODBC 32 bit installed, coming with older versions of Office, make sure that you are running 32-bit R (easy with RStudio under Tools).
Or, with 64 bit drivers, use 64 bit R.

Resources