I have bigquery some table. bq_table_download functions work for most of the table, however, it shows failed to parse errors with 1 table.
bigrquery::bq_auth()
sql_query=paste0("select * from `project_id.dataset.table_name`")
bq_table_ext=bq_table_download(bq_project_query(project_id, sql_query))
In order to solve this I have also used as suggested bq_table_download have download quota limit.
options(scipen = 20)
However, still, I am unable to figure what is an issue.
Here is snapshot of error code
Error in bq_parse_files(schema_path, c(path_first_chunk, chunk_plan$dat$path), :
Failed to parse '/tmp/Rtmpjd5eQN/bq-download-8650ad590.json'
Calls: ... master_table_base -> bq_table_download -> bq_parse_files
In addition: Warning messages:
1: In writeBin(result$content, con) : problem writing to connection
2: In writeBin(result$content, con) : problem writing to connection
Execution halted
Related
I am trying to save a text file in the data folder of a private package I am developing.
I tried the following:
my_text <- “Some text string”
save.RDS(my_text, file = “C/…./package_name/data/mytext.rda”)
When I try to build the document, I get the error:
Error in FUN(X[[i]], ...) :
bad restore file magic number (file may be corrupted) -- no data loaded
Calls: suppressPackageStartupMessages ... <Anonymous> -> load_all -> load_data -> unlist -> lapply -> FUN
In addition: Warning message:
file mytext.rda' has magic number 'X'
Use of save versions prior to 2 is deprecated
Execution halted
Exited with status 1.
What could I do to save the text?
Library(readr)
write_rds(x = my_text, path = "C/…./package_name/data/mytext.rda")
try this.
Best way is using devtools::use_data(my_text, internal = TRUE) as mentioned by #PoGibas.
Keep getting the following warning:
"Warning message:
In data(VEMCOdata) : data set ‘VEMCOdata’ not found"
I'm new to R. I set up all my files exactly how the package VTrack specified. I set my working directory, load my first data set then ue the command >data(xx) and keep getting the same warning message.
setwd("C:/Users/gwhite/Desktop")
library(VTrack)
VEMCOdata <- read.csv('VEMCOdata_2014.csv')
data(VEMCOdata)
Warning message:
In data(VEMCOdata) : data set ‘VEMCOdata’ not found
I've established a successful connection to a Google BigQuery database in R using the following code
library(tidyverse)
library(bigrquery)
con <- DBI::dbConnect(bigrquery::bigquery(),
project = project,
dataset = dataset,
billing = billing)
But when I try to run a simply dplyr query such as:
tbl(con, "table_name") %>%
summarise(total = n())
I receive the following error and warning:
Error: (L4:10): GROUP BY cannot refer to an empty field name
In addition: Warning message:
Translator is missing window variants of the following aggregate
functions:
* %||%
I figured out if I add the argument use_legacy_sql = F to the dbConnect function, I can solve for the error but I still receive the warning
total
<int>
1 6259914
Warning message:
Translator is missing window variants of the following aggregate
functions:
* %||%
I've never seen any information about needing to specify a legacy sql argument in the connection for dplyr to function, and the fact that I'm still getting a warning tells me something is still wrong with my connection. I've received the same warning when I establish a connection with the src_bigquery function in the bigrquery package.
In my efforts to work around the issue mentioned here:
MonetDB connect to GO.db within R code that is run in-database
I went ahead and copied the code from WGCNA that I needed to my own package and installed it. Obviously, I now can load the package without any issues (since I didn't need the GO.db part).
However, I seem to run into another issue:
Server says '!Error running R expression. Error message: Error in
.C("corFast", x = as.double(x), nrow = as.integer(nrow(x)), ncolx =
as.integer(ncol(x)), : '.
I indeed wanted to use the faster cor function from WGCNA, but apparently the C call now creates another issue.
Unfortunately, the message is not informative. I already tried to run the query interactively and adding debug to the statement. This did not provide me with more information.
Is there anything that I can do to increase the verbosity, so that I can debug the proces?
I also tried:
options(monetdb.debug.query=F)
This resulted in a bit of extra output prior to the query, but no extra output on the error that occurred.
Using the suggestion of Hannes Muehleisen I added:
options(monetdb.debug.mapi=T)
It does add a little more information, which allowed me to proceed a bit further. I am now stuck with the following error, which seems again truncated.
QQ: 'SELECT * FROM cor_test();' TX: 'sSELECT * FROM cor_test(); ; RX:
'!Error running R expression. Error message: Error in .C("corFast", x
= as.double(x), nrow = as.integer(nrow(x)), ncolx = as.integer(ncol(x)), : ! "corFast" not available for .C() for
package "MRMRF Error in .local(conn, statement, ...) : Unable to
execute statement 'SELECT * FROM cor_test();'. Server says '!Error
running R expression. Error message: Error in .C("corFast", x =
as.double(x), nrow = as.integer(nrow(x)), ncolx = as.integer(ncol(x)),
: '.
Yes this is a known issue where only the first line of the error message is returned. We should fix this. I always use stop(whatever) to return some info from within the UDF.
I currently have a script in R that loops around 2000 times (for loop), and on each loop it queries data from a database using a url link and the read.csv function to put the data into a variable.
My problem is: when I query low amounts of data (around 10000 rows) it takes around 12 seconds per loop and its fine. But now I need to query around 50000 rows of data per loop and the query time increases quite a lot, to 50 seconds or so per loop. And this is fine for me but sometimes I notice it takes longer for the server to send the data (≈75-90 seconds) and APPARENTLY the connection times out and I get these errors:
Error in file(file, "rt") : cannot open the connection
In addition: Warning message:
In file(file, "rt") : cannot open: HTTP status was '0 (nil)'
or this one:
Error in file(file, "rt") : cannot open the connection
In addition: Warning message:
In file(file, "rt") : InternetOpenUrl failed: 'The operation timed
out'
I don't get the same warning every time, it changes between those two.
Now, what I want is to avoid my program to stop when this happens, or to simply prevent this timeout error and tell R to wait more time for the data. I have tried these settings at the start of my script as a possible solution but it keeps happening.
options(timeout=190)
setInternet2(use=NA)
setInternet2(use=FALSE)
setInternet2(use=NA)
Any other suggestions or workarounds? Maybe to skip to the next loop when this happens and store in a variable the loop number of the times this error occurred so it can be queried again in the end but only for those i's in the loop that were skipped due to the connection error? The ideal solution would be, of course, to avoid having this error.
A solution using the RCurl package:
You can change the timeout option using
curlSetOpt(timeout = 200)
or by passing it into the call to getURL
getURL(url_vect[i], timeout = 200)
A solution using base R:
Simply download each file using download.file, and then worry about manipulating those file later.
I see this is an older post, but it still comes up early in the list of Google results, so...
If you are downloading via WinInet (rather than curl, internal, wget, etc.) options, including timeout, are inherited from the system. Thus, you cannot set the timeout in R. You must change the Internet Explorer settings. See Microsoft references for details:
https://support.microsoft.com/en-us/kb/181050
https://support.microsoft.com/en-us/kb/193625
This is partial code that I show you, but you can modify to you're needs :
# connect to website
withRestarts(
tryCatch(
webpage <- getURL(url_vect[i]),
finally = print(" Succes.")
),
abort = function(){},
error = function(e) {
i<-i+1
}
)
In my case the url_vect[i] was one of the url's I copied information. This will increase the time you need to wait for the program to finish sadly.
UPDATED
tryCatch how to example