Prompt 'Yes' every time to getFilings - r

I am going to download the 2005 10-Ks for several corporations in R using the EDGAR package. I have a mini loop to test which is working:
for (CIK in c(789019, 777676, 849399)){
getFilings(2005,CIK,'10-K')
}
However each time this runs I get a yes/no prompt and I have to type 'yes':
Total number of filings to be downloaded=1. Do you want to download (yes/no)? yes
Total number of filings to be downloaded=1. Do you want to download (yes/no)? yes
Total number of filings to be downloaded=1. Do you want to download (yes/no)? yes
How can I prompt R to answer 'yes' for each run? Thank you

Please remember to include a minimal reproducible example in your question, including library(...) and all other necessary commands:
library(edgar)
report <- getMasterIndex(2005)
We can bypass the prompt by doing some code surgery. Here, we retrieve the code for getFilings, and replace the line that asks for the prompt with just a message. We then write the new function (my_getFilings) to a temporary file, and source that file:
x <- capture.output(dput(edgar::getFilings))
x <- gsub("choice <- .*", "cat(paste(msg3, '\n')); choice <- 'yes'", x)
x <- gsub("^function", "my_getFilings <- function", x)
writeLines(x, con = tmp <- tempfile())
source(tmp)
Everything downloads fine:
for (CIK in c(789019, 777676, 849399)){
my_getFilings(2005, CIK, '10-K')
}
list.files(file.path(getwd(), "Edgar filings"))
# [1] "777676_10-K_2005" "789019_10-K_2005" "849399_10-K_2005"

Related

Wait for rgbif download to complete before proceeding

I am developing a small application in R Shiny. Part of the application will need to query GBIF to download species occurrence data. This is possible using rgbif. The function rgbif::occ_download() will download the data and rgbif::occ_download_meta() will check whether GBIF has fulfilled your request. For example:
geometry <- "POLYGON((30.1 10.1,40 40,20 40,10 20,30.1 10.1))"
res <- occ_download(paste0("geometry within ", geometry), type = "within", format = "SPECIES_LIST")
occ_download_meta(res)
<<gbif download metadata>>
Status: RUNNING
Format: SPECIES_LIST
Download key: 0004089-190415153152247
Created: 2019-04-25T09:18:20.952+0000
Modified: 2019-04-25T09:18:21.045+0000
Download link: http://api.gbif.org/v1/occurrence/download/request/0004089-190415153152247.zip
Total records: 0
So far, so good. However, the following function rgbif::occ_download_get() can't download the data for downstream analysis until occ_download_meta(res) has completed (when Status = SUCCEEDED).
How can I make the session wait until the download from GBIF has been completed? I cannot hard code a wait time into the script as different sized extents will take GBIF longer or shorter amounts of time to process. Also, the number of other active users querying the service could also alter wait times. I therefore need some sort of flag where Status == Succeeded before proceeding.
I have copied some skeleton code with comments below.
library(rgbif)
geometry <- "POLYGON((30.1 10.1,40 40,20 40,10 20,30.1 10.1))" # Define boundary
res <- occ_download(paste0("geometry within ", geometry), type = "within", format = "SPECIES_LIST")
# WAIT HERE UNTIL Status == SUCCEEDED
occ_download_meta(res)
x <- occ_download_get(res, overwrite = TRUE) # Download data
data<-occ_download_import(x) # Import into R
rgbif maintainer here. You could do something like we have within the occ_download_queue() function:
res <- occ_download(paste0("geometry within ", geometry), type = "within", format = "SPECIES_LIST")
still_running <- TRUE
status_ping <- 3
while (still_running) {
meta <- occ_download_meta(res)
status <- meta$status
still_running <- status %in% c("succeeded", "killed")
Sys.sleep(status_ping) # sleep between pings
}
you probably want to check for succeeded and killed, and do something different if killed

Using R to access FTP Server and Download Files Results in Status "530 Not logged in"

What I'm Attempting to Do
I'm attempting to download several weather data files from the US National Climatic Data Centre's FTP server but am running into problems with an error message after successfully completing several file downloads.
After successfully downloading two station/year combinations I start getting an error "530 Not logged in" message. I've tried starting at the offending year and running from there and get roughly the same results. It downloads a year or two of data and then stops with the same error message about not being logged in.
Working Example
Following is a working example (or not) with the output truncated and pasted below.
options(timeout = 300)
ftp <- "ftp://ftp.ncdc.noaa.gov/pub/data/gsod/"
td <- tempdir()
station <– c("983240-99999", "983250-99999", "983270-99999", "983280-99999", "984260-41231", "984290-99999", "984300-99999", "984320-99999", "984330-99999")
years <- 1960:2016
for (i in years) {
remote_file_list <- RCurl::getURL(
paste0(ftp, "/", i, "/"), ftp.use.epsv = FALSE, ftplistonly = TRUE,
crlf = TRUE, ssl.verifypeer = FALSE)
remote_file_list <- strsplit(remote_file_list, "\r*\n")[[1]]
file_list <- paste0(station, "-", i, ".op.gz")
file_list <- file_list[file_list %in% remote_file_list]
file_list <- paste0(ftp, i, "/", file_list)
Map(function(ftp, dest) utils::download.file(url = ftp,
destfile = dest, mode = "wb"),
file_list, file.path(td, basename(file_list)))
}
trying URL 'ftp://ftp.ncdc.noaa.gov/pub/data/gsod/1960/983250-99999-1960.op.gz'
Content type 'unknown' length 7135 bytes
==================================================
downloaded 7135 bytes
...
trying URL 'ftp://ftp.ncdc.noaa.gov/pub/data/gsod/1961/984290-99999-1961.op.gz'
Content type 'unknown' length 7649 bytes
==================================================
downloaded 7649 bytes
trying URL 'ftp://ftp.ncdc.noaa.gov/pub/data/gsod/1962/983250-99999-1962.op.gz'
downloaded 0 bytes
Error in utils::download.file(url = ftp, destfile = dest, mode = "wb") :
cannot download all files In addition: Warning message:
In utils::download.file(url = ftp, destfile = dest, mode = "wb") :
URL ftp://ftp.ncdc.noaa.gov/pub/data/gsod/1962/983250-99999-1962.op.gz':
status was '530 Not logged in'
Different Methods and Ideas I've Tried but Haven't Yet Been Successful
So far I've tried to slow the requests down using Sys.sleep in a for loop and any other manner of retrieving the files more slowly by opening then closing connections, etc. It's puzzling because: i) it works for a bit then stops and it's not related to the particular year/station combination per se; ii) I can use nearly the exact same code and download much larger annual files of global weather data without any errors over a long period of years like this; and iii) it's not always stopping after 1961 going to 1962, sometimes it stops at 1960 when it starts on 1961, etc., but it does seem to be consistently between years, not within from what I've found.
The login is anonymous, but you can use userpwd "ftp:your#email.address". So far I've been unsuccessful in using that method to ensure that I was logged in to download the station files.
I think you're going to need a more defensive strategy when working with this FTP server:
library(curl) # ++gd > RCurl
library(purrr) # consistent "data first" functional & piping idioms FTW
library(dplyr) # progress bar
# We'll use this to fill in the years
ftp_base <- "ftp://ftp.ncdc.noaa.gov/pub/data/gsod/%s/"
dir_list_handle <- new_handle(ftp_use_epsv=FALSE, dirlistonly=TRUE, crlf=TRUE,
ssl_verifypeer=FALSE, ftp_response_timeout=30)
# Since you, yourself, noted the server was perhaps behaving strangely or under load
# it's prbly a much better idea (and a practice of good netizenship) to cache the
# results somewhere predictable rather than a temporary, ephemeral directory
cache_dir <- "./gsod_cache"
dir.create(cache_dir, showWarnings=FALSE)
# Given the sporadic efficacy of server connection, we'll wrap our calls
# in safe & retry functions. Change this variable if you want to have it retry
# more times.
MAX_RETRIES <- 6
# Wrapping the memory fetcher (for dir listings)
s_curl_fetch_memory <- safely(curl_fetch_memory)
retry_cfm <- function(url, handle) {
i <- 0
repeat {
i <- i + 1
res <- s_curl_fetch_memory(url, handle=handle)
if (!is.null(res$result)) return(res$result)
if (i==MAX_RETRIES) { stop("Too many retries...server may be under load") }
}
}
# Wrapping the disk writer (for the actual files)
# Note the use of the cache dir. It won't waste your bandwidth or the
# server's bandwidth or CPU if the file has already been retrieved.
s_curl_fetch_disk <- safely(curl_fetch_disk)
retry_cfd <- function(url, path) {
# you should prbly be a bit more thorough than `basename` since
# i think there are issues with the 1971 and 1972 filenames.
# Gotta leave some work up to the OP
cache_file <- sprintf("%s/%s", cache_dir, basename(url))
if (file.exists(cache_file)) return()
i <- 0
repeat {
i <- i + 1
if (i==6) { stop("Too many retries...server may be under load") }
res <- s_curl_fetch_disk(url, cache_file)
if (!is.null(res$result)) return()
}
}
# the stations and years
station <- c("983240-99999", "983250-99999", "983270-99999", "983280-99999",
"984260-41231", "984290-99999", "984300-99999", "984320-99999",
"984330-99999")
years <- 1960:2016
# progress indicators are like bowties: cool
pb <- progress_estimated(length(years))
walk(years, function(yr) {
# the year we're working on
year_url <- sprintf(ftp_base, yr)
# fetch the directory listing
tmp <- retry_cfm(year_url, handle=dir_list_handle)
con <- rawConnection(tmp$content)
fils <- readLines(con)
close(con)
# sift out only the target stations
map(station, ~grep(., fils, value=TRUE)) %>%
keep(~length(.)>0) %>%
flatten_chr() -> fils
# grab the stations files
walk(paste(year_url, fils, sep=""), retry_cfd)
# tick off progress
pb$tick()$print()
})
You may also want to set curl_interrupt to TRUE in the curl handle if you want to be able to stop/esc/interrupt the downloads.

"download.file" Incomplete and inconsistent downloads

Am trying to understand why I am having inconsistent results downloading CSV files from a website archive. Don't know if the problem is at my end, the other side or just failed communications in between. Any suggestions are welcomed.
Using a R script to automate the downloading of CSV files by month and year from the HYCOM archives for analysis. The script generated the following URL trying URL 'http://ncss.hycom.org/thredds/ncss/GLBu0.08/reanalysis/3hrly?var=salinity&var=water_temp&var=water_u&var=water_v&latitude=13.875&longitude=-72.25&time_start=2012-05-01T00:00:00Z&time_end=2012-05-31T21:00:00Z&vertCoord=&accept=csv'
Running download.file successfully obtains the file about half the time, otherwise fails. Any suggestions are welcomed. The images below shows the failed run. Successful run is below.
Successful Log
#download one month of data
MM = '05'
LastDay = ndays(paste(year,MM,'01',sep="-"))
H1 = paste( as shown in image)
H2 = '-01T00:00:00Z&time_end='
#H3 = 'T21:00:00Z&timeStride=1&vertCoord=&accept=csv'
H3 = 'T21:00:00Z&vertCoord=&accept=csv'
HtmlLink <- paste(H1,year,"-",MM,H2,year,"-",MM,"-",LastDay,H3,sep="")
dest = paste("../data/",year,MM,".csv",sep="")
download.file(url =HtmlLink ,destfile=dest,cacheOK=FALSE, method="auto")
trying URL 'as shown in image'
Content type 'text/plain;charset=UTF-8' length unknown
..................................................
................downloaded 666 KB
user system elapsed
28.278 6.605 5201.421
LOG OF FAILED RUN
You can/should turn the following into a function accepting parameters and replace the hardcoded values with said params (I used httr:::parse_query() to make the list):
library(httr)
URL <- "http://ncss.hycom.org/thredds/ncss/GLBu0.08/reanalysis/3hrly"
params <- list(var = "salinity",
var = "water_temp",
var = "water_u",
var = "water_v",
latitude = "13.875",
longitude = "-72.25",
time_start = "2012-05-01T00:00:00Z",
time_end = "2012-05-31T21:00:00Z",
vertCoord = "",
accept = "csv")
dest_file <- "filename"
res <- GET(url=URL,
query=params,
timeout(360),
write_disk(dest_file, overwrite=TRUE),
verbose())
warn_for_status(res)
You can (eventually) remove the verbose() from that GET call, but it's helpful during debugging.
The main issue is that this server is s l o w and times out before the transfer is complete. Even the value of 360 might not be enough (you'll need to experiment).
Many thanks to all for the help. The suggestion by hrbrmstr appears to be an elegant answer and I look forwards to testing it. However, I was unable to install a working copy using the program manager. Installation from a local download also failed since R complained that the OS X version that I downloaded from CRAN was a windows version, not OS X. Yes, I repeated the download several times to make sure I had the right package.
As suggested by Cyrus Mohammadian, I tried the procedures in the curl library.
Running the same URL, download.file transfers failed about 50% of the time. Using curl reduced the transfer times from 2000 seconds to 1000 seconds with no failures in 12 tries.
## calculate number of days in month
ndays <- function(d) {
last_days <- 28:31
rev(last_days[which(!is.na(
as.Date( paste( substr(d, 1, 8),
last_days, sep = ''),
'%Y-%m-%d')))])[1] }
nlat = 13.875
elon = -72.25
#download one month of data
year = 2008
MM = '01'
LastDay = ndays(paste(year,MM,'01',sep="-"))
H1 = paste('http://ncss.hycom.org/thredds/ncss/GLBu0.08/reanalysis/3hrly?
var=salinity&var=water_temp&var=water_u&var=water_v&latitude=',
nlat,'&longitude=', elon,'&time_start=',sep="")
H2 = '-01T00:00:00Z&time_end='
H3 = 'T21:00:00Z&timeStride=1&vertCoord=&accept=csv'
HtmlLink <- paste(H1,year,"-",MM,H2,year,"-",MM,"-",LastDay,H3,sep="")
dest = paste("../data/",year,MM,".csv",sep="")
curl_download(url =HtmlLink ,destfile=dest,quiet=FALSE, mode="wb")

How to create periodically send text to a "log file" while printing normal output to console?

I'm creating R code for a Monte Carlo simulation of a professional sport. Because the game dynamics are very complicated and to make the debugging process simpler, I'd like to have R send a line of text for every action that happens in the game to a "log file." The log file would be a comprehensive, play by play description of what's happening in the simulation, and would look something like this…
"GAME BEGINS"
POSSESSION ASSIGNED TO X TEAM
PLAYER Y GETS BALL
PLAYER Y SCORES
FOUL BY PLAYER Z OCCURS
SUBSTITUTION OCCURS (PLAYER W <-> PLAYER Q)
…
"GAME ENDS"
I can't just use the sink() function because while the simulation is running, I setup a progress bar (with the setTxtProgressBar function) and real time scores to be printed to the console. If I used sink(), I couldn't see any of the progress indicators or scores on the R console. Does this make sense? In other words I need to periodically send text to a log file in a cumulative fashion. Here is some example code to give you something to work with…
Thanks
for (i in 1:100)
{**SOMEHOW NEED TO PRINT LINE "START LOOP" TO LOG FILE**;
a <- rnorm(n = 100, mean = i, sd = 5);
print(mean(a)); #PRINT THIS MEAN TO THE CONSOLE
**SOMEHOW PRINT "LOOP 'i' COMPLETE" TO LOG FILE**}
See ?cat. You can open a file connection to your log file and specify that in your cat call. When you don't specify a file name or connection it will print to the console.
As you say, don't use sink() as it will make the log file the default connection. Rather, open a named connection with file().
> log_con <- file("test.log")
> cat("write to log", file = log_con) # creates file and writes to it
> cat("write to console") # prints to console
write to console
The above results in a log file with the line "write to log" and "write to console" printed on the console.
If you need to append to your log file, set append = TRUE and use the file name instead of the file() connection.
> cat("add to log", file = "test.log", append = TRUE)
To open the log file in "append" mode:
log_con <- file("test.log",open="a")
Figured it out, thanks to shujaa and BigFinger. To summarize, here is how you would do it with my example code:
log_con <- file("/filepath/log.txt", open="a")
for (i in 1:100)
{
cat("loop begins", file = log_con, sep="\n")
a <- rnorm(n = 100, mean = i, sd = 5)
print(mean(a))
cat("single loop completed", file = log_con, sep="\n")
}
close(log_con)
The library log4r seems to be more complete than an homemade one: https://github.com/johnmyleswhite/log4r
write("this message", file=stderr())
Same example, including values written to the log.txt file:
log_con <- file("log.txt", open="a")
for (i in 1:10){
cat("loop begins; i =", i, '\n', file = log_con)
a <- rnorm(n = 100, mean = i, sd = 5)
print(mean(a))
cat("single loop completed; mean(a) = ", mean(a), '\n', file =
log_con)
}
close(log_con)

How to get the queue number from CONDOR into your R job

I think I have a simple problem because I was looking up and down the internet and couldn't find someone else asking this question:
My university has a Condor set-up. I want to run several repetitions of the same code (e.g. 100 times). My R code has a routine to store the results in a file, i.e.:
write.csv(res, file=paste(paste(paste(format(Sys.time(), '%y%m%d'),'res', queue, sep="_"), sep='/'),'.csv',sep='',collapse=''))
res are my results (a data.frame), I indicate that this file contains the results with 'res' and finally I want to add the queue number of this calculation (otherwise files would be replaced, wouldn't they?). It should look like: 140109_res_1.csv, 140109_res_2.csv, ...
My submit file to condor looks like this:
universe = vanilla
executable = /usr/bin/R
arguments = --vanilla
log = testR.log
error = testR.err
input = run_condor.r
output = testR$(Process).txt
requirements = (opsys == "LINUX") && (arch == "X86_64") && (HAS_R_2_13 =?= True)
request_memory = 1000
should_transfer_files = YES
transfer_executable = FALSE
when_to_transfer_output = ON_EXIT
queue 3
I wonder how do I get the 'queue' number into my R code? I tried a simple example with
print(queue)
print(Queue)
But there is no object found called queue or Queue. Any suggestions?
Best wishes,
Marco
Okay, I solved the problem. This is how it goes:
I had to change my submit file. I changed the slot arguments to:
arguments = --vanilla --args $(Process)
Now the process number is forwarded to the R code. There you retrieve it with the following line. The value will be stored as a character. Therefore, you should convert it to a numeric value (also check whether a number like 10 is passed on as '1' and '0' in which case you should also collapse the values).
run <- commandArgs(TRUE)
Here is an example of the code I let run.
> run <- commandArgs(TRUE)
> run
[1] "0"
> class(run)
[1] "character"
> try(as.numeric(run))
[1] 0
> try(run <- as.numeric(paste(run, collapse='')) )
> try(print(run))
[1] 0
> try(write(run, paste(run,'csv', sep='.')))
You can also find information how to pass on variables/arguments to your code here: http://research.cs.wisc.edu/htcondor/manual/v7.6/condor_submit.html
I hope this helps anyone.
Cheers and thanks for all other commenters!
Marco

Resources