I'm trying to get data from 2000 and 2010:
acs.fetch(endyear="2000",span="0", LA_tract, variable="H002_001", dataset="sf1")
Also tried:
acs.fetch(endyear="2000",span="0", LA_tract, table.name="H002", dataset="sf1")
(LA_tract is the geography)
I get an error:
Error in file(file, "rt") : cannot open the connection
The variable/table number is correct...
The function and key work fine with ACS data - just not decade data.
I would appreciate any help with this issue.
span is the number of years in the survey. I've seen 5 and 3 and 1, but not 0.
Related
I am suddenly getting an error messaging when using tigris::counties() even though the same code worked perfectly fine a few weeks ago. Can anyone replicate this error or know of a workaround?
require(tigris)
counties_VT <- counties(state="VT",cb=T,class="sf")
Error Message:
Retrieving data for the year 2020
Error: Cannot open "C:\Users\QSMITH\AppData\Local\Temp\RtmpYZZH4z"; The source could be corrupt or not supported. See `st_drivers()` for a list of supported formats.
In addition: Warning message:
In unzip(file_loc, exdir = tmp) : error 1 in extracting from zip file
Thank you!
Hi Im using the seasonal library in Rstudio, I have a data set of 48 numbers to make a seasonal adjustment, and the function seas() doesn't seems to work.
After several attemps to fix the error I found the problem is in the amount of data:
-Less than 36: not enough data points, series must have at least 3 complete years of data.
-Between 36 and 62 data points appears the following error:
Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file 'C:\Users\krafftdi\AppData\Local\Temp\RtmpGaGNle\x1355813076522/iofile.est': No such file or directory
-More than 63 it appears to be working just fine.
I have no idea why it happened that way, I already try to reinstall the library as well as change the working directory but nothing seems to work.
Is it a problem with my library/computer/Rstudio or the library doesn't work?
Can anybody explain to me why the error is in Temporaly files and how can make it work for 48 numbers (ie 4 years)?
Here is an example of my test code:
Test <- rnorm(62)
Test.ts <-ts(Test, start = c(2019,1),frequency =12)
seas(Test.ts)
Thank you very much!
I'm trying to get markt capitalization from brazil stocks, but I'm getting an error in mycode
This is my code
what_metrics <- yahooQF(c("Market Capitalization"))
getQuote("PINE3.SA", what=what_metrics)
This is the error I get
> Error in sq[, "exchangeTimezoneName"] : incorrect number of dimensions
How can I fix this?
The error likely comes from the fact that the yahoo data in quantmod does not look at stocks traded in Brazil - try e.g. "SPY" instead of "PINE3.SA" and see if that also gives you the error. You could perhaps download the Bovespa data and analyze it with quantmod - see: https://github.com/palencar/BovespaID.
I want to create a data frame in R with name of districts and their long and lat. For this purpose, I gave the command:
>locations_df <- mutate_geocode(cities_df, district)
This commands used the maps.googleapis.com for the purpose of geocoding, but for my 13 districts I am getting NA error. One of the error is pasted below:
geocode failed with status OVER_QUERY_LIMIT, location = "Sukkur"
How can I provide the geocode for missing values? I checked the name of missed cities on google map for spelling error but no such error was seen.
Thank you for the help.
Honestly your best bet may be to try running the query again. The results seem pretty idiosyncratic as far as I can tell (possibly related to dynamic IPs over WiFi?). These are my results from just now.
df %>% mutate_geocode(address)
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Sukkur&sensor=false
address lon lat
1 Sukkur NA NA
Warning message:
geocode failed with status OVER_QUERY_LIMIT, location = "Sukkur"
So it failed for me too. I checked the queries I had left, then added "Paris" to check its results.
geocodeQueryCheck()
2499 geocoding queries remaining.
df
address
1 Sukkur
df[2,1]<-"Paris"
df %>% mutate_geocode(address)
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Sukkur&sensor=false
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Paris&sensor=false
address lon lat
1 Sukkur 68.822808 27.72436
2 Paris 2.352222 48.85661
And now it works!
The issue may be helped by obtaining a Google Maps API key as this question suggests, which you can use if you install the GitHub version of ggmap.
The other option is to iterate through requests as an answer here suggests.
I have three dataframes, for which I am trying to find a cell-by-cell mean.
r1<-raster('a.tif')
r2<-raster('b.tif')
r3<-raster('c.tif')
However, doing this is giving me the following error
q<-mean(r1,r2,r3)
or
q<-(r1+r2+r3)/3
Error
Error in .local(.Object, ...) : options(warn) not set
Warning message:
closing unused connection 4 .....
That is a weird error message. Often this type of situation goes away if you restart R without loading an old workspace (which may be stale). If that is what is going on use unlink(".RData"), exit R without saving and start again.
To answer your aside question, yes it is much easier to stack them. E.g.
f <- list.files(pattern='tif$')
s <- stack(f)
x <- sum(s)