I am needed to use sf package command, because readShapePoly commnad will be erased. That's neer future I know... So I want to change my code from route
thata readShapePloy to route sf::st_read. But I cannot write correct code. So I want to correct code and I am very happy, if u show correct sf package command. My now command below, thx...(I am sorry to my poor English skill,Plz over come it...)
In R, I wrote code again and again for ex, on sf::st_read command. But that show error again and again...code below nd error message below too
usa_state <- readShapePoly("usa_state.shp", IDvar = "STATE_CODE")
That is ok, but I know to change that code neer future, cuz this command is erased in neer future. So Plz show me command thata route of sf package.I tried below code but I know this is not understandable in R.
usa_state = sf::st_read("usa_state.shp", layer = "STATE_CODE")
bad code...Plz shw me correct coding!Error occured now am I...
Error in CPL_read_ogr(dsn, layer, query, as.character(options), quiet, :
SQL execution failed, cannot open layer.
In addition: Warning message:
In CPL_read_ogr(dsn, layer, query, as.character(options), quiet, :
GDAL Error 1: SQL Expression Parsing Error: syntax error, unexpected
identifier, expecting SELECT or '('. Occurred around : "STATE_CODE"
You're almost there with usa_state = sf::st_read("usa_state.shp", layer = "STATE_CODE").
I'm guessing that STATE_CODE is a field in the usa_state.shp shapefile. You don't need to supply any field names to the st_read() function. Just use:
library(sf)
usa_state = st_read("usa_state.shp")
You'll need to make sure that the usa_state.shp file (and its associated files) are in your current working directory, or you'll need to use the full path:
usa_state = st_read("/path/to/usa_state.shp")
The sf package is well worth getting to know. It's made all of my spatial work in R much easier.
Related
I'm trying to read the shapefile that you can download with this url.
I have a code similar to the next one to download automatically the files:
library("raster")}
url<-"http://www6.gipuzkoa.eus/CATASTRO/Planos/ZIP-A098.zip"
downloader::download(url, dest=paste0(getwd(),"/","my_file.zip"), mode="wb",quiet=T)
zipped_shape_names<-c("098_HELBIDE_SHP/ATRIBUTOAK-A098.cpg","098_HELBIDE_SHP/ATRIBUTOAK-A098.dbf","098_HELBIDE_SHP/ATRIBUTOAK-A098.shp","098_HELBIDE_SHP/ATRIBUTOAK-A098.shx")
unzip("my_file.zip", files=zipped_shape_names)
my_shape<-raster::shapefile("098_HELBIDE_SHP/ATRIBUTOAK-A098.shp")
But what I obtain is the following error:
Error in rgdal::readOGR(dirname(x), fn, stringsAsFactors = stringsAsFactors, :
no features found
In addition: Warning messages:
1: In .local(x, ...) : .prj file is missing
2: In ogrFIDs(dsn = dsn, layer = layer) : no features found
You can access to the original web page by this link and pressing "Descargar planos"
I don't have this problem with others areas, just with this and another one, but I don't know what is happening with this specific area.
Any help will be appreciated.
The error message is quite clear. no features found means that your shapefile is empty. You can check this in multiple ways.
One is to add your shapefile in Q-Gis or any other GIS-software tool. In the case of Q-GIS your shapefile will pop-up in the layers pane, but you won't see any features.
It is also possible to check the .dbf file in R:
library(foreign)
read.dbf("098_HELBIDE_SHP/ATRIBUTOAK-A098.dbf")
The .dbf should contain as many rows as there are features. In your case None.
Simpler code would be
library(raster)
url<-"http://www6.gipuzkoa.eus/CATASTRO/Planos/ZIP-A098.zip"
download.file(url, dest="my_file.zip")
unzip("my_file.zip")
s <- raster::shapefile("098_HELBIDE_SHP/ATRIBUTOAK-A098.shp")
Clearly that file is empty. However, it works for this file:
s <-raster::shapefile("098_LANDALUR_SHP/LANDALUR-PARTZELAK-A098.shp")
I am trying to load the state map from the maps package into an R object. I am hoping it is a SpatialPolygonsDataFrame or something I can turn into one after I have inspected it. However I am failing at the first step – getting it into an R object. I do not know the file type.
I first tried to assign the map() output to an R object directly:
st_m <- maps::map(database = "state")
draws the map, but str(st_m) appears to do nothing, unless it is redrawing the same map.
Then I tried loading it as a dataset: st_m <- data("stateMapEnv", package="maps") but this just returns a string:
> str(stateMapEnv)
chr "R_MAP_DATA_DIR"
I opened the maps directory win-library/3.4/maps/mapdata/ and found what I think is the map file, “state.L”.
I tried reading it with scan and got an error message I do not understand:
scan(file = "D:/Documents/R/win-library/3.4/maps/mapdata/state.L")
Error in scan(file = "D:/Documents/R/win-library/3.4/maps/mapdata/state.L") :
scan() expected 'a real', got '#'
I then opened the file with Notepad++. It appears to be a binary or compressed file.
So I thought it might be an R data file with an unusual extension. But my attempt to load it returned a “bad magic number” error:
st_m <- load("D:/Documents/R/win-library/3.4/maps/mapdata/state.L")
Error in load("D:/Documents/R/win-library/3.4/maps/mapdata/state.L") :
bad restore file magic number (file may be corrupted) -- no data loaded
Observing that these responses have progressed from the unhelpful through the incomprehensible to the occult, I thought it best to seek assistance from the wizards of stackoverflow.
This should be able to export the 'state' or any other maps dataset for you:
library(ggplot2)
state_dataset <- map_data("state")
I am trying to load a file from the spanish building census (any of the files there will serve as an example, I'm using the 03001-ADSUBIA buildings one).
I have tried the read.gml function from the Multiplex package and get the following error:
read.gml("A.ES.SDGC.BU.03001.building.gml")
Error in which(("node" == arx) == TRUE)[1]:which(("edge" == arx) == TRUE)[1] :
NA/NaN argument
Then I tried the read.graph from the igraph package and also got an error:
read.graph("A.ES.SDGC.BU.46900.building.gml", format=c("gml"))
Error in .Call("R_igraph_read_graph_gml", file, PACKAGE = "igraph") :
At foreign.c:1127 : Parse error in GML file, line 1 (syntax error, unexpected STRING, expecting $end), Parse error
What am I doing wrong, and what can I do to fix it?
igraph and multiplex do not work because that is a different GML: Graph Modelling Language, as the name suggests, is for graphs (or networks). Your GML is Geography Markup Language.
Found an alternative on this post. However I would like to know why specific packages like multiplex or igraph cannot get the job done properly...
Code:
llayer<-ogrListLayers("A.ES.SDGC.BU.03001.building.gml")[1]
a<- readOGR(dsn="A.ES.SDGC.BU.46900.building.gml", layer=llayer, encoding = "UTF-8")
I have successfully used GEPHI to open gml files, and then use GEPHI'S export function (menu driven) to create 'new' gml files that open with igraph in R.
Details about GEPHI and the gml file definition are here.
I would like to retrieve power hedging data using Rbbg bloomberg package in R and I know this formula works in excel :
=BDH("VATT SS Equity","BI_%_ELECTRIC_POWER_HEDGED","01/01/2000","","GEOGRAPHIC_LOCATION_OVERRIDE=EUCN","BI_CONTRACT_MATURITY_OVERRIDE=CY12","FUND_PER=Q")
But when I try this in R :
conn<-blpConnect(log.level="off")
data<-bdh(conn,"VATT SS Equity","BI_PER_ELECTRIC_POWER_HEDGED","20000101","","GEOGRAPHIC_LOCATION_OVERRIDE=EUCN","BI_CONTRACT_MATURITY_OVERRIDE=CY12","FUND_PER=Q")
I get the following error message :
Error in .jcall("RJavaTools", "Ljava/lang/Object;", "invokeMethod", cl, :
org.findata.blpwrapper.WrapperException: response error: Invalid override field id specified [nid:217]
What should I change in the formula to make it work ?
Thanks
Edit: Indeed it is BI_PCT_ELECTRIC_POWER_HEDGED, however the problem does not come from here but from the overrides.
This returns an empty variable for me, but it doesn't throw an error so it might get you on the right track.
The way you specify options is different in the current version it seems.
data<-bdh(conn,"VATT SS Equity", "BI_PER_ELECTRIC_POWER_HEDGED","20000101","",
override_fields=c("GEOGRAPHIC_LOCATION_OVERRIDE",
"BI_CONTRACT_MATURITY_OVERRIDE",
override_values=c("EUCN","CY12"),
option_names="periodicitySelection",
option_values="QUARTERLY")
The doc where I found the correct syntax is here: RBloomberg. It was written in 2010 for the predecessor package (before Bloomberg complained about using their name) but I guess it works! I think the convention of enumerating the list of option names then the option values is odd compared to your assumption that OPTION=VALUE was correct, but there you go.
I'm working for the first time with the R package BerkeleyEarth, and attempting to use its convenience functions to access the BEST data. I think maybe it's just a problem with their servers (a matter I've separately addressed to the package's maintainer) but I wanted to know if it's instead something silly I'm doing.
To reproduce my fault
library(BerkeleyEarth)
downloadBerkeley()
which provides the following error message
trying URL 'http://download.berkeleyearth.org/downloads/TAVG/LATEST%20-%20Non-seasonal%20_%20Quality%20Controlled.zip'
Error in download.file(urls$Url[thisUrl], destfile = file.path(destDir, :
cannot open URL 'http://download.berkeleyearth.org/downloads/TAVG/LATEST%20-%20Non-seasonal%20_%20Quality%20Controlled.zip'
In addition: Warning message:
In download.file(urls$Url[thisUrl], destfile = file.path(destDir, :
InternetOpenUrl failed: 'A connection with the server could not be established'
Has anyone had a better experience using this package?
The error message is pointing to a different URL than one should get judging what URLs are listed at http://berkeleyearth.org/data/ that point to the zip formatted files. There are another set of .nc files that appear to be more recent. I would replace the entries in the BerkeleyUrls dataframe with the ones that match your analysis strategy:
This is the current URL that should be in position 1,1:
http://berkeleyearth.lbl.gov/downloads/TAVG/LATEST%20-%20Non-seasonal%20_%20Quality%20Controlled.zip
And this is the one that is in the package dataframe:
> BerkeleyUrls[1,1]
[1] "http://download.berkeleyearth.org/downloads/TAVG/LATEST%20-%20Non-seasonal%20_%20Quality%20Controlled.zip"
I suppose you could try:
BerkeleyUrls[, 1] <- sub( "download\\.berkeleyearth\\.org", "berkeleyearth.lbl.gov", BerkeleyUrls[, 1])