I am calculating the elevation to obtain a graph with the api of nokia and the value of elevation that is returning me in a point near the sea is of 44.0 (meters?). If I get the elevation of that point from other online map services, it returns me 3 meters. What happens with the value returned by Nokia?
I am using the routing API (https://route.api.here.com/routing/7.2/calculateroute.json) with the following parameters "returnelevation=true&routeAttributes=all&representation=navigation&mode=fastest;car" and the response is the following information.
and the json response is this
"shape":[
"36.8142901,-2.5617902,44.0",
"36.814338,-2.5617993,44.0",
"36.8146276,-2.562207,45.0",
"36.8153036,-2.5631618,49.0",
"36.8153572,-2.5632262,50.0",
"36.8154109,-2.5632584,50.0",
"36.8155181,-2.5632906,51.0",
"36.8155932,-2.5633442,52.0",
"36.815722,-2.5636446,56.0",
"36.8158185,-2.5635159,56.0"
],
Thank you for raising the issue with us. We have informed our data team to check this. You can also report data discrepancies in https://mapcreator.here.com/
Related
I've looked through many pages of how to do this and they essentially all have the same R code suggestions, which I've followed. Here's the R code I'm using for the specific weather station I'm looking for:
library(rnoaa)
options(noaakey="MyKeyHere")
ncdc(datasetid='GHCND', stationid='GHCND:USW00014739', datatypeid='dly-tmax-normal', startdate='2017-05-15', enddate='2018-01-04')
The error message I get when I run this is:
Warning message:
Sorry, no data found
I've gone directly to the NOAA site (https://www.ncdc.noaa.gov/cdo-web/search) and manaually pulled the dataset out there (using the "daily summaries" dataset, which is the same as GHCND in the API). There is in fact data there for my entire date range.
What am I missing?
The documentation says:
Note that NOAA NCDC API calls can take a long time depending on the call. The NOAA API doesn't perform well with very long timespans, and will time out and make you angry - beware.
Have you tried a smaller timespan?
I have a question regarding wind speed metric in Here Weather Observation API.
Following is the link and example.
https://developer.here.com/documentation/weather/topics/example-weather-observation.html
I am not sure if you are looking for the definition of WindSpeed , if yes then here it is :
Wind speed in km/h or mph. If the element is in the response and it
contains a value, there is a Double in the String.
<windSpeed>1.85</windSpeed>
If the element is
in the response and it does not contain a value, there is an asterisk
(*) in the String.
<windSpeed>*</windSpeed>
for more detail about metric and these parameters you can go through the link :
https://developer.here.com/documentation/weather/topics/resource-report.html#resource-report__param-product
Hope this help !
I have about 9 million records with latitude and longitude, in addition to the timestamp in EST. I want to use the latitude and longitude to generate the appropriate regional timezone, from which I then want to adjust all these times for the relevant timezone.
I have tried using geonames
data_raw$tz <- mapply(GNtimezone, data$lat,data$lon)
However, this returns the following error:
Error in getJson("timezoneJSON", list(lat = lat, lng = lng, radius = 0)) :
error code 13 from server: ERROR: canceling statement due to statement timeout
I have tried to use a method described in this post.
data$tz_url <- sprintf("https://maps.googleapis.com/maps/api/timezone/%s?location=%s,%s×tamp=%d&sensor=%s",
"xml",
data$lat,
data$lon,
as.numeric(data$time),
"false")
for(i in 1:100){
data$tz[i] <- xmlParse(readLines(data$tz_url[i]), isURL=TRUE)[["string(//time_zone_name)"]]
}
With this method, I am able to get the urls for the XML data. But when I try to pull the XML data in a for loop, and append the timezone to the dataframe, it doesn't do it for all the records... (in fact, only 10 records at a time intermittently).
Does anyone know of any alternate methods or packages to get the three character timezone (i.e. EST) for about 9 million records relatively quickly? Your help is much appreciated. Or better yet, if you have ideas on why the code I used above isn't working, I'd appreciate that too.
For a list of methods of converting latitude and longitude to time zone, see this post. These mechanisms will return the IANA/Olson time zone identifier, such as America/Los_Angeles.
However, you certainly don't want to make 9 million individual HTTP calls. You should attempt to group the records to distinct locations to minimize the number of lookups. If they are truly random, then you will still have a large number of locations, so you should consider the offline mechanisms described in the previous post (i.e. using the tz_world shapefile with some sort of geospatial lookup mechanism).
Once you have the IANA/Olson time zone identifier for the location, you can then use R's time zone functionality (as.POSIXct, format, etc.) with each of corresponding timestamp to obtain the abbreviation.
However, do recognize that time zone abbreviations themselves can be somewhat ambiguous. They are useful for human readability, but not much else.
I've written the package googleway to access google maps API. You'll need a valid API key (and, for Google to handle 9 million calls you'll have to pay for it as their free one only covers 2500)
library(googleway)
key <- "your_api_key"
google_timezone(location = c(-37, 144),
key = key)
$dstOffset
[1] 0
$rawOffset
[1] 36000
$status
[1] "OK"
$timeZoneId
[1] "Australia/Hobart"
$timeZoneName
[1] "Australian Eastern Standard Time"
I am encountering a problem with the reverse geocoding geonames api package in R. I have a dataset of nearly 900k rows containing latitude and longtitude and I am using GNneighbourhood(lat,lng)$name function to create the neighbourhood for every pair of coordinates(my dataset contains incidents in San Francisco).
Now, while the function is working perfectly for the big majority of points, there are times that it is giving error code 15 message :we are afraid we could not find a neighbourhood for latitude and longitude. The same procedure can be performes with revgeocode function(google reverse geocoding api) of the ggmap package and in fact it works right even for the points that give error with geoname package. The reason I am not using it is cause of the query limit per day.
Successful example
GNneighbourhood(37.7746,-122.4259)$name
[1] "Western Addition"
Failure
GNneighbourhood(37.76569,-122.4718)$name
Error in getJson("neighbourhoodJSON", list(lat = lat, lng = lng)) :
error code 15 from server: we are afraid we could not find a neighbourhood for latitude and longitude :37.76569,-122.4718
Searching for the above point in google maps works fine and we can also see that the incident is not on water or any other inappropriate location.(Unless the park nearby is indicating something, i don't know)
Anyone with experience with the procedure and the specific package? Is it possible for the api to be incomplete? It clearly states that it can handle all US cities. Some help would be appreciated.
is there a way to use WMS->GetFeatureInfo specifying a TIME period (eg: 2014-01-01/2014-03-01) to extract a series of values from a raster layer loaded from a GeoServer instance?
Thanks in advance
Not at the moment, no. It may be added in the future though, it's not the first time I hear this request. I don't have a ETA, it depends on when funding to work on it shows up.
In the meantime, a somewhat complex workaround might be to configure the image mosaic index as a WFS feature type, query it by date, figure out the exact time values intersected by the interval, and then do N GetFeatureInfo requests, one for each of those values.