How can I request pinpoint accurate geocoding data using HERE Api - here-api

I am looking to use HEREs geocoding service to locate the Lat and Lon of a place based on a UK postcode. At the moment my request will return a rough location even though I have provided a full postcode.
The old "geocode" API that I used previously, would return relevant results however this has been put into maintenance and replaced with the "geocode and search" API. This new API seems like it just looks through a list of stored points of interest within HERE’s database and returns the closest it can to what you have searched for, rather than trying to find the exact location entered.
How can I get more accurate results using the below request? Bare in mind that I will only have access to the postcode.
https://geocode.search.hereapi.com/v1/geocode?q={postCode}&apiKey={key}
At the moment I receive a response similar to the below using postcode PE1 1QL. It should be pointing to a car park, however if you enter the lat and lon returned from the API into a map E.g Google Maps, it gives you a more general location, rather than an accurate one.
{
"title": "PE1 1, Peterborough, England",
"id": "here:cm:namedplace:22221149",
"resultType": "locality",
"localityType": "postalCode",
"address": {
"label": "PE1 1, Peterborough, England",
"countryCode": "GBR",
"countryName": "England",
"county": "Cambridgeshire",
"city": "Peterborough",
"postalCode": "PE1 1"
},
"position": {
"lat": 52.57362,
"lng": -0.24219
},
"mapView": {
"west": -0.23515,
"south": 52.56739,
"east": -0.25194,
"north": 52.57984
},
"scoring": {
"queryScore": 0.67,
"fieldScore": {
"postalCode": 0.95
}
}
},
I would expect the Lat and Lng to be much closer to the postcode entered than the above example.

Regarding on this release notes https://developer.here.com/documentation/geocoding-search-api/release_notes/topics/known-issues.html
you can read "High precision postal codes are not yet supported":
Known Issues
The following table lists issues known to be present in the current release.
Search for intersections is not yet supported
Search by telephone numbers is not yet supported
Political views are not yet supported. All views are “International”
Places detail views are not yet supported
High precision postal codes are not yet supported
The Geocoder API 6.2 will be supported at least until end of 2020 (maybe more) and "Maintenance" in documentation means: no new features.

Related

Bug: Here routing isolines API returning same polygon for all distances

For some coordinates, the Here isoline API returns the same polygon for all the time distances, regardless of the range[values] value.
The example is for Burlington GO train station, ON, Canada, and changing the coordinates a few meters causes invalid data.
example:
curl --location -g --request GET 'https://isoline.router.hereapi.com/v8/isolines?transportMode=pedestrian&range[type]=time&range[values]=50,600&destination=43.339900,-79.809388&apikey=XXXXXX'
{
"arrival": {
"time": "2022-08-31T06:28:52+00:00",
"place": {
"type": "place",
"location": {"lat": 43.33981,"lng": -79.80958},
"originalLocation": { "lat": 43.3399,"lng": -79.8093881}
}
},
"isolines": [
{
"range": {"type": "time","value": 50},
"polygons": [{"outer": "BG86o1yC3mmn4E1C2GhI2C1KAhI1CArF4K1KgI1CiIqB2CgEA4G" }
]
},
{
"range": {"type": "time","value": 600},
"polygons": [{"outer": "BG86o1yC3mmn4E1C2GhI2C1KAhI1CArF4K1KgI1CiIqB2CgEA4G"}
]
}
]
}
If I change the coordinates just a few meters, to "lat": 43.3403845, "lng": -79.8089106 then it returns correct isolines.
Is there a best practice when selecting destination coordinates, any extra parameter I should use to make the results more predictable?
I consider this a bug, if you cannot return valid results you should throw an error, not return invalid data to clients.
I think you can get better results if you use the additional parameter "radius" (meters) for the place (lat, lng) that you provided.
This parameter instructs the router to consider all places within the given radius as potential candidates for route calculation. This can be either because it is not important which place is used, or because it is unknown. Radiuses wider than 200 meters are not supported.
For example: destination=43.339900,-79.809388;radius=100.
I hope this information is useful for you.
Regards.

Here API: Can accept-language header be used to get filter addresses (French versus English)

We are using the places/search endpoint to look up addresses.
Being based in Ontario, a lot of addresses have French and English version.
For example, looking for 360 Lisgar Ottawa, on will return 2 addresses in French and 2 in English (https://places.demo.api.here.com/places/v1/discover/search?at=45%2C-75&q=360+Lisgar%2C+Ottawa%2C+on&addressFilter=countryCode%3DCAN&Accept-Language=en%3Bq%3D1%2Cfr%3Bq%3D0.1&app_id=DemoAppId01082013GAL&app_code=AJKnXv84fjrb0KIHawS0Tg).
We have been trying all kinds of combination of the "Accept-Language" header parameters, with different values and weight, and it always returns the 4 values. Is there any other way to only get the addresses in French or English?
We could filter in our code after, if there was any language field on the address returned, but we couldn't find any reference about it in the documentation. Is there any ways to do it?
Thanks,
Rene
The Accept-Language header is used in the here-api to indicate preference for language. However, if we have no translation/transliteration for a result in that language, we use a fallback mechanism where we try to return results in a language that would be intelligible to the user. In your case, I suspect that there are only French entries for some of the addresses and only English entries for others, which is why you get the same results no matter what you set the "Accept-Language" header to. There's no way to filter results so that if we don't have a translation in your preferred language then we don't return the result at all.
This might be worth trying as I can specify the language using the HERE Geocoder REST API by adding a parameter (the same format as the Google API) at the end:
&language=en
or
&language=ar
I work in Dubai so my other desired language is Arabic. See examples below.
Example REST API Call for English
https://geocoder.cit.api.here.com/6.2/geocode.json?app_id={your app id}&app_code={your app code}&searchText=1 Prince Fawaz St, Al Khobar&language=en
Example extract from the JSON return
"Address": {
"Label": "1 Prince Ahmad Street, Madinat Al Ummal, 34441 Al Khobar, Saudi Arabia",
"Country": "SAU",
"County": "Eastern Province",
"City": "Al Khobar",
"District": "Madinat Al Ummal",
"Street": "Prince Ahmad Street",
"HouseNumber": "1",
"PostalCode": "34441",
"AdditionalData": [
{
Example REST API Call for Arabic
https://geocoder.cit.api.here.com/6.2/geocode.json?app_id={your app id}&app_code={your app code}&searchText=1 Prince Fawaz St, Al Khobar&language=ar
Example extract from JSON return
"Address": {
"Label": "\u202e\u202a1\u202c شارع الأمير أحمد, مدينة العمال, \u202a34441\u202c الخبر, السعودية\u202c",
"Country": "SAU",
"County": "المنطقة الشرقية",
"City": "الخبر",
"District": "مدينة العمال",
"Street": "شارع الأمير أحمد",
"HouseNumber": "1",
"PostalCode": "34441",
"AdditionalData": [
{

Get a place from the Places here-api using the id

Although there is a place lookup from the ID of other Here Services (https://developer.here.com/rest-apis/documentation/places/topics_api/resource-lookup.html)... I can't find a way to lookup place details using the actual placeID... I'd like to be able to cache the id and give my user recent places they've viewed.
Is this possible... in the detailed response or the searches all provide the ID... but is it useable as I describe?
Sample output from Autocomplete for the Atlanta Airport is below:
{
"title": "ATL",
"highlightedTitle": "<b>ATL</b>",
"vicinity": "6000 N Terminal Pkwy<br/>College Park, GA 30320",
"highlightedVicinity": "6000 N Terminal Pkwy<br/>College Park, GA 30320",
"position": [
33.640397,
-84.450922
],
"category": "airport",
"href": "https://places.api.here.com/places/v1/places/840djgzq-aea3f677bbd744ab855203f2ba20281b;context=Zmxvdy1pZD05MmE2ZDVkNS05MDZiLTU3YTQtOGM3NC00MTMxYjY5YzllNDlfMTQ5NzE0NTc4ODAwNl83OTUzXzcwMjEmcmFuaz0w?app_id=eo36dAgbCSxzcLGxzyjZ&app_code=jDJSp_MrBeF6jbuZXUSQqw",
"type": "urn:nlp-types:place",
"resultType": "place",
"id": "840djgzq-aea3f677bbd744ab855203f2ba20281b"
}
To get details of particular place you need to use URL from href attribute. If you want, then you can create a map with key as a value of id attribute and value as URL from href attribute.

Wrong intent in Alexa Skill Request when using the simulator

I set up my intents using this intent schema:
{
"intents": [
{
"intent": "StartIntend"
},
{
"intent": "AMAZON.YesIntent"
},
{
"intent": "AMAZON.NoIntent"
}
]
}
My sample utterances look like this (it's german):
StartIntend Hallo
StartIntend Moin
StartIntend Guten Tag
Why does the Amazon Developer Console generate the following request, when I use the utterance "Yes" or "Ja"?
{
"session": {
"sessionId": "SessionId...",
"application": {
"applicationId": "amzn1.ask.skill...."
},
"attributes": {},
"user": {
"userId": "amzn1.ask.account...."
},
"new": true
},
"request": {
"type": "IntentRequest",
"requestId": "EdwRequestId...",
"locale": "de-DE",
"timestamp": "2017-02-17T21:07:59Z",
"intent": {
"name": "StartIntend",
"slots": {}
}
},
"version": "1.0"
}
Whatever I enter, it always is using the intend StartIntend.
Why is that? What have I forgotten / what have I done wrong?
The schema and utterance look correct.
I tried duplicating what you are seeing by performing the following steps:
Copied them as-is into a new skill on my account
Selected the North America region on the Configuration page.
Set the lambda to point to an existing lambda that I have. For testing purposes, I just need a valid ARN. I'm going to ignore the response anyways.
Then entered "Yes" into the service simulator
It indeed sent the Lambda the AMAZON.YesIntent.
So I conclude that there's nothing with the data you posted.
I tried entering Ja which resulted in the StartIntend, but I guess I would expect that since Ja is not "Yes" in North America.
Have you set the region to Europe, and entered a Lambda for the Europe region?
I talked about it with the Amazon Support. After some experiments it turned out, you have to write "ja" in lowercase. It seems to be a bug in the simulator itself.
When creating the skill in the Alexa Skills Kit, you need to choose the correct language i.e. German, see screenshot below.
Everything else seems to be correct.

pull the citations for a paper from google scholar using R

Using google-scholar and R, I'd like to find out who is citing a particular paper.
The existing packages (like scholar) are oriented towards H-index analyses: statistics on a researcher.
I want to give a target-paper as input. An example url would be:
https://scholar.google.co.uk/scholar?oi=bibs&hl=en&cites=12939847369066114508
Then R should scrape these citations pages (google scholar paginates these) for the paper, returning an array of papers which cite the target (up to 500 or more citations). Then we'd search for keywords in the titles, tabulate journals and citing authors etc.
Any clues as to how to do that? Or is it down to literally scraping each page? (which I can do with copy and paste for one-off operations).
Seems like this should be a generally useful function for things like seeding systematic reviews as well, so someone adding this to a package might well increase their H :-)
Although there's is a bunch of available Google's API, a google scholar-based API is not available. So, albeit a web crawler on google scholar pages might not be difficult to develop, I do not know to what extent it might be illegal. Check this.
Alternatively, you could use a third party solution like SerpApi. It's a paid API with a free trial. We handle proxies, solve captchas, and parse all rich structured data for you.
Example python code (available in other libraries also):
from serpapi import GoogleSearch
params = {
"api_key": "secret_api_key",
"engine": "google_scholar",
"hl": "en",
"cites": "12939847369066114508"
}
search = GoogleSearch(params)
results = search.get_dict()
Example JSON output:
{
"position": 1,
"title": "Lavaan: An R package for structural equation modeling and more. Version 0.5–12 (BETA)",
"result_id": "HYlMgouq9VcJ",
"type": "Pdf",
"link": "https://users.ugent.be/~yrosseel/lavaan/lavaanIntroduction.pdf",
"snippet": "Abstract In this document, we illustrate the use of lavaan by providing several examples. If you are new to lavaan, this is the first document to read … 3.1 Entering the model syntax as a string literal … 3.2 Reading the model syntax from an external file …",
"publication_info": {
"summary": "Y Rosseel - Journal of statistical software, 2012 - users.ugent.be",
"authors": [
{
"name": "Y Rosseel",
"link": "https://scholar.google.com/citations?user=0R_YqcMAAAAJ&hl=en&oi=sra",
"serpapi_scholar_link": "https://serpapi.com/search.json?author_id=0R_YqcMAAAAJ&engine=google_scholar_author&hl=en",
"author_id": "0R_YqcMAAAAJ"
}
]
},
"resources": [
{
"title": "ugent.be",
"file_format": "PDF",
"link": "https://users.ugent.be/~yrosseel/lavaan/lavaanIntroduction.pdf"
}
],
"inline_links": {
"serpapi_cite_link": "https://serpapi.com/search.json?engine=google_scholar_cite&q=HYlMgouq9VcJ",
"cited_by": {
"total": 10913,
"link": "https://scholar.google.com/scholar?cites=6338159566757071133&as_sdt=2005&sciodt=0,5&hl=en",
"cites_id": "6338159566757071133",
"serpapi_scholar_link": "https://serpapi.com/search.json?as_sdt=2005&cites=6338159566757071133&engine=google_scholar&hl=en"
},
"related_pages_link": "https://scholar.google.com/scholar?q=related:HYlMgouq9VcJ:scholar.google.com/&scioq=&hl=en&as_sdt=2005&sciodt=0,5",
"versions": {
"total": 27,
"link": "https://scholar.google.com/scholar?cluster=6338159566757071133&hl=en&as_sdt=2005&sciodt=0,5",
"cluster_id": "6338159566757071133",
"serpapi_scholar_link": "https://serpapi.com/search.json?as_sdt=2005&cluster=6338159566757071133&engine=google_scholar&hl=en"
},
"cached_page_link": "https://scholar.googleusercontent.com/scholar?q=cache:HYlMgouq9VcJ:scholar.google.com/&hl=en&as_sdt=2005&sciodt=0,5"
}
},
...
Check out the documentation for more details.
Disclaimer: I work at SerpApi.

Resources