Regarding Geocode API [maxresults], do you have any limit of maximum number of responses for search results?
Regarding Places(Search) API, how do I set the limit of maximum number of responses? Do you have an parameter of it?
Regarding Geocode API [maxresults], do you have any limit of maximum number of responses for search results?
There is no limit on the maximum responses for search results. But there is a limit on the result relevance (usually 100%, displayed in the response).
Regarding Places(Search) API, how do I set the limit of maximum number of responses? Do you have an parameter of it?
Yes, you can to use the size parameter for it. In the below example, we are restricting the number of results returned in the response to 2 by appending size=2 to the query.
https://places.demo.api.here.com/places/v1/discover/around?at=41.8369%2C-87.684&Accept-Language=en-US%2Cen%3Bq%3D0.5&app_id=DemoAppId01082013GAL&app_code=AJKnXv84fjrb0KIHawS0Tg&size=2
Hope this helps!
Related
I need to know the speed limit for some coordinates from the vehicle's trip.
If I do POST request like:
https://fleet.cit.api.here.com/2/calculateroute.json?routeMatch=1&mode=car&attributes=SPEED_LIMITS_FCn(*)&&app_id=APP_ID&app_code=APP_CODE
with body:
LATITUDE,LONGITUDE
37.401996,-122.041338
37.416438,-122.086022
I'm receiving back built route with very many coordinates.
But I'm interested in only two coordinates...
Is there a better way to get speed-limit for a few (~10-100) coordinates?
This will get an array of link and if speed limit is available in map data, it would be returned for the link. linkAttributes and routeAttributes will limit the response of parameters. the default response however can't be limited.
https://route.ls.hereapi.com/routing/7.2/calculateroute.json?apiKey=xxx&waypoint0=42.4065,-113.3798&waypoint1=42.2821,-83.74847&mode=fastest;car&inkAttributes=speedLimit,dynamicSpeedInfo&legAttributes=links
speedinfo can come via geocoder API in the form of speed category like below that will be based on the address location.
https://geocoder.ls.hereapi.com/6.2/geocode.json?searchtext=Castro%20St%20Mountain%20View%20Santa%20Clara%20US&gen=9&apiKey=xxxxx&locationattributes=li
If I make a request to the Google Analytics API using only the metric "ga:users", the result is different to the one that is returned in the "totalsForAllResult" field when I add a dimension.
Does anyone know the explanation for this and which is the correct result?
You cannot sum up the user counts per dimension and get a gross total as one and the same user can appear in multiple dimension values. For a detailed explanation look here. If you want to get the total users value, repeat the API request w/o the dimension. Apparently, Google does the mistake of blindly summing the values themselves in the totalsForAllResults field of the response of the Core Reporting API which can be highly misleading.
I understand that this is a question which has been asked elsewhere, but I haven't yet found an answer which is especially helpful.
The problem I'm having is that the data on the regular web version of analytics doesn't match the data I've pulled from the API.
From what I've read, this can sometimes be an issue with the type of query being used. Here's what I've been using:
var requiredArguments = {
'dimensions':'ga:medium',
'metrics': 'ga:users, ga:sessions, ga:uniquePageviews, ga:newUsers',
'sort': 'ga:medium',
'start-index': '1',
'max-results': '1000',
'sampling-level': 'DEFAULT',
};
and then...
var results = Analytics.Data.Ga.get(
tableId,
startDate,
finishDate,
'ga:users, ga:sessions, ga:uniquePageviews, ga:newUsers',
requiredArguments);
Sessions, across a month, for instance, can sometimes vary by other 1000. I've tried using different sampling types; I don;t think it's that, because I'm not going over 50,000 sessions in a query.
Any help on this is much appreciated.
You need to check the result returned if the data is sampled it will tell you the data is sampled.
"containsSampledData":false
samplingLevel
samplingLevel=DEFAULT Optional. Use this parameter to set the sampling
level (i.e. the number of sessions used to calculate the result) for a
reporting query. The allowed values are consistent with the web
interface and include: DEFAULT — Returns response with a sample size
that balances speed and accuracy. FASTER — Returns a fast response
with a smaller sample size. HIGHER_PRECISION — Returns a more accurate
response using a large sample size, but this may result in the
response being slower. If not supplied, the DEFAULT sampling level
will be used. See the Sampling section for details on how to calculate
the percentage of sessions that were used for a query.
Sampling should return results that are close but not exactly the same as the website. The only way to completely remove sampling from the API is to have a Premium Google Analytics Account
Also remember to consider processing latency. If you request data that is under 48 hours old it will also be different from the website.
I'm seeing inconsistencies between reported ga:sessions, ga:users, ga:pageviews from a query spanning a year through the API, and the same date range from the GA website.
I've been able to match ga:sessions & ga:pageviews exactly by requesting every month separately and summing the values, however in the case of ga:users I am still seeing wildly different figures between the numbers returned by GAPI and the GA website.
The number is actually larger than the year's figures when I sum the month's figures, and both numbers are higher than the values reported in the GA website.
What dimension/metric could GA be using for 'Users'?
I suspect your having an issue with sampling level. If the request you are making returns a large enough amount of data in this case selecting a full years worth of data. The server will return the results sampled.
Sampling
Google Analytics calculates certain combinations of dimensions and
metrics on the fly. To return the data in a reasonable time, Google
Analytics may only process a sample of the data.
You can specify the sampling level to use for a request by setting the
samplingLevel parameter.
If a Core Reporting API response contains sampled data, then the
containsSampledData response field will be true. In addition, 2
properties will provide information about the sampling level for the
query: sampleSize and sampleSpace. With these 2 values you can
calculate the percentage of sessions that were used for the query. For
example, if sampleSize is 201,000 and sampleSpace is 220,000 then the
report is based on (201,000 / 220,000) * 100 = 91.36% of sessions.
When requesting from the api the Default sampleing level is used this is in order to increase speed of the request. You can change that by sending specifying the samplingLevel to use in your request.
samplingLevel=DEFAULT
Optional.Use this parameter to set the sampling level (i.e. the number of sessions used
to calculate the result) for a reporting query. The allowed values are consistent with
the web interface and include:
•DEFAULT — Returns response with a sample size that balances speed and accuracy.
•FASTER — Returns a fast response with a smaller sample size.
•HIGHER_PRECISION — Returns a more accurate response using a large sample size, but this may result in the response being slower.
As I am not coming close to 100000 queries per day I am assuming that Google is referring to the Freebase 10 requests per second per user limit. (I am passing in my Goggle Key)
If i am running a query that crosses multiple Freebase domains is that considered more than one request? Or is a single query considered one request regardless of it size?
thanks
Scott
Yes, it sounds like you're exceeding the per/second rate limit. You'll need to introduce some delays in your application so that you don't exceed the limit. The rate limit only applies to HTTP requests so you can query as much data as you like as long as it fits in one request.