R: Google Distance Matrix API request rate limit exceeded - r

I know that similar questions have been asked before, but from what I've been able to gather, none of the answers seem to apply to my case.
What I'm trying to do is replicate this, but in the R language : Computing the optimal road trip across the us
Everything works perfectly until I ask the Googles for the distance matrix for more than 10 locations. In my script (to follow) I list my API key, and on the API website I can see that my successful runs of the program (when the number of locations is less than 10) increase my usage for the day, so I know that my API is working... I think.
What I don't understand is why do I receive the "rate limit exceeded" error for, say, a distance matrix with 11 locations? If I have 1,500 requests left, I should certainly not have any issues, right? I should add that I am not familiar with other programming languages such as Java and Python, so that could explain part of my confusion.
Here be the relevant code:
# Request object from API
r <- GET(
"https://maps.googleapis.com/maps/api/distancematrix/json",
query = list(
origins = places,
destinations = places,
key = "INSERT API KEY HERE")
)
stop_for_status(r)
distances <- content(r)
The variable 'places' is simply a list containing the locations that I want distances to/from.

RTM?
Each query sent to the Google Maps Distance Matrix API is limited by
the number of allowed elements, where the number of origins times the
number of destinations defines the number of elements.
The Google Maps Distance Matrix API has the following limits in place:
Standard Usage Limits
Users of the standard API:
2,500 free elements per day
100 elements per query
100 elements per 10 seconds
Ergo: I think you have to split it up into several queries, with a 10 seconds pause in between, in order to get the full distance matrix.

Related

HERE Routing API - Matrix/Multiple arrival times?

Update: I have switched to HERE Routing API because I was told that it would be faster. I am trying to use HerePy to get a routing matrix, but I am getting the following error message:
AttributeError: 'RoutingApi' object has no attribute 'matrix'
Regardless of whether I find out how to move past this error, it's also not clear whether this API could accept multiple departure times (Each of my origins has its own departure time.). I have a feeling I will also run into the matrix size issue again. Does anyone know how to fix this error and/or know more about what I'm able to do? I had a phone call with someone from the sales department, but they didn't know the answers to these questions.
Original Question: I am trying to use the Google Maps Distance Matrix API. I have an array or origins, an array of destinations, and then an array of arrival times. Each destination has its own arrival time. From what I have read in the documentation, it is not clear whether I can use an array of arrival times, or just one arrival time per request. Does anyone know?
I suppose if I can only do one arrival time per request, then I would just group together the destinations with the same arrival times into one request. I will need to do multiple requests anyway due to the 100 maximum of elements/25 maximum of origins or destinations per request.
Thanks!
You can check the Matrix Routing API offered by Here API.
The Matrix Routing service is an HTTP JSON API that calculates routing matrices, travel times and/or distances, of up to 10,000 origins and 10,000 destinations. A routing matrix is a matrix with rows labeled by origins and columns by destinations. Each entry of the matrix is the travel time or distance from the origin to the destination.
For more information , please do visit to the following link

Google Maps Distance Matrix - efficient useage [duplicate]

This question already has answers here:
Google Maps API - Getting closest points to zipcode
(2 answers)
Closed 2 years ago.
I'm working on the delivery web app and can't figure out how can I reduce the number of requests app sends to google distance matrix to calculate a distance between requested delivery and store address.
I have a page-catalogue that has around 1000 stores. Each time user enters delivery address I send 1000 requests to google maps API to check if user's delivery address is within each store delivery range. Hence google charges me for 1000 requests every time users enter new delivery address.
Any suggestions on how to optimise usage of Google API and show only those stores that deliver to the selected address, as the current way is the way too expensive. I'm wondering how large on-demand delivery services that have tens of thousands of stores deal with this?
You could calculate the direct-line distance (using a formula) and only request stores whose direct line distance is less than the allowed range, since the travel distance can't be shorter than the direct line.
If you don't care about getting exactly the shortest travel, you can also sort the candidates, request them in order and stop as soon as you get an acceptable one. That will occasionally give a store that's physically closer but further away by road, which may or may not be acceptable.
In most programming languages, the direct-line distance will be available in a "geo" library or similar, under the name "great-circle distance". You can also search for it here on SO.

How to show only train/subway stations using nearby stations api?

I am using the nearby stations by geocode transit api to return a json result of 3 closest stations. I assumed these would be train/subway stations but the 3 closest to any given dynamic lat & lng are usually bus stations.
Example of my query with parameters: https://transit.api.here.com/v3/stations/by_geocoord.json?center=LAT%2CLNG&radius=350&app_id=APPID&app_code=APPCODE&max=3`
I have read in pervious posts that the ability to use modes such as mode=1 is only available whilst using one of the routing apis.
To cut down on API calls, I do not wish to call google map nearby places api for train/stations and then use those stations in another api call to Here.com to get the distances. I was hoping there was a way I can do it in one call.
I am client side filtering at the moment and only displaying name & distance. I see that if I change my max parameter to 50, i see more results further down which are tube/subway stations but unsure as to how I would go about filtering these out. Also, i see reference to icons but no url's given to access them, where are these located?
I am using ReactJS and the native fetch method for my api request.
Thanks
I expect that you have 3 requirements:
1) names and distances of stations around a specific coordinate, 2) stations should be filtered by only trains/trams(intercity or intracity) and
doesn't include buses and remaining other modes,3) Avoiding multiple API calls:
by_name or by_geocoord API restrict the search based on the modes, though it can fulfil the first requirement
Route transit API returns connection with the transit stations(by train only) and distances as well, based on the arrival and departure coordinates,
https://transit.api.here.com/v3/route.json?app_id=xxxxxxxxx&app_code=xxxxxxxx&modes=intercity_train,bus,light_rail&dep=41.9773,-87.9019&arr=41.8961,-87.6552&time=2019-06-24T07%3A30%3A00#

Analytics API doesn't match web data

I understand that this is a question which has been asked elsewhere, but I haven't yet found an answer which is especially helpful.
The problem I'm having is that the data on the regular web version of analytics doesn't match the data I've pulled from the API.
From what I've read, this can sometimes be an issue with the type of query being used. Here's what I've been using:
var requiredArguments = {
'dimensions':'ga:medium',
'metrics': 'ga:users, ga:sessions, ga:uniquePageviews, ga:newUsers',
'sort': 'ga:medium',
'start-index': '1',
'max-results': '1000',
'sampling-level': 'DEFAULT',
};
and then...
var results = Analytics.Data.Ga.get(
tableId,
startDate,
finishDate,
'ga:users, ga:sessions, ga:uniquePageviews, ga:newUsers',
requiredArguments);
Sessions, across a month, for instance, can sometimes vary by other 1000. I've tried using different sampling types; I don;t think it's that, because I'm not going over 50,000 sessions in a query.
Any help on this is much appreciated.
You need to check the result returned if the data is sampled it will tell you the data is sampled.
"containsSampledData":false
samplingLevel
samplingLevel=DEFAULT Optional. Use this parameter to set the sampling
level (i.e. the number of sessions used to calculate the result) for a
reporting query. The allowed values are consistent with the web
interface and include: DEFAULT — Returns response with a sample size
that balances speed and accuracy. FASTER — Returns a fast response
with a smaller sample size. HIGHER_PRECISION — Returns a more accurate
response using a large sample size, but this may result in the
response being slower. If not supplied, the DEFAULT sampling level
will be used. See the Sampling section for details on how to calculate
the percentage of sessions that were used for a query.
Sampling should return results that are close but not exactly the same as the website. The only way to completely remove sampling from the API is to have a Premium Google Analytics Account
Also remember to consider processing latency. If you request data that is under 48 hours old it will also be different from the website.

Wrong users count returned

I'm seeing inconsistencies between reported ga:sessions, ga:users, ga:pageviews from a query spanning a year through the API, and the same date range from the GA website.
I've been able to match ga:sessions & ga:pageviews exactly by requesting every month separately and summing the values, however in the case of ga:users I am still seeing wildly different figures between the numbers returned by GAPI and the GA website.
The number is actually larger than the year's figures when I sum the month's figures, and both numbers are higher than the values reported in the GA website.
What dimension/metric could GA be using for 'Users'?
I suspect your having an issue with sampling level. If the request you are making returns a large enough amount of data in this case selecting a full years worth of data. The server will return the results sampled.
Sampling
Google Analytics calculates certain combinations of dimensions and
metrics on the fly. To return the data in a reasonable time, Google
Analytics may only process a sample of the data.
You can specify the sampling level to use for a request by setting the
samplingLevel parameter.
If a Core Reporting API response contains sampled data, then the
containsSampledData response field will be true. In addition, 2
properties will provide information about the sampling level for the
query: sampleSize and sampleSpace. With these 2 values you can
calculate the percentage of sessions that were used for the query. For
example, if sampleSize is 201,000 and sampleSpace is 220,000 then the
report is based on (201,000 / 220,000) * 100 = 91.36% of sessions.
When requesting from the api the Default sampleing level is used this is in order to increase speed of the request. You can change that by sending specifying the samplingLevel to use in your request.
samplingLevel=DEFAULT
Optional.Use this parameter to set the sampling level (i.e. the number of sessions used
to calculate the result) for a reporting query. The allowed values are consistent with
the web interface and include:
•DEFAULT — Returns response with a sample size that balances speed and accuracy.
•FASTER — Returns a fast response with a smaller sample size.
•HIGHER_PRECISION — Returns a more accurate response using a large sample size, but this may result in the response being slower.

Resources