Add many routes using Google Maps Directions Service Over Query Limit - google-maps-api-3

I have an application that requires having many (sometimes over 50) dynamic routes added to the google map. Using the DirectionsService, I am able to add right around 10 routes or so before the requests start to fail with the status of "Over Query Limit". I was wondering if anyone has encountered this situation and has found a way to deal with it. It seems that even if I pause between requests, I am still failing for anything over the first 10. Not sure if there is a request limit for a single map instance?
Any help with this would be much appreciated!

According to the documentation, you can only get up to 2 requests per second. If your code do that slower than 2 per second, that should be fine.
I used the setInterval(function(){.... to test it and yup, if you do it slow enough it seems to work.
http://jsfiddle.net/bd1qdLv3/

Related

Maximum number of avoid areas exceeds limit using avoid areas

I am trying to get a route between two points using avoid areas with Routing API HERE maps and I am having the error "Maximum number of avoid areas exceeds limit".
Below you could find the request I am using:
https://route.cit.api.here.com/routing/7.2/calculateroute.json?app_id=#####&app_code=#####g&waypoint0=geo!39.4640023,-0.3850572&waypoint1=geo!39.476885,-0.3801068&mode=fastest;car;traffic:disabled&avoidareas=39.45315053433366,-0.3745426849935516;39.45244111598196,-0.3758222574575006!39.45646192309658,-0.3727107307399733;39.456087897102364,-0.3738696063317133!39.45594467628818,-0.37061955378352013;39.455610302758494,-0.37146705481229625!39.46063897809171,-0.3637087111174383;39.460208373008,-0.36463342201032306!39.46027406507121,-0.3644229889377801;39.45945896807123,-0.36512131930616654!39.45778290983732,-0.36235345142498465;39.45722411730335,-0.36284132909356276!39.458055076124936,-0.3685070306751628;39.45796969111227,-0.369566281083658!39.45960790790132,-0.36670532495457014;39.45880954421065,-0.3687782227883713!39.46786419209955,-0.3788290555558871;39.466598324440575,-0.37952348064968555!39.46629280916266,-0.37952060299424345;39.46579450682472,-0.3798614186868332!39.447906189702366,-0.3865406097869585;39.44771727050539,-0.38799155376945255!39.447906266440604,-0.3860336486039068;39.44767149909636,-0.3866130855790714!39.45518409583871,-0.3836551666444044;39.454907307568014,-0.38405749286187724!39.45964221683283,-0.38704088462136754;39.45899783260966,-0.38824034688297143!39.46042754674725,-0.3884778363064053;39.45992759234617,-0.3890550711354175!39.46052328505597,-0.38689531313812037;39.459738005168106,-0.38822226990415315!39.46193614040639,-0.389429648171608;39.46154553298938,-0.3900677999760695!39.46191503182935,-0.39018276482275266;39.46111159154079,-0.3909970310465749!39.4639823644881,-0.39148296845987174;39.46280010046198,-0.39256505432368666!39.467739617727254,-0.38146113326699044;39.46706936907706,-0.3821015492686101!39.46976679855793,-0.3846784165944088;39.46901057325898,-0.3860335643592155!39.469205752460745,-0.38845565289929074;39.46871087077631,-0.38937325235434783!39.47401789680908,-0.39616459840290014;39.4733842284781,-0.396270815957154!39.47423647867104,-0.39540645561031434;39.47401789680908,-0.39616459840
I would like to know if this limitation is because I am using a free trial plan or because the API have this limitation. In this case any idea to solve this limitation?
Below you could find the here guide to use Avoid areas, if someone is interested.
https://developer.here.com/rest-apis/documentation/routing/topics/example-route-avoiding-an-area.html
Thanks,
This is an API limitation. Trial credentials don't limit the functionality of available service.

Optimal route with timed waypoints

I want to plan a route for a delivery truck with a number of stops along the way where some of the stops need to be timed. I have tried to figure out a way that makes use of the Directions service and the Distance service without making a ton of requests. If someone has got a solution in theory I can probably put together the code myself. Thanks!

Scrape all google search result for a specific name

I think the question has been answered here before,but i could not find the desired topic.I am a newbie in web scraping.I have to develop a script that will take all the google search result for a specific name.Then it will grab the related data against that name and if there is found more than one,the data will be grouped according to their names.
All I know is that,google has some kind of restriction on scraping.They provide a custom search api.I still did not use that api,but hoping to get all the resulted links corresponding to a query from that api. But, could not understand what will be the ideal process to do the scraping of the information from that links.Any tutorial link or suggestion is very much appreciated.
You should have provided a bit more what you have been doing, it does not sound like you even tried to solve it yourself.
Anyway, if you are still on it:
You can scrape Google through two ways, one is allowed one is not allowed.
a) Use their API, you can get around 2k results a day.
You can up it to around 3k a day for 2000 USD/year. You can up it more by getting in contact with them directly.
You will not be able to get accurate ranking positions from this method, if you only need a lower number of requests and are mainly interested in getting some websites according to a keyword that's the choice.
Starting point would be here: https://code.google.com/apis/console/
b) You can scrape the real search results
That's the only way to get the true ranking positions, for SEO purposes or to track website positions. Also it allows to get a large amount of results, if done right.
You can Google for code, the most advanced free (PHP) code I know is at http://scraping.compunect.com
However, there are other projects and code snippets.
You can start off at 300-500 requests per day and this can be multiplied by multiple IPs. Look at the linked article if you want to go that route, it explains it in more details and is quite accurate.
That said, if you choose route b) you break Googles terms, so either do not accept them or make sure you are not detected. If Google detects you, your script will be banned by IP/captcha. Not getting detected should be a priority.

Graphite nginx requests per second

Is there any way how to get Graphite to graph req/s ?
When you retrieve nginx requests from nginx_status you are sending an absolute value to the graphite, so I'm thinking if there is any way how you can get the rate per second ?
My understanding is that derivative(series) would give you requests/minute but I could really use requests/s.
Cheers.
I'm not sure if this is the right way to do this but it seems like this did the trick
scaleToSeconds(derivative(stats.*.*.*.nginx.handles),1)
Anyone sees any problems with this ?
Run your nginx access logs through Logster, which will munge the data and forward several interesting metrics -- including requests per second, if you wish -- to Graphite.
https://github.com/etsy/logster

Efficiently webscraping a website without an api?

Considering that most languages have webscraping functionality either built in, or made by others, this is more of a general web-scraping question.
I have a site in which I would like to pull information from about 6 different pages. This normally would not be that bad; unfortunately though, the information on these pages changes roughly every ten seconds, which could mean over 2000 queries an hour (which is simply not okay). There is no api to the website I have in mind either. Is there any possible efficient way to get the amount of information I need without flooding them with requests, or am I out of luck?
At best, the site might return you an HTTP 304 Not Modified in its header when you make a request - indicating that you need not download the page, as nothing has changed. If the site is set up to do so, this might help decrease bandwidth, but would still require the same number of requests.
If there's a consistent update schedule, then at least you know when to make the requests - but you'll still have to ask (i.e.: make a request) to find out what information has changed.

Resources