I would like to do a project on speed limits on highways in germany. I want to know the distance between changes of the speed limit. To do this I need get a dataset which includes the speed limit traffic signs or the areas where a speed limit is set along ONE highway.
I havent worked with here yet. And before I dig into the details I would like to know if here is the right tool to do this project. And of course it would be nice if you could also tell me briefly how to do it, since I dont't even know where to start in here :)
Thanks a lot!
I tried Openstreetmaps before, but the data is too outdated. For example you cannot see speed limits due to construction works.
I found this link on other posts https://github.com/seaBass3/here-pde-speed-limit
but it seems not valid any more
This can be solved by different approaches, but one the most feasible, is the following:
By using the tool HereTraffic API v7 you can get real-time traffic flow and information about traffic signs, by using query parameters, response structures, and data types.
You can get all the information required on real-time traffic flow data in JSON, including information on speed and jam factor for the region(s) defined in each request. Can also deliver additional data such as the geometry of the road segments in relation to the flow.
Provides aggregated information about traffic incidents in JSON, including the type and location of each traffic incident, status, start and end time, and other relevant data. This data is useful to dynamically optimize route calculations.
And if you need historical information also like from past dates you can always use Here Probe Data to get the data which can be compared with different datasets.
This is one of the examples you can use in order to get all the information you need:
curl -H "Authorization: Bearer $TOKEN" "https://data.traffic.hereapi.com/v7/flow?locationReferencing=shape&in=bbox:13.400,52.500,13.405,52.505"
Related
I am currently using the Google Places API on a free trial. I am interested in paying for the API but can't find the exact cost of the two commands that I use: google_places(), and google_place_details(). I have contacted the Google sales team and looked at the places and billing url, but I have not managed to find the answer of how much it would cost exactly to execute these two commands.
For google_places(), this is an example of a command I would execute:
google_places(search_string = "Cafeteria in Madrid, Spain", key=key)
From the places and billing url, it seems like this counts as a text search, so each time the code is executed it would cost 0,032$. Is this the case?
For google_place_details(), here is an example of the command I would execute:
google_place_details(place_id = "ChIJf_XA-F0U04kR1IPYSdTJ4so", key=key)
This command, as well giving basic place details (which cost 0,017$ according to the billing url),
gives information which counts as contact data (an extra 0,003$) and atmosphere data (an extra 0,005$). It also provides photo data (0,007$ according to the billing url), which I am not interested in but is automatically included in the results anyway. Does this mean that the cost of executing this command once is these four prices summed up?
I am interested in knowing exactly how much it would cost to execute the two commands I have listed.
probably this helps:
First of all you are billed monthly after you exceeded the 200 Euro/Dollars, which are given by google for free (as you probably described as "free plan"). So after every month you get a bill on how many requests of each function you send to google. There everything is written quite clearly including the amount and price of each "unit". then you can easily divide it.
Second option would be your Google Api Cockpit.
It tracks your requests quite precisely on different time bases. So sending your wanted commands only once on a day can give you an exact total-price.
The Cockpit is super handy for different things. If you want you can even set limits, which is probably helpful in your case too.
Here is the link to the billing monitor as well: Billing Google API Cockpit
Furthermore the description of how google charges you. Look here
best regards
I am using a HERE API call to request traffic incident data from a particular start time. Whenever I include the "type" key and specify "Accident" as the value, no response is returned. However, switching the value to "Construction" does provide a response.
Does anyone have information on how to make the API call return accident data specifically?
Here is the exact call I am using:
https://traffic.api.here.com/traffic/6.3/incidents.json?app_id={{app_id}}&app_code={{app_code}}&startTime=2017-01-01T00:00:00-05:00&type=Accident&bbox=52.5233,13.4035;52.5181,13.4159
There is no data returned as there are no “Accident” type incidents in that location. This can be seen when skipping the “type” parameter:
https://traffic.api.here.com/traffic/6.3/incidents.xml?app_id=APP_ID&app_code=APP_CODE&startTime=2017-01-01T00:00:00-05:00&bbox=52.5233,13.4035;52.5181,13.4159
Traffic Data is dynamic data. Accident can be located somewhere and disappear in a matter of minutes. We recommend to use wego.here.com to locate an accident, or Bing maps (they are all using our services and traffic data).
We were able to find one in Germany right now (should be there until 13:34 German time):
https://traffic.api.here.com/traffic/6.3/incidents.xml?app_id=app_id&app_code=app_code&startTime=2017-01-01T00:00:00-05:00&prox=51.52427,11.85887,15&type=Accident
Hope this helps! Happy Coding!
google has an API for downloading search suggestions:
https://www.google.com/support/enterprise/static/gsa/docs/admin/70/gsa_doc_set/xml_reference/query_suggestion.html
unfortunately, as far as i can tell, these results are specific to your location. for an analysis, i would like to be able to define the city/location that google thinks it is making the suggestion to. here's what happens when i scrape from dar es salaam, tanzania:
http://suggestqueries.google.com/complete/search?client=firefox&q=insurance
["insurance",["insurance","insurance companies in tanzania","insurance group of tanzania","insurance principles","insurance act","insurance policy","insurance act tanzania","insurance act 2009","insurance definition","insurance industry in tanzania"]]
i understand that a vpn would partially solve this issue, but only by giving me a different location and not lots of locations. is there a reasonable way to replicate this sort of thing quickly and easily from, say, the 100 largest cities in the united states?
confirmation that results differ within the usa-
thanks!
Google will use your IP and your location history (if turned on) to determine your location.
To be able to go around it, you can spoof your IP while logged off your google account (but I don't know if google will consider it a trial of hacking no matter what your intentions are).
Another way is to use Tor browser (even though it is not it's original purpose). You can configure tor to exit from a certain country using the Exitnode parameter in the torrc config file
As found in the docs:
ExitNodes node,node,…
A list of identity fingerprints, country codes, and address patterns of nodes to use as exit node
But if you want a fast way to do it, I don't think that's possible since google wants to know the real location of the users and have put a lot of effort into making such tricks fail.
The hl param for interface language changes the search results, but I can't tell if it's actually changing the location. For example:
http://suggestqueries.google.com/complete/search?client=chrome&q=why&hl=FR
Here's an example with 5 different values of hl:
http://jsbin.com/tusacufaza/edit?js,output
I think the question has been answered here before,but i could not find the desired topic.I am a newbie in web scraping.I have to develop a script that will take all the google search result for a specific name.Then it will grab the related data against that name and if there is found more than one,the data will be grouped according to their names.
All I know is that,google has some kind of restriction on scraping.They provide a custom search api.I still did not use that api,but hoping to get all the resulted links corresponding to a query from that api. But, could not understand what will be the ideal process to do the scraping of the information from that links.Any tutorial link or suggestion is very much appreciated.
You should have provided a bit more what you have been doing, it does not sound like you even tried to solve it yourself.
Anyway, if you are still on it:
You can scrape Google through two ways, one is allowed one is not allowed.
a) Use their API, you can get around 2k results a day.
You can up it to around 3k a day for 2000 USD/year. You can up it more by getting in contact with them directly.
You will not be able to get accurate ranking positions from this method, if you only need a lower number of requests and are mainly interested in getting some websites according to a keyword that's the choice.
Starting point would be here: https://code.google.com/apis/console/
b) You can scrape the real search results
That's the only way to get the true ranking positions, for SEO purposes or to track website positions. Also it allows to get a large amount of results, if done right.
You can Google for code, the most advanced free (PHP) code I know is at http://scraping.compunect.com
However, there are other projects and code snippets.
You can start off at 300-500 requests per day and this can be multiplied by multiple IPs. Look at the linked article if you want to go that route, it explains it in more details and is quite accurate.
That said, if you choose route b) you break Googles terms, so either do not accept them or make sure you are not detected. If Google detects you, your script will be banned by IP/captcha. Not getting detected should be a priority.
I am trying to get the distance traveled on a transit route -- particularly San Francisco MUNI, but the standards NextBus, GTFS, and Google Maps API appear to be universal. I'm comfortable using any of these APIs, I'm just not sure how to go about this problem.
The easy way - ask Google Maps (this using webservices, but there is also the javascript API):
http://maps.googleapis.com/maps/api/directions/json?origin=37.7954199,-122.397&destination=37.7873299,-122.44691&sensor=false&mode=transit&departure_time=1348109609&alternatives=true
this JSON includes distance traveled, but there are two issues:
Google does not allow you to use this data unless you're displaying a map, which I don't want to do
I would need to ensure that the distance returned is for the correct route/line, since it can/will give multiple routing options. This is probably doable but would require more logic.
EDIT: using alternatives=true (or provideRouteAlternatives: true using the javascript API) only returns a maximum of 3 routes, which here in SF often doesn't include the route I'm looking for (other transit agencies, multiple lines on the same route, etc). So this isn't such a great option.
NextBus:
example route config:
http://webservices.nextbus.com/service/publicXMLFeed?command=routeConfig&a=sf-muni&r=1
The coordinates for each stop are given, but connecting the dots on those is not the same as the route taken -- it will cut corners, etc, and I need this to be accurate. The actual route taken is given under <path>/<point>, but I don't see any obvious correlation between stop and path coordinates. Plus, NextBus says in their documentation (p.10 near the bottom) that you should NOT connect points between <path> segments, they're only meant for drawing on a map and can overlap.
GTFS:
The GTFS data also separates stop and "shape" coordinates (like NextBus paths). Unfortunately, the coordinates are slightly different for the same stops between NextBus and GTFS (rounding), though the stop ID/tags are the same. Also, the data files are in the megabytes, and I need to use this for a mobile app. I suppose I could put all the data in a database and query that, but that still leaves figuring out how to correlate the stops with the shape. The "shapes_distance_traveled" column in the shapes.txt file is especially promising. MUNI chooses to leave the optional "shapes_distance_traveled" field out of stop_times.txt, though.
Any advice would be appreciated, I understand this seems like an epic task to get a simple value. Maybe I'll just throw a map in to legitimately use the distance :)
Instead of using Google Maps, I would look into the un-encumbered licensing of OpenStreetMap. There are multiple
routing engines that can use OSM data. Personally, I would use routing in PostGIS or SQLite, but depending on your skillset you might choose another.
You've clearly done your research, (+1), and as you said, the easy way is to ask Google. If it is worth for you then you might want to look into purchasing a business licence to use the Google Maps API, and negotiate with them about the requirement of displaying a map. That's the only legal way I can think of with the Google API. Alternatively, you can try building you own routing engine with data from the TIGER data set, which is freely available from the US Census Bureau, but again, as you said, it may seem like an epic task. :-)