here-api Time taken to travel the distance - here-api

I was trying the HERE routing maps to find out if it actually give me reasonable time for the distances that I already know.
Source : 12.971076,77.537375
Destination: 12.975366,77.606841
https://route.ls.hereapi.com/routing/7.2/calculateroute.json?waypoint0=12.971076,77.537375&waypoint1=12.975366,77.606841&mode=fastest%3Bcar%3Btraffic%3Aenabled&departure=now&apiKey={API-Key}
I see that the API always gives me the time taken in between 33/34 minutes on all days of the week and all hours of the day.
Any idea on how the travel time is calculated?
In my opinion, for the above coordinates, during the rush hours it takes anywhere between 50-80 minutes.

We have replicated the same API, and got 2 different travel time by altering the value of departure. Can you please try to use below DEPARTURE time format.
https://route.ls.hereapi.com/routing/7.2/calculateroute.json?waypoint0=12.971076,77.537375&waypoint1=12.975366,77.606841&mode=fastest%3Bcar%3Btraffic%3Aenabled&departure=2020-03-19T13:00:00&apiKey=xxxxxx
https://route.ls.hereapi.com/routing/7.2/calculateroute.json?waypoint0=12.971076,77.537375&waypoint1=12.975366,77.606841&mode=fastest%3Bcar%3Btraffic%3Aenabled&departure=2020-03-19T10:00:00&apiKey=xxxxxx

Related

Get routes with historical LIVE data

I would like to retrieve historical travel times from HERE API.
Following the API documentation for 'Calculate Route', I requested travel times for a fixed route at a fixed departure time for different days in the past, using mode=fastest;car;traffic:enabled.
The result is the same route every day and a weekday pattern (i.e., same travel time each Monday) for travel times. This obviously does not include actual traffic conditions on the specified day.
From the documentation, I would have expected to get specific travel times for each day in the past (up to one year).
Did I miss something or is this just not possible?
Thanks a lot for any help!
It is not possible to get the exact route for a past date. For any date other than NOW(current time), the route is calculated with the accumulated historical data taking into consideration the week day, time of the day, any construction work etc.

Google Analytics: How to compare real-time vs yesterday?

In the REAL-TIME / Overview page, you can see how much people are currently browsing your site. Although, how do you know if this current value is good or bad? I would like to know how much people were browsing my site the same time the day before, so I would know if I have 5% more or less people.
Also, how would I know if the site is doing it better or worse than 1, 2 or 5 hours before? The REAL-TIME shows the last 30 minutes of per minute page-views, but how do I know if the site is going down or up compared to a few hours before? 30 minutes is not enough.
Is there any add-on to add, custom modification to make, or free/paid service to complement?
You want to use the standard ("core") reporting. The dimensions that will help you are (UI / API):
Hour / ga:hour: A two-digit hour of the day ranging from 00-23 in the timezone configured for the account. This value is also corrected for daylight savings time. If the timezone follows daylight savings time, there will be an apparent bump in the number of sessions during the changeover hour (e.g., between 1:00 and 2:00) for the day per year when that hour repeats. A corresponding hour with zero sessions will occur at the opposite changeover. (Google Analytics does not track user time more precisely than hours.)
Hour of day / ga:dateHour: Combined values of ga:date and ga:hour formated as YYYYMMDDHH
Date Hour and Minute / ga:dateHourMinute: Combined values of ga:date, ga:hour and ga:minute formated as YYYYMMDDHHMM
Hour Index / ga:nthHour: The index for each hour in the specified date range. The index for the first hour of the first day (i.e., start-date) in the date range is 0, for the next hour 1, and so on
With the UI you can add a secondary dimension to reports or build custom reports, with the API you can need to build your requests from scratch (try the explorer, official API doc).

Best retention practice using Graphite

I have been a happy user of Graphite+Grafana for a few months now and I have been advocating it around my firm.
My approach has been to measure data of interest and collect them into 1-minute or 5-minute buckets and send that information to Graphite. I was recently contacted by a group that processes quotes (billions a day!) and their approach has been to create a log line each time their applications process 1 million quotes. The problem is that the interval between 2 log lines can be highly erratic from 1 second to a few hours.
The dilemma is then: should I set my retention policy to a 1-second bucket so that I can see all measurements associated with spikes or should I use say a 1-minute bucket so that the number of data points to be saved and later on queried is much more manageable. FYI, when I set it to 1-second, showing the data for 8 or 10 charts, for a few days was bringing the system (or at least my browser) to a crawl because of the numbers of data points (mostly NULL) being pushed around from Graphite to Grafana
Here's my retention policy: 1s:10d,1m:36d,5m:180d
Alternatively, is there a way to configure Grafana+Graphite to only retrieve non-NULL data points?
What do you recommend?
You can always specify a lower retention period for 1s metrics so when you show a longer range Graphite will send you only the more coarse level.
For example, you can specify: 1s:2d, 1m:7d, 5m:180d
This way, if you show a range more than 2 days in the past you will get 1m resolution (and so on), which won't make your browser crawl, while you will still be able to inspect spikes in the last 2 days.

How Do You Deal With Time Zones in Time Series Graphs?

I imagined there would be more literature on this, but I'm having trouble finding any. I have a lot of non-algebraically-aggregatable time series data (that is to say, points for which no function exists that I could use to aggregate them to a higher granularity-- stuff like unique active users, unique contributors, etc... where knowing the amount I had every minute of some hour does not tell me how many I had total during the hour). Currently, I'm just storing and presenting all of this data in UTC. The problem is that many of my clients find this confusing-- understandably so. Because the data is non-algebraically-aggregatable, there's no way to get from UTC data for 1 day midnight- midnight to, say, PST data from midnight to midnight. Recalculation would need to be done from raw data.
So:
Recalculation from raw data is prohibitively expensive for some complicated analytics graphs
We could store all data for all time zones, but this would increase the amount of data we store x24.
All of that said, how do other people deal with this issue? Here's how Google Analytics does it, but this seems insufficient for my use case because I know if I open the multiple timezone can of worms, clients will ask for more than one. This will also take a lot of work that doesn't seem worth the effort as just adding timezone support won't be extremely noticeable or a huge win. What I'm really hoping for is some clever design solution that just presents the UTC data in some intuitive enough way that it's no longer confusing for people in other timezones. Has anyone dealt with similar problems and come upon a solution I'm missing?
First of all, you should recognize that there a lot more than 24 time zones. In order to accurately take into account how people actually use time worldwide, you should be using IANA time zones, of which there are over 500. See also Wikipedia and the timezone tag wiki.
If you are dealing with individual points (discreet timestamps), then you can certainly convert from UTC to any time zone you wish, on the fly as you render your graph. You just need to also keep in mind that the range of data you query will also need to be translated to that time zone.
But if you are talking about aggregating data by the "day" of a specific time zone, then there is no magic bullet. You will need to decide ahead of time which time zones you want to support and calculate each one separately. When you do this, recognize that it's not just the view that's changing. Since the day boundaries are different for each time zone, then the data for each time zone could potentially have very different daily totals.
You should also be aware that not every day has 24 hours. If the day happens to be the date of a daylight saving time transition, it could have 23, 23.5, 24.5, or 25 hours. This could potentially affect how you draw your graph.
One approach you might consider is to be time zone ignorant in your aggregations, rather than using UTC or any specific time zone. Of course this depends heavily on the context of your data, but it is appropriate in certain circumstances. For example, on an invoice, you might care less about the specific timestamps, and more about which calendar date the invoice was assigned to. In that case, once a date is assigned, you would just aggregate on that date. Even if the company operates over multiple time zones, you wouldn't care about that in aggregate.
As far as some clever design that abstracts this from the user, I'm afraid I haven't seen much. The only two choices you really have are timezone-adjusted aggregations (UTC or otherwise), and time zone ignorant aggregations for calendar-date contexts.
We had similar issues to roll up the data for Generation in renewable. We went with three options User / Farm / UTC.
If user selects USER then all the data would be based on his browser Time zone. And Yesterday meant 24 hours till last mid night in user local time.
Similarly if it was Farm, then we take the Farm local and derive the same.
UTC is standard similar to what you have implemented.

Matching GPS tracks to local days

I'm writing a geotagging app and running into headaches with timezones. Basically, my app has the following data:
Images with local timestamps (i.e. relative to a timezone)
GPS track files consisting of entries using UTC timestamps
My problem: I need a way to get all data that belongs to a give day, based on the timezone where the data was acquired. For the images, that is easy (I ask the timezone from the user upon import and save it in the EXIF data), but I'm not sure how to do it for the GPS tracks (there usually are multiple tracks per day, and assigning them timezones is not easy for the user when importing data that spans sever days and timezones). I can think of two possible solutions:
Use a heuristic based on the fact that the tracks are recorded at the same time and place as the images - but there can be tracks before a day's first image or after its last one that still need to be included - I'm not sure how to realiably handle such edge cases
Determine the timezone from the GPS coordinates - this would be an ideal solution, but is there an open source library that does this (ideally one that works offline)?
The heuristic method I don't think will work well.
Firstly always store times as UTC and timezone of origin, otherwise time is less meaningful.
After some thought I think that it would be sufficient to resolve down to the country code and from that lookup the timezone.
Depending on how much detail you want I think GeoTog may help you to locate a city and therefore a country from a lat/long (although it will need changing to work the other way).
If not that then Gisgraphy will work with the larger GeoNames database. You could use the web service or extract the data.
If none of these are good enough then I think you'll need to get a some GIS data, possibly boundaries from VMAP0 and process it into polygons or something searchable.
Option two: you could start by checking this site: http://www.twinsun.com/tz/tz-link.htm
Option one (less complicated, but I am not sure I accurately understand your need...)
So you have as input:
A target day defined in a known timezone TZ, starting at t0 and ending at t1 (excluded)
Images with timestamps ti in the same timezone TZ (is this hypothesis true?)
GPS tracks with UTC timestamps tg which can span over several time zones
We also know that there is at least one GPS track for each image.
Here's something that should work:
Convert your target day into UTC. You get the values t0/UTC and t1/UTC
Convert images' timestamps into UTC (you get ti/UTC from known ti/TZ)
process image if (t0/UTC <= ti/UTC < t1/UTC) i.e. it was taken during your target day.
find a GPS track including ti/UTC (no problem since tracks are timestamped in UTC), and then the closest timestamp within the list of points in this track. This point is the most likely position of your image.

Resources