Can OpenTSDB align the start of the downsample interval with the query start timestamp? - opentsdb

I understand that the alignment of the downsample interval will align to the nearest natural calendar/clock boundary. So 1h-sum will align the start of the downsample to the top of the hour.
Is there a way to align it with the start specified by the query?
E.g. to specify start as '1d-ago', and downsample as 1h-sum and then get 24 aggregate data points aligned to the current time exactly. If 'now' is 2017-03-08 10:17:23, then the interval boundaries align with 17 minutes, 23 seconds past the hour.

There are several cases where non-calendar alignment would be useful:
Sliding averages/totals with endtime is reset to current time.
Daily aggregations in a time zone, different from the server time zone.
This is how we implement aggregator alignment in Axibase Time Series Database which also runs on HBase.
https://apps.axibase.com/chartlab/f365c04e
Both the SQL syntax, Rest API, and the graphics library expose the align field which accepts the following options:
CALENDAR
END_TIME
START_TIME
These options determine the start time for each period.

Related

Custom Time Slots

Is it possible to set the time slots to 90 minutes and put a 15 minute "break" in between?
And also display the time axis (e.g. 08:00 - 09:30) like in the image below?
Thank you for your help!

Graphite - offset series by standard deviation of itself

I'm using Graphite and Grafana and I'm trying to plot a series against a time shifted version of itself for comparison.
(I.e. is the current value similar to this time last week?)
What I'd like to do is plot;
the 5 minute moving average of the series
a band consisting of the 5 minute moving average of the series timeshifted by 7 days, bounded above and below by the standard deviation of itself
That way I can see if the current moving average falls within a band limited by the standard deviation of the moving average from a week ago.
I have managed to produce a band based on the timeshifted moving average, but only by offsetting either side by a constant amount. I can't work out any way of offsetting by the standard deviation (or indeed by any dynamic value).
I've copied a screenshot of the sort of thing I'm trying to achieve. The yellow line is the current moving average, the green area is bounded by the historical moving average offset either side by the standard deviation.
Is this possible at all in Grafana using Graphite as the backend?
I'm not quite on the latest version, but can easily upgrade (and will do so shortly anyway).
Incidentally, I'm not a statistician, if what I'm doing actually makes no sense mathematically, I'd love to know! ;-) My overall goal is to explore better alternatives, instead of using static thresholds, for highlighting anomalous or problematic server performance metrics - e.g. CPU load, disk IOPS, etc.

How do I do waveform analysis in R or Excel?

I'm trying to get some information out of a couple of waveforms, which I currently have in the format of a CSV table with the columns: Time, Volume, Pressure, Flow. (The data is the flow/pressure/volume data obtained from a healthcare ventilator.)
I want to get excel / R / another-programme-that-I've-not-yet-thought-of to do the following with the waveform:
Break it up into individual breaths - starting with when the pressure starts to rise above a baseline (shortly followed by the flow rising above 0)
For each breath, extract:The highest pressure that occurs, the pressure just before the start of the next breath, the lowest pressure that occurs
Does anyone know how to do that, without doing it manually? (It'd mean trawling through a few hours-worth of ventilator data for the project I'm trying to do)
I've attached a copy of the shapes of the waves I'm dealing with, to try to help make more sense.Pressure & Volume against time
My intention is to work out the breath-to-breath variability in maximum, minimum, and baseline pressures.
Ultimately, I've come up with my own fix using excel.
For each pressure value, I've taken the mean of that value and the two either side of it. I've then compared this to the previous mean. For my data, a change of this mean of more than 1% in either direction identifies the beginning and end of the up- and down-stroke of the pressure curve, and I can identify where in the curve I am based on the absolute value for pressure (the peak should never be lower than the highest baseline for the data I've collected), and for the slopes, the first direction of change of the mean (this smooths out occasionally inconsistent slopes).
I'm then graphing the data, along with the transition points, the calculated phase of the breath, and the breath count, so I can sense-check it visually.
Ged

Moving average of last 10 hours in graphite

I have to draw the average of data points of last 10 hours. I get a data point for every 5 minutes, so essentially i have to draw the average of last 12*10 data points.
Suppose i have "delay" as a data point, at every point it makes more sense to draw average of last 10 hours delay instead of plotting the current delay.
I tried Average(),sum() and summarize() functions but i guess they do not achieve this functionality.
Any help on this?
Can you take advantage of the movingAverage function within graphite?
An example for the 10 hours of the a moving average would be the below.
&target=movingAverage(datapoint.name.deplay,'10hour')

Calculate the date & time given the position of the sun (Azimuth & elevation) and latitude and longitude

Related to this extremely helpful question regarding finding the azimuth & elevation of the sun for a given date, and coordinates. I wish to find the inverse: times & dates the sun will be in that position of the sky.
Therefore I am wondering could someone help with maybe an existing formula or modifying the one linked to.
My current idea was to take two ranges with a variation of a couple of degrees for both, one for the azimuth (120-123 degrees) and elevation (18-21 degrees). Then write an algorithm to iterate through all days / times, and check if the given ranges exist for a time on that day. Looping through these days and using the attached algorithm isn't exactly going to keep Big O small, and also won't be best for performance.
Any help or tips appreciated, please.
Thanks.
There is some useful stuff here (see the links - in particular [12]-[15])
https://en.wikipedia.org/wiki/Position_of_the_Sun
One problem is if you are using this to determine things like "which days would the sun be directly over the 'Heel Stone' at Stonehenge in Z-thousand BC", then there will be a lot of sources of errors beyond precession and/or nutation (earthquakes change the earths rotation period, when the Sun is close to the horizon you'll get some significant refraction). There is also http://www.stargazing.net/kepler/sun.html . However, as there are many days and times when the sun is in a particular position, the method of guessing a window of date and time and then doing a Newton-style approximation iteratively is probably best. Perhaps if you could give more information as to why you are trying to find the answer (i.e."when does the shadow of the oak tree fall on the buried treasure..."), we could be more helpful?
After some thinking you can get the date like this:
if (ang>=0.0) date = (21.March) +ang*(21.June -21.March )/(23.4 degrees);
else date = (21.September)-ang*(21.December-21.September)/(23.4 degrees);
dates are pretty straight forward
ang is the current angle between the ecliptic plane and Earths equator plane
must be measured during day !!!
if you measure the suns height (at your latitude) at astronomical noon then:
ang = height - (90 degrees - your latitude)
to convert height measured at any time you need to apply vector math
see computation of angle between two planes
see image for more clarity
To compute time during the day you will need to look for
conversions between standard (UTC) time and stellar time
also a good idea is to look for solar clock design which includes all computations in geometrical manner.
Do not forget that this approach do not include precession, nutation ...
if you account for that then this task become unsolvable because of sun sky-dome path crossing which leads to multiple solutions for any given suns position
luckily precession is too slow and we can skip it for few thousands years
and nutation has small radius (affect accuracy only)

Resources