Adding hours across the DST transition in moments - momentjs

I'm trying to create a graph that has a vertical grid line (coming off the X axis) on every 0, 6am, midday and 6pm, using d3. I'm doing this by manually calculating the tick values in momentjs. I calculate the start of the day easily with
firstPoint = momenttz(startDate).startOf('day')
And then add 6 hours to it in a loop.
for (let i = 0; i <= days * 24; i += 6) {
hourTickValues.push(firstPoint.clone().add(i, 'hours'));
}
When crossing into or out of daylight saving time I want to keep the lines at the local time of 6am, midday and 6pm. This will mean that twice a year the grid spacing will be different as there are only 5 hours between 12am and 6am going in to DST and 7 hours when coming out.
Momentjs's docs clearly state that when adding hours to a time it does not adjust for daylight saving, but when adding days it will. I've also discovered that adding fractional days does not work, it always rounds up to an integer.
So, how can I reliably create a (momentjs) date object for every 6th hour according the DST adjusting clock?

I solved this by manually setting each time objects hour after creation.
for (let i = 0; i <= days * 24; i += 6) {
hourTickValues.push(firstPoint.clone().add(i, 'hours'));
hourTickValues[hourTickValues length - 1].hours(i % 24);
}

Related

How to calculate the stabilization moment in a time-series (after one pronounced peak)?

I want to know when a time series stabilizes after its peak.
This time series shows a peak, and then it goes down (see images below (1); (2)).
I would like to calculate the moment at which this time-series stabilizes (becomes flatter) after its peak.
In an ideal world, the data should go down to 0 and stay there. But as you see, it does not reach 0, nor stay 100% stable.
I thought of various different ways/possibilities:
-Calculate a tangent point to see the slope change. But the data has many small ups & downs even after smoothing it.
-Calculate the average at the tail (end of the series /e.g. if time = 8000; calculate last 2000 values mean average), then calculate an interval (margin +- this value), then calculate the time at which the first value appears in this interval.
-Calculate pronounced changes in the trend or slope.
*Maybe you have a better idea I did not consider. Feel free to share it if you've already dealt with this in the past.
I need to know the time at which this stabilization happened.
Ideally, you could mute/ignore all values before the peak value, but without deleting these rows (time should stay). Calculating the peak is easy (max value).
I also standardized the data so it starts at y=0 (I have various time series, I make them all start at y=0 to compare them later*).
I do not know how to provide the data because it is about 8k values.
I would really appreciate your help.
Thank you very much.

Custom Time Slots

Is it possible to set the time slots to 90 minutes and put a 15 minute "break" in between?
And also display the time axis (e.g. 08:00 - 09:30) like in the image below?
Thank you for your help!

Ball is bouncing unrealistically fast

I am trying to create a simple bouncy ball simulator, and I want the balls to move at approximately the same speed as they would in real life. However, currently, after being dropped from 5 feet, the ball hits the ground almost instantly and bounces a few times before stopping in than a second. I know I could just experiment with my gravity value until it's semi-realistic, but I'm confused as to why my current gravity value doesn't work. Here's how I got the current one:
Gravity in real life = 9.8 m/sec^2
= 32.152 ft/sec^2
= 1.072 ft per 1/30th of a sec^2 (my frame rate is set to 30 in my program)
= 102.887 pixels per 1/30th of a sec^2 (a foot is 96 pixels in my program)
Here's my code for moving the ball (using Processing 3.2.1):
void move() {
dy+=102.88704; //gravity
x+=dx;
y+=dy;
z+=dz;
if(y+size*8>480) {
dy*=-0.85;
}
y=constrain(y,-100000.0,480-(size*8));
}
Currently, x and z just stay at 0. Since it's being dropped from 5 feet, it hits the ground when it gets to 480-size*8 (size is in inches). The 0.85 value is temporary and I might tweak it later, but it shouldn't have any impact on this issue. Any and all help is greatly appreciated. Thanks!
Your mistake is in converting your measurements from seconds to 1/30 seconds. Note that the unit is ft/sec^2: that is seconds squared. So to convert the time unit from seconds to 1/30 seconds you must also square the 1/30. Therefore
32.152 ft/sec^2
= 32.152/30^2 ft/(1/30 sec)^2
= 0.035724 ft/(1/30 sec)^2
= 0.035724 * 96 pixels/(1/30 sec)^2
= 3.4295 pixels/(1/30 sec)^2
So try replacing the number 102.88704 with 3.4295 and see if that fixes the problem.

Can OpenTSDB align the start of the downsample interval with the query start timestamp?

I understand that the alignment of the downsample interval will align to the nearest natural calendar/clock boundary. So 1h-sum will align the start of the downsample to the top of the hour.
Is there a way to align it with the start specified by the query?
E.g. to specify start as '1d-ago', and downsample as 1h-sum and then get 24 aggregate data points aligned to the current time exactly. If 'now' is 2017-03-08 10:17:23, then the interval boundaries align with 17 minutes, 23 seconds past the hour.
There are several cases where non-calendar alignment would be useful:
Sliding averages/totals with endtime is reset to current time.
Daily aggregations in a time zone, different from the server time zone.
This is how we implement aggregator alignment in Axibase Time Series Database which also runs on HBase.
https://apps.axibase.com/chartlab/f365c04e
Both the SQL syntax, Rest API, and the graphics library expose the align field which accepts the following options:
CALENDAR
END_TIME
START_TIME
These options determine the start time for each period.

Moving average of last 10 hours in graphite

I have to draw the average of data points of last 10 hours. I get a data point for every 5 minutes, so essentially i have to draw the average of last 12*10 data points.
Suppose i have "delay" as a data point, at every point it makes more sense to draw average of last 10 hours delay instead of plotting the current delay.
I tried Average(),sum() and summarize() functions but i guess they do not achieve this functionality.
Any help on this?
Can you take advantage of the movingAverage function within graphite?
An example for the 10 hours of the a moving average would be the below.
&target=movingAverage(datapoint.name.deplay,'10hour')

Resources