ASP - How to calculate total hour and minutes within time range - asp.net

I need to prepare a yearly management report to show the total overtime (OT) work hours and minutes of all staff within time range and month.
The time format in ms sql 2000 database is as follows:
Each record will contains the FROM date & time and TO date & time
The report layout is as follows:
I had no idea how to divide and calculate the total hours & minutes within the time range as each OT records will overlap several time range and date.
Please help. Thanks.
Joe

The SQL DateDiff function can be used to compute the number of minutes, i.e.
declare #fromDT datetime
declare #toDT datetime
set #fromDT = '10/22/2011 18:30'
set #toDT = '10/22/2011 22:45'
select #fromDT,#toDT,DATEDIFF(mi,#fromDt,#toDt),
ltrim(str(DATEDIFF(mi,#fromDt,#toDt)/60))+':'+
ltrim(str(DATEDIFF(mi,#fromDt,#toDt)%60)) as HoursMin
Returns
StartTime End Time Mins HoursMin
2011-10-22 18:30:00.000 2011-10-22 22:45:00.000 255 4:15

Related

defining business day where hours are not same as standard days

While working on a sales report for an entertainment company ( bars and nightclubs), I normally just sum sales and I get the daily sum of sales. but I was communicated that their business day starts at 6 am of each and closes at 5:59:59 am the next day. basically sales reported Monday are the sales from 6 am Sunday thru 5:59:59 am Monday.
the company operates throughout the US so we have multiple time zones as well
the table has the following columns:
Transaction id, location, Transaction_datetimeLocal, TransactionDateTimeUTC, Transaction amount
how do I define / filter the calculation to be from 6am one day to 5:59:59 am the next day USING Power BI / DAX
TIA
In Power BI you have your table with the local time. You need to add a calculated column with the following DAX formula:
Business Time = 'Table'[Local Time] - TIME(6, 0, 0)
From this new column you could the create your business date with
Business Date = 'Table'[Business Time].[Date]
This is how it looks in the Data view:

Teradata Conversion of difference between dates in hours

I need to calculate the difference between 2 dates i.e creation date till todays date in hours in teradata.
((Creation_date - Current_Date) HOUR) As Open_Hour
If you just want hours:
select current_timestamp - <your timestamp> hour
If you want to get more granular, you can use hour to minute or hour to second.

Get number of milliseconds for a localised date, taking into account daylight savings

I have data in Google BigQuery that looks like this:
sample_date_time_UTC time_zone milliseconds_between_samples
-------- --------- ----------------------------
2019-03-31 01:06:03 UTC Europe/Paris 60000
2019-03-31 01:16:03 UTC Europe/Paris 60000
...
Data samples are expected at regular intervals, indicated by the value of the milliseconds_between_samples field:
The time_zone is a string that represents a Google Cloud Supported Timezone Value
I'm then checking the ratio of the actual number of samples compared to the expected number over any particular day, for any single day range (expressed as a local date, for the given time_zone):
with data as
(
select
-- convert sample_date_time_UTC to equivalent local datetime for the timezone
DATETIME(sample_date_time_UTC,time_zone) as localised_sample_date_time,
milliseconds_between_samples
from `mytable`
where sample_date_time between '2019-03-31 00:00:00.000000+01:00' and '2019-04-01 00:00:00.000000+02:00'
)
select date(localised_sample_date_time) as localised_date, count(*)/(86400000/avg(milliseconds_between_samples)) as ratio_of_daily_sample_count_to_expected
from data
group by localised_date
order by localised_date
The problem is that this has a bug, as I've hardcoded the expected number of milliseconds in a day to 86400000. This is incorrect, as when daylight saving begins in the specified time_zone (Europe/Paris), a day is 1hr shorter. When daylight saving ends, the day is 1hr longer.
So, the query above is incorrect. It queries data for 31st March of this year in the Europe/Paris timezone (which is when daylight saving started in that timezone). The milliseconds in that day should be 82800000.
Within the query, how can I get the correct number of milliseconds for the specified localised_date?
Update:
I tried doing this to see what it returns:
select DATETIME_DIFF(DATETIME('2019-04-01 00:00:00.000000+02:00', 'Europe/Paris'), DATETIME('2019-03-31 00:00:00.000000+01:00', 'Europe/Paris'), MILLISECOND)
That didn't work - I get 86400000
You can get the difference in milliseconds for the two timestamps by removing the +01:00 and +02:00. Note that this gives the difference between the timestamps in UTC: 90000000, which is not the same as the actual milliseconds that passed.
You can do something like this to get the milliseconds for one day:
select 86400000 + (86400000 - DATETIME_DIFF(DATETIME('2019-04-01 00:00:00.000000', 'Europe/Paris'), DATETIME('2019-03-31 00:00:00.000000', 'Europe/Paris'), MILLISECOND))
Thanks #Juta, for the hint on using UTC times for the calculation. As I'm grouping my data for each day by a localised date, I figured out that I can work out milliseconds for each day by getting the beginning and end datetime (in UTC), for my 'localised' date, using the following logic:
-- get UTC start datetime for localised date
-- get UTC end datetime for localised date
-- this then gives the milliseconds for that localised date:
datetime_diff(utc_end_datetime, utc_start_datetime, MILLISECOND);
So, my full query becomes:
with daily_sample_count as (
with data as
(
select
-- get the date in the local timezone, for sample_date_time_UTC
DATE(sample_date_time_UTC,time_zone) as localised_date,
milliseconds_between_samples
from `mytable`
where sample_date_time between '2019-03-31 00:00:00.000000+01:00' and '2019-04-01 00:00:00.000000+02:00'
)
select
localised_date,
count(*) as daily_record_count,
avg(milliseconds_between_samples) as daily_avg_millis_between_samples,
datetime(timestamp(localised_date, time_zone)) as utc_start_datetime,
datetime(timestamp(date_add(localised_date, interval 1 day), time_zone)) as utc_end_datetime
from data
)
select
localised_date,
-- apply calculation for ratio_of_daily_sample_count_to_expected
-- based on the actual vs expected number of samples for the day
-- no. of milliseconds in the day changes, when transitioning in/out of daylight saving - so we calculate milliseconds in the day
daily_record_count/(datetime_diff(utc_end_datetime, utc_start_datetime, MILLISECOND)/daily_avg_millis_between_samples) as ratio_of_daily_sample_count_to_expected
from
daily_sample_count

Fetch Data At every 2 Minutes SQL Lite

Please can anyone help me in fetching data from my SQL Lite Table at every 2 minutes between my start time and stop time
I have two columns Data , TimeStamp and I am filtering between two timestamp and it is working fine but what I am trying to do is to result my data at every 2 minutes interval For example my start time is 2016-12-15 10:00:00 and stop time is 2016-12-15 10:10:00 the result should be 2016-12-15 10:00:00,2016-12-15 10:02:00,2016-12-15 10:04:00 ....
Add, to your where clause, an expression that looks for 2 minute boundaries:
strftime("%s", TimeStamp) % 120 = 0
This assumes you have data on exact, 2-minute boundaries. It will ignore data between those points.
strftime("%s", TimeStamp) converts your time stamp string into a single number representing the number of seconds since Jan 1st, 1970. The % 120 does modulo arithmetic resulting in 0 every 120 seconds. If you want minute boundaries, use 60. If you want hourly, use 3600.
What's more interesting -- and I've used this -- is to take all the data between boundaries and average them together:
SELECT CAST(strftime("%s", TimeStamp) / 120 AS INTEGER) * 120 as stamp, AVG(Data)
FROM table
WHERE TimeStamp >= '2016-12-15 10:00:00' AND
TimeStamp < '2016-12-15 10:10:00'
GROUP BY stamp;
This averages all data with timestamps in the same 2-minute "bin". The second date comparison is < rather than <= because then the last bin would only average one sample whereas the other bins would be averages of multiple values. You could also add MAX(Data) and MIN(Data) columns, if you want to know how much the data changed within each bin.

Get total time and create an average based on timestamps

Background: I want to use coldfusion to find the total time a process takes by taking two timestamps and then adding all of the total times to create an average.
Question: What is the best way to take two timestamps and find out the difference in time by minutes.
Example:
Time Stamp #1: 2015-05-08 15:44:00.000
Time Stamp #2: 2015-05-11 08:52:00.000
So the time between the above timestamps would be:
2 Days 6 hours 52 mins = 3,292 minutes
I want to run this conversion on a handful of timestamp's and take the total minutes and divide to get an average.
To add more information to my question. 1. Yes the values are coming from a DB MSSQL. 2. I am actually going to be using the individual time differences and showing and overall average. So in my for loop each line will have a value like 3,292 (converted to mins or hours or days) and at the end of the for loop I want to show an average of all the lines shown on the page. Let me know if I need to add any other information.
Assuming your query is sorted properly, something like this should work.
totalMinutes = 0;
for (i = 2; i <= yourQuery.recordcount; i++)
totalMinutes +=
DateDiff('n'
, yourQuery.timestampField[i-1]
,yourQuery.timestampField[i]);
avgMinutes = totalMinutes / (yourQuery.recordcount -1);
Use the dateDiff() function
diffInMinutes = dateDiff('n', date1, date2);

Resources