Getting the time zone as a UTC offset on windows and linux - datetime

What is the simplest way to get the machine's time zone as a positive or negative UTC offset, preferably using some time of shell command?

For all Unix-ish operating systems, when using the GNU date command:
date +%z
Example, for Eastern European Time (my timezone):
[moocha#helium ~]$ date +%z
+0200

If what you want is the non-summer/daylight-savings offset, you'd have to do something like:
date -d 'Jan 1' +%z
(or Jul in the southern hemisphere). This works with date from GNU coreutils, anyway.
Shockingly enough, I don't see any way to get the tm_isdst flag from date.

I figure that if worse comes to worse I can just send a request to an NTP server and take the difference of it and the current local time, but that seems kind of wasteful if the system knows its offset.

Related

BlueSky Statistics - String to date [time] issues

Trying to convert time as a string to a time variable.
Use Date/Dates/Convert String to Date...... for format I use %H:%M:%S....
Here is the syntax from the GUI
[Convert String Variables to Date]
BSkystrptime (varNames = c('Time'),dateFormat = "%H:%M:%S",prefixOrSuffix = "prefix",prefixOrSuffixValue = "Con_",data = "Dataset2")
BSkyLoadRefreshDataframe(dframe=Dataset2,load.dataframe=TRUE)
A screen shot of result is attached....
Compare variables Time [string] to Con_Time [date/time]
The hours are 2 hours out [wrong!] - the Minutes and Seconds are correct.
What am I doing wrong here?
Screen Shot
I believe you are running into a known issue with a prior release of BlueSky Statistics. This issue is fixed with the current stable release available on the download page.
The reason for this was although the time is converted correctly into the local time zone, BlueSky Statistics was reading the time zone in the local time zone and converting it to UTC.
You are probably +2 hours ahead of UTC, so you are seeing the time move 2 hrs back. Give us a couple of days to post a patch.
You can also confirm this by writing and executing the following syntax in the syntax window
Dataset2$Con_Time

splunk mysearchbar.timerange.val() datetime to UTC

I am using Splunk 6.2.X along with Django bindings to create a Splunk app.
To get access to the earliest/latest dates from the timerange picker, am using the following in my JS.
mysearchbar.timerange.val()
Am getting back a map where the values are in epoch format:
Object {earliest_time: 1440122400, latest_time: 1440124200}
When I convert them using moment using the following, I get different datetime than expected:
> moment.unix('1440122400').utc().toString()
"Fri Aug 21 2015 02:00:00 GMT+0000"
However, the time does not correspond to the values that have been selected on the time range picker i.e. 08/20/2015 22:00:00.000
Am not sure what the difference is getting caused by? Am sure tht the timezone is not the factor as the time difference is erratically not equivalent to derive using simple timezone add/subtract.
I was wondering if this behaviour can be explained as to how to get the Splunk epoch datetime to UTC would be helpful.
I was able to get rid of the timezone issue by performing the following:
Setting the timezone of the Splunk engine to UTC in props.conf as follows:
TZ = GMT
Setting up the CentOS (the server hosting Splunk) to UTC
Hope this helps anyone else who stumbles upon similar issues.
Thanks.

Converting time zones within time series

I brought in a time series into R using the parse_date_time function in the library(lubridate) and I brought it in as EST.
streamflowDateTime<-parse_date_time(streamflowDateTime,"%m%d%Y %H%M",tz="EST")
However, the data experiences DST on 04-03-2005 01:45 and the next time step is 03:00. I want to convert this occurrence and all the time stamps that follow to EST by subtracting an hour so that it is continuous. It would be preferred if there was an automated way to do it where the program figures out where DST starts taking place and moves back an hour itself, since DST does not take effect every year on the same day at the same time.
Here's a sample of the data
structure(c(1112475600, 1112476500, 1112477400, 1112478300, 1112479200,
1112480100, 1112481000, 1112481900, 1112482800, 1112483700, 1112484600,
1112485500, 1112486400, 1112487300, 1112488200, 1112489100, 1112490000,
1112490900, 1112491800, 1112492700, 1112493600, 1112494500, 1112495400,
1112496300, 1112497200, 1112498100, 1112499000, 1112499900, 1112500800,
1112501700, 1112502600, 1112503500, 1112504400, 1112505300, 1112506200,
1112507100, 1112508000, 1112508900, 1112509800, 1112510700, 1112515200,
1112516100, 1112517000, 1112517900, 1112518800, 1112519700, 1112520600,
1112521500, 1112522400, 1112523300, 1112524200, 1112525100, 1112526000,
1112526900, 1112527800, 1112528700, 1112529600, 1112530500, 1112531400,
1112532300, 1112533200, 1112534100, 1112535000, 1112535900, 1112536800,
1112537700, 1112538600, 1112539500, 1112540400, 1112541300, 1112542200,
1112543100, 1112544000, 1112544900, 1112545800, 1112546700, 1112547600,
1112548500, 1112549400, 1112550300, 1112551200, 1112552100, 1112553000,
1112553900, 1112554800, 1112555700, 1112556600, 1112557500, 1112558400,
1112559300, 1112560200, 1112561100, 1112562000, 1112562900, 1112563800,
1112564700, 1112565600, 1112566500, 1112567400, 1112568300, 1112569200
), class = c("POSIXct", "POSIXt"), tzone = "EST")
Edits:
streamflowDateTime[8840:length(streamflowDateTime)] <- streamflowDateTime[8840:length(streamflowDateTime)]-hours(1)
In the full entire data set, the occurence happens at location 8840, which I know manually, I want the code to automatically find the position where the time difference between two consecutive time stamps is not 15 minutes and replace the '8840' in code with that automated value. for loops are too slow
You can probably just supply the full IANA time zone ID America/New_York instead of the time zone abbreviation.
parse_date_time(streamflowDateTime,"%m%d%Y %H%M",tz="America/New_York")
Using America/New_York will properly account for both EST and EDT, including the correct transitions between them.
This seems to be supported, as seen in this blog post - at least on systems that provide IANA/Olson time zones, such as Linux or Mac.
According to the docs:
... R does not come with a predefined list zone names, but relies on the user's OS to interpret time zone names. As a result, some names will be recognized on some computers but not others. Most computers, however, will recognize names in the timezone data base originally compiled by Arthur Olson. These names normally take the form "Country/City." ...
Since Windows uses its own set of time zones, you will probably not be able to use IANA/Olson identifiers. However:
The equivalent Windows time zone id would be "Eastern Standard Time". (Despite the name, this covers both EST and EDT). I am uncertain if R supports these or not.
The fully qualified POSIX time zone for the current rule would be "EST5EDT,M3.2.0,M11.1.0". This should work on all OS's - however it only represents the US Eastern Time Zone since the 2007 change.
From 1987-2006 the rule would have been "EST5EDT,M4.1.0,M10.5.0". Use the appropriate rule for the values you're working in. If you have dates that span these periods, you'll need to split them up and process them separately, or if possible, write a function to use the correct rule for the data.
See also, the timezone tag wiki.

Converting a Go Time from UnixDate to RFC3339 Fails to Preserve TimeZone

I am converting a UnixDate formatted time string to an RFC3339 formatted time using Go's time package. This seems to be easy and works well on my local machine, but when run on a remote host, the timezone info seems to get lost.
The input time is Eastern Australian Standard Time (EST) and seems to be interpreted as UTC by time.Parse().
Code snippet available here:
package main
import "fmt"
import "time"
func main() {
t,_ := time.Parse(time.UnixDate,"Mon Jan 14 21:50:45 EST 2013")
fmt.Println(t.Format(time.RFC3339)) // prints time as Z
t2,_:=time.Parse(time.RFC3339,t.Format(time.RFC3339))
fmt.Println(t2.Format(time.UnixDate)) // prints time as UTC
}
Do I need to specifically set locales or anything?
Timezone parsing in Go doesn't always work as expected. But that is a great deal due to the fact that timezone abbreviation names are ambiguous. For example, does EST in your scenario mean Eastern Australian Standard Time (GMT+11) or Eastern Standard Time (GMT-5)?
If your local time is "Eastern Australian Standard Time", Go will assume you mean local time. That is why it worked on your local computer. But since the server is not using that as local time, there is no reason to assume you mean Sydney time. Instead, Go chooses neither of the EST timezones and creates a fake time.Location with the name "EST" but the effect of UTC. The only way to tell it was originally meant to be EST would be to call t.Location().String().
The author of the time package wrote a bug report explaining how the timezone parsing works here.

Debug, Unix. Run a program with a different time from the system

I am debugging a program in MacOSX,
and I need that this program thinks we are one year later than the one given by the operating system.
I cannot change the time of the operation system, because I need to run a second program concurrently with the correct time.
I could modify the code of the first program to add one year each time it gets the time from the operation system, but the code is too big to do that; I prefer not to use this solution.
I heard once that there is a command in Unix to run a program with a fake/mocked time.
Do you know about it?
I haven't tried it, but libfaketime claims to do what you need.
Quoted from the website:
As an example, we want the "date" command to report our faked time. To do so, we could use the following command line:
user#host> date
Tue Nov 23 12:01:05 CEST 2007
user#host> LD_PRELOAD=/usr/local/lib/libfaketime.so.1 FAKETIME="-15d" date
Mon Nov 8 12:01:12 CEST 2007
Assuming the lib works as advertised, you can trick your program into thinking it was running a year ahead using:
LD_PRELOAD=/usr/local/lib/libfaketime.so.1 FAKETIME="+1y" ./your_program

Resources