Trying to convert time as a string to a time variable.
Use Date/Dates/Convert String to Date...... for format I use %H:%M:%S....
Here is the syntax from the GUI
[Convert String Variables to Date]
BSkystrptime (varNames = c('Time'),dateFormat = "%H:%M:%S",prefixOrSuffix = "prefix",prefixOrSuffixValue = "Con_",data = "Dataset2")
BSkyLoadRefreshDataframe(dframe=Dataset2,load.dataframe=TRUE)
A screen shot of result is attached....
Compare variables Time [string] to Con_Time [date/time]
The hours are 2 hours out [wrong!] - the Minutes and Seconds are correct.
What am I doing wrong here?
Screen Shot
I believe you are running into a known issue with a prior release of BlueSky Statistics. This issue is fixed with the current stable release available on the download page.
The reason for this was although the time is converted correctly into the local time zone, BlueSky Statistics was reading the time zone in the local time zone and converting it to UTC.
You are probably +2 hours ahead of UTC, so you are seeing the time move 2 hrs back. Give us a couple of days to post a patch.
You can also confirm this by writing and executing the following syntax in the syntax window
Dataset2$Con_Time
I am using Splunk 6.2.X along with Django bindings to create a Splunk app.
To get access to the earliest/latest dates from the timerange picker, am using the following in my JS.
mysearchbar.timerange.val()
Am getting back a map where the values are in epoch format:
Object {earliest_time: 1440122400, latest_time: 1440124200}
When I convert them using moment using the following, I get different datetime than expected:
> moment.unix('1440122400').utc().toString()
"Fri Aug 21 2015 02:00:00 GMT+0000"
However, the time does not correspond to the values that have been selected on the time range picker i.e. 08/20/2015 22:00:00.000
Am not sure what the difference is getting caused by? Am sure tht the timezone is not the factor as the time difference is erratically not equivalent to derive using simple timezone add/subtract.
I was wondering if this behaviour can be explained as to how to get the Splunk epoch datetime to UTC would be helpful.
I was able to get rid of the timezone issue by performing the following:
Setting the timezone of the Splunk engine to UTC in props.conf as follows:
TZ = GMT
Setting up the CentOS (the server hosting Splunk) to UTC
Hope this helps anyone else who stumbles upon similar issues.
Thanks.
I brought in a time series into R using the parse_date_time function in the library(lubridate) and I brought it in as EST.
streamflowDateTime<-parse_date_time(streamflowDateTime,"%m%d%Y %H%M",tz="EST")
However, the data experiences DST on 04-03-2005 01:45 and the next time step is 03:00. I want to convert this occurrence and all the time stamps that follow to EST by subtracting an hour so that it is continuous. It would be preferred if there was an automated way to do it where the program figures out where DST starts taking place and moves back an hour itself, since DST does not take effect every year on the same day at the same time.
Here's a sample of the data
structure(c(1112475600, 1112476500, 1112477400, 1112478300, 1112479200,
1112480100, 1112481000, 1112481900, 1112482800, 1112483700, 1112484600,
1112485500, 1112486400, 1112487300, 1112488200, 1112489100, 1112490000,
1112490900, 1112491800, 1112492700, 1112493600, 1112494500, 1112495400,
1112496300, 1112497200, 1112498100, 1112499000, 1112499900, 1112500800,
1112501700, 1112502600, 1112503500, 1112504400, 1112505300, 1112506200,
1112507100, 1112508000, 1112508900, 1112509800, 1112510700, 1112515200,
1112516100, 1112517000, 1112517900, 1112518800, 1112519700, 1112520600,
1112521500, 1112522400, 1112523300, 1112524200, 1112525100, 1112526000,
1112526900, 1112527800, 1112528700, 1112529600, 1112530500, 1112531400,
1112532300, 1112533200, 1112534100, 1112535000, 1112535900, 1112536800,
1112537700, 1112538600, 1112539500, 1112540400, 1112541300, 1112542200,
1112543100, 1112544000, 1112544900, 1112545800, 1112546700, 1112547600,
1112548500, 1112549400, 1112550300, 1112551200, 1112552100, 1112553000,
1112553900, 1112554800, 1112555700, 1112556600, 1112557500, 1112558400,
1112559300, 1112560200, 1112561100, 1112562000, 1112562900, 1112563800,
1112564700, 1112565600, 1112566500, 1112567400, 1112568300, 1112569200
), class = c("POSIXct", "POSIXt"), tzone = "EST")
Edits:
streamflowDateTime[8840:length(streamflowDateTime)] <- streamflowDateTime[8840:length(streamflowDateTime)]-hours(1)
In the full entire data set, the occurence happens at location 8840, which I know manually, I want the code to automatically find the position where the time difference between two consecutive time stamps is not 15 minutes and replace the '8840' in code with that automated value. for loops are too slow
You can probably just supply the full IANA time zone ID America/New_York instead of the time zone abbreviation.
parse_date_time(streamflowDateTime,"%m%d%Y %H%M",tz="America/New_York")
Using America/New_York will properly account for both EST and EDT, including the correct transitions between them.
This seems to be supported, as seen in this blog post - at least on systems that provide IANA/Olson time zones, such as Linux or Mac.
According to the docs:
... R does not come with a predefined list zone names, but relies on the user's OS to interpret time zone names. As a result, some names will be recognized on some computers but not others. Most computers, however, will recognize names in the timezone data base originally compiled by Arthur Olson. These names normally take the form "Country/City." ...
Since Windows uses its own set of time zones, you will probably not be able to use IANA/Olson identifiers. However:
The equivalent Windows time zone id would be "Eastern Standard Time". (Despite the name, this covers both EST and EDT). I am uncertain if R supports these or not.
The fully qualified POSIX time zone for the current rule would be "EST5EDT,M3.2.0,M11.1.0". This should work on all OS's - however it only represents the US Eastern Time Zone since the 2007 change.
From 1987-2006 the rule would have been "EST5EDT,M4.1.0,M10.5.0". Use the appropriate rule for the values you're working in. If you have dates that span these periods, you'll need to split them up and process them separately, or if possible, write a function to use the correct rule for the data.
See also, the timezone tag wiki.
I'm using the following functions:
# The epoch used in the datetime API.
EPOCH = datetime.datetime.fromtimestamp(0)
def timedelta_to_seconds(delta):
seconds = (delta.microseconds * 1e6) + delta.seconds + (delta.days * 86400)
seconds = abs(seconds)
return seconds
def datetime_to_timestamp(date, epoch=EPOCH):
# Ensure we deal with `datetime`s.
date = datetime.datetime.fromordinal(date.toordinal())
epoch = datetime.datetime.fromordinal(epoch.toordinal())
timedelta = date - epoch
timestamp = timedelta_to_seconds(timedelta)
return timestamp
def timestamp_to_datetime(timestamp, epoch=EPOCH):
# Ensure we deal with a `datetime`.
epoch = datetime.datetime.fromordinal(epoch.toordinal())
epoch_difference = timedelta_to_seconds(epoch - EPOCH)
adjusted_timestamp = timestamp - epoch_difference
date = datetime.datetime.fromtimestamp(adjusted_timestamp)
return date
And using them with the passed code:
twenty = datetime.datetime(2010, 4, 4)
print(twenty)
print(datetime_to_timestamp(twenty))
print(timestamp_to_datetime(datetime_to_timestamp(twenty)))
And getting the following results:
2010-04-04 00:00:00
1270339200.0
2010-04-04 01:00:00
For some reason, I'm getting an additional hour added in the last call, despite my code having, as far as I can see, no flaws.
Where is this additional hour coming from?
# Ensure we deal with `datetime`s.
date = datetime.datetime.fromordinal(date.toordinal())
(That's chopping off the time-of-day completely, as ‘ordinal’ is only a day number. Is that what you meant to do? I suspect not.)
Anyway, as Michael said, datetime.fromtimestamp gives you a naïve datetime corresponding to what local time for that POSIX (UTC) timestamp would be for you. So when you call —
date = datetime.datetime.fromtimestamp(adjusted_timestamp)
you're getting the local time for the POSIX timestamp representing 2010-04-04T00:00:00, which of course in BST is an hour ahead. This doesn't happen in the return direction because your epoch is in January, when BST is not in force. (However your EPOCH would also be completely off if you weren't in the UK.)
You should replace both your uses of datetime.fromtimestamp with datetime.utcfromtimestamp.
It's sad that datetime continues the awful time tradition of keeping times in local time. Calling them ‘naïve’ and taking away the DST flag just makes them even worse. Personally I can't stand to use datetime, preferring integer UTC timestamps for everything (converting to local timezones for formatting only).
Judging by your profile, you're in the UK. That means you're currently running on UTC+1 due to DST.
If I take your timestamp and run it through datetime.fromtimestamp on Python 2.6 (I know you use Python 3, but this is what I have), that shows me that it believes it refers to 2010-04-04 02:00:00 - and I'm in CEST, so that's UTC+2.
Running datetime.fromtimestamp(0), I get that the epoch is 1970-01-01 01:00:00. This then shows me that it is correctly adding only a single hour (since January 1st is outside of DST, and the epoch is midnight UTC on that date, it would be 01:00 here).
In other words, your problem is that you're sending in a time which has DST applied, but datetime_to_timestamp treats it as if DST didn't exist. timestamp_to_datetime, however, applies the DST.
Unfortunately, I don't know enough Python to know how you would solve this, but this should at least give you something to go on.
What is the simplest way to get the machine's time zone as a positive or negative UTC offset, preferably using some time of shell command?
For all Unix-ish operating systems, when using the GNU date command:
date +%z
Example, for Eastern European Time (my timezone):
[moocha#helium ~]$ date +%z
+0200
If what you want is the non-summer/daylight-savings offset, you'd have to do something like:
date -d 'Jan 1' +%z
(or Jul in the southern hemisphere). This works with date from GNU coreutils, anyway.
Shockingly enough, I don't see any way to get the tm_isdst flag from date.
I figure that if worse comes to worse I can just send a request to an NTP server and take the difference of it and the current local time, but that seems kind of wasteful if the system knows its offset.