I need to test my C++ program, which uses system time.
The program is large and it uses third party libraries which possibly also use system time.
I want to see how my program behaves for different dates / times.
Is it possible to change the system time only for one running process in UNIX?
Many thanks...
Well, I found the answer by myself.
In unix shell there is an environmental variable TZ which is the timezone and it is used by all C/C++ time functions.
This variable can be manipulated to set time in current unix shell to any arbitrary time (limited to number of seconds, not milliseconds) and even change date.
Examples:
export TZ=A-02:10:20
(shifts time by 2 hours 10 minutes and 20 seconds from now forward).
export TZ=A+02:10:20
(the same shift backwards)
If you want to change the date, you can use large number of hours, for example:
export TZ=A-72:10:20
Unfortunately it does not let you change date too much, from my experiments it is up to several days back/forward. So for changing month/year this does not work.
(use 'date' command to check current date/time after setting TZ variable).
To cancel all changes use
export TZ=
I guess you can use libfaketime.
Related
I am trying to get a more responsive idea of the system run queue length to see if load balancing based on the one minute load average from sysinfo() is having issues caused by the client processes perhaps looking in lockstep...
I've managed to find /proc/schedstats, and it looks to be what I'm looking for, but...
I want to make sure I base my values on the actual interval between polls of /proc/schedstat, instead of potential processing overhead (it's a shell script).
Now for the question: What is the unit of measurement used for the "timestamp" value at the top of the /proc/schedstats file? It's sure not nanoseconds, because the value is somewhere between 258 and 260 when my script loops through with a sleep 1 between loops.
Inspecting kernel sources sched/stats.c shows that the timestamp field is in jiffies.
What is the unit of measurement used for the "timestamp" value at the top of the /proc/schedstats file?
It's 1/HZ second, typically with HZ=300, the unit would be 3.333 miliseconds if I'm counting right.
Tearing my hair out on this one. Took me hours just to get rJava up and running (because mac OS X el capitan was not wanting to play nice with Java) in order to load excel-specific data importing packages etc. But in the end this hasn't helped my problem, and I'm just about at my wits end. Please help.
Basic situation is this:
Have simple excel data of time durations, over a span of a couple of years. So the two columns I'm importing are the time(duration) and year(2016,2017 etc).
In Excel the data is formatted as [h]:mm:ss so it displays correctly (data is related to number of hours worked in a month, so typically something like 80:xx:xx ~ 120:xx:xx). I'm aware that in excel, despite the cells being formatted as above, and only showing the relevant period of hours, that in reality excel has appended an (irrelevant, arbitrary) date to this hours data. I have searched and searched and found no way around this limitation in the way excel handles dates/times/durations.
I import this data into R via the "import data -> import from excel data set" menu item in R commander GUI, not the console.
However when importing the data into R, the data displays as a single number e.g. approx. 110 hrs is converted to 4.xxxxx, not as hh:mm:ss. So when running analyses and generating plots etc, instead of the actual (meaningful) 110:xx:xx type data being displayed, a completely meaningless 4.xxxxxx is displayed.
If I change the formatting of the excel cells to display the date as well as the time rather than use the [h]:mm:ss cell formatting, R erroneously interprets the data to something equally useless, like 1901/02/04 05:23 am
I have installed and loaded a variety of packages such as xlsx, XLConnect, lubridate etc but it hasn't made any difference to how R interprets the excel data on import, from the GUI at least.
Please tell me how do I either
a) edit the raw data to a format that R will understand as a time duration (and nothing but a time duration) in hh:mm:ss format, or
b) format the current data from within R after importing, so that it displays the data in the correct way rather than a useless number or arbitrary date/time?
[Please note: I can use the console, when given the commands etc needed to be executed. But I need to find a solution that ultimately will allow the data to be imported and/or manipulated from within the GUI, not from typing a bunch of commands into the console, as the end user (not me) has zero programming ability and cannot use a console, and will only ever be using R via the GUI.]
Is your code importing the data from excel as seconds?
library(lubridate)
duration <- lubridate::as.duration(400000)
as.numeric(duration, "hours")
111.1111
as.numeric(duration, "days")
4.62963
seconds_to_period(400000)
"4d 15H 6M 40S"
I'm uploading a .csv file in R-H2o using h2o.importFile. However, the date values are parsed incorrectly.
For example, with date time format YYYY-MM-DD hh:mm:ss, (e.g. 2016-06-16 12:30:00), the result is always 1466073000000, which is incorrect.
This is an odd combinations of data import artifacts:
This is an epoch time, which is in number of seconds from January 1, 1970.
If you use this code:
numDate <- 1466073000 #notice I removed three zeros
as.POSIXct(numDate, origin="1970-01-01")
You get the following output:
"2016-06-16 06:30:00 EDT"
So, it is in miliseconds.
Also, the time is incorrect, by 6 hours.
Chances are it is giving you Greenwich mean time adjustment for your systems time zone (which if you work in a corporate system could be different than your current time zone depending on where the actual processor is located and how your system is set up)
You have options:
Run the analysis on epoch time
or
Convert using:
as.POSIXct( 1466073000000/1000, origin="1970-01-01")
try to coerce h2o to bring it in the way you want.
As long as this time (with your zone adjustment) IS correct there is no reason to change it unless you need to be able to read it correctly. I would change the output after the analysis was run to make it human readable.
I have a vector of times in R, all_symbols$Time and I am trying to find out how to get JUST the times (or convert the times to strings without losing information). I use
strptime(all_symbol$Time[j], format="%H:%M:%S")
which for some reason assumes the date is today and returns
[1] "2013-10-18 09:34:16"
Date and time formatting in R is quite annoying. I am trying to get the time only without adding too many packages (really any--I am on a school computer where I cannot install libraries).
Once you use strptime you will of necessity get a date-time object and the default behavior for no date in the format string is to assume today's date. If you don't like that you will need to prepend a string that is the date of your choice.
#James' suggestion is equivalent to what I was going to suggest:
format(all_symbol$Time[j], format="%H:%M:%S")
The only package I know of that has time classes (i.e time of day with no associated date value) is package:chron. However I find that using format as a way to output character values from POSIXt objects lends itself well to functions that require factor input.
In the decade since this was written there is now a package named “hms” that has some sort of facility for hours, minutes, and seconds.
hms: Pretty Time of Day
Implements an S3 class for storing and formatting time-of-day values, based on the 'difftime' class.
Came across the same problem recently and found this and other posts R: How to handle times without dates? inspiring. I'd like to contribute a little for whoever has similar questions.
If you only want to you base R, take advantage of as.Date(..., format = ("...")) to transform your date into a standard format. Then, you can use substr to extract the time. e.g. substr("2013-10-01 01:23:45 UTC", 12, 16) gives you 01:23.
If you can use package lubridate, functions like mdy_hms will make life much easier. And substr works most of the time.
If you want to compare the time, it should work if they are in Date or POSIXt objects. If you only want the time part, maybe force it into numeric (you may need to transform it back later). e.g. as.numeric(hm("00:01")) gives 60, which means it's 60 seconds after 00:00:00. as.numeric(hm("23:59")) will give 86340.
Is it possible to create a custom time zone in R for handling datetime objects?
More specifically I am interested in dealing with POSIXct objects, and would like to create a time zone than corresponds to "US/Eastern" - 17 hours. Time zones with a similar offset do not follow the same daylight savings convention as the US.
The reason for using a time zone so defined comes from FX trading, for which 5 pm EST is a reasonable 'midnight'.
When you are concerned about a specific ”midnight-like” time for each day, I assume that you want to obtain a date without time which switches over at that time. If that is your intention, then how about simply subtracting 17 hours (= 17*3600 seconds) from your vector of times, and taking the date of the resulting POSIXct value?
That would avoid complicated time zone maniplulations, which are usually not hanled by R itself but the underlying C libraray, as far as I know, so they might be difficult to achieve from within R. Instead, all computations would be performed in EST, and you'd still get a different switchover time than the local midnight.