I'm looking for a way to convert decimal hours to HH:MM:SS
For instance, as input:
4.927778 hours
Desired output:
04:55:40
You can try something like below
dh <- 4.927778
strftime(as.POSIXct(dh * 60 * 60, origin = Sys.Date(), tz = "GMT"), format = "%H:%M:%S")
## [1] "04:55:40"
You should be able to get an idea of what you need to do -
a <- "4.927778 hours"
a <- as.numeric(gsub(x = a,pattern = " hours", replacement = ""))
h <- a%/% 1
m <- ((a%% 1)*60) %/% 1
s <- round((((a%% 1)*60) %% 1)*60,0)
paste(h,m,s,sep = ":")
#[1] "4:55:40"
An alternative solution is to convert this to a date/time class and then format it in an appropriate way.
format(ISOdatetime(1900,1,1,0,0,0, tz="GMT") +
as.difftime(4.927778, unit="hours"), "%H:%M:%S")
You can use sprintf() to format the output as you wish when you have the number of hours, minutes and seconds as integers. These can be calculated using modulo (%%) and floor()/round(). The number of hours can be extracted from a string very nicely using the parse_number() function from the readr package:
library(readr)
input <- "4.927778 hours"
hrs <- parse_number(input)
hours <- floor(hrs)
minutes <- floor((hrs %% 1) * 60)
seconds <- round((((hrs %% 1) * 60) %% 1) * 60)
sprintf("%02d:%02d:%02d", hours, minutes, seconds)
The advantage of this strategy is that it still works for time differences larger than 24 hours in contrast to the solutions using strftime().
This should work with negative values as well.
convertHours<-function(hours){
timeoffset<-as.numeric(as.POSIXct(format(Sys.time(), tz = "GMT")) - as.POSIXct(format(Sys.time(), tz = "")))
hoursMinutes<-ifelse(hours<0,paste0("-",strftime(as.POSIXct((abs(hours)+timeoffset) * 60 * 60, origin = Sys.Date(), tz =""), format = "%H:%M")), strftime(as.POSIXct((hours+timeoffset) * 60 * 60, origin = Sys.Date(), tz =""), format = "%H:%M"))
}
h<-1.33
hhmm<-convertHours(h)
hhmm [1] "01:19"
h<--1.33
hhmm<-convertHours(h)
hhmm [1] "-01:19"
If you are using .net/C# you can use a little bit of date math.
var when = DateTime.UtcNow; // or any time of your choice
var later = when.AddHours(4.927778);
var span = later - when;
Console.WriteLine(span)
I see the R flag now. Perhaps this will give you a hint where to look for something similar. I don't know R.
Related
I want to create a sequence of times 00:00 - 12:00 and 12:00 to 00:00 with 10 minutes step.
How can I do this in R?
I have tried with:
library(chron)
t <- merge(0:23, seq(0, 50, by = 10))
chron(time = paste(x$x, ':', x$y), format = c(times = "h:m"))
But I have 2 problems:
I get an error running chron(time = paste(x$x, ':', x$y), format = c(times = "h:m")):
Error in convert.times(times., fmt) : format h:m may be incorrect
How can I turn it to standard time with AM/PM? Should I merge it twice:
t <- merge(0:12, seq(0, 50, by = 10))
t_am <- merge(t, "AM")
t_pm <- merge(t, "PM")
Or maybe another way using POSIXt?
We can use seq :
format(seq(as.POSIXct('00:00', format = "%H:%M", tz = "UTC"),
as.POSIXct(Sys.Date() + 1), by = '10 mins'), "%I:%M%p")
#[1] "12:00AM" "12:10AM" "12:20AM" "12:30AM" "12:40AM" "12:50AM" "01:00AM ...
#[141] "11:20PM" "11:30PM" "11:40PM" "11:50PM" "12:00AM"
Make sure you have the correct locale or set it via :
Sys.setlocale("LC_TIME", "en_US.UTF-8")!
I have a netcdf file with a timeseries and the time variable has the following typical metadata:
double time(time) ;
time:standard_name = "time" ;
time:bounds = "time_bnds" ;
time:units = "days since 1979-1-1 00:00:00" ;
time:calendar = "standard" ;
time:axis = "T" ;
Inside R I want to convert the time into an R date object. I achieve this at the moment in a hardwired way by reading the units attribute and splitting the string and using the third entry as my origin (thus assuming the spacing is "days" and the time is 00:00 etc):
require("ncdf4")
f1<-nc_open("file.nc")
time<-ncvar_get(f1,"time")
tunits<-ncatt_get(f1,"time",attname="units")
tustr<-strsplit(tunits$value, " ")
dates<-as.Date(time,origin=unlist(tustr)[3])
This hardwired solution works for my specific example, but I was hoping that there might be a package in R that nicely handles the UNIDATA netcdf date conventions for time units and convert them safely to an R date object?
I have just discovered (two years after posting the question!) that there is a package called ncdf.tools which has the function:
convertDateNcdf2R
which
converts a time vector from a netCDF file or a vector of Julian days
(or seconds, minutes, hours) since a specified origin into a POSIXct R
vector.
Usage:
convertDateNcdf2R(time.source, units = "days", origin = as.POSIXct("1800-01-01",
tz = "UTC"), time.format = c("%Y-%m-%d", "%Y-%m-%d %H:%M:%S",
"%Y-%m-%d %H:%M", "%Y-%m-%d %Z %H:%M", "%Y-%m-%d %Z %H:%M:%S"))
Arguments:
time.source
numeric vector or netCDF connection: either a number of time units since origin or a netCDF file connection, In the latter case, the time vector is extracted from the netCDF file, This file, and especially the time variable, has to follow the CF netCDF conventions.
units
character string: units of the time source. If the source is a netCDF file, this value is ignored and is read from that file.
origin
POSIXct object: Origin or day/hour zero of the time source. If the source is a netCDF file, this value is ignored and is read from that file.
Thus it is enough to simply pass the netcdf connection as the first argument and the function handles the rest. Caveat: This will only work if the netCDF file follows CF conventions (e.g. if your units are "years since" instead of "seconds since" or "days since" it will fail for example).
More details on the function are available here:
https://rdrr.io/cran/ncdf.tools/man/convertDateNcdf2R.html
There is not, that I know of. I have this handy function using lubridate, which is basically identical to yours.
getNcTime <- function(nc) {
require(lubridate)
ncdims <- names(nc$dim) #get netcdf dimensions
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime", "date", "Date"))[1]] #find time variable
times <- ncvar_get(nc, timevar)
if (length(timevar)==0) stop("ERROR! Could not identify the correct time variable")
timeatt <- ncatt_get(nc, timevar) #get attributes
timedef <- strsplit(timeatt$units, " ")[[1]]
timeunit <- timedef[1]
tz <- timedef[5]
timestart <- strsplit(timedef[4], ":")[[1]]
if (length(timestart) != 3 || timestart[1] > 24 || timestart[2] > 60 || timestart[3] > 60 || any(timestart < 0)) {
cat("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n")
warning(paste("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n"))
timedef[4] <- "00:00:00"
}
if (! tz %in% OlsonNames()) {
cat("Warning:", tz, "not a valid timezone. Assuming UTC\n")
warning(paste("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n"))
tz <- "UTC"
}
timestart <- ymd_hms(paste(timedef[3], timedef[4]), tz=tz)
f <- switch(tolower(timeunit), #Find the correct lubridate time function based on the unit
seconds=seconds, second=seconds, sec=seconds,
minutes=minutes, minute=minutes, min=minutes,
hours=hours, hour=hours, h=hours,
days=days, day=days, d=days,
months=months, month=months, m=months,
years=years, year=years, yr=years,
NA
)
suppressWarnings(if (is.na(f)) stop("Could not understand the time unit format"))
timestart + f(times)
}
EDIT: One might also want to take a look at ncdf4.helpers::nc.get.time.series
EDIT2: note that the newly-proposed and currently in developement awesome stars package will handle dates automatically, see the first blog post for an example.
EDIT3: another way is to use the units package directly, which is what stars uses. One could do something like this: (still not handling the calendar correctly, I'm not sure units can)
getNcTime <- function(nc) { ##NEW VERSION, with the units package
require(units)
require(ncdf4)
options(warn=1) #show warnings by default
if (is.character(nc)) nc <- nc_open(nc)
ncdims <- names(nc$dim) #get netcdf dimensions
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime", "date", "Date"))] #find (first) time variable
if (length(timevar) > 1) {
warning(paste("Found more than one time var. Using the first:", timevar[1]))
timevar <- timevar[1]
}
if (length(timevar)!=1) stop("ERROR! Could not identify the correct time variable")
times <- ncvar_get(nc, timevar) #get time data
timeatt <- ncatt_get(nc, timevar) #get attributes
timeunit <- timeatt$units
units(times) <- make_unit(timeunit)
as.POSIXct(time)
}
I couldn't get #AF7's function to work with my files so I wrote my own. The function below creates a POSIXct vector of dates, for which the start date, time interval, unit and length are read from the nc file. It works with nc files of many (but probably not every...) shapes or forms.
ncdate <- function(nc) {
ncdims <- names(nc$dim) #Extract dimension names
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime",
"date", "Date"))[1]] # Pick the time dimension
ntstep <-nc$dim[[timevar]]$len
tm <- ncvar_get(nc, timevar) # Extract the timestep count
tunits <- ncatt_get(nc, timevar, "units") # Extract the long name of units
tspace <- tm[2] - tm[1] # Calculate time period between two timesteps, for the "by" argument
tstr <- strsplit(tunits$value, " ") # Extract string components of the time unit
a<-unlist(tstr[1]) # Isolate the unit .i.e. seconds, hours, days etc.
uname <- a[which(a %in% c("seconds","hours","days"))[1]] # Check unit
startd <- as.POSIXct(gsub(paste(uname,'since '),'',tunits$value),format="%Y-%m-%d %H:%M:%S") ## Extract the start / origin date
tmulti <- 3600 # Declare hourly multiplier for date
if (uname == "days") tmulti =86400 # Declare daily multiplier for date
## Rename "seconds" to "secs" for "by" argument and change the multiplier.
if (uname == "seconds") {
uname <- "secs"
tmulti <- 1 }
byt <- paste(tspace,uname) # Define the "by" argument
if (byt == "0.0416666679084301 days") { ## If the unit is "days" but the "by" interval is in hours
byt= "1 hour" ## R won't understand "by < 1" so change by and unit to hour.
uname = "hours"}
datev <- seq(from=as.POSIXct(startd+tm[1]*tmulti),by= byt, units=uname,length=ntstep)
}
Edit
To address the flaw highlighted by #AF7's comment that the above code would only work for regularly spaced files, datev could be calculated as
datev <- as.POSIXct(tm*tmulti,origin=startd)
I'm trying to write a script that will scan a table for times, and those that happen to be past 6pm will be changed to be 6am of the following day. I have tried using the lubridate package (ymd_hms), but the problem is that it forces me to specify a date (I would like to just use the current system date).
I am kind of new to R (and programming in general) so I'm having trouble wrapping my head around how factors, variables and all that works.
endTime <- ymd_hms("x 18:00:00", tz = "America/Chicago")
Ideally I would want the "x" to take on the system date (no time), but lubridate won't let me do that as it only wants a numerical date in there, it won't let me assign some date to a name and use that.
After that, this should happen
for (Time in firstTen) {
if (tables$Time > endTime ) {
dateTime = ymd_hms("x+1 06:00:00")
}
}
I know the code isn't functional but I just want to give you an idea of what I have in mind.
Any help appreciated!
Here you go mate
dateTime = ymd_hms( paste(Sys.Date()+1, "06:00:00", sep="-"))
EDIT: for your other question regarding changing timezones, you can use this: (from here)
require(lubridate)
dateTime = ymd_hms( paste(Sys.Date()+1, "06:00:00", sep="-"))
dateTime <- as.POSIXct(dateTime, tz="Europe/London")
attributes(dateTime)$tzone <- "America/Los_Angeles"
dateTime
You can achieve this with
library(lubridate)
library(dplyr)
endTime <- ymd_hms(paste(Sys.Date(), "18:00:00"), tz = "America/Chicago")
test.data <- data.frame("Original.time" = endTime + minutes(round(rnorm(15, 1440, 2000))))
time.hours <- hour(test.data$Original.time) +
minute(test.data$Original.time)/60 +
second(test.data$Original.time)/3600
test.data$New.Time <- if_else(time.hours > 18,
ymd_hms(paste(date(test.data$Original.time)+1, "6:00:00"), tz = "America/Chicago"),
test.data$Original.time)
I hope this helps!
I have a netcdf file with a timeseries and the time variable has the following typical metadata:
double time(time) ;
time:standard_name = "time" ;
time:bounds = "time_bnds" ;
time:units = "days since 1979-1-1 00:00:00" ;
time:calendar = "standard" ;
time:axis = "T" ;
Inside R I want to convert the time into an R date object. I achieve this at the moment in a hardwired way by reading the units attribute and splitting the string and using the third entry as my origin (thus assuming the spacing is "days" and the time is 00:00 etc):
require("ncdf4")
f1<-nc_open("file.nc")
time<-ncvar_get(f1,"time")
tunits<-ncatt_get(f1,"time",attname="units")
tustr<-strsplit(tunits$value, " ")
dates<-as.Date(time,origin=unlist(tustr)[3])
This hardwired solution works for my specific example, but I was hoping that there might be a package in R that nicely handles the UNIDATA netcdf date conventions for time units and convert them safely to an R date object?
I have just discovered (two years after posting the question!) that there is a package called ncdf.tools which has the function:
convertDateNcdf2R
which
converts a time vector from a netCDF file or a vector of Julian days
(or seconds, minutes, hours) since a specified origin into a POSIXct R
vector.
Usage:
convertDateNcdf2R(time.source, units = "days", origin = as.POSIXct("1800-01-01",
tz = "UTC"), time.format = c("%Y-%m-%d", "%Y-%m-%d %H:%M:%S",
"%Y-%m-%d %H:%M", "%Y-%m-%d %Z %H:%M", "%Y-%m-%d %Z %H:%M:%S"))
Arguments:
time.source
numeric vector or netCDF connection: either a number of time units since origin or a netCDF file connection, In the latter case, the time vector is extracted from the netCDF file, This file, and especially the time variable, has to follow the CF netCDF conventions.
units
character string: units of the time source. If the source is a netCDF file, this value is ignored and is read from that file.
origin
POSIXct object: Origin or day/hour zero of the time source. If the source is a netCDF file, this value is ignored and is read from that file.
Thus it is enough to simply pass the netcdf connection as the first argument and the function handles the rest. Caveat: This will only work if the netCDF file follows CF conventions (e.g. if your units are "years since" instead of "seconds since" or "days since" it will fail for example).
More details on the function are available here:
https://rdrr.io/cran/ncdf.tools/man/convertDateNcdf2R.html
There is not, that I know of. I have this handy function using lubridate, which is basically identical to yours.
getNcTime <- function(nc) {
require(lubridate)
ncdims <- names(nc$dim) #get netcdf dimensions
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime", "date", "Date"))[1]] #find time variable
times <- ncvar_get(nc, timevar)
if (length(timevar)==0) stop("ERROR! Could not identify the correct time variable")
timeatt <- ncatt_get(nc, timevar) #get attributes
timedef <- strsplit(timeatt$units, " ")[[1]]
timeunit <- timedef[1]
tz <- timedef[5]
timestart <- strsplit(timedef[4], ":")[[1]]
if (length(timestart) != 3 || timestart[1] > 24 || timestart[2] > 60 || timestart[3] > 60 || any(timestart < 0)) {
cat("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n")
warning(paste("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n"))
timedef[4] <- "00:00:00"
}
if (! tz %in% OlsonNames()) {
cat("Warning:", tz, "not a valid timezone. Assuming UTC\n")
warning(paste("Warning:", timestart, "not a valid start time. Assuming 00:00:00\n"))
tz <- "UTC"
}
timestart <- ymd_hms(paste(timedef[3], timedef[4]), tz=tz)
f <- switch(tolower(timeunit), #Find the correct lubridate time function based on the unit
seconds=seconds, second=seconds, sec=seconds,
minutes=minutes, minute=minutes, min=minutes,
hours=hours, hour=hours, h=hours,
days=days, day=days, d=days,
months=months, month=months, m=months,
years=years, year=years, yr=years,
NA
)
suppressWarnings(if (is.na(f)) stop("Could not understand the time unit format"))
timestart + f(times)
}
EDIT: One might also want to take a look at ncdf4.helpers::nc.get.time.series
EDIT2: note that the newly-proposed and currently in developement awesome stars package will handle dates automatically, see the first blog post for an example.
EDIT3: another way is to use the units package directly, which is what stars uses. One could do something like this: (still not handling the calendar correctly, I'm not sure units can)
getNcTime <- function(nc) { ##NEW VERSION, with the units package
require(units)
require(ncdf4)
options(warn=1) #show warnings by default
if (is.character(nc)) nc <- nc_open(nc)
ncdims <- names(nc$dim) #get netcdf dimensions
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime", "date", "Date"))] #find (first) time variable
if (length(timevar) > 1) {
warning(paste("Found more than one time var. Using the first:", timevar[1]))
timevar <- timevar[1]
}
if (length(timevar)!=1) stop("ERROR! Could not identify the correct time variable")
times <- ncvar_get(nc, timevar) #get time data
timeatt <- ncatt_get(nc, timevar) #get attributes
timeunit <- timeatt$units
units(times) <- make_unit(timeunit)
as.POSIXct(time)
}
I couldn't get #AF7's function to work with my files so I wrote my own. The function below creates a POSIXct vector of dates, for which the start date, time interval, unit and length are read from the nc file. It works with nc files of many (but probably not every...) shapes or forms.
ncdate <- function(nc) {
ncdims <- names(nc$dim) #Extract dimension names
timevar <- ncdims[which(ncdims %in% c("time", "Time", "datetime", "Datetime",
"date", "Date"))[1]] # Pick the time dimension
ntstep <-nc$dim[[timevar]]$len
tm <- ncvar_get(nc, timevar) # Extract the timestep count
tunits <- ncatt_get(nc, timevar, "units") # Extract the long name of units
tspace <- tm[2] - tm[1] # Calculate time period between two timesteps, for the "by" argument
tstr <- strsplit(tunits$value, " ") # Extract string components of the time unit
a<-unlist(tstr[1]) # Isolate the unit .i.e. seconds, hours, days etc.
uname <- a[which(a %in% c("seconds","hours","days"))[1]] # Check unit
startd <- as.POSIXct(gsub(paste(uname,'since '),'',tunits$value),format="%Y-%m-%d %H:%M:%S") ## Extract the start / origin date
tmulti <- 3600 # Declare hourly multiplier for date
if (uname == "days") tmulti =86400 # Declare daily multiplier for date
## Rename "seconds" to "secs" for "by" argument and change the multiplier.
if (uname == "seconds") {
uname <- "secs"
tmulti <- 1 }
byt <- paste(tspace,uname) # Define the "by" argument
if (byt == "0.0416666679084301 days") { ## If the unit is "days" but the "by" interval is in hours
byt= "1 hour" ## R won't understand "by < 1" so change by and unit to hour.
uname = "hours"}
datev <- seq(from=as.POSIXct(startd+tm[1]*tmulti),by= byt, units=uname,length=ntstep)
}
Edit
To address the flaw highlighted by #AF7's comment that the above code would only work for regularly spaced files, datev could be calculated as
datev <- as.POSIXct(tm*tmulti,origin=startd)
I have an instrument that exports data in an unruly time format. I need to combine the date and time vectors into a new datetime vector in the following POSIXct format: %Y-%m-%d %H:%M:%S. Out of curiosity, I attempted to do this in three different ways, using as.POSIXct(), strftime(), and strptime(). When using my example data below, only the as.POSIXct() and strftime() functions work, but I am curious as to why strptime() is producing NAs? Also, I cannot convert the strftime() output into a POSIXct object using as.POSIXct()...
When trying these same functions on my real data (of which I've only provided you with the first for rows), I am running into an entirely different problem. Only the strftime() function is working. For some reason the as.POSIXct() function is also producing NAs, which is the only command I actually need for converting my datetime into a POSIXct object...
It seems like there are subtle differences between these functions, and I want to know how to use them more effectively. Thanks!
Reproducible Example:
## Creating dataframe:
date <- c("2017-04-14", "2017-04-14","2017-04-14","2017-04-14")
time <- c("14:24:24.992000","14:24:25.491000","14:24:26.005000","14:24:26.511000")
value <- c("4.106e-06","4.106e-06","4.106e-06","4.106e-06")
data <- data.frame(date, time)
data <- data.frame(data, value) ## I'm sure there is a better way to combine three vectors...
head(data)
## Creating 3 different datetime vectors:
## This works in my example code, but not with my real data...
data$datetime1 <- as.POSIXct(paste(data$date, data$time), format = "%Y-%m-%d %H:%M:%S",tz="UTC")
class(data$datetime1)
## This is producing NAs, and I'm not sure why:
data$datetime2 <- strptime(paste(data$date, data$time), format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
class(data$datetime2)
## This is working just fine
data$datetime3 <- strftime(paste(data$date, data$time), format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
class(data$datetime3)
head(data)
## Since I cannot get the as.POSIXct() function to work with my real data, I tried this workaround. Unfortunately I am running into trouble...
data$datetime4 <- as.POSIXct(x$datetime3, format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
Link to real data:
here
Example using real_data.txt:
## Reading in the file:
fpath <- "~/real_data.txt"
x <- read.csv(fpath, skip = 1, header = FALSE, sep = "", stringsAsFactors = FALSE)
names(x) <- c("date","time","bscat","scat_coef","pressure_mbar","temp_K","CH1","CH2") ## This is data from a Radiance Research Integrating Nephelometer Model M903 for anyone who is interested!
## If anyone could get this to work that would be awesome!
x$datetime1 <- as.POSIXct(paste(x$date, x$time), format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
## This still doesn't work...
x$datetime2 <- strptime(paste(x$date, x$time), format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
## This works:
x$datetime3 <- strftime(paste(x$date, x$time), format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
## But I cannot convert from strftime character to POSIXct object, so it doesn't help me at all...
x$datetime4 <- as.POSIXct(x$datetime3, format = "%Y-%m-%d %H:%M%:%S", tz = "UTC")
head(x)
Solution:
I was not providing the as.POSIXct() function with the correct format string. Once I changed %Y-%m-%d %H:%M%:%S to %Y-%m-%d %H:%M:%S, the data$datetime2, data$datetime4, x$datetime1 and x$datetime2 were working properly! Big thanks to PhilC for debugging!
For your real data issue replace the %m% with %m:
## Reading in the file:
fpath <- "c:/r/data/real_data.txt"
x <- read.csv(fpath, skip = 1, header = FALSE, sep = "", stringsAsFactors = FALSE)
names(x) <- c("date","time","bscat","scat_coef","pressure_mbar","temp_K","CH1","CH2") ## This is data from a Radiance Research Integrating Nephelometer Model M903 for anyone who is interested!
## issue was the %m% - fixed
x$datetime1 <- as.POSIXct(paste(x$date, x$time), format = "%Y-%m-%d %H:%M:%S", tz = "UTC")
## Here too - fixed
x$datetime2 <- strptime(paste(x$date, x$time), format = "%Y-%m-%d %H:%M:%S", tz = "UTC")
head(x)
There was a format string error causing the NAs; try this:
## This is no longer producing NAs:
data$datetime2 <- strptime(paste(data$date, data$time), format = "%Y-%m-%d %H:%M:%S",tz="UTC")
class(data$datetime2)
Formatting to "%Y-%m-%d %H:%M:%OS" is a generic view. To make the fractional seconds to a specific number of decimals call the option for degits.sec, e.g.:
options(digits.secs=6) # This will take care of seconds up to 6 decimal points
data$datetime1 <- lubridate::parse_date_time(data$datetime, "%Y-%m-%d %H:%M:%OS")