I have data from Survey Monkey and the dates that the responses were provided are transferring over in a strange format. They look like:
"1036914:34:45.00"
I have altered the variable type to every available option that SPSS provides and none of them are giving me the right data.
The number that I pasted above should be a date/time between the end of March and the beginning of April of 2018.
Any thoughts?
This should be the number of
"hours:minutes:seconds.centiseconds"
that have passed since 01/01/1900. I looked it up and your example would be
Monday 16 April 2018 at 18:34:45
This is probably an MS Excel format?
Related
I am working with a data set where the Period is given as
2001.01,2001.02,2001.03......When I am trying to convert it using yearmon() it shows only Jan and Feb this two month.For
example for 2001.05 it shows Jan 2001. But it should be May 2001. I think I need to convert the Period in 2001-01 format
first before using yearmon(). Can anyone plz tell me how can I do it?
I have data coming in from different versions of a machine in CSV format but each machine (due to their age) provides the date/time in a different format.
2020-10-30T13:24:26.874Z
2020-10-30T14:37:27.052Z
10/30/2020 09:29:06
10/30/2020 11:42:47
2020-10-30T14:10:35.422Z
11/02/2020
I've used the following formulas to split the date and time into different column:
=LEFT(O1399,10)
=IF(RIGHT(O1399,1)="Z",LEFT(RIGHT(O1399,13),12),RIGHT(O1399,8))
However some of my date results are coming up as follows:
43872.3807
43872.3829
We've also had issues where the date is in an American format 10/6/20 (6 November 2020) but the result is English but still 10/6/2020 (10 June 2020).
Can anyone help with a simple solution of how I can get all of the dates and times into a UK format?
I have a SQLite 3 database on MacOS with timestamp data (typically something like 279020203.539467).
The documentation says that dates can be stored as
REAL - as Julian day numbers, the number of days since noon in Greenwich
on November 24, 4714 B.C. according to the proleptic Gregorian calendar.
And it looks like this is what I've got.
I want to export this and import it into other databases.
I am assuming the timestamp datatype is not compatible between database engines, so some conversion has to happen somewhere, and I am working under the assumption that it'd be best to do that while exporting the data from the SQLite database.
But I can't figure out how to do this conversion.
I've looked at this answer which refers to this forum post which indicates that in SQLite you could do something like
select datetime('40660.9454658044', '+2415018 days', '+12 hours', 'localtime');
(Maybe 2415018 is the number of days between November 24, 4714 B.C. and some other magical date...)
However, replacing the timestamp string in this example with what I have results in null. Presumably because '279020203.539467' is some other kind of timestamp. It is also some magnitudes larger than the example.
But how to convert this to a usable date? I know it should be around 2011/2012.
Interpreting the data as an "integer" (seconds since 1970-01-01) gives 1978 so that is not correct either.
UPDATE: I've found that
288773834.371606 should be 2010-02-25 07.57
296636121.950064 should be 2010-05-27 08.55
(CET if that matters).
The good news is: To convert a Julian date to "regular" date format you could use datetime(strftime('%J',jtime)). FYI Here's the doc for sqlite date and time functions. But there's bad news.
A NASA Calculator computes the Julian date of 2010-02-25 to be 2455246. It computes the civil date of 288773834 as Sept 2, 785907 A.D. sqlite doesn't give that same result using the above notation, but it doesn't give a "date".
Even though the numbers look like Julian date notation, they are not any dates in our lifetimes.
DATEDIF("2010-02-25 07.57"; "2010-05-27 08.55"; "d")
This gives 91 days which works out to be 7862400 seconds, almost exactly the same as the difference between the two timestamps (296636121.950064 - 288773834.371606 = 7862287.578458).
On MacOS native timestamps are seconds from 2001-01-01 (Jan 1 2001).
And
DATEDIF("2001-01-01 00:00"; "2010-02-25 07.57"; "d")
gives 3342 days which is 288748800 seconds, near, but not exactly the timestamp but that difference is caused by
DATEDIF only caters for whole days
CET is one hour off
Correcting for that we get around 25000 seconds more to add, which takes us almost exactly to the timestamp for 2010-02-25 07.57.
So the gist of this is that SQLite on MacOS stores timestamps as MacOS native timestamps which are seconds since start of MacOS epoch (2001-01-01 00.00). This is probably caused by the application that created the data was not using SQLite timestamps but MacOS native dates and storing them in the database as any other data.
Converting this to some other format should be trivial, either during the export, a conversion of the exported file or during imports.
Possibly it would be easiest to convert the date on export from the original database using
select ...
datetime(table.timestamp_field, 'unixepoch', '+31 years')
from ...
I wrote a data frame in CSV format with R and opened it with Excel. Then I converted it to an Excel Workbook and made some edit on it.
When I imported that Excel file in R again, the date column looked like this. (Few are in numbers and few are dates.)
Date
39387
39417
15/01/2007
16/01/2007
I tried to change the format with Excel but failed. General or number option in Excel format generate the number like I mentioned which is in no way related to the date.
It seems all four of your example are in respect of dates in January (11th and 12th for the first two), that the Excel involved has been configured to expect MDY (rather than DMY as popular in UK for example) and its date system to ‘1900’, and that the CSV has written the dates as 'conventional' dates rather than date index numbers.
So what Excel saw first was:
11/01/2017
12/01/2017
15/01/2017
16/01/2017
intended to represent the 11th, 12th, 15th and 16th of January. However, because expecting MDY Excel has interpreted (coercing the Text to date index) the first two entries as November 1 and December 1. For ‘months’ 15 and 16 it did no interpretation and just reported the text as in the CSV.
The coercion is automatic with no option to turn it off (nor 'reversed' with a change of format). Various ways to address this (common) issue are mentioned in the link kindly provided by #Gerard Wilkinson. I am not providing a full solution here since (a) some things are under user control (eg the ‘1904’ system is an option, as is the choice whether MDY or DMY) and (b) the preferred route will depend to some extent on how often required and what the user is most familiar with.
I'm trying to import a table from Microsoft Access (.accdb) to R.
The code that I use is:
library(RODBC)
testdb <- file.path("modelEAU Database V.2.accdb")
channel <- odbcConnectAccess2007(testdb)
WQ_data <- sqlFetch(channel, "WaterQuality")
It seems that it works but the problem is importing date and time data. Into the Access file there are two columns, one with date field (dd/mm/yyyy) and another one with time field (hh:mm:ss) and when I import them in R, in date column appears the date with yyyy-mm-dd format and into the time column the format is 1899-12-30 hh:mm:ss. Also, R can't recognise these formats as a variable and I can't work with them.
Also, I tried the mdb.get function but it didn't work as well.
Does somebody know how to import the data in R from Access defining the date and time format ? Any idea how to import the Access file as a text file?
Note: I'm working with with Office 2010 and R version 2.14.1
Thanks a lot in advanced.
Look at the result of runing str on your data frame. That will tell you more about how the data is actually stored. Generally dates and times are stored as a number from an origin date (Access uses 30 Dec. 1899 because MS thought that 1900 was a leap year). Sometimes it is stored as the number of days since the origin with time being represented as a fraction of the day, other times it is the number of seconds (or miliseconds) since the origin.
You will need to see how the data was sent (whether access and odbc converted to strings first, or sent days or seconds), then you will have a better feel for how to work with these (possibly converting) in R.
There is an article in the June 2004 edition of R News (the predecesor to the R Journal) that details the common ways to deal with dates and times in R and could be very useful to you.
You should decide what you want to end up with, a single column of DateTimes, 2 columns with numbers, 2 columns with characters, etc.