I have a column in datatable having dates with format dd/MM/yyyy HH:mm. I fill the datatable using the code below which is common for more than 1 select statements so i cannot specify column and their datatype before filling the datatable. Any manipulation after filling the data is acceptable to me.
data_adapt = New OracleDataAdapter(query, OraConn)
dt = New DataTable
data_adapt.Fill(dt)
For paging i create a copy of the datatable using skip and take as below
dtLineupCopy = New DataTable
dtLineupCopy = dtLineup.AsEnumerable().Skip(startRows).Take(pageSize)).CopyToDataTable()
Now the issue is when I use Compute method it doesn't treat the column values as date type and returns some random date value from the column instead of minimum value.
Arvdate = dtLineupCopy.Compute("Min(Arrivaldate)", "")
Is there a way to convert the datatype for the column?
Also tried adding a new column of datetime type but it throws error System.FormatException: String was not recognized as a valid DateTime
dtLineupCopy.Columns.Add("ArvDate", getType(DateTime), "CONVERT(Arrivaldate, 'System.DateTime')")
Data in Arrivaldate column of dtLineupCopy.
22/09/2012 01:02
27/09/2012 17:01
1/10/2012 1:02
13/10/2012 07:26
14/10/2012 19:47
20/10/2012 00:00
20/10/2012 00:00
How about converting to date in the query:
Min(TO_DATE(Arrivaldate, format_mask)
http://www.techonthenet.com/oracle/functions/to_date.php
If the query that you pass in results in Arrivaldate being brought back as a string not a date then the preferred option would be to change that query? Instead of selecting Arrivaldate, select:
to_date(Arrivaldate, 'DD/MM/YYYY HH24:Mi') as Arrivaldate
If somehow that's not an option then you parse the strings afterwards. Doing that within the confines of the DataTable.Compute expression means rolling your own parsing date into sortable format function something like ...
Arvdate = dtLineupCopy.Compute("Min(Substring(Arrivaldate,7,4) + Min(Substring(Arrivaldate,4,2) + Min(Substring(Arrivaldate,1,2) + ... etc ..... " )) )
Related
I would like to write DateTime values to an excel sheet using openxlsx. When I try to do this, instead of just the DateTime value, I get a lowercase "x" on one row followed by the DateTime in the subsequent row. This occurs whether I use write.xlsx or writeData. I also tried converting the DateTime using as.POSIXlt or as.POSIXct, converting the date with timezone specified or not, and get the same result.
The UTC DateTime values are coming from a PerkinElmer microplate reader file.
Below is a code snippet that gives me this result. Any advice or help is appreciated, Thanks!
library(openxlsx)
library(lubridate)
date <- as_datetime("2022-04-07T22:15:08+0000", tz = "America/Los_Angeles")
options(openxlsx.datetimeFormat = "yyyy-mm-dd hh:mm:ss")
write.xlsx(date,"test.xlsx",overwrite = TRUE)
The documentation of write.xlsx says in section Arguments that x is (my emphasis)
A data.frame or a (named) list of objects that can be handled by writeData() or writeDataTable() to write to file.
So apparently an atomic vector is first coerced to data.frame and since the data argument name is x, so is its column header.
This also happens when writing a named list date_list <- list(date = date). A workbook with a sheet named date is created and the data in it has a column header x.
I have a timestamp column with datetime values. I want to extract year-month from it.
For Example: if timestamp is 2020-02-19T13:42:51.393Z, output I want is Feb-2020.
I tried looking at https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/format-datetimefunction, it has nothing for month.
Thanks in advance.
you could try something like this:
let months = dynamic({"1":"Jan", "2":"Feb", "3":"Mar"}); // ... add the rest
print dt = datetime(2020-02-19T13:42:51.393Z)
| project s = strcat(months[tostring(getmonth(dt))], "-", getyear(dt))
I collect data into Lua tables that I export to SAS datasets with sas.ds_write. Some columns contain strings that represent timestamps in the format "14OCT19:09:12:52". I succeed to convert them to a SAS datetime value, but in SAS, they arrive as numbers without a format.
Is there a way to specify a number in Lua is a DateTime value?
Have you tried to format values before export?
Example:
Dataset
(imagine the result is your data exported from the Lua table)
DATA mytest;
INPUT dates;
CARDS;
1886663572
1886763572
1886863572
;
RUN;
Result:
Format values
PROC SQL;
CREATE TABLE myresults AS
SELECT PUT(dates, datetime19.) AS formatted_dates
FROM mytest
;
RUN;
Result:
Now, notice the output values are character-type, you could use something like this to cast them into datetime data type.
I have a csv file, where the first 3 columns of each row represent a date, like:
2013,1,1,... (first row)
I want to automatically convert the first three columns of each row in the csv into a python datetime object using the following code:
parseDate = lambda y,m,d: datetime.datetime(y,m,d)
df = pandas.DataFrame.from_csv(csvPath, index_col=False,header=None, parse_dates...
=[0,1,2],date_parser= parseDate)
But get an error in the date_parser part.
However, just doing
dtime = parseDate(2003,1,1)
works as expected, so my lambda expression actually seems to be correct.
Can anyone help?
I'm trying to import an excel worksheet into R. I want to retrieve a (character) ID column and a couple of date columns from the worksheet. The following code works fine but brings one column in as a date and not another. I think it has something to do with more leading columns being empty in the second date field.
dateFile <- odbcConnectExcel2007(xcelFile)
query <- "SELECT ANIMALID, ST_DATE_TIME, END_DATE_TIME FROM [KNWR_CL$]"
idsAndDates <- sqlQuery(dateFile,query)
So my plan now is to bring in the date columns as character fields and convert them myself using as.POSIXct. However, the following code produces only a single row in idsAndDates.
dateFile <- odbcConnectExcel2007(xcelFile)
query <- "SELECT ANIMALID, ST_DATE_TIME, END_DATE_TIME FROM [KNWR_CL$]"
idsAndDates <- sqlQuery(dateFile,query,as.is=TRUE,TRUE,TRUE)
What am I doing wrong?
I had to move on and ended up using the gdata library (which worked). I'd still be interested in an answer for this though.