I have cells like 'Dec 6 2016 6:26AM' (copied from DB) in NVARCHAR. I want to convert that to the Datetime format . But every time when I try do that there are the error 'Error converting data type nvarchar to datetime'.
I do not understand what is the problem, because my cell is absolutely similar to Datetime 100 format.
For deciding the problem I have alreade tried:
convert and cast (do not work), Try_convert and Try_ cast do not work too (get me nulls only), creating a new table with needed column format and inserting into that my data and etc.
The exprected result is opportunity to use DATEDIFF function
Have you tried this:
SELECT CONVERT(DATETIME,'Dec 6 2016 6:26AM',100);
gives the result:
2016-12-06 06:26:00.000
Related
I have Ship_Date as 12/14/2013 20:27 and defined as varachar datatype in the source table.How can I convert this to timestamp format and load as is 12/14/2013 20:27?I'm using CAST (SHIP_DATE AS TIMESTAMP(0) FORMAT 'MM/DD/YYYYHH:MI:SS') AS SHIP_DATE but terdata is throwing invalid timestamp error.Please help in resolving the issue
You were missing the B indicating a blank/space in your original cast. But Teradata casts chokes on a single digit month. You can add a leading zero using a RegEx:
Cast(RegExp_Replace(SHIP_DATE,'\b([\d])\b', '0\1') AS TIMESTAMP(0) FORMAT'dd/mm/yyyyBhh:mi')
select to_timestamp('12/14/2013 20:27','MM/DD/YYYY HH24:MI');
The problem with your cast as you have it written is you are telling Teradata you have seconds in your string when you don't. You can use:
select cast ('12/14/2014 20:27' as TIMESTAMP(0) FORMAT 'MM/DD/YYYYBHH:MI')
However, this still won't handle single digit months.
The issue here is the empty space between 2013 and 20 and the missing zeros for the seconds.
I did oreplace to remove the space and concatenation and it worked.
SELECT
CAST (
( OREPLACE('12/14/2013 20:27', ' ','') ||':00')
AS TIMESTAMP(0) FORMAT 'MM/DD/YYYYHH:MI:SS'
) AS SHIP_DATE
And the result is below:
I am using oracle 12c with the username system. My problem is when I execute this insert statement that I took from oracle live sql site:
insert into emp
values(7788, 'SCOTT', 'ANALYST', 7566,to_date('13-JUL-87','dd-mm-rr') - 85,3000, null, 20);
it shows :
sql error ora-01858. 00000 - "a non-numeric character was found where a numeric was expected"
*Cause: The input data to be converted using a date format model was
incorrect. The input data did not contain a number where a number was
required by the format model.
*Action: Fix the input data or the date format model to make sure the
elements match in number and type. Then retry the operation.
what is this -85 after the to_date(..)
To handle dates, you would better use the ANSI format (date 'yyyy-mm-dd'):
insert into emp values(7788, 'SCOTT', 'ANALYST', 7566, date '1987-07-13'- 85,3000, null, 20);
If you need to use a to_date for some reason, you have to be sure that the format of your string exactly matches the format mask you use: if your month is written as 'JUL' you need 'MON' in the format mask and not 'mm'. 'mm' would match a month written as '07'.
Please notice that even with the right format mask, this way to write dates is dangerous, because it's based on the language of your DB.
The -85 means "subtract 85 days".
A is in the format of timestamp(6). I need it in timestamp(0). The code I am using is the following:
cast(cast(A AS date) as timestamp(0))
FROM 'table'
where A >= '?StartDT'
After inputing the date I want for the parameter I get the 'Invalid timestamp' error.
If A is truly a Timestamp(6) then casting it first as a DATE will affectively trim off the time elements, so when you cast the result to a TIMESTAMP(0) you are going to end up with a time of 00:00:00.
You'll need to also cast the TIMESTAMP(6) field as a time and then add the results together like:
CAST(CAST(A AS DATE) AS TIMESTAMP(0)) + (CAST(A AS TIME(6)) - TIME '00:00:00' HOUR TO SECOND)
You can also use SUBSTRING() to snip off the last 6 characters of the TIMESTAMP(6) field and cast that resulting string to a TIMESTAMP(0):
CAST(SUBSTRING(CAST(A AS CHAR(26)) FROM 1 FOR 19) AS TIMESTAMP(0))
This doesn't address the INVALID TIMESTAMP error you are getting though. Are you certain that field A is a TIMESTAMP(6) and not a VARCHAR() that looks like a Timestamp? What happens when you remove the outer cast, are their any dates in the result that look like they wouldn't convert nicely to a timestamp? Something is not quite right here, and I suspect that it's in your data.
I want to insert a datetime value into a database using SQL Server Compact Edition in Microsoft Webmatrix 3.
I tried the following query:
INSERT INTO Tutorials ([Tutorial], [StartDate])
VALUES ('3d', CONVERT(DATETIME, '07-23-08', 110));
And I got the following error message:
The conversion is not supported. [ Type to convert from (if known) = datetime, Type to convert to (if known) = float ]
Try with
INSERT INTO Tutorials ([Tutorial], [StartDate])
VALUES ('3d', CONVERT(DATETIME, '07-23-08', 10));
If you set the style value to 10 the input format must be mm-dd-yy, if you add 100 to the style value the expected format has a four digit year (110 --> mm-dd-yyyy).
For an exhaustive table of the style values look at CAST and CONVERT (SQL Server Compact).
By the way, you could take advantage of an implicit conversion using a date format like yyyymmdd or yyyy-mm-dd:
INSERT INTO Tutorials ([Tutorial], [StartDate])
VALUES ('3d', '20080723');
I am reading a big csv (>1GB big for me!). It contains a timestamp field.
I read it (100 rows to start with ) with fread from the excellent data.table package.
ddfr <- fread(input="~/file1.csv",nrows=100, header=T)
Problem 1 (RESOLVED): the timestamp fields (called "ts" and "update"), e.g. "02/12/2014 04:40:00 AM" is converted to string. I convert the fields back to timestamp with lubridate package mdh_hms. Splendid.
ddfr$ts <- data.frame( mdy_hms(ddfr$ts))
Problem 2 (NOT RESOLVED): The timestamp is created with time zone as per POSIXlt.
How do I create in R a timestamp with NO TIME ZONE? is it possible??
Now I use another (new) great package, PivotalR to write the dataframe to PostGreSQL 9.3 using as.db.data.frame. It works as a charm.
x <- as.db.data.frame(ddfr, table.name= "tbl1", conn.id = 1)
Problem 3 (NOT RESOLVED): As the original dataframe timestamp fields had time zones, a table is created with the fields "timestamp with time zone". Ultimately the data needs to be stored in a table with fields configured as "timestamp without time zone".
But in my table in Postgres the data is stored as "2014-02-12 04:40:00.0", where the .0 at the end is the UTC offset. I think I need to have "2014-02-12 04:40:00".
I tried
ALTER TABLE tbl ALTER COLUMN ts type timestamp without time zone;
Then I copied across. While Postgres accepts the ALTER COLUMN command, when I try to copy (using INSERT INTO tbls SELECT ...) I get an error:
"column "ts" is of type timestamp without time zone but expression is of type text
Hint: You will need to rewrite or cast the expression."
Clearly the .0 at the end is not liked (but why then Postgres accepts the ALTER COLUMN? boh!).
I tried to do what the error suggested using CAST in the INSERT INTO query:
INSERT INTO tbl2 SELECT CAST(ts as timestamp without time zone) FROM tbl1
But I get the same error (including the suggestion to use CAST aargh!)
The table directly created by PivotalR (based on the dataframe) has this CREATE script:
CREATE TABLE tbl2
(
businessid integer,
caseno text,
ts timestamp with time zone
)
WITH (
OIDS=FALSE
);
ALTER TABLE tbl1
OWNER TO mydb;
The table I'm inserting into has this CREATE script:
CREATE TABLE tbl1
(
id integer NOT NULL DEFAULT nextval('bus_seq'::regclass),
businessid character varying,
caseno character varying,
ts timestamp without time zone,
updated timestamp without time zone,
CONSTRAINT busid_pkey PRIMARY KEY (id)
)
WITH (
OIDS=FALSE
);
ALTER TABLE tbl1
OWNER TO postgres;
My apologies for the convoluted explanation, but potentially a solution could be found at any step in the chain, so I preferred to put all my steps in one question. I am sure there has to be a simpler method...
I think you're confused about copying data between tables.
INSERT INTO ... SELECT without a column list expects the columns from source and destination to be the same. It doesn't magically match up columns by name, it'll just assign columns from the SELECT to the INSERT from left to right until it runs out of columns, at which point any remaining cols are assumed to be null. So your query:
INSERT INTO tbl2 SELECT ts FROM tbl1;
isn't doing this:
INSERT INTO tbl2(ts) SELECT ts FROM tbl1;
it's actually picking the first column of tbl2, which is businessid, so it's really attempting to do:
INSERT INTO tbl2(businessid) SELECT ts FROM tbl1;
which is clearly nonsense, and no casting will fix that.
(Your error in the original question doesn't match your tables and queries, so the details might be different as you've clearly made a mistake in mangling/obfuscating your tables or posted a newer version of the tables than the error. The principle remains.)
It's generally a really bad idea to assume your table definitions won't change and column order won't change anyway. So always be explicit about columns. In this case I think your intention might have actually been:
INSERT INTO tbl2(businessid, caseno, ts)
SELECT CAST(businessid AS integer), caseno, ts
FROM tbl1;
Note the cast, because the type of businessid is different between the two tables.