Are Sybase datetime fields supported in PolyBase exernal tables? - odbc

I'm working with PolyBase on SQL Server 2022. I have some tables with type "datetime NULL" on Sybase ASE 16.
When declaring an external table e.g.
CREATE EXTERNAL TABLE SYBASESCHEMA.SomeTable
(
[SomeNiceTime] DATETIME NULL
)
WITH (LOCATION = N'SomeNiceDatabase.dbo.SomeTable', DATA_SOURCE = SYBASE_DS);
I receive an error message like the following one:
105083;The following columns in the user defined schema are incompatible with the external table schema for table 'SomeTable': 'SomeNiceTime' failed to be reflected with the error: 'The detected ODBC SQL_TYPE 11 is not supported for external generic tables.'
Does anyone know how this could be resolved?

Related

sqlite ODBC driver and "attach" statement

I want to use the sqlite ODBC driver from http://www.ch-werner.de/sqliteodbc/ and start with an ATTACH statement, as I need joint data from two databases. However the driver returns no dataw when provided the following SQL:
attach 'my.db' as mydb; select 1
It however correctly complains with only one SQL statement allowed when the first statement is indeed a SELECT:
select 2;attach 'my.db' as mydb; select 1
Checking at the source, a checkddl() function analyzes if the provided requests contains DDL (Data Definition Language) statement. Before digging in the complete code, question is:
did someone manage to issue a select after an attach with this driver ?

SQL Server Polybase | Cosmos Document DB Date conversion issue

Im new to polybase. I have linked my SQL 2019 server to a third parties Azure cosmos and i am able to query data out of my collection. I am getting an error out when i try to query date fields though. In the documents the dates are defined as:
"created" : {
"$date" : 1579540834768
},
In my external table i have the column defined as
[created] DATE,
I have tried to create the column as int and nvarchar(128) but the schema detection rejects it each time. (i have tried to create a field created_date but the schema detection also disagree's that this is correct.
When i try a query that returns any of the date fields i get this error:
Msg 105082, Level 16, State 1, Line 8
105082;Generic ODBC error: [Microsoft][Support] (40460) Fractional data truncated while performing conversion. .
OLE DB provider "MSOLEDBSQL" for linked server "(null)" returned message "Unspecified error".
Msg 7421, Level 16, State 2, Line 8
Cannot fetch the rowset from OLE DB provider "MSOLEDBSQL" for linked server "(null)". .
This happens if i try and exclude null values in my query - even when filtering to specific records where the date is populated (validated using the Azure portal interface)
Is there something i should be doing to handle the integer date from the json records; or another type i can use to get my external table to work?
Found a solution. SQL Server recommends the wrong type for mongodb dates in the schema. Using DateTime2 resolved the issue. Found this on a polybase type mapping page in msdn.

MariaDB: SELECT INSERT from ODBC CONNECT engine from SQL Server keeps causing "error code 1406 data too long"

Objective: Using MariaDB I want to read some data from MS SQL Server (via ODBC Connect engine) and SELECT INSERT it into a local table.
Issue: I keep geting "error code 1406 data too long" even if source and destination varchar fields have the very same size (see further details)
Details:
The query which I'm trying to execute is in the form:
INSERT INTO DEST_TABLE(NUMERO_DOCUMENTO)
SELECT SUBSTR(TRIM(NUMERO_DOCUMENTO),0,5)
FROM CONNECT_SRC_TABLE
The above is the very minimal subset of fields which causes the problem.
The source CONNECT Table is actually a view inside SQL Server. The destination table has been defined so to be identical to the the ODBC CONNECT Table (same field names, same NULL constranints, same filed types ans sizes)
There's no issue on a couple of other VARCHAR fields
The issue is happening with a filed NUMERO_DOCUMENTO VARCHAR(14) DEFAULT NULL where the max length from the input table is 14
The same issue is also happening with 2 other fields ont the same table
All in all it seems to be an issue with the source data rather then the destination table.
Attemped workarounds:
I tried to force silent truncation but, reasonably, this does not make any difference: Error Code: 1406. Data too long for column - MySQL
I tried enlarging the destination field with no appreciable effect NUMERO_DOCUMENTO VARCHAR(100) DEFAULT NULL
I tried to TRIM the source field (hidden spaces?) and to limit its size at source to no avail: INSERT INTO DEST_TABLE(NUMERO_DOCUMENTO) SELECT SUBSTR(TRIM(NUMERO_DOCUMENTO),0,5) FROM CONNECT_SRC_TABLE but the very same error is always returned
Workaround:
I tried performing the same thing using a FOR x IN (src_query) DO INSERT .... END FOR and this solution seems to work: this means that the problem is not into the data itself but in how the engine performs the INSERT SELECT query

OLE DB provider 'for linked server returned data that does not match expected data length for

I get an error querying a remote postgresql server from my sql server 2017 Standard via a linked server
this is the query:
SELECT CAST(test AS VARCHAR(MAX)) FROM OpenQuery(xxxx,
'SELECT corpo::TEXT as test From public.notification')
and this is the error message:
Msg 7347, Level 16, State 1, Line 57
OLE DB provider 'MSDASQL' for linked server 'xxx' returned data that does not match expected data length for
column '[MSDASQL].test'. The (maximum) expected data length is 1024, while the returned data length is 7774.
Even without conversions the error stills
For the odbc and linked server I followed this handy guide.
In my case, I was reading the data through a view. Apparently, the data size of one column was changed in the underlying table but the view still reported to the linked server the original smaller size of the column. The solution was to open the view with MSSMS and save it again.
Can you try this?
SELECT *
FROM OPENQUERY(xxxx, '\
SELECT TRIM(corpo) AS test
FROM public.notification;
') AS oq
I prefer using OPENQUERY since it will send the exact query to the linked server for it to execute.
MySQL currently has problem with casting to VARCHAR data type, so I using TRIM() function to cheat it.

Informatica giving error while loading Teradata Number column

In a Teradata(13.5) target table we have column with data-type Number; when we try to load this table using Informatica flow it gives following error:
[Severity Timestamp Node Thread Message Code Message
ERROR 4/1/2015 3:08:52 PM node01_<host_name> WRITER_1_*_1 WRT_8229 Database errors occurred:
FnName: Execute -- [Teradata][ODBC Teradata Driver] Illegal data conversion]
We have tried everything including:
1. Changing Informatica target datatype to decimal, bigint, integer, varchar
2. Importing target table to Informatica using Informatica Target Designer but this Number field is imported as Varchar(0)
Please suggest how to solve this as changing target data-type is not an option for us.
You can import target table to Informatica using Informatica Target Designer, and then edit the datatype for the column to anything you want. It can be alterered without issues.

Resources