How to Set Block Insert Size Parameters in Teradata - odbc

I am using odbc insert command in the statistics package Stata (v14.2) on an Ubuntu 14.04.2 LTS server to insert some data into a Teradata DB (v14.10).
This Stata command has a block option, which makes Stata send the data in blocks of 1000 rows. Unfortunately the ODBC driver appears to be doing single-row inserts (according to the DBA who monitors the system).
Is it possible to alter the ODBC driver behavior?
If so, how does one specify the block size for the ODBC driver to use in the .odbc.ini file or a connection string?
I've looked the TD ODBC manual and Googled, but I could not find anything useful.

Related

Tableau doesn't display tables after odbc connection in clickhouse?

I have tableau 2019.4 installed and even after successful odbc connection, no tables are displayed. I have tried both ANSI and Unicode drivers but no tables are shown even when database is connected.

Truncation of database name and columns while connecting to SQL database from R

I am on macOS Catalina (Version 10.15.1), running R 3.5.0. I am running SQL server on Docker locally. For connecting to the server, I am using odbc:
con <- dbConnect(odbc(),
Driver = "Simba SQL Server ODBC Driver",
Server = "localhost",
UID = 'SA',
PWD = 'XXXXXXXX',
database = dbname)
I am able to connect to the server, however the names of all the databases therein are getting truncated to just the first letter as shown here:
Subsequently all the character columns are also showing only the first letter.
I had a look at this, but just can't figure out why this is happening. For starters, the names of the databases itself, as shown in the RStudio Connections pane, are getting truncated.
I am able to connect to the databases using Azure Data Studio and see all the columns correctly.
I had a look at this and turns out this was exactly what I was facing. Followed the steps mentioned here to resolve the issue.

Connecting to an Azure SQL Server Data Warehouse from R on a Mac - See random names instead of tables

I'm trying to connect to an Azure SQL Server (12.00.1900) from R on a Mac, using Microsoft's unixodbc SQL Server drivers (17).
I get a connection, but instead of seeing the 12 or so tables that live in the database, dbListTables returns 442 tables, all with nonsensical names, beginning with 'Csoe', 'Ote', and ending in 'xlshm_idad'. Instead of seeing the single schema that lives in the database, I see cin_1mro__e, IFRAINSHM, and s, none of which have any tables in them.
Note that when I use an ordinary SQL visualization app, that doesn't use the MS drivers, I'm able to see the tables and their content properly.
In addition, the RSQLServer package gets a working connection and sees the tables correctly, but isn't compatible with dplyr semantics.
Can anyone help or advise? I've looked for third party SQL Server unixodbc drivers for Mac, and I can't find any.
Until I see more info from OP, I'll leave as my answer the general recommendation to use R's odbc package. Assuming the correct drivers are installed, connection is configured correctly in odbc.ini, and assuming trusted_connection=yes is used in the same, then connecting from R is as simple as:
library(odbc)
dbConn <- dbConnect(odbc(), dsn = "myDSN")
if trusted connection is not on then you just need to pass uid and pwd arguments.
Also, it may be the case OP that you did not install freeTDS, so try (replace with equivalent for package manager you're using):
brew install freetds --with-unixodbc
This gives you the libtdsodbc.so driver. Make sure the DSN points to this.

Connecting cassandra to Tableau Software

I want to connect Tableau software to my cassandra database. Note that i'm using tableau in windows7 and cassandra in ubuntu (Virtual machine).
For this i've installed the Cassandra ODBC (and Simba cassandra ODBC but i got the same problem). I got a connexion succes and i found my keyspace but not my tables !!!!!!
But no table in "cim" keyspace !!
Note that in my keyspace "cim" i have 3 tables that i can request with any problem in cassandra. Is there something i should do before creating the ODBC driver ???
Thank you for your help
The ODBC driver as it stands currently uses thrift so won't be able to communicate directly with cql3 to display the table names. Describe commands also won't work. However, you should still be able to select data from your tables. Updates to the ODBC driver should provide cql3 support at some point in the new year.
Update Simba ODBC driver for Cassandra supports CQL3 and solves your problem.
http://www.simba.com/connectors/apache-cassandra-odbc

Is it possible to programmatically get the server details from an ODBC DSN?

I'm working on some issues with psqlODBC's XA/MSDTC transaction handling and I find myself needing to obtain the server connection details (hostname, port, etc) from any user-supplied dsn programmatically without having psqlODBC invoked via the Driver Manager to do so.
Just parsed key/value string pairs will do; the problem is resolving user/system/file DSNs to get the underlying connection info.
The underlying issue I'm trying to solve is that a 32-bit application using MSDTC on a 64-bit system will supply a DSN that works for the 32-bit PostgreSQL driver. The 64-bit PostgreSQL drivers have different names - PostgresSQL ANSI vs PostgreSQL ANSI(x64) and similar for the Unicode drivers. So a DSN that works for a 32-bit app won't work for 64-bit apps ... like msdtc.exe. So I need a way to get the connection parameters the 32-bit app used and feed them into the 64-bit ODBC driver (or direct to libpq).
In the case of a DSN-less connection string like:
DRIVER={{PostgreSQL ANSI}};SERVER=127.0.0.1;PORT=5432;DATABASE=SOMEDB;UID=Administrator;PWD=;CA=disable"
I could just parse it to get the relevant info, but that won't work for file, system, or user DSNs where the XA transaction co-ordinator used by MSDTC only gets whatever the original user app passed to the ODBC layer - like:
DSN=SomeUserOrSystemDSNName
or
FILEDSN=C:\SomeFileDSN.dsn
and wrapped in that DSN is a DRIVER={{{PostgreSQL ANSI}}.
I've taken a look at the ODBC API docs and I don't see anything that seems to expose a way to load any DSN string, resolve file/system/user DSNs and get a parameter hash/map. OTOH, there's a lot of documentation out there, and some of the sections and function names aren't what I'd call predictable.
So - please tell me I'm missing something obvious, and there's a way to just:
GetDSNProperty("FILEDSN=C:\SomeFileDSN.dsn", "SERVER");
.. rather than writing hacky code to manually look up each DSN type.

Resources