Is it possible to programmatically get the server details from an ODBC DSN? - odbc

I'm working on some issues with psqlODBC's XA/MSDTC transaction handling and I find myself needing to obtain the server connection details (hostname, port, etc) from any user-supplied dsn programmatically without having psqlODBC invoked via the Driver Manager to do so.
Just parsed key/value string pairs will do; the problem is resolving user/system/file DSNs to get the underlying connection info.
The underlying issue I'm trying to solve is that a 32-bit application using MSDTC on a 64-bit system will supply a DSN that works for the 32-bit PostgreSQL driver. The 64-bit PostgreSQL drivers have different names - PostgresSQL ANSI vs PostgreSQL ANSI(x64) and similar for the Unicode drivers. So a DSN that works for a 32-bit app won't work for 64-bit apps ... like msdtc.exe. So I need a way to get the connection parameters the 32-bit app used and feed them into the 64-bit ODBC driver (or direct to libpq).
In the case of a DSN-less connection string like:
DRIVER={{PostgreSQL ANSI}};SERVER=127.0.0.1;PORT=5432;DATABASE=SOMEDB;UID=Administrator;PWD=;CA=disable"
I could just parse it to get the relevant info, but that won't work for file, system, or user DSNs where the XA transaction co-ordinator used by MSDTC only gets whatever the original user app passed to the ODBC layer - like:
DSN=SomeUserOrSystemDSNName
or
FILEDSN=C:\SomeFileDSN.dsn
and wrapped in that DSN is a DRIVER={{{PostgreSQL ANSI}}.
I've taken a look at the ODBC API docs and I don't see anything that seems to expose a way to load any DSN string, resolve file/system/user DSNs and get a parameter hash/map. OTOH, there's a lot of documentation out there, and some of the sections and function names aren't what I'd call predictable.
So - please tell me I'm missing something obvious, and there's a way to just:
GetDSNProperty("FILEDSN=C:\SomeFileDSN.dsn", "SERVER");
.. rather than writing hacky code to manually look up each DSN type.

Related

Pentaho Driver Class could not be found, make sure the Generic Database Driver (Jar File) is installed

Pentaho Driver Class could not be found, make sure the Generic
Database Driver (Jar File) is installed
Where can I get a generic database driver? Nothing comes up when I do a Google search.
This wasn't very helpful. Anyone have any advice?
There is no such thing as a generic jdbc driver.
The difference between generic jdbc connection and others is the following:
for specific connections the user sets the hostname, port, username&password and eventually some other optional fields (such as instance for mssql); In generic connection user must specify the full url and the driver class.
Specific connections sometimes have some extra intelligence embedded (e.g. getName vs getAlias in MySQL); generic connections don’t do that type of post-processing, whatever comes from the jdbc driver is added to the stream.
So, to set up a generic connection (for sake of argument, let’s say postgres), you need the full postgres jdbc url, the driver class name and credentials. If you need to switch to a different type of database (say Oracle) you can use variables for the driver class name and url. But the driver is still a specific one.
When should you use a generic connection?
if you need to switch target db types in runtime (though beware that SQL syntax may vary)
If the database you want to connect to isn’t in the list of supported databases
If you need a different driver class than the one set up in the specific connection (e.g, MySQL 5.x vs MySQL 8)
If your connection requires some tweaks to the url from what the default url provided by the jdbc driver says
But if you want to connect to MySQL 8 using a generic connection you still need a MySQL 8 JDBC driver.

iSeries connection error with IBM DB2 Connector Core

When we migrated from .NET framework to .NET core, we had to change the format of our iSeries connection string from using Server= to using Data source= and to include port#, but also we had to include Database= because without it the connection string could not be assigned to a connection due to an "Invalid argument" exception. With absolutely anything for a database, we always get a uniform error message:
ERROR [08004] [IBM] SQL30061N The database alias or database name \"QSYS \" was not found at the remote node.
(Notice extra spaces in the DB name)
No matter what we supply for a database, the error is always the same. We tried our actual library name similar to app0123 that is reported by DSPLIB or QSYS etc.
We also tried databases reported by DSPRDBDIRE named similarly to IHST0123 but in this case the error was different:
ERROR [42968] [IBM] SQL1598N An attempt to connect to the database server failed because of a licensing problem.
We know that there is no licensing problem with the server because it is our production environment that many applications in Java and C# connect to.
Our usual practice is that if an application App1 connects, it uses app1 user name and app01, app02 etc schemas, app01 being the default one. Therefore, we only ever had the iSeries host name like IHST01 etc in the connection string, and we added user ID and password through the connection string builder.
We are having no issues connecting through .NET core connector to DB2 LUW since database on it is very apparent and unambiguous. Since we never had to specify an iSeries database under .NET framework, it is not clear what it has to be. Does anybody know?
The library (aka schema) name is not the database name.
The *LOCAL entry in DSPRDBDIRE should be your DB name.
A better tool is IBM Access Client Soultions (ACS) "Database -->Schemas" tool which has a UI like so:
On the connected server (ut29p63.rch.stglabs.ibm.com), there are (at least) two databases:
ut29p63
Dbtest
I'm surprised you don't think the DB name was needed for .NET Framework or Java as I've always needed them. If you've only got one database on your IBM i, as is common for smaller boxes, it's possible the DB name matches the system name.
Judging from the license error message, you are getting connected.
However, the .NET Core nuget packages use the IBM Db2 Connect driver. This driver is included for Db2 for LUW, and with an appropriate and optional license allows connecting to IBM Db2 for i or IBM Db2 for z/OS.
In other words, the Db2 Connect driver can always connect to Db2 LUW but you'll have to pay for an enhanced license to connect to IBM i or IBM z/OS. See IBM Db2 Connect License Types You'll need an Enterprise license or an Unlimited Edition for System i.
If you were using Db2 Connect driver for your .NET Framework, the same license will work for .NET Core (assuming the Db2 Connect versions match).
However, if you were using the free ODBC/OLEDB/ADO driver for .NET Framework, then you'll need the Db2 Connect license for .NET Core.

Connecting to an Azure SQL Server Data Warehouse from R on a Mac - See random names instead of tables

I'm trying to connect to an Azure SQL Server (12.00.1900) from R on a Mac, using Microsoft's unixodbc SQL Server drivers (17).
I get a connection, but instead of seeing the 12 or so tables that live in the database, dbListTables returns 442 tables, all with nonsensical names, beginning with 'Csoe', 'Ote', and ending in 'xlshm_idad'. Instead of seeing the single schema that lives in the database, I see cin_1mro__e, IFRAINSHM, and s, none of which have any tables in them.
Note that when I use an ordinary SQL visualization app, that doesn't use the MS drivers, I'm able to see the tables and their content properly.
In addition, the RSQLServer package gets a working connection and sees the tables correctly, but isn't compatible with dplyr semantics.
Can anyone help or advise? I've looked for third party SQL Server unixodbc drivers for Mac, and I can't find any.
Until I see more info from OP, I'll leave as my answer the general recommendation to use R's odbc package. Assuming the correct drivers are installed, connection is configured correctly in odbc.ini, and assuming trusted_connection=yes is used in the same, then connecting from R is as simple as:
library(odbc)
dbConn <- dbConnect(odbc(), dsn = "myDSN")
if trusted connection is not on then you just need to pass uid and pwd arguments.
Also, it may be the case OP that you did not install freeTDS, so try (replace with equivalent for package manager you're using):
brew install freetds --with-unixodbc
This gives you the libtdsodbc.so driver. Make sure the DSN points to this.

Trouble connecting to Oracle database using RODBC

I recently upgraded from Windows 7 to Windows 10 and had to reset some remote database connections. I had previously been connecting quite successfully to an Oracle database using the Oracle 11g client and RODBC.
library(RODBC)
channel<-
odbcConnect(dsn="myoracleDB",
uid='myusername',
pw='mypassword',
believeNRows=FALSE)
result<- sqlQuery(channel,"select * from schema_name.table_name")
close(channel)
Since the Windows 10 upgrade, the above connection protocol no longer works. Specifically, I get the following error:
channel<-
odbcConnect(dsn="myoracleDB",
uid='myusername',
pw='mypassword',
believeNRows=FALSE)
Warning messages:
1: In RODBC::odbcDriverConnect("DSN=myoracleDB;UID=myusername;
PWD=mypassword",:
[RODBC] ERROR: state HY000, code 12170, message [Oracle][ODBC]
[Ora]ORA-12170: TNS:Connect timeout occurred
2: In RODBC::odbcDriverConnect("DSN=myoracleDB;UID=myusername;
PWD=mypassword",:ODBC connection failed
Two additional observations are relevant here:
I use the Windows command line to execute tnsping myoracleDB which returns a successful connection to the database
I can also use Oracle's SQL Developer Application to successfully connect to and query from the database.
So I feel confident that the Oracle Client and the ODBC Data Sources are set up correctly.
Interestingly, I AM able to connect to my database using the RODBC library if I use the following code:
mycon = odbcDriverConnect("Driver={Oracle in OraClient11g_home1};
Dbq=myoracleDB; Uid=myusername; Pwd=mypassword;",
believeNRows=FALSE)
My question for the community is:
This new connection protocol works (which I'm happy about). However, since I don't really understand why it works when the approach that worked before no longer works, I fear I may be ignoring some underlying problem that could really hurt me down the road.
I have found the following SO threads to be helpful, though neither really addresses my issue exactly:
Failure to connect to odbc database in R
Connect to ORACLE via R, using the info in sql developer
UPDATE:
I have accessed the Windows ODBC 64 bit menu and verified that I do have a DSN called "myoracleDB" which is assigned to the "Oracle in OraClient11g_home1" driver. I have tested this connection and find that it works fine. I have also used the RODBC line:
odbcDataSources()
in RStudio and found that the data source "myoracleDB" is recognized. However, when I try to execute:
channel<-
odbcConnect(dsn="myoracleDB",
uid='myusername',
pw='mypassword',
believeNRows=FALSE)
I still get the error:
"TNS: Connect timeout occurred ODBC connection failed"
If you check out the docs, DSN=myoracleDB tells RODBC to connect to the Windows DSN "myoracleDB", while Dbq=myoracleDB tells RODBC to connect to the TNSNAMES entry "myoracleDB". They're two different ways of resolving database names. tnsping and SQL Developer also both use TNSNAMES to resolve databases.
So I think your DSN probably got deleted when you reset things. You can test it by going to Control Panel > Administrative Tools > Data Sources (ODBC). If your database is there, you should be able to Configure it and click Test Connection to make sure it's working. Otherwise you can add it there, and your original configuration should work again.

Connect to a named SQL Server 2016 instance from R

I'm trying to connect to a SQL Server 2016 database in RStudio. I'm using RStudio on my laptop. I could remote in to the server and install RStudio there if it were absolutely necessary, but working locally has massive advantages so I would really prefer that if it were possible. Connection with the server goes through a VPN (FortiClient) that I have running on my laptop.
On this server, there are two SQL Server instances. One is a SQL Server 2012 edition, which is the default instance and hence not named - it used to be the only instance on this server. The other one is the 2016 edition. This instance was set up more recently in order to use the R integration capabilities new to SQL Server 2016. Because the server already had a default instance, this instance had to be named and it hence is called DEVR.
When I access the instances in SSMS and click 'Properties', the name of the default instance is DWH-ACC and the 2016 instance is called DWH-ACC\DEVR.
This is the code I'm running in RStudio to test my connection:
server <- "[IP-ADDRESS]\\DWH-ACC\\DEVR"
databaseName <- "Database"
user <- "user"
pwd <- "password"
sqlShareDir <- "C:\\Dir"
sqlWait <- TRUE
sqlConsoleOutput <- FALSE
sampleDataQuery <- "SELECT TOP 10 FROM [dbo].[Table]"
cc <- RxInSqlServer(server = server, databaseName = databaseName, user = user, password = pwd, shareDir = sqlShareDir, wait = sqlWait, consoleOutput = sqlConsoleOutput)
rxSetComputeContext(cc)
inDataSource <- RxSqlServerData(sqlQuery = sampleDataQuery, server=server, databaseName=databaseName, user=user, password=pwd, stringsAsFactors=TRUE, rowsPerRead=500)
rxGetVarInfo(data = inDataSource)
I've tried several options for the server specification, among which [IP-ADDRESS]\\DEVR and [IP-ADDRESS]/DEVR, which both do not work either. This is the error I get when I run the code:
[Microsoft][ODBC SQL Server Driver][DBNETLIB]The SQL-Server does not exist or permission has been denied.
Could not open data source.
ODBC Error in SQLDisconnect
(Message translated from Dutch, by the way - this may not be the exact error text in the English version of the software)
When I try simply the IP address as my server connection string, I get a different error that seems to indicate it is able to find the instance (the 2012 one, i.e. the wrong one) but not able to process the query.
[Microsoft][ODBC-stuurprogrammabeheer] Fout in functievolgorde
I'm not sure how to translate this one, but it seems to be related to ODBC-drivers and says "error in function order". Anyway, this error is unrelated and I don't need it solved or explained, it simply goes to show that R does seem to be able to connect to the default instance but not to the newer, named one.
Enable Implied Authentication for Launchpad Accounts
Specifically navigate to the User Account from the Control Panel and you'll see the SQLR UserGroup with 20 accounts.
Permission these on the Server and DB Table with write access.
That should see you right. Good luck
You could create a ODBC connection in your local instance. A tutorial on creating ODBC connections can be found here. Background about the different types of SQL Server ODBC connections can be found here.
The ODBC connection should be able to distinguish between the the different SQL Server instances.
For me the main advantage in using ODBC connections is that I don't have to store database passwords inside/near my R scripts.
In R/Rstudio you can connect to the SQL Server instance via the ODBC channel. A tutorial on ODBC channels and RevoScaleR: link.
Other packages in R also provide possibilities to connect to ODBC connections, for instance: RODBC, dplyr.

Resources