pyodbc - [unixODBC][Driver Manager]Data source name not found, and no default driver specified - pyodbc

I am setting up a system to connect to an AWS Redshift database from python. I am thinking that there's something wrong in the python script because I can connect via isql. I've installed all the relevant packages, and I am able to connect via isql as follows:
$ isql rndredshift readonly ***** -v
+---------------------------------------+
| Connected! |
| |
| sql-statement |
| help [tablename] |
| quit |
| |
+---------------------------------------+
SQL> quit
However, my python script is failing to connect. Here's the script:
import pyodbc
import sys
def main():
redshift_conn_str = assemble_connection_string(
Driver='{PostgreSQL}',
Server='10.191.4.97',
ServerName='rndredshift',
Port='5439',
Database='prod',
Uid='readonly',
Pwd='*******'
)
print("===========")
print(redshift_conn_str)
print("===========")
new_conn2 = pyodbc.connect(redshift_conn_str)
print(psql.read_sql('select top 10 * from rawdb.raw_imprequest_20150101', new_conn2))
def assemble_connection_string(**kwargs):
return ';'.join([k + '=' + v for (k, v) in kwargs.items()])
if __name__ == '__main__':
sys.exit(main())
Here's the output:
===========
Uid=readonly;Database=prod;ServerName=rndredshift;Driver={PostgreSQL}; Server=10.191.4.97;Pwd=********;Port=5439
===========
Traceback (most recent call last):
File "test_redshift.py", line 24, in <module>
sys.exit(main())
File "test_redshift.py", line 17, in main
new_conn2 = pyodbc.connect(redshift_conn_str)
pyodbc.Error: ('IM002', '[IM002] [unixODBC][Driver Manager]Data source name not found, and no default driver specified (0) (SQLDriverConnectW)')
The PosgreSQL driver is installed:
$ odbcinst -q -d
[PostgreSQL]
[MySQL]
And the data source is configured:
$ odbcinst -q -s
[rndredshift]

If you're using DSNs, you're going to need to specify that in your connection string. Also, if you want to use DSN-less connections, I believe the keyword is SERVER and not SERVERNAME.
Try this connection string?
Uid=readonly;Database=prod;DSN=rndredshift;Driver={PostgreSQL};Pwd=********;
Make sure you specify the full server name and port in odbc.ini as well. Also, since you're using PostgreSQL, any reason you're not using the native PostgreSQL driver?
https://wiki.postgresql.org/wiki/Psycopg
Good luck!

Also, I've been perplexed over the ways to obtain and install the PostgreSQL driver. When I installed unixODBC, the odbcinst.ini file was created and contained an entry for the PostgreSQL driver that looked this this:
[PostgreSQL]
Description = ODBC for PostgreSQL
Driver = /usr/lib/psqlodbc.so
Setup = /usr/lib/libodbcpsqlS.so
Driver64 = /usr/lib64/psqlodbc.so
Setup64 = /usr/lib64/libodbcpsqlS.so
FileUsage = 1
However, the files for Driver and Driver64 where not on the system. So then, I installed postgresql-odbc, which gave me the missing libraries. Is there a better way to do this? As I mentioned earlier, isql works fine, so I'm still thinking it's a python issue.

I decided to try using the psycopg2 package, and I got a connection to work! Here's my script:
import sys
import psycopg2
def main():
conn_string = "host='10.191.4.97' dbname='prod' user='readonly' password='****' port='5439'"
print("===========")
print(conn_string)
print("===========")
new_conn2 = psycopg2.connect(conn_string)
print("Connected using psycopg2!")
if __name__ == '__main__':
sys.exit(main())
So, while I'm happy that I can connect, the question still remains about pyodbc and the PostgreSQL connection string. Thoughts?

Here's the connection string:
Uid=readonly;Database=prod;ServerName=rndredshift;Driver={PostgreSQL}; Server=10.191.4.97;Pwd=********;Port=5439
Using DSN instead of ServerName didn't work.

Related

[DataDirect][ODBC lib] Driver Manager Message file not found. Please check for the value of InstallDir in your odbc.ini in Informatica

I am using informatica, I have Singlestore DB which I am trying to connect.
I am able to login to singelstore DB using Singlestore ODBC Driver as below.
Singlestore version:8.0.5
SS ODBC Driver version: 1.1.1
Singlestore is self managed.
[abc#rnd-2 ~]$ isql SingleStore-server
+---------------------------------------+
| Connected! |
| |
| sql-statement |
| help [tablename] |
| quit |
| |
+---------------------------------------+
SQL> ^C
While I am trying to connect informatica with Singlestore using ODBC Connection, I am gettion error:
Message Code: WRT_8001
Message: Error connecting to database...
WRT_8001 [Session s_test Username dev DB Error -1
[DataDirect][ODBC lib] Driver Manager Message file not found. Please check for the value of InstallDir in your odbc.ini.
Database driver error...
Function Name : Connect
Database driver error...
Function Name : Connect
Database Error: Failed to connect to database using user [dev] and connection string [SingleStore-server].]Message Code: WRT_8001
Message: Error connecting to database...
WRT_8001 [Session s_test Username dev DB Error -1
[DataDirect][ODBC lib] Driver Manager Message file not found. Please check for the value of InstallDir in your odbc.ini.
Database driver error...
Function Name : Connect
Database driver error...
Function Name : Connect
Database Error: Failed to connect to database using user [dev] and connection string [SingleStore-server].]
My location of odbc.ini file: /etc/odbc.ini
odbc.ini
[SingleStore_server]
Description=SingleStore server
Driver=/home/abc/singlestore-connector-odbc-1.1.1-centos7-amd64/libssodbca.so
SERVER=<>
USER=<>
PASSWORD=<>
DATABASE=<>
PORT=<>
I added path in .bash_profile, but still getting same error:
# .bash_profile
# Get the aliases and functions
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
# User specific environment and startup programs
PATH=$PATH:$HOME/bin
export PATH
export ODBCINI=/etc/odbc.ini
Pls let me know how to resolve this error.
Ref link: https://knowledge.informatica.com/s/article/577839?language=en_US
https://knowledge.informatica.com/s/article/Error-connecting-to-database-DataDirect-ODBC-lib-Driver-Manager-Message-file-not-found-Please-check-for-the-value-of-InstallDir-in-your-odbc-ini-while?language=en_US
https://docs.singlestore.com/managed-service/en/developer-resources/connect-with-application-development-tools/connect-with-odbc/the-singlestore-odbc-driver.html
Reg export ODBCINI=/etc/odbc.ini, I have seen Informatica always use their ODBC drivers. Can you please check if you have single store drivers available in /<INFA_HOME>/ODBCX.version/odbc.ini​ file? If yes, i highly recommend to use it.
If yes, please see if you can test the ODBC driver with Infa provided tool $INFA_HOME/tools/debugtools/ssgodbc -d dsn -u username -p password [-v] against your DB. This will ensure you have no issues with ODBC setup.
You can find all about this here link.
If no, then, pls make sure you have installed correct version single store ODBC drivers (32 or 64 bit) and Informatica user have RWX permission on them. Then,
Add the driver path to LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$HOME/server_dir:$ODBCHOME/lib;
set ODBCINI=/etc/odbc.ini
grant access - chmod 777 /etc/odbc.ini
see if the tool ssgodbc is able to establish connection.
Please see the following examples of integrating SingleStore data with Informatica:
JDBC - https://www.cdata.com/kb/tech/singlestore-jdbc-informatica-cloud.rst
ODBC - https://www.cdata.com/kb/tech/singlestore-odbc-informatica.rst

Cannot add JDBC driver in Sqoop command when running import command using Airflow 2.5.0

I am running a Sqoop import command which imports a table from MySQL db and loads it to HDFS using Sqoop. I have created a below DAG which performs this above activity.
from airflow.models import DAG
from airflow.contrib.operators.sqoop_operator import SqoopOperator
from airflow.utils.dates import days_ago
Dag_Sqoop_Import = DAG(dag_id="SqoopImport",
schedule_interval="* * * * *",
start_date=days_ago(2))
sqoop_mysql_import = SqoopOperator(conn_id="sqoop_local",
table="shipmethod",
cmd_type="import",
target_dir="/airflow_sqoopImport",
num_mappers=1,
task_id="SQOOP_Import",
dag=Dag_Sqoop_Import)
sqoop_mysql_import
I have also created a SqoopImport connection in Airflow as below.
But when is Trigger the job it should take the below command as I assume
sqoop import --connect jdbc:mysql://192.168.0.15:3306/adventureworks?characterEncoding=latin1 --driver com.mysql.jdbc.Driver --username xxxx --password xxxxxx --autoreset-to-one-mapper --table workorder --target-dir /user/adminn/workorder
But when I check in logs its actually taking below command
Executing command: sqoop import --username xxxx --password MASKED --num-mappers 1 --connect jdbc:mysql://192.168.0.15:3306/adventureworks?characterEncoding=latin1 --target-dir /airflow_sqoopImport --as-textfile --table shipmethod
And the DAG fails giving below error. also I know the cause of this error, I need to add the parameter driver com.mysql.jdbc.Driver which can solve the below error. am struggling to add, can you please let me know where am going wrong.
ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic#5906ebcb is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.'
Replies Appreciated, thanks.
You should provide the driver class as an argument for the operator and not the connection
sqoop_mysql_import = SqoopOperator(conn_id="sqoop_local",
table="shipmethod",
cmd_type="import",
target_dir="/airflow_sqoopImport",
driver="com.mysql.jdbc.Driver",
num_mappers=1,
task_id="SQOOP_Import",
dag=Dag_Sqoop_Import)

Problem with Python script when setting up LDAP for MacOS

I am trying to set up Google secure LDAP on my Macbook Pro running Monterey 12.3 following these instructions from Google.
request.appendData_(NSData.dataWithBytes_length_(CONFIG,
len(CONFIG))) TypeError: Expecting byte-buffer, got str
See the script from the guide:
#!/usr/bin/python
from OpenDirectory import ODNode, ODSession, kODNodeTypeConfigure
from Foundation import NSMutableData, NSData
import os
import sys
# Reading plist
GOOGLELDAPCONFIGFILE = open(sys.argv[1], "r")
CONFIG = GOOGLELDAPCONFIGFILE.read()
GOOGLELDAPCONFIGFILE.close()
# Write the plist
od_session = ODSession.defaultSession()
od_conf_node, err = ODNode.nodeWithSession_type_error_(od_session, kODNodeTypeConfigure, None)
request = NSMutableData.dataWithBytes_length_(b'\x00'*32, 32)
request.appendData_(NSData.dataWithBytes_length_(CONFIG, len(CONFIG)))
response, err = od_conf_node.customCall_sendData_error_(99991, request, None)
# Edit the default search path and append the new node to allow for login
os.system("dscl -q localhost -append /Search CSPSearchPath /LDAPv3/ldap.google.com")
os.system("bash -c 'echo -e \"TLS_IDENTITY\tLDAP Client\" >> /etc/openldap/ldap.conf' ")
I have tried to find some solutions on Google (e.g. .encode, b'..) But I do not really understand it.
Thanks for the help.
Okay, I found the solution, actually here it was posted earlier.
Error running python script to create google ldap configuration on Macos

socket.error: [Errno 98] Address already in use

I have this code to connect with server, and this is fileServer.py on server, i have another file py at client but not test yet, i got problem when run this code, please see the information below
import socket
import threading
import os
def RetrFile(name, sock):
filename = sock.recv(1024).decode()
if os.path.isfile(filename):
message = "EXISTS" + str(os.path.getsize(filename))
sock.send(message.encode())
userResponse = sock.recv(1024).decode()
if userResponse[:2] == "OK":
with open(filename, 'rb') as f:
bytesToSend = f.read(1024)
sock.send(bytesToSend)
while (bytesToSend !=""):
bytesToSend = f.read(1024)
sock.send(bytesToSend)
else:
sock.send("ERR")
sock.close()
def Main():
host = '192.168.0.91'
port = 8069
s = socket.socket()
s.bind((host,port))
s.listen(5)
print('Server Started')
while True:
c, addr = s.accept()
print ('Client connected ip: ' + str(addr))
t = threading.Thread(target = RetrFile, args=('retrThread',c))
t.start()
s.close()
if __name__ == '__main__':
Main()
And when I run it, it show me an Error, I think it is about socket to connect with IP server, is it right?
File "fileServer.py", line 40, in <module>
Main()
File "fileServer.py", line 26, in Main
s.bind((host,port))
File "/usr/lib/python2.7/socket.py", line 228, in meth
return getattr(self._sock,name)(*args)
socket.error: [Errno 98] Address already in use
How can I fix that?
Any suggest?
Thanks in advance
I think you are trying to run more than one Odoo server on the same port.
Try this on terminal:
sudo netstat -nlp | grep 8069
then you will see something like this:
tcp 0 0 0.0.0.0:8069 0.0.0.0:* LISTEN 10869/python2
Kill the process:
sudo kill -9 10869
OR
Change the port number in the fileServer.py.
Then try to start Odoo.
Hope it will help you.
You can simply use the following script to kill the process.
fuser -k 8069/tcp
Generally,
fuser -k <port_no>/<tcp/udp>
OR
netstat -nlp | grep <port_no>
kill -9 PID
The error is self explanatory "Address already in use"
return getattr(self._sock,name)(*args)
socket.error: [Errno 98] Address already in use
#KbiR has already explained it
For windows check this out How can you find out which process is listening on a port on Windows?
you could use this command to kill the Odoo process already running on that port
fuser -k 8069/tcp
and launch your python script again
use this command is the correct sudo systemctl stop odoo11
if you use other version of odoo change the number 11 for your version

connect to FileMaker Server with pyodbc

I have a FileMaker db running on FileMaker Server 14 on a Mac Mini and I'm trying to get at it with pyodbc. It's not going well.
First, what works:
telnet 192.169.19.3 2399
ssh Name#192.169.19.3
tsql -H FM-Server -p 2399 -U Name -P pwd
One weird thing about that last one is that it gives me a seconds counter:
1
2
not a prompt, although I can still type commands in. I'm not sure what that means and couldn't find any info about it.
Now, what doesn't work:
tsql -LH 192.169.19.3
tsql -LH FM-Server
isql FM-Server Name pwd
No listed info for the FileMaker Server, isql gives me [ISQL]ERROR: Could not SQLConnect which is just so helpful you know
One issue is that at this point I've sort of forgotten if I should be using FM ODBC or FreeTDS as my driver in pyodbc, luckily neither of them work:
>>> c = pyodbc.connect("DRIVER={FreeTDS};DSN=FM-Server;UID=Name;PWD=pwd")
pyodbc.Error: ('08001', '[08001] [unixODBC][FreeTDS][SQL Server]Unable to connect to data source (0) (SQLDriverConnect)')
>>> c = pyodbc.connect("DRIVER={FileMaker ODBC};DSN=FM-Server;UID=Name;PWD=pwd")
pyodbc.Error: ('08S01', '[08S01] [unixODBC][FileMaker][FileMaker ODBC] Failed to connect to listener (2) (65535) (SQLDriverConnect)')
Giving it just the DSN freezes the window. Here's my configs:
odbc.ini | /usr/local/Cellar/unixodbc/2.3.4/etc/odbc.ini
[FM-Server]
Driver = FreeTDS
Host = 192.169.19.3
ServerName = FM-Server
UID = Name
PWD = pwd
Port = 2399
odbcinst.ini | /usr/local/Cellar/unixodbc/2.3.4/etc/odbcinst.ini
[ODBC Drivers]
FileMaker ODBC = Installed
FreeTDS = Installed
[FileMaker ODBC]
Driver = /Library/ODBC/FileMaker ODBC.bundle/Contents/MacOS/fmodbc.so
Setup =
[FreeTDS]
Description = FreeTDS
Driver = /usr/local/Cellar/freetds/1.00.9/lib/libtdsodbc.0.so
Setup = /usr/local/Cellar/freetds/1.00.9/lib/libtdsodbc.0.so
UsageCount = 1
freetds.conf | /usr/local/Cellar/freetds/1.00.9/etc/freetds.conf
[FM-Server]
host = 192.169.19.3
port = 2399
tds version = 8.0
Any info is much appreciated and shout to the resources here, here, and elsewhere for helping me get even this far.
I was able to solve this with the following, slightly abbreviated
brew uninstall freeDTS
brew uninstall unixODBC
pip uninstall pyodbc
re-installed pyodbc with ActualTech's installer (not sure if necessary): http://www.actualtech.com/python-osx-odbc.php
switched my odbc.ini to read "Driver=FileMaker ODBC" instead of "Driver=freeTDS" but given the code I actually ran I'm not that that's necessary:
python
import pyodbc
c = pyodbc.connect("DRIVER={FileMaker ODBC};SERVER=192.169.19.3;PORT=2399;UID=Name;PWD=pwd")
and magically that worked. idk man

Resources