I can not connect ODBC in google colab - odbc

I am trying connect ODBC with Colab, but it is give me a error. The error is OperationalError: ('HYT00', '[HYT00] [Microsoft][ODBC Driver 17 for SQL Server]Login timeout expired (0) (SQLDriverConnect)')
Could someone help me?
Let you my code
%%sh
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get -q -y install msodbcsql17
!pip install pyodbc
import pyodbc
Parámetros de conexiòn a SQL
server = 'server-training.database.windows.net'
database = 'db_axm'
username = 'axm_reader'
password = 'careerp#th22*'
conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};'
'SERVER='+ server +';'
'DATABASE='+ database +';'
'UID='+ username +';'
'PWD='+ password)`
I have benn triying change the code with some alternatives but I have been unsuccessful

Related

using pyodbc in azure databrick for connecting with SQL server

import pyodbc
pyodbc.connect('Driver={SQL SERVER};'
'Server=server name;'
'Database = database name;'
'UID='my uid;'
'PWD= 'my password;'
'Authentication = ActiveDirectoryPassword')
running above code in databrick notebook i am getting following error
Error: ('01000', "[01000] [unixODBC][Driver Manager]Can't open lib 'SQL SERVER' : file not found (0) (SQLDriverConnect)")
By default, Azure Databricks does not have ODBC Driver installed.
For SQL Server: You can resolve the issue by using the following script
sudo apt-get -q -y install unixodbc unixodbc-dev
sudo apt-get -q -y install python3-dev
sudo pip install --upgrade pip
pip install pyodbc
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get -q -y install msodbcsql
For Azure SQL Database: Run the following commands in a single cell to install MY SQL ODBC Driver on Azure Databricks cluster.
%sh
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add -
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/mssql-release.list
sudo apt-get update
sudo ACCEPT_EULA=Y apt-get -q -y install msodbcsql17
What you posted looks like straight Python code. In the Databricks environment, things are a little different than they are on your local machine.
Try it like this.
import pyodbc
server = '<server>.database.windows.net'
database = '<database>'
username = '<username>'
password = '<password>'
driver= '{ODBC Driver 17 for SQL Server}'
cnxn = pyodbc.connect('DRIVER='+driver+';SERVER='+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = cnxn.cursor()
cursor.execute("SELECT TOP 20 pc.Name as CategoryName, p.name as ProductName FROM [SalesLT].[ProductCategory] pc JOIN [SalesLT].[Product] p ON pc.productcategoryid = p.productcategoryid")
row = cursor.fetchone()
while row:
print (str(row[0]) + " " + str(row[1]))
row = cursor.fetchone()

Get Access to Airflow Hooks within a jupyter notebook

I have Airflow running using the postgres backend - all fine. Additionally I have running a Jupyter server on the same host where Airflow runs. Now I thought I can just access the airflow hooks from a notebook.
import pandas as pd
import numpy as np
import matplotlib as plt
from airflow.hooks.mysql_hook import MySqlHook
mysql = MySqlHook(mysql_conn_id = 'mysql-x')
sql = "select 1+1"
mysql.get_pandas_df(sql)
But I get this exception message:
OperationalError: (sqlite3.OperationalError) no such table: connection
[SQL: SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted
FROM connection
WHERE connection.conn_id = ?]
[parameters: ('mysql-x',)]
(Background on this error at: http://sqlalche.me/e/e3q8)
But what makes me get suspicious is that it not only does not find the connection_id (which I clearly can see in the Airflow UI). It also says: sqlite3.OperationalError - it very much looks like it is not even connected to the same postgres database. I have checked os.environ["AIRFLOW_HOME"] which seems to be correct.
EDIT 1:
After I start the jupyter notebook server after airflow, such that all environment variables are set, I get a different error:
/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/attributes.py in __get__(self, instance, owner)
351
352 def __get__(self, instance, owner):
--> 353 retval = self.descriptor.__get__(instance, owner)
354 # detect if this is a plain Python #property, which just returns
355 # itself for class level access. If so, then return us.
/usr/local/lib/python3.7/site-packages/airflow/models/connection.py in get_password(self)
188 "Can't decrypt encrypted password for login={}, \
189 FERNET_KEY configuration is missing".format(self.login))
--> 190 return fernet.decrypt(bytes(self._password, 'utf-8')).decode()
191 else:
192 return self._password
/usr/local/lib/python3.7/site-packages/cryptography/fernet.py in decrypt(self, msg, ttl)
169 except InvalidToken:
170 pass
--> 171 raise InvalidToken
InvalidToken:
You can use this dockerfile to reproduce it:
FROM apache/airflow
USER root
# install mysql client
RUN apt-get update && apt-get install -y mariadb-client-10.3 unzip
# install mssql client and tools
RUN apt-get install -y curl gnupg libicu-dev libicu63
RUN curl https://packages.microsoft.com/keys/microsoft.asc -o key
RUN apt-key add < key
RUN curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list > /etc/apt/sources.list.d/msprod.list
ENV ACCEPT_EULA=Y
RUN apt-get update && apt-get install -y mssql-tools msodbcsql17 unixodbc unixodbc-dev unzip libunwind8
RUN curl -Lq 'https://go.microsoft.com/fwlink/?linkid=2108814' -o sqlpackage-linux-x64-latest.zip
RUN mkdir /opt/sqlpackage/ && unzip sqlpackage-linux-x64-latest.zip -d /opt/sqlpackage/
RUN chmod a+x /opt/sqlpackage/sqlpackage && ln -sfn /opt/sqlpackage/sqlpackage /usr/bin/sqlpackage
RUN ln -sfn /opt/mssql-tools/bin/sqlcmd /usr/bin/sqlcmd
# install notebooks
RUN pip install jupyterlab pandas numpy scikit-learn matplotlib pymssql
#RUN cat /entrypoint.sh
# start additional notebok server
# this is a dirty hack but for the sake of this prototype good enough
RUN sed -i -e's/\# Run the command/airflow scheduler \& \njupyter notebook --ip=0.0.0.0 --port=9000 --NotebookApp.token="" --NotebookApp.password="" \& \n/' /entrypoint
EXPOSE 9000
# switch back to airflow user
USER airflow
RUN airflow initdb
RUN alias ll='ls -al'

Unable to connect to Microsoft SQL Server inside Docker container using FreeTDS

I want to host Shiny applications on my company network using Docker for Windows.
How do I set up the Docker, odbc.ini, odbcinst.ini, freetds.conf, or possibly other files so that my Shiny application can query data from an internal Microsoft SQL Server (2016) database? The database server is not running on the same machine running the Docker container.
I don't know if I need a newer version of FreeTDS or if I have mis-configured one of the files. I tried using the IP address to the server instead of sql-server.host.com in all files but get the same error message below.
$ tsql -C output:
Compile-time settings (established with the "configure" script)
Version: freetds v1.00.104
freetds.conf directory: /etc/freetds
MS db-lib source compatibility: no
Sybase binary compatibility: yes
Thread safety: yes
iconv library: yes
TDS version: 4.2
iODBC: no
unixodbc: yes
SSPI "trusted" logins: no
Kerberos: yes
OpenSSL: no
GnuTLS: yes
MARS: no
$ odbcinst -j output:
unixODBC 2.3.6
DRIVERS............: /etc/odbcinst.ini
SYSTEM DATA SOURCES: /etc/odbc.ini
FILE DATA SOURCES..: /etc/ODBCDataSources
USER DATA SOURCES..: /root/.odbc.ini
SQLULEN Size.......: 8
SQLLEN Size........: 8
SQLSETPOSIROW Size.: 8
$ cat etc/odbcinst.ini output:
[FreeTDS]
Description = FreeTDS unixODBC Driver
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so
$ cat etc/odbc.ini output:
[sql-server]
driver = FreeTDS
server = sql-server.host.com
port = 1433
TDS_Version = 4.2
$ cat etc/freetds/freetds.conf output:
[sql-server]
host = sql-server.host.com
port = 1433
tds version = 4.2
Command in R giving error:
con <- dbConnect(odbc::odbc(),
driver = "FreeTDS",
server = "sql-server.host.com",
port = 1433,
database = "database name",
TDS_Version = 4.2)
Error:
Error: nanodbc/nanodbc.cpp:950: 08001: [FreeTDS][SQL Server]Unable to connect to data source
Execution halted
Docker file:
# Install R version 3.5.3
FROM r-base:3.5.3
# Install Ubuntu packages
RUN apt-get update && apt-get install -y \
sudo \
gdebi-core \
pandoc \
pandoc-citeproc \
libcurl4-gnutls-dev \
libcairo2-dev/unstable \
libxt-dev \
libssl-dev \
unixodbc unixodbc-dev \
freetds-bin freetds-dev tdsodbc
# Edit odbc.ini, odbcinst.ini, and freetds.conf files
RUN echo "[sql-server]\n\
host = sql-server.host.com\n\
port = 1433\n\
tds version = 4.2" >> /etc/freetds.conf
RUN echo "[FreeTDS]\n\
Description = FreeTDS unixODBC Driver\n\
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so" >> /etc/odbcinst.ini
RUN echo "[sql-server]\n\
driver = FreeTDS\n\
server = sql-server.host.com\n\
port = 1433\n\
TDS_Version = 4.2" >> /etc/odbc.ini
# Install R packages that are required
RUN R -e "install.packages(c('shiny', 'DBI', 'odbc'), repos='http://cran.rstudio.com/')"
# copy the app to the image
RUN mkdir /root/shiny_example
COPY app /root/shiny_example
COPY Rprofile.site /usr/lib/R/etc/
# Make the ShinyApp available at port 801
EXPOSE 801
CMD ["R", "-e", "shiny::runApp('/root/shiny_example')"]
Docker build and run commands:
docker build . -t shiny_example
docker run -it --network=host -p 801:801 shiny_example
Note that following R code works on my Windows machine running the Docker container and I can successfully query the database:
library(DBI)
con <- dbConnect(odbc::odbc(),
driver = "SQL server",
server = "sql-server.host.com")
$ isql -v sql-server output:
[S1000][unixODBC][FreeTDS][SQL Server]Unable to connect to data source
[01000][unixODBC][FreeTDS][SQL Server]Unknown host machine name.
[ISQL]ERROR: Could not SQLConnect
$ tsql -S sql-server output:
locale is "en_US.UTF-8"
locale charset is "UTF-8"
using default charset "UTF-8"
Error 20013 (severity 2):
Unknown host machine name.
There was a problem connecting to the server
It looks like you're correct, but might have missed a little. I had similar issue, but I was able to fix it!
On python+mssqlserver+pymssql+docker(unbuntu16.04 base image)
Without fixing the end, upon running my code(using pymssql) was giving me this error
Traceback (most recent call last):
File "<stdin>", line 4, in <module>
File "src/pymssql.pyx", line 645, in pymssql.connect
pymssql.InterfaceError: Connection to the database failed for an unknown reason.
I followed created 3 files and copied in the image!
Use myserver.orgName.com where MSsql server is hosted.
docker/odbcinst.ini
[FreeTDS]
Description = v0.91 with protocol v7.3
Driver = /usr/lib/x86_64-linux-gnu/odbc/libtdsodbc.so
docker/odbc.ini
[myserverdsn]
Driver = FreeTDS
Server = myserver.orgName.com
Port = 1433
TDS_Version = 7.3
docker/freetds.conf
[global]
# TDS protocol version, use:
# 7.3 for SQL Server 2008 or greater (tested through 2014)
# 7.2 for SQL Server 2005
# 7.1 for SQL Server 2000
# 7.0 for SQL Server 7
tds version = 7.2
port = 1433
# Whether to write a TDSDUMP file for diagnostic purposes
# (setting this to /tmp is insecure on a multi-user system)
; dump file = /tmp/freetds.log
; debug flags = 0xffff
# Command and connection timeouts
; timeout = 10
; connect timeout = 10
# If you get out-of-memory errors, it may mean that your client
# is trying to allocate a huge buffer for a TEXT field.
# Try setting 'text size' to a more reasonable limit
text size = 64512
# A typical Microsoft server
[myserverdsn]
host = myserver.orgName.com
port = 1433
tds version = 7.3
Dockerfile content
RUN apt-get -y install unixodbc unixodbc-dev freetds-dev freetds-bin tdsodbc
COPY freetds.conf /etc/freetds/freetds.conf
COPY odbc.ini /etc/odbc.ini
COPY odbcinst.ini /etc/odbcinst.ini
Test python code that works:
python
>>> import pymssql
>>> conn = pymssql.connect(server = 'myserver.orgName.com',
user = 'myusername',
password = 'mypassword',
database= 'mydbname')
works without error!

MariaDB is not working

I have installed mariadb using below commands:
yum install MariaDB-server MariaDB-client -y
But when execute commane:
service mysqld start
Error is coming:
mysqld: unrecognized service
Please let us know what wrong I am doing or what additionally I have to do to make it work.
Firstly verify that service sees MySQL using command:
sudo service --status-all
If you see MySQL there run command:
sudo service mysql start

Maria DB Access Privileges

I have created my Maria DB Instance under Centos 7 using the the following command
[username#localhost]$ sudo yum install mariadb-server
System Enabled and Started with the following Commands
sudo systemctl enable mariadb
sudo systemctl start mariadb
I have secured the installation with the following command
sudo mysql_secure_installation
The Server was secured and invoked the mysql prompt using
mysql -u root -p
There is no problem. I am able to use this command to get into SQL Command
However when I am trying to load tables, i am getting the following error
ERROR 1045 [28000]: Access Denied for user 'root#'localhost' (using
password: NO)
Unable to load data. What could be the solution
Thanks
Sreeram

Resources