Reading documentation here.
I would like to use an odbc connection with Snowflake. I'm already able to get this working by adding .odbc.ini to my home directory with a configuration that looks similar to this:
[snowflake]
Description=SnowflakeDB
Driver=SnowflakeDSIIDriver
Locale=en-US
SERVER=ourco.us-east-1.snowflakecomputing.com
PORT=443
SSL=on
ACCOUNT=ourco.us-east-1
UID=MY_NAME
PWD=$MyPassword
This works. With this config, I'm able to create a connection and query our database.
However I would like to use odbc with keys instead of my password. Followed guide here on creating keys and added them within SF interface.
I then updated my odbc.ini file:
[snowflake]
Description=SnowflakeDB
Driver=SnowflakeDSIIDriver
Locale=en-US
SERVER=ourco.us-east-1.snowflakecomputing.com
PORT=443
SSL=on
UID=MY_NAME
ACCOUNT=ourco.us-east-1
PRIV_KEY_FILE=/home/rstudio-blah/keys/rsa_key.p8 # this is where I stored the key
SNOWSQL_PRIVATE_KEY_PASSPHRASE=potatoes{ # not my real pwd
When attempting to connect with this set up I got error message:
Error: nanodbc/nanodbc.cpp:1021: 00000: [unixODBC][Snowflake][DSI] (20032) Required setting 'PWD' is not present in the connection settings.
I tried adding PWD= just trying to trick it with a null value but then got:
Error: nanodbc/nanodbc.cpp:1021: 00000: [unixODBC][Snowflake][Snowflake] (31)
Password not found.
I then tried adding AUTHENTICATOR=SNOWFLAKE_JWT which gave error:
Error: nanodbc/nanodbc.cpp:1021: 00000: [unixODBC][Snowflake][Snowflake] (44)
Error finalize setting: Marshaling private key failed.
How can I connect via odbc using a key value pair as opposed to a PWD?
I am using libzdb - Database Connection Pool Library with sqlite database. I am getting following exception :
Failed to start connection pool - database protocol 'sqlite' not supported
After ConnectionPool_start() - it goes in static int _fillPool(T p), in that it is getting falied at above statement
Connection_T con = Connection_new(P, &P->error);
My connection url is as follows :
sqlite:///home/ZDB_TESTING/zdb-test/testDb.db
Kindly help me with this problem.
This means that the SQLite library is not compiled into the libzdb library. If installing from a distribution, make sure that you select libzdb built with SQLite. If you built libzdb yourself from source, after you run ./configure make sure the output says, SQLite3: ENABLED. Otherwise you need to install SQLite on your system first.
I'm having trouble connecting to a database with the ODBC.jl package. I can't tell if the problem is with my setup (more likely) or the package. The problem is that ODBC.jl can't seem to locate the correct ODBC driver.
> using ODBC
> ODBC.listdrivers()
/path/to/generic/odbc/
But I need to use a different driver than the one picked up from above.
I'm trying to use a custom connection string as follows:
>ODBC.DSN("DRIVER=path/to/driver/i/want;SERVER=myserver;USER=myuser;PASSWORD=mypass;DATABASE=somedb;")
which returns this:
[ODBC] IM002: [unixODBC][Driver Manager]Data source name not found, and no default driver specified
ERROR: ODBC.ODBCError("ODBC.API.SQLDriverConnect(dbc,window_handle,conn_string,out_conn.ptr,BUFLEN,out_buff,driver_prompt) failed; return code: -1 => SQL_ERROR ")
My understanding is that I should be able to specify the driver as done above, but this does not give the desired connection.
I have .odbc.ini and .odbcinist.ini files set-up in my home directory, which I believe are working correctly. I'm on a Suse enterprise distro. When connecting via isql i have no problems.
Any help is appreciated.
I have build the spark source using the following command
mvn -Pyarn -Phadoop-2.5 -Dhadoop.version=2.5.2 -Phive -Phive-1.1.0 -Phive-thriftserver -DskipTests clean package
I have started the thrift server using the following command
spark-submit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --master local[*] file:///c:/spark-1.3.1/sql/hive-thriftserver/target/spark-hive-thriftserver_2.10-1.3.1.jar
Connected the thriftserver in beeline using the following command
Jdbc:hive2://localhost:10000
Created the table named as people using the following query
Create table people(Name String);
Load data local inpath ‘C:\spark-1.3.1\examples\src\main\resources\people.txt’ overwrite into table people;
How to read this table from C# application using odbc connection or thrift library?
I have use the following code snippet to read the table using C# code generated by thrift and Thrift dll
Console.WriteLine("Thrift hive server for Spark SQL Connection....");
TSocket hiveSocket = new TSocket("localhost", 10000);
TBinaryProtocol protocol =new TBinaryProtocol(hiveSocket);
ThriftHive.Client client = new ThriftHive.Client(protocol);
if (!hiveSocket.IsOpen)
{
hiveSocket.Open();
}
Console.WriteLine("Thrift server connected");
client.execute("select * from people1");
But i can not execute the query.
It is not throwing any error or exception because there was probably no error and the processing worked. You just need to actually retrieve the results using client.fetchAll().
I'm using the RPostgreSQL 0.4 library (compiled on R 2.15.3) on R 2.15.2 under Windows 7 64-bit to interface to PostgreSQL. This works fine when connecting to my PostgreSQL databases on localhost. I'm trying to get my R code to run with a remote PostgreSQL database on Heroku. I can connect to Heroku's PostgreSQL database from the psql command shell on my machine, and it connects without a problem. I get the message:
psql (9.2.3, server 9.1.9)
WARNING: psql version 9.2, server version 9.1.
Some psql features might not work.
WARNING: Console code page (437) differs from Windows code page (1252)
8-bit characters might not work correctly. See psql reference
page "Notes for Windows users" for details.
SSL connection (cipher: DHE-RSA-AES256-SHA, bits: 256)
Clearly, psql uses SSL to connect. When I try to connect using the RPostgreSQL library routine dbConnect(), however, supplying exactly the same credentials using dname=, host=, port=, user=, password=, the connection fails with the complaint:
Error in postgresqlNewConnection(drv, ...) :
RS-DBI driver: (could not connect <user>#<hostname> on dbname <dbname>)
Calls: source ... .valueClassTest -> is -> is -> postgresqlNewConnection -> .Call
Execution halted
I know that Heroku insists on an SSL connection if you want to access their database remotely, so it seems likely that the R interface routine dbConnect() isn't trying SSL. Is there something else that I can do to get a remote connection from R to PostgreSQL on Heroku to work?
To get the JDBC URL for your heroku instance:
Get your hostname, username and password using [pg:credentials].
Your jdbc URL is going to be:
jdbc:postgresql://[hostname]/[database]?user=[user]&password=[password]&ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory
Proceed as you would normally with JDBC.
Apparently there is a way using RJDBC. See:
http://ryepup.unwashedmeme.com/blog/2010/11/17/working-with-r-postgresql-ssl-and-mssql/
Please note that in order to connect to Heroku database with JDBC externally, it is important to set the sslfactory parameter as well. Hope Heroku team goes through it and modifies their documentation.
String dbUri = "jdbc:postgresql://ec2-54-243-202-174.compute-1.amazonaws.com:5432/**xxxxxxx**";
Properties props = new Properties();
props.setProperty("user", "**xxxxx**");
props.setProperty("password", "**xxxxx**");
props.setProperty("ssl", "true");//ssl to be set true
props.setProperty("sslfactory", "org.postgresql.ssl.NonValidatingFactory");// sslfactory to be set as shown above
Connection c=DriverManager.getConnection(dbUri,props);
See answer to related Q at https://stackoverflow.com/a/38942581. The suggestion of using RPostgres (https://github.com/rstats-db/RPostgres) instead of RPostgreSQL resolved this same issue for me.