nsis nsODBC plugin not accepting all parameters - odbc

I am using the NSIS nsODBC plugin to create an systemdsn entry.
When I call the command as shown below it works and creates my system dsn odbc.
nsODBC::AddSysDSN "ODBC Driver 11 for SQL Server" "DSN=test" "server=localhost" "DATABASE=test" "Trusted_Connection=Yes"
Pop $0
Pop $0 Returns "Successful" and in my ODBC Data Source Administrator I see my connection and it works.
But I need to create my system dsn with a username and password, on the forums and site I have seen they say it should look like the following:
nsODBC::AddSysDSN "ODBC Driver 11 for SQL Server" "DSN=test" "server=test" "DATABASE=test" "UID=test" "PWD=test"
Pop $0
When I run this command Pop $0 returns the text "error" and no system dsn is created.
I have checked the SQL server instance it has the test database, a login account called test which is a system admin account, it is in mixed mode authentication.
Any suggestions as to what I am doing wrong as to why it will not create a system dsn with a username and password?
Thanks in advance
Andy

That plug-in is badly designed because it uses everything on the stack as its parameters. It is also a very thin wrapper around the database API so as long as your stack is empty before calling then the problem is most likely a missing/wrong parameter and not a bug in the plug-in.
Calling the plugin with parameters like "foo=1" "bar=baz" is "translated" to foo=1\0bar=baz\0\0 which is what you see on MSDN and that is the C/C++ syntax for a double null terminated key-pair string.
Why is Server set to test and not localhost?
The code listed on the MSDN page for ConfigDSN says:
For example, to configure a data source that requires a user ID, password, and database name, a setup application might pass the following keyword-value pairs:
DSN=Personnel Data\0UID=Smith\0PWD=Sesame\0DATABASE=Personnel\0\0
Maybe the pair order matters, I don't know.

Related

Julia ODBC connection produces pop-up in Jupyter

This is my first time using Julia and I've written a test script to connect to a database as follows:-
using ODBC
db = ODBC.DSN("DRIVER={SQL Server};SERVER=MyServer;DATABASE=MyDatabase;Trusted_Connection=Yes;");
However, when I execute the code in Jupyter, I get a pop-up each time as shown below. I would like to be able to login automatically using windows authentication and not have to manually enter the login details. Can someone help?
Shoudn't it be True vs Yes:
Trusted_Connection=True;
Also I think you shoud define an ODBC data source in your system and use it's name (e.g. MyDSN) in connection string:
db = ODBC.DSN("DSN=MyDSN;DATABASE=MyDatabase;Trusted_Connection=Yes;");
I found the answer, which is to pass the prompt parameter as follows:
using ODBC
db = ODBC.DSN("DRIVER={SQL Server};SERVER=MyServer;DATABASE=MyDatabase;Trusted_Connection=Yes;";prompt=false);

Credentials for AWS Athena ODBC connection

I want to access AWS Athena in Power BI with ODBC. I used the ODBC driver(1.0.3) that Amazon provides:
https://docs.aws.amazon.com/de_de/athena/latest/ug/connect-with-odbc.html
To access the AWS-Service I use the user=YYY and the password=XXX. To access the relevant data our administrator created a role “ExternalAthenaAccessRole#99999”.
99999 is the ID of the account where Athena runs.
To use the ODVC-driver in Power BI I created the following connection string:
Driver=Simba Athena ODBC Driver;AwsRegion=eu-central-1;S3OutputLocation=s3://query-results-bucket/testfolder;AuthenticationType=IAM Credentials;
But when I enter the User XXX with the password YYY It get the message “We couldn’t authenticate with the credentials provided. Please try again.”.
Normally I would think that I must include the role “ExternalAthenaAccessRole#99999” in the connection string, but I couldn’t find a parameter for it in the documentation.
https://s3.amazonaws.com/athena-downloads/drivers/ODBC/SimbaAthenaODBC_1.0.3/Simba+Athena+ODBC+Install+and+Configuration+Guide.pdf
Can anybody help me how I can change the connection string so that I can access the data with the ODBC driver in Power BI?
TL;DR;
When using Secret Keys, do not specify "User / password", but instead always click on "default credentials" in Power Bi, to force it to use the Local AWS Configuration (e.g. C:/...$USER_HOME/.aws/credentials)
Summarized Guide for newbies:
Prerequisites:
AWSCli installed locally, on your laptop. If you don’t have this, just download the MSI installer from here:
https://docs.aws.amazon.com/cli/latest/userguide/install-windows.html
Note: this quick guide is just to configure the connection using AWS Access Keys, and not federating the credentials through any other Security layer.
Configure locally your AWS credentials.
From the Windows command prompt (cmd), execute: aws configure
Enter your AWS Access Key ID, Secret Access Key and default region; for example "eu-west-1" for Ireland.
You can get these Keys from the AWS console, IAM service, Users, select your user, Security, Create/Download Access Keys.
You should never share these keys, and it’s highly recommended to rotate these, for example, every month.
Download Athena ODBC Driver:
https://docs.aws.amazon.com/athena/latest/ug/connect-with-odbc.html
Important: If you have Power Bi 64 bits, download the same (32 or 64) for the ODBC.
Install it on your laptop, where you have Power Bi.
Open Windows ODBCs, add a User DSN and select Simba-Athena as the Driver.
Use always "Default credentials" and not user/password, since it will use our local keys from Step 1.
Configure an S3 bucket, for the temporary results. You can use something like: s3://aws-athena-query-results-eu-west-1-power-bi
On the Power Bi app, click on Get Data and Type ODBC.
Choose Credentials "default", to use the local AWS keys (from step 1) and, optionally, enter a "select" query.
Click on Load the data.
Important concern: I’m afraid Power Bi will load all the results from the query into our local memory. So if, for example. we're bringing 3 months of data and that is equivalent to 3 GB, then we will consume this in our local laptop.
Another important concern:
- For security reasons, you'll need to implement a KMS Encryption keys. Otherwise, the data is being transmitted in clear text, instead of being encrypted.
Relevant reference (as listed above), where you can find the steps for this entire configuration process, but more in detail:
- https://s3.amazonaws.com/athena-downloads/drivers/ODBC/Simba+Athena+ODBC+Install+and+Configuration+Guide.pdf
Carlos.

SQL Server 2017 ODBC via Rstudio or R on SSMS gets connected only to master database

I have been working on SQL Server 2017 via R (on Rstudio as well as R on SSMS) and i am unable to connect to a specific database. I mention the database name in the connection prompt but, it gets connected only to the master database. Is there something that I am missing while connecting?
The syntax I use for connection is:
conn = "Driver={ODBC Driver 13 for SQL Server};server=;Uid=uid; pwd=pwd;Database = mydb"
I am trying to use both RevoscaleR as well as ODBC() package in Rstudio to connect to a specific database but, it still gets connected to master database. Using RStudio connections pane, if i try to explore the other databases, it shows only dbo schemas and no other schemas even if they exist. Can someone help me in figuring out what might have gone wrong?
Most likely the login you use (the uid) is not authorized for that particular database (it is not created as a user in that database).
Some example code you can run in SSMS as - for example - sa:
--switch over to the database in question
USE mydb
GO
CREATE USER uid FOR LOGIN uid;
The above code creates a user in the database in question with the same name as the login.
Hope this helps!

odbcDriverConnect issue trying to connect to an Access Database in R

I am using R 32 bit and am having an issue trying to get the odbcDriverConnect function to work when trying to connect to an Access database. I have successfully connect to the database using odbcConnect, but am also trying to learn how to use the odbcDriverConnect function.
My code is
scallopdata<-odbcDriverConnect("Driver={Microsoft Access Driver (*.mdb, *.accdb)};Dbq=S://adv/Scallop Central/2014 RSARR/2014 RSA Database_9_3_2014.mdb;
Uid=admin;Pwd=")
When I run the code, I get an error message of
ODBC Microsoft Access Driver Login Failed. Could not find file S://adv/Scallop Central/2014 RSARR/2014 RSA Database_9_3_2014.mdb.
I click the OK button which takes me to a Login box. I select the Database... button. This brings me to a Select Database box where I can select the same database that is specified in the Dbq section of code. Once I select the correct database and click OK I am connected to the database.
I am hoping to use the odbcDriverConnect function so that I do not have to set up a new odbc DSN for each database I would like to access. This may just be me not fully understanding the function.
If any one can provide so insight, it would be very helpful.

How to avoid storing userid/password in the .odbc.ini file on Linux?

I am connecting to a Teradata database through ODBC with Stata on an Ubuntu server (12.04 LTS). Everything works fine, except that I have my TD userid and password stored in the .odbc.ini file, which seems like a terrible idea. The alternative is to enter them in Stata, which seems even worse and is awkward. Is there a way to do this more securely? The login info that I use to ssh into the server is synced with the TD database. It seems that it should be possible to pass that information along.
In ODBC terms you do not need to store usernames / passwords in any of your ODBC ini files. Both the ODBC SQLConnect and SQLDriverConnect support the passing in of username / password at the time they are called.
SQLDriverConnect would need something in your InConnectionString like "DSN=YourDataSourceName;UID=username;PWD=password".
You could go one step further and pass in the whole DSN as a command line argument thus meaning that you would not need an ODBC data source in an ini file. I'm sure one of the forum readers can post a sample for you from Teradata.
As for passing in the user name and password from your SSH loging. Your application would need to capture that and pass it to ODBC.
If you want to establish a finer grain of security around your odbc.ini file or other files on your Ubuntu server that may contain user credentials I would strongly suggest the use of Access Control Lists (ACLs). Beyond the typical Owner::Group::World permissions you can specify permissions down to the specific user on whether they are allowed or denied an explicit permission for a given file.
Other options regarding security on Teradata include the use of LDAP authentication if your environment supports it. Configuring LDAP on Teradata is beyond the scope of SO and in many cases a billable, professional services engagement with Teradata's Information Security CoE.

Resources