Credentials for AWS Athena ODBC connection - odbc

I want to access AWS Athena in Power BI with ODBC. I used the ODBC driver(1.0.3) that Amazon provides:
https://docs.aws.amazon.com/de_de/athena/latest/ug/connect-with-odbc.html
To access the AWS-Service I use the user=YYY and the password=XXX. To access the relevant data our administrator created a role “ExternalAthenaAccessRole#99999”.
99999 is the ID of the account where Athena runs.
To use the ODVC-driver in Power BI I created the following connection string:
Driver=Simba Athena ODBC Driver;AwsRegion=eu-central-1;S3OutputLocation=s3://query-results-bucket/testfolder;AuthenticationType=IAM Credentials;
But when I enter the User XXX with the password YYY It get the message “We couldn’t authenticate with the credentials provided. Please try again.”.
Normally I would think that I must include the role “ExternalAthenaAccessRole#99999” in the connection string, but I couldn’t find a parameter for it in the documentation.
https://s3.amazonaws.com/athena-downloads/drivers/ODBC/SimbaAthenaODBC_1.0.3/Simba+Athena+ODBC+Install+and+Configuration+Guide.pdf
Can anybody help me how I can change the connection string so that I can access the data with the ODBC driver in Power BI?

TL;DR;
When using Secret Keys, do not specify "User / password", but instead always click on "default credentials" in Power Bi, to force it to use the Local AWS Configuration (e.g. C:/...$USER_HOME/.aws/credentials)
Summarized Guide for newbies:
Prerequisites:
AWSCli installed locally, on your laptop. If you don’t have this, just download the MSI installer from here:
https://docs.aws.amazon.com/cli/latest/userguide/install-windows.html
Note: this quick guide is just to configure the connection using AWS Access Keys, and not federating the credentials through any other Security layer.
Configure locally your AWS credentials.
From the Windows command prompt (cmd), execute: aws configure
Enter your AWS Access Key ID, Secret Access Key and default region; for example "eu-west-1" for Ireland.
You can get these Keys from the AWS console, IAM service, Users, select your user, Security, Create/Download Access Keys.
You should never share these keys, and it’s highly recommended to rotate these, for example, every month.
Download Athena ODBC Driver:
https://docs.aws.amazon.com/athena/latest/ug/connect-with-odbc.html
Important: If you have Power Bi 64 bits, download the same (32 or 64) for the ODBC.
Install it on your laptop, where you have Power Bi.
Open Windows ODBCs, add a User DSN and select Simba-Athena as the Driver.
Use always "Default credentials" and not user/password, since it will use our local keys from Step 1.
Configure an S3 bucket, for the temporary results. You can use something like: s3://aws-athena-query-results-eu-west-1-power-bi
On the Power Bi app, click on Get Data and Type ODBC.
Choose Credentials "default", to use the local AWS keys (from step 1) and, optionally, enter a "select" query.
Click on Load the data.
Important concern: I’m afraid Power Bi will load all the results from the query into our local memory. So if, for example. we're bringing 3 months of data and that is equivalent to 3 GB, then we will consume this in our local laptop.
Another important concern:
- For security reasons, you'll need to implement a KMS Encryption keys. Otherwise, the data is being transmitted in clear text, instead of being encrypted.
Relevant reference (as listed above), where you can find the steps for this entire configuration process, but more in detail:
- https://s3.amazonaws.com/athena-downloads/drivers/ODBC/Simba+Athena+ODBC+Install+and+Configuration+Guide.pdf
Carlos.

Related

ODBC 18 vs ODBC 17 in Windows Data Source Manager

I have a microsoft access front end connecting to a SQL database for the backend. I have been using this setup for the last 4 years and I have recently run into issues with new associates not being able to use the tool due to our company retiring ODBC driver 17 from our internal systems. I don't understand what is the difference between ODBC Driver 17 and 18 that would cause version 18 to fail.
How the driver is used:
in ODBC Data source manager a manual link to our database is created. The associate enters a specific name for the link "Our_link" and in the Driver name it states "ODBC Driver 17 for SQL Server"
Then inside of our access front end we link to that driver like so:
Const ConStrSQL As String = "DRIVER={ODBC Driver 17 for SQLServer};Server=OurServer;Database=Our_DB;UID=User();Trusted_Connection=Yes;"
The issue I am having is when I try to create the ODBC connection in the data source administrator using ODBC driver 18 I get an error that states:
[![`"Connection Failed: The certificate chain was issued by an authority that is not trusted"`]
Not sure if this extra information would help but I also see the following:
SQLState: 08001
SQL Server Error -2146893019
Client unable to establish connection
Is this something I need to reach out to our database admin group and ask if they installed driver 18 on the server side?
I'm guessing it has to do with the changes to the encryption behavior with version 18, specifically that encryption is required by default. The recommended fix is to install a trusted certificate on your server[1], but if you don't want to deal with the DB Admins you might be able to still connect by specifying No (or optional) to Encrypt in your connection string.[2]
There is a chance that won't work the server is set to Force Encryption, but it sounds like the change is all on the client end. Ideally you would want the encryption working all the time, so if you are using a self-signed certificate add the public key from the SQL server to trusted certificates on the client machines.
[1]: https://techcommunity.microsoft.com/t5/sql-server-blog/odbc-driver-18-0-for-sql-server-released/ba-p/3169228)
[2]: https://learn.microsoft.com/en-us/sql/connect/odbc/dsn-connection-string-attribute?view=sql-server-ver16#encrypt
The 'fix' that was found with the help of one of my database admins is as follows:
In the data source manager there is an option to select that states "Trust Server Certificate"
Once that option is selected I was able to complete the rest of my DSM connection. One thing to note is I was receiving the previous error when trying to change the DEFAULT DATABASE option. The checkbox to "trust server certificate" is on the screen after that.. so I had to skip choosing my default database, check the box, then go back and select my default database for everything to work.
I haven't completed all my testing in Access to make sure everything works 100%, but my quick testing is very promising.

Snowflake ODBC works with Azure and fail with AWS

I created an azure Snowflake trial account and an odbc dsn, it works.
Then I had to create a Snowflake trial in AWS to use the Snowflake training.
When creating a DSN it fails with this error; Incorrect username or password was specified.
For Azure, I use ..snowflakecomputing.com, this works.
For AWS I use .snowflakecomputing.com, I get the user error.
I tried other combinations but hten I always get a host unresolved error.
..snowflakecomputing.com
.sg..aws.snowflakecomputing.com
Thanks for hints
As per https://docs.snowflake.com/en/user-guide/organizations-connect.html your Snowflake URL / servername is in this format
https://<account_locator>.<region>.<cloud>.snowflakecomputing.com
The cloud is not needed if using AWS, but is if using Azure or GCP. The region is not needed if us-east1 but otherwise must be specified.
Now, it doesn't matter where you are trying to connect to Snowflake from - it only matters where your Snowflake instance is.
So, if you made your Snowflake instances in AWS it is the exact same URL / server name whether you connect to it via the web console, a command-line tool, ODBC from an AWS application, JDBC from an Azure application, or anything else.
Thus, if you made the Snowflake account in AWS you will use something like
https://FV23210.ap-southeast2.snowflakecomputing.com
no matter where you call it from; if you made the Snowflake account in Azure you will use something like
https://FV23210.ap-southeast2.azure.snowflakecomputing.com
no matter where you call it from.

Accessing Reflection for unix and openvms outside of Reflection

My place of business currently uses Reflection for Unix and OpenVMS to handle a database of customers. I access this database directly through the Reflection emulation. The only way to get data out of Reflection is to navigate to a single customer via keyboard input and print the information to a .txt.
Is there anyway I can access the VM other than through Reflection with the end goal of automating retrieval of customer information from a Java script executed outside of the Reflection environment? This is the information I can gather via the Reflection interface about what I am connecting to:
At the bottom of the Reflection interface - VT500-7 -- HOST_NAME via SECURE SHELL
Via the Connection Setup drop-down:
Host name: HOST_NAME
SSH config scheme: AutoKeyLogin
User name: username
Via the Security... button:
General tab:
Port number: 22
User Authentication: [x] Public Key
[x] Password
User Keys tab:
Use Name Type Location
[x] username1user DSA C:\Documents\PathToSSHKey\.ssh
Host Keys tab:
Host Type Fingerprint
HOST_NAME, 111.1.111.11, 22 DSA 39:14:f3:123:fds:restOfFingerprint
There is more information available if the solution is possible but I have just not provided enough to solve it, so please ask.
Given that I have the host name, port, .ssh, and host key, is it possible to connect to and read from the VM that I am otherwise connecting to normally via the Reflection emulator?
NO. Reflection (other example is PuTTY) is just a dumb-terminal emulator, here using the (secure) SSH protocol to connect to some Operating System. From the information provided we cannot even tell which OS. Maybe OpenVMS maybe some Unix. Most certainly not a 'VM', but a physical box. Maybe a Alpha, Integrity, Sun, IBM or Intel server.
IF, big if, it is OpenVMS you would possibly see something like this flash by on entry:
XXX - HP rx2600 (1.50GHz/6.0MB) OpenVMS IA64 V8.3-1H1
Last interactive login on Thursday, 7-DEC-2017 13:23:19.83
Last non-interactive login on Wednesday, 6-DEC-2017 12:35:45.80
Most likely username uses is set up to always start a (shell) script which starts a menu from which a program is activated, which knows how to access data record. IF is it OpenVMS then the actual data is likely stored in RMS (indexed) files, but it could in a proper (Oracle RDB or RDBMS) database.
If bulk access to the data is needed then you need to talk to the system/application manager for the system 'HOST_NAME' and ask them about the application and its database.
You may find that there is FTP, ODBC or JDBC or natice DB (OCI?) access to the data avaiable already, or that this can be requested. Likely tools in this space are ConnX, Attunity Connect, and such.
First you'll need to find out which OS/Platform/Version, which application (3rd party? home grown? 4GL? Cobol? Basic? and ultimately, which database/storage method.
That's not to say that some terminal emulator cannot be 'tricked' (google -
screen scraping) to be programmed to fetch a series of data on command, but that will always be error prone and laboriously for limited volumes.
You are better of trying to get proper data access.
Good luck! You'll need some.
Hein

R connect to database

Sorry but I am failing at a very simple task right now.
I have the following database information:
database name
hostname
port
SID
TNS
User ID
password
I want to build a connection with the RODBC package.
According to the results of my google search i should do
conn<-odbcConnect(dsn, uid=***, pwd=***)
what is "dsn"? is this even the right way?
dsn is Data Source Name, which is a shortcut you may define on your machine to store key information about the connection. How you set up a DSN varies depending on your operating system.
I write scripts that run on multiple machines, so rather than use a DSN, I use odbcDriverConnect, via something like
odbcDriverConnect(connection="driver=[driver]; server=[server]; database=[database]; uid = [User ID]; pwd = [password]")
You'll need to know your driver name to make this work. Where to find this will depend on your operating system, as well as the flavor of SQL you are using.

How to avoid storing userid/password in the .odbc.ini file on Linux?

I am connecting to a Teradata database through ODBC with Stata on an Ubuntu server (12.04 LTS). Everything works fine, except that I have my TD userid and password stored in the .odbc.ini file, which seems like a terrible idea. The alternative is to enter them in Stata, which seems even worse and is awkward. Is there a way to do this more securely? The login info that I use to ssh into the server is synced with the TD database. It seems that it should be possible to pass that information along.
In ODBC terms you do not need to store usernames / passwords in any of your ODBC ini files. Both the ODBC SQLConnect and SQLDriverConnect support the passing in of username / password at the time they are called.
SQLDriverConnect would need something in your InConnectionString like "DSN=YourDataSourceName;UID=username;PWD=password".
You could go one step further and pass in the whole DSN as a command line argument thus meaning that you would not need an ODBC data source in an ini file. I'm sure one of the forum readers can post a sample for you from Teradata.
As for passing in the user name and password from your SSH loging. Your application would need to capture that and pass it to ODBC.
If you want to establish a finer grain of security around your odbc.ini file or other files on your Ubuntu server that may contain user credentials I would strongly suggest the use of Access Control Lists (ACLs). Beyond the typical Owner::Group::World permissions you can specify permissions down to the specific user on whether they are allowed or denied an explicit permission for a given file.
Other options regarding security on Teradata include the use of LDAP authentication if your environment supports it. Configuring LDAP on Teradata is beyond the scope of SO and in many cases a billable, professional services engagement with Teradata's Information Security CoE.

Resources