ODBC connects, but MsAccess 2016 won't - odbc

Hello been struggling with this one:
Up until a few months ago I had an MSAccess 2010 database, it's matching SQL ODBC and life was good.
Then came MSAccess 2016 with a new laptop and now whenever I try to use the old database (and I've created fresh ones in the new version just to test), MsAccess will return an error:
a) SQL Server Error 53: About Failed Connection
b) SQL Server Error 17: Server does not exist or access is denied.
(I use linked tables and with queries I import the data needed)
What really complicates things to diagnose is that:
When you test the ODBC connectivity by itself, the test come back "successful".
I tried pairing the ODBC with Excel, and it did get through, I could see the table list to link.. so again, the ODBC works.
If I log into the work's VPN (even while being connected to the work internal network)... MsAccess 2016 queries magically work as intended.
So, there is something I'm missing with Access itself and some network setting somewhere, but my own IT department is stumped.
Any ideas are welcome on what may be happening. Thanks!

Related

Excel ODBC Connection Issues and HY000 CURLError

I am trying to connect Excel to our Snowflake instance so that users can pull data into Snowflake. I've installed the latest ODBC driver and set the User and Server as required. The authenticator is set to externalbrowser. When we attempt to use that connection within Excel, one of two things occurs:
A bunch of browser tabs open but the user is able to connect and bring in data. Not sure why we have multiple tabs but at least they get what they need.
The connection just spins, and ultimately we end up with a HY000 error saying that the REST Request for our URL failed; code=52 msg=Server returned nothing (no headers, no data), oS code=2, osMsg='No such file or directory'.
We have tried multiple options and all of our other connections (JDBC for example) works just fine with the external browser setting. There doesn't seem to be much of a difference between users that can connect and those that can't.
There is a good Knowledge Base article written about connecting to Snowflake from Excel.
Skip Step 1: Configure SSO with Azure AD if you are not using SSO. Go on to Step 2 onward.
This is the article: https://community.snowflake.com/s/article/HOW-TO-connect-to-Snowflake-authenticating-with-Azure-AD-SSO-from-MS-Excel-ODBC-driver

ODBC 18 vs ODBC 17 in Windows Data Source Manager

I have a microsoft access front end connecting to a SQL database for the backend. I have been using this setup for the last 4 years and I have recently run into issues with new associates not being able to use the tool due to our company retiring ODBC driver 17 from our internal systems. I don't understand what is the difference between ODBC Driver 17 and 18 that would cause version 18 to fail.
How the driver is used:
in ODBC Data source manager a manual link to our database is created. The associate enters a specific name for the link "Our_link" and in the Driver name it states "ODBC Driver 17 for SQL Server"
Then inside of our access front end we link to that driver like so:
Const ConStrSQL As String = "DRIVER={ODBC Driver 17 for SQLServer};Server=OurServer;Database=Our_DB;UID=User();Trusted_Connection=Yes;"
The issue I am having is when I try to create the ODBC connection in the data source administrator using ODBC driver 18 I get an error that states:
[![`"Connection Failed: The certificate chain was issued by an authority that is not trusted"`]
Not sure if this extra information would help but I also see the following:
SQLState: 08001
SQL Server Error -2146893019
Client unable to establish connection
Is this something I need to reach out to our database admin group and ask if they installed driver 18 on the server side?
I'm guessing it has to do with the changes to the encryption behavior with version 18, specifically that encryption is required by default. The recommended fix is to install a trusted certificate on your server[1], but if you don't want to deal with the DB Admins you might be able to still connect by specifying No (or optional) to Encrypt in your connection string.[2]
There is a chance that won't work the server is set to Force Encryption, but it sounds like the change is all on the client end. Ideally you would want the encryption working all the time, so if you are using a self-signed certificate add the public key from the SQL server to trusted certificates on the client machines.
[1]: https://techcommunity.microsoft.com/t5/sql-server-blog/odbc-driver-18-0-for-sql-server-released/ba-p/3169228)
[2]: https://learn.microsoft.com/en-us/sql/connect/odbc/dsn-connection-string-attribute?view=sql-server-ver16#encrypt
The 'fix' that was found with the help of one of my database admins is as follows:
In the data source manager there is an option to select that states "Trust Server Certificate"
Once that option is selected I was able to complete the rest of my DSM connection. One thing to note is I was receiving the previous error when trying to change the DEFAULT DATABASE option. The checkbox to "trust server certificate" is on the screen after that.. so I had to skip choosing my default database, check the box, then go back and select my default database for everything to work.
I haven't completed all my testing in Access to make sure everything works 100%, but my quick testing is very promising.

SQL Server 2017 ODBC via Rstudio or R on SSMS gets connected only to master database

I have been working on SQL Server 2017 via R (on Rstudio as well as R on SSMS) and i am unable to connect to a specific database. I mention the database name in the connection prompt but, it gets connected only to the master database. Is there something that I am missing while connecting?
The syntax I use for connection is:
conn = "Driver={ODBC Driver 13 for SQL Server};server=;Uid=uid; pwd=pwd;Database = mydb"
I am trying to use both RevoscaleR as well as ODBC() package in Rstudio to connect to a specific database but, it still gets connected to master database. Using RStudio connections pane, if i try to explore the other databases, it shows only dbo schemas and no other schemas even if they exist. Can someone help me in figuring out what might have gone wrong?
Most likely the login you use (the uid) is not authorized for that particular database (it is not created as a user in that database).
Some example code you can run in SSMS as - for example - sa:
--switch over to the database in question
USE mydb
GO
CREATE USER uid FOR LOGIN uid;
The above code creates a user in the database in question with the same name as the login.
Hope this helps!

what's the issue with AttachDbFilename

Apparently, using AttachDbFilename and user instance in your connection string is a bad way to connect to a DB. I'm using SQL server express on my local machine and it all seems to work fine. But what's the proper way to connect to SQL server then?
Thanks for your explanation.
Using User Instance means that SQL Server is creating a special copy of that database file for use by your program. If you have two different programs using that same connection string, they get two entirely different copies of the database. This leads to a lot of confusion, as people will test updating data with their program, then connect to a different copy of their database in Management Studio, and complain that their update isn't working. This sends them through a flawed series of wild goose chase steps trying to troubleshoot the wrong problem.
This article goes into more depth about how to use this feature, but heed the very first note: the User Instance feature has been deprecated. In SQL Server 2012, the preferred alternatives are (in this order, IMHO):
Create or attach your database to a real instance of SQL Server. Your connection string will then just need to specify the instance name, the database name, and credentials. There will be no mixup as Management Studio, Visual Studio and your program(s) will all be connecting to a single copy of the database.
Use a container for local development. Here's a great starter video by Anna Hoffman and Anthony Nocentino, and I have some other resources here, here, and here. If you're on an M1 Mac, you won't be able to use a full-blown SQL Server instance, but you can use Azure SQL Edge if you can get by with most SQL Server functionality (the omissions are enumerated here).
Use SqlLocalDb for local development. I believe I pointed you to this article yesterday: "Getting Started with SQL Server 2012 Express LocalDB."
Use SQL Server Compact. I like this option the least because the functionality and syntax is not the same - so it's not necessarily going to provide you with all the functionality you're ultimately going to want to deploy. Compact Edition is also deprecated, so there's that.
Of course if you are using a version < SQL Server 2012, SqlLocalDb is not an option - so you should be creating a real database and using that consistently. I only mention the Compact option for completeness - I think that can be almost as bad an idea as using AttachDbFileName.
EDIT: I've blogged about this here:
Bad Habits : Using AttachDBFileName
In case someone had the problem.
When attaching the database with a connection string containing AttachDBFile
with SQLEXPRESS, I noticed this connection was exclusive to the ASP.NET application that was using the database. The connection did block the access to all other processes on the file level when made with System.Data.SqlClient as provider.
In order to assure the connection to be shareable with other processes
instead use DataBase to specify the database name in your connection string
Example or connection string :
Data Source=.\SQLEXPRESS;DataBase=PlaCliGen;User ID=XXX;password=ZZZ; Connect Timeout=30
,where PlaCliGen is the name (or logical name) by which SQLEXPRESS server knows the database.
By connecting to the data base with AttachDBFile giving the path to the .mdf file
(namely : replacing DataBase = PlacliGen by AttachDBFile = c:\vs\placligen\app_data\placligen.mdf) the File was connected exclusively and no other process could connect to the database.

Pull Sybase data into SQL Server

I have an ASP.NET app that uses a SQL Server database. I now need to pull data from Sybase ASE into that SQL Server database for my app to consume, and I'm not having any success with my ideas.
Has anyone done this? Any ideas/suggestions/tips?
You can configure a linked server from SQL Server to Sybase. It should be fairly vanilla using the Sybase provider on the MS side.
Okay, I've finally (through lame trial and error) found out how to link my Sybase ASE (12.5) server to my SQL Server (2008) which will allow the integration I want. Here's roughly how I did it:
Logged in to Sybase ASE OLE DB Configuration Manager (this is like the Sybase version of Windows' ODBC Data Sources) and added an OLE DB data source. I believe you must be an admin on the PC to do this.
In SQL Server 2008 Management Studio, went to Server Objects > Linked Servers. Right click and select "New Linked Server".
In the Linked Server Properties, I set the following properties:
General:
--Linked server: the name of your linked server as you want it to appear in your linked server list
--Provider: Select Sybase ASE OLE DB Provider from the dropdown list.
--Product name: The exact name of the OLD DB data source you just created in Sybase ASE OLE DB Configuration Manager.
--Data source: Same as Product name.
--Provider string: I left this blank
--Location: I left this blank
--Catalog: The default database (master or whatever) to log on to.
Security:
--You need to map a valid SQL Server logon to a valid Sybase logon. I did not use impersonation (which does a credentials pass-thru).
--I chose my connection Be made without using a security context.
Server Options:
--All the defaults worked for me.
Throughout, the standard SQL Server help worked fairly well as a guide. Though not always true, F1 was my friend here.
I can now do distributed queries, DTS or SSIS packages, and use SSRS. This takes a lot of the suck out of Sybase ASE.
Of course the above can be done via the command line using sp_linkserver, but the GUI is more comfortable for a lowly dev like me.
Use Management Studio or Enterprise Manager to import the data using the data importation wizard. That should be it, just make sure you pick the right data provider in the wizard and you should be good to go.
If you want this to be a live feed create a small windows service to manage the exchange of information. It should be relatively simple to do, just a little bit of leg work on your end. If you are adverse to that there are plenty of off the shelf solutions that can do this for you.
The question is a little vague on specifics:
Is this a one time conversion or part of a repeated process.
Is the source machine "reachable" from your destination machine (can you connect the two or do you need to read in files)
With most conversions there are two parts:
Physically getting data from the source into the destination.
Mapping data from the source to the destination tables.
It is hard to make any recommendations without more info. What would be fine for a one time conversion would not work if you need to read in data all day every day. Also, if the source database can not be connected to and you have to pass files, they methods change.

Resources