Compare ODBC settings across multiple database servers - odbc

I've run into this problem with my three-node SQL Cluster, though it's not unique to clusters. We have a dozen different ODBC drivers installed, both x86 and x64 versions, and we're constantly finding instances where some nodes in our cluster has either a different version of the driver, are missing the driver, or it's not configured properly. Especially in a cluster, it's critical that different nodes all have the same configurations, or jobs can fail unexpectedly on one node and run fine on another, and it leads to hours of frustration.
Is there a tool out there that will compare the installed/configured ODBC drivers and data sources and produce a report of what's out of sync? I've considered writing something in the past to do this, but haven't gotten around to it. If it's an issue for others and there's not a tool that does it, I'll put one together.

It seems that all the information related to your ODBC settings is stored in the registry, all together. Since nobody else knows of an app to compare these settings, I'll throw one together and post it on my website, putting a link here.
If you want to compare the settings yoursef, they're stored at:
HKLM\SOFTWARE\ODBC\ODBC.INI\ (your data sources)
HKLM\SOFTWARE\ODBC\ODBCINST.INI\ (your installed providers)
Also, it's worth noting that if you're on an x64 machine, there are both x64 and x86 ODBC drivers and data sources, and they're stored separately - in this case, check out the accepted answer on the following post to see which location you should be checking in:
http://social.msdn.microsoft.com/Forums/en/netfx64bit/thread/92f962d6-7f5e-4e62-ac0a-b8b0c9f552a3

Related

Azure Data Factory - Self-Hosted Integration Runtime - ODBC driver mystery

We are using a Self-Hosted Integration Runtime for Azure Data Factory.
On that machine there was installed an Exasol ODBC driver of version 6. We wanted to upgrade the driver, deleted an old one and installed a new driver of version 7.
Weird thing is that now in Exasol logs we can see that Data Factory is sometimes connecting via driver version 7, and sometimes via driver version 6.
I made an experiment and deleted Exasol ODBC driver from the machine completely. After that Data Factory still was able to connect to Exasol using the driver I just deleted.
Looks like drivers' DLLs are cached somewhere. What can it be?
Update 1
I captured following actions in Process Monitor when Data Fatory connected to Exasol with ODBC driver of version 6:
Where these C:\Config.Msi\3739be5*.rbfASolution-6.1\ODBC\ DLLs may come from? There is no C:\Config.Msi\ directory on the machine.
Update 2
I noticed that when I test connection via Microsoft Integration Runtime Configuration Manager on the machine or in Data Factory Linked Service, then connection is always performed with ODBC driver of version 7.
But when I test connection via Data Factory Dataset, then in some cases connection is done with ODBC driver of version 6.
You could check the registry but clean at your own risk. An alternative might be the SysIternals tools, Process Monitor or Process Explorer which might help you get to the bottom of this. Install them on the SHIR VM if you are allowed to. Process Explorer in particular is a bit like SQL Profiler (if you've ever used that) so will be able to tell you which registry keys external processes are using. It will give you a lot a lot of information so you will have to make judicious use of timestamp and filtering. The proposed steps:
Start a trace using Process Monitor
Start a pipeline using the Exasol driver
Wait til it completes (or at least you know it has started)
Stop the Process Monitor trace Spend time going through the millions
of records it has captured, trying to filter down, or search for your
process
An alternative would be to build a clean SHIR and install only the new driver. Then swap it in for the old one. You may have to get the new SHIR added to the firewall if this is an issue for you.
Honestly I would propose both of these approached in parallel for a production problem. Procmon / Process Explorer can be quite labour and time expensive but should help you get to the bottom of the issue. Building a cleaner SHIR is probably a safer option in the long-term, but requires new infrastructure.
It may sound silly, but rebooting the server where SHIR is working solved the problem.
We noticed, that this server was running for more than 30 days, and decided to reboot it. Maybe restarting Integration Runtime service itself would also help, but we didn't do it.
Thanks to everyone for you help.

AS400 iSeries Client Access multiple versions

We are running an AS400 v5r2 and I have iSeries Client access installed. Since v5r2 does not support a x64 ODBC driver does anyone know how I can either install two versions (v5r4 supports x64) of iSeries Client Access on the same box or just install the x64 odbc driver from the more recent version without uninstalling all of v5r2 components.
Installing two versions of Client Access is probably not going to work, since both register their ODBC drivers with the same name, so only one would be available at a given time.
OTOH the PC side of V5R4 Client Access would probably work without problem with a V5R2 OS/400; perhaps even 6.1 iSeries Access, too. So you can upgrade the x64 box and check whether everything is working. FYI, I had problems with the first versions of 6.1 iSeries Access when running on x64 boxes, later versions were a bit better; also, I do not remember that V5R4 Client Access had a 64-bit variant at all.
Do not forget that on a x64 PC, there are two different ODBC drivers, one for 32-bit applications (stored on C:\WINDOWS\SysWOW64\cwbodbc.dll and that you can manage with the 32-bit administradorC:\WINDOWS\SysWOW64\odbcad32.exe), and another one for 64-bit applications (stored on C:\WINDOWS\System32\cwbodbc.dll and that you can manage with the 64-bit administrador C:\WINDOWS\System32\odbcad32.exe.) Unless your application is recompiled for 64-bit, what you are interested in is the former one, and if V5R2 Client Access runs flawlessly on that PC, everything is fine. Some applications like Office 2010 come in two flavours, but precisely for compatibility reasons like ODBC, it is still recommended to run the 32-bit variant even on 64-bit workstations.
1) V5R2 is very dead. You aren't going to get a lot of help when it comes to supporting an OS this old.
2) V5R4 is also dead.
3) Generally speaking, IBM intends that Client Access will work for operating systems two levels back and two levels ahead, so you could try using a V5R4 ODBC driver against a V5R2 DB2. The issue is going to be getting a V5R4 version of Client Access.
4) If you have questions about admin issues like this, Server Fault is probably the better choice.
EDIT: Add details of Client Access installation
Client Access has two logical pieces, a server side component and a client side component. Both pieces are available in the IFS, in the QIBM directory tree. If you have an already-working setup of Client Access on the server side, you can install the client side one of two ways:
1) Map a network drive to the IFS and run setup from there. This obviously won't be helpful to you because the V5R2 software doesn't support x64. If you are still under software maintenance, you could order a newer version of Client Access and install it on the server, and then use the newer version to install the needed ODBC driver.
2) Use the IBM-supplied CD to install the client component directly on the client. This allows you to install a different client version than the one on the server. Not generally recommended but in the case where you're migrating away from an unsupported machine, it's probably not a big worry. If your company ordered V5R4 at any time, you have the Client Access CDs.
The key thing for you is that you don't need to install the full Access product if all you need is the ODBC driver.
The biggest problem facing you is the age of the software. IBM stopped supporting V5R4 in Sep 2013. You aren't going to be able to place an order for it with IBM. You might be able to order V6R1 but the ODBC driver may not work with V5R2 - you'd have to try it. See the IBM i Access web site for details, but it's not downloadable.
If you can use OLEDB, try IBM's FTP site.

Test connection was NOT successful...Database not found of no system permission

I am having difficulty connecting to a existing Informix database. I am attempting to mimic the configuration that is present on another machine which currently works. By the way, that other machine is on the same network and it is accessing the DB through a tunnel, so I am pretty sure the issue isn't related to the network configuration.
Regardless, here are the steps that I took to try and make the connection
Downloaded clientsdk.3.50.TC9DE and installed this. The working machine uses 3.50.TC2DE, but I couldn't find the installer for that version. (Note that at first I tried using 3.50TC9, not sure if that makes a difference)
Matched the ODBC config in the new machine to the working machine
The working machine has a host name in the Host Name field. I assume this was allowed because the host was set to an IP in the hosts file. Regardless, I am using the IP.
Also I am using C:\Windows\SysWOW64\odbcad32.exe to create the DSN
Made sure that the INFORMIXDIR and PATH directory were correct. as per http://www.dbforums.com/informix/694408-odbc-test-connection-not-successful.html#post2633932 I don't think the locales are the issue because they aren't set in the working machine's Setnet32. Also, I made sure that the locales matched in the ODBC environment settings.
Also, since my INFORMIXDIR is in C:\Program Files (x86)\IBM\Informix\Client-SDK\bin I tried replacing Program Files (x86) with PROGRAM~2 and Client-SDK with CLIENT~1 to no avail.
Tried setting INFORMIXDIR directly in my system environment variables (outside of Setnet32)
Set DBPATH to match the working system in both the user and system environment variables.
Set INFORMIXSERVER to the server in both Setnet and the system environment variables.
Completely lowered the firewall on my machine.
I can ping and telnet into the server.
I have also tried..
Tried this on Windows XP
Tested the ILogin demo. The result was a popup that stated Customer Records Found in the title bar with an empty text area field.
Reinstalled into C:\informix instead of C:\Program Files(x86)...
Rebooted after various steps.
At this point I am at a loss. Has anyone run into this? The only other things I can think of is that I am using Win7 64-bit (with 32 bit drivers) and that the driver is 9DE not 2DE.
Alright so half of the battle is over. I was able to get a "Test connection was successful" on my Win7 machine. We had a copy of the 2.90.TC6 driver available in our file server from way back. I installed it and it worked. So my guess is that the database I am working with isn't compatible with 3.50.TC9DE.
I guess my next course of action is to try and find an installer for 3.50.TC2DE so that I can match the production system.

Running Visual Studio in Parallels for mac - problem with debugging sites sitting in os x drive

I've installed parallels desktop on my MacBook to be able to run Visual Studio 2008 in a XP installation. Everything works great except when I decided to put my websites in my sites folder in the os x file system (Which by default automatically happens because the My Documents folder is mapped to the Mac's Documents folder, and I'd rather put my code there so that both OS's can easily access it.).
When trying to build or debug I get this error:
Failed to start monitoring changes to 'Z:\xxx...'
How do I get it so that I can get it to work under Parallels, from the shared drive?
Parallels uses network drives to simulate folders on OS X, and Windows can't monitor changes to network drives, so if you do this directly, it'll be broken.
If you want to keep them in sync though, use Live Mesh (http://www.mesh.com) and install it on both the host and guest. A little roundabout, but it'll make it so both copies are maintained (and Live Mesh is handy for other things too)
I recently flipped over to putting my source code onto my Mac volume, so I could use Time Machine to back it up and immediately got this same problem with my ASP.NET app. Other, procedural applications, built just fine, by the way.
I tried all sorts of things, including using Samba on the Mac side to share the directory, which led into the "too many BIOS commands" error described elsewhere. Unfortunately for me, the Registry hacks to fix that problem never worked for some reason.
I finally found another solution that avoids Samba and just uses the regular Parallels Shared Folders. It too is a Registry hack, but this one simply turns off file change monitoring for ASP.NET. It is a bit heavy-handed, but gets my builds to work again.
The reference for this change is here:
http://support.microsoft.com/kb/911272
The downside to this approach, I am finding, is that you need to be more deliberate about recompiling, or restarting the web server, as changes during development don't just magically appear anymore. I am still deciding whether that is a useful tradeoff.
UPDATE: After several days of this, development was just too difficult and, sadly, what I reverted to was keeping my source inside the Parallels virtual disk. To enable Time Machine backups and Spotlight searches, I used a lightweight MS utility called SyncToy to push stuff out of Parallels and out to my Mac drive several times a day. Despite the high hack factor, it is working well.
I know this isnt strictly a solution but VMware fusion is superior when it comes to shared drive space on a virtual machine. Its what i currently use and hasn't let me down thus far...
People always give me odd looks when they see visual studio on my mac :P
Try moving the project on to the VMs C drive. Its not an ideal situation, but you can access the VMs C drive from OS X.
I have a similar problem with a php site that uses an MS Access database (its a clients system). I have alias's that point to the php site on the VM so that I can still do all of my coding in OS X. To do this I created a network share on the VM and then connected to it from OS X. Once connected make the alias's. If the network drive is not open and you open a file in OS X it will try to reconnect. It means the VM will need to be running to get to the files, but this isn't normally a problem since the VM is hosting the site anyways.
.NET has funny issues trying to debug the objects on a network drive.
make sure that you have full trust on your local network between your Mac and XP install.
Check out: http://msdn.microsoft.com/en-us/library/aa302361.aspx
If at the end of that research, I"m afraid you will have to look into the option of keeping it on the VMDisk and moving it when you need it.
I see a similar problem on my machine connected to the windows domain. My documents is mapped to a network share and I can't debug|run|etc. I had to eventually move to my local disk for debugging.
I definately recommend Live Mesh as a way to keep directories in sync. Just keep the VM's directory in sync with the Mac's directory.
Or use SVN to hold copies in both machines and do commit/update as appropriate. That way you get versioning, history and if your project grows bigger, you can share with other devs.
I know dropbox also has history and sharing, but not check in/check out/conflicts and all the other advantages of a real source control.
Oh, if you have money you can also go for TFS. I would but it is just too expensive :)

"Database is locked" error in SQLite over a Mac network

I have created a simple database using SQLite (actually PySQLite). It works fine when I'm querying or writing to the database from the local machine (ie program and database file on the windows machine drive). However when I copy the database file to my network drive (a time capsule), then Windows machines, although they can see the files and have full read/write access to the drive, give me a "SQL Error: database is locked" even when performing a simple select!
Queries work fine over the network from Macs.
There is no fancy multi-access going on - only one machine has the database open. Seems like some weird Mac networking issue. Happens in either the Python program, or in the SQLite3 command line. I am using SQLite 3.6.14.2.
Anybody seen this problem? Any way of fixing it? Don't really want to get heavy with MYSQL because this is a simple single-user program, but i'd like to use it from multiple machines.
Thanks.
I don't know if it can be done on MAC, on Debian I have to mount the samba directory with the nobrl option.
From mount.cifs(8):
nobrl
Do not send byte range lock requests to the server. This is
necessary for certain applications that break with cifs
style mandatory byte range locks (and most cifs servers do
not yet support requesting advisory byte range locks).
Read the sqlite FAQ: http://www.sqlite.org/faq.html#q5
"People who have a lot of experience
with Windows tell me that file locking
of network files is very buggy and is
not dependable. If what they say is
true, sharing an SQLite database
between two or more Windows machines
might cause unexpected problems."
So it doesn't work on Windows, it doesn't tell about MAC.
Possibly it fails to lock the file over the network, I think you use SMB protocol so the bugginess comes with the package. If you would like to use SQLite over the network see SQLite Network for alternatives.
I've had a similar problem and I solved it by installing a newer sqlite version. Since Python 2.6 the problem has disappeared too because it uses a newer sqlite dll.
Thank you Carlos. Cherrytree depends on SQLite, and for some reason it recently stopped working with my samba-mounted SQLite database file, complaining about a locked database. Adding "nobrl" to my ubuntu fstab entry solved the problem.
//192.168.3.122/Files /mnt/Files cifs username=public,password=asdf,rw,noperm,nobrl 0 0

Resources