Prevent Access from Blocking ODBC Query Under Certain Conditions - r

I'm using Access VBA to call an R script that builds some charts. This R script pulls some data from the Access database via an ODBC query. I'm using library(RODBC) to make the connection from R.
If I restart Access, or run Compact/Repair, the query will always run. However, if I make other changes in the database, I'll sometimes get the following warning:
Warning messages:
1: In odbcDriverConnect(sprintf("Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s", :
[RODBC] ERROR: state HY000, code -3810, message [Microsoft][ODBC Microsoft Access Driver] The database has been placed in a state by an unknown user that prevents it from being opened or locked.'
And the script fails to run, because the connection couldn't be made.
What's the best way to manage/set the state of the database so the query will always run? The issue isn't directly linked to whether a table is open or not - I can open a table, and close a table, and not have an issue, and even run with a table open, sometimes.
Edit: The error is caused by making any sort of change in a VBA module (this is unrelated to the actual VBA call of the script, I can run the same rscript call in the command line and replicate the error). Now that I understand that's the cause, I don't think it's a big issue. Saving the VBA module sometimes seems to correct the error, although not 100% of the time.

This is by design.
Making any design change to a VBA module, form or report sets an exclusive lock on an accdb file, which remains until the Access application that has made the change closes.
Just close and re-open the file after making any design change to a form, report or VBA module.
This is one of the reasons people recommend you split the database, since then you can change the design without locking people out of the data.

Related

Excel ODBC Connection Issues and HY000 CURLError

I am trying to connect Excel to our Snowflake instance so that users can pull data into Snowflake. I've installed the latest ODBC driver and set the User and Server as required. The authenticator is set to externalbrowser. When we attempt to use that connection within Excel, one of two things occurs:
A bunch of browser tabs open but the user is able to connect and bring in data. Not sure why we have multiple tabs but at least they get what they need.
The connection just spins, and ultimately we end up with a HY000 error saying that the REST Request for our URL failed; code=52 msg=Server returned nothing (no headers, no data), oS code=2, osMsg='No such file or directory'.
We have tried multiple options and all of our other connections (JDBC for example) works just fine with the external browser setting. There doesn't seem to be much of a difference between users that can connect and those that can't.
There is a good Knowledge Base article written about connecting to Snowflake from Excel.
Skip Step 1: Configure SSO with Azure AD if you are not using SSO. Go on to Step 2 onward.
This is the article: https://community.snowflake.com/s/article/HOW-TO-connect-to-Snowflake-authenticating-with-Azure-AD-SSO-from-MS-Excel-ODBC-driver

OpenEdge database connection issue

Im trying to add another data in a from table in a separate database to my script,
but I keep getting this error all time.
My script
connect database "chris.db" .
run chrisf.p
disconnect databse.
The error I'm getting
How can I get round this issue?
Thank you.
The word "database" is not part of the syntax for the CONNECT statement.
CONNECT "chris".
is the correct syntax.
The OpenEdge documentation for CONNECT is here: https://documentation.progress.com/output/OpenEdge117/openedge117/?_ga=2.93982683.75218856.1547464117-1040589272.1546786181#page/dvpin%2Fthe-connect-statement.html
I'm not sure what you are trying to do with:
run chrisf.p disconnect databse.
but that will run an external procedure called "chrisf.p" and pass 2 "compile on the fly" parameters with values of "disconnect" and "databse". (I'm pretty sure that's not really what you intend.)

Does MonetDBLite support auto-commit mode?

I am trying to optimize data upload in an R package using MonetDBLite. As per the MonetDB website, using LOCKED mode can speed up upload:
LOCKED mode
In many bulk loading situations, the original file can be saved as a
backup or recreated for disaster handling. This reliefs the database
system from having to prepare for recovery as well and to safe
significant storage space. The LOCKED qualifier can be used in this
situation (and in single user mode!) to skip the logging operation
normally performed.
However, when I try to run my COPY INTO statement with LOCKED mode I get the error:
Server says 'ParseException:SQLparser:COPY INTO .. LOCKED: only allowed in auto commit mode'.
Reading the CRAN MonetDBlite documentation would have me believe that the standard mode is auto-commit, eg. the documentation for dbTransaction():
dbTransaction is used to switch the data from the normal
auto-commiting mode into transactional mode. Here, changes to the
database will not be permanent until dbCommit is called. If the
changes are not to be kept around, you can use dbRollback to undo all
the changes since dbTransaction was called.
but perhaps this isn't true since I'm getting the above error.
Does anyone have any insight?

Teradata: Keep running the script although there are errors

I have a long script with sql statements to run on teradata. I want the script to keep running until the end and save the errors in a log file and that it will stop on every error. How can I do it?
thanks
Assuming you are using Teradata SQL Assistant:
Click on Tools in the menu bar, then Options, then Query. There is a checkbox that says "Stop query execution if an SQL error occurs"
To get the most recent error hit F11. Otherwise, from the menu bar click Tools, then show history. Double click on the row number on the left side of one of the history records and it will bring up a screen with the result messages for each statement. You can also query this sort of info directly from one of the QryLog views in DBC.
Errors can be of multiple types, some can be by-passed and some cannot be. For example, with native Teradata Tools and Utilities you can make a script ignore run-time errors, or even syntax errors, but generally it is impossible to ignore network connectivity errors and still get remaining part of your queries executed.
Generally in such scenarios, you want to use the BTEQ tool for executing the SQL in which you can ignore the execution errors. BTEQ is a standard Teradata tool which can be downloaded from Teradata website for free and is commonly installed by users querying Teradata through plain SQL.
To create a workable BTEQ script simply copy paste all of your queries into a plain text file, separate all queries with semicolons ; and on the very top of that plain text file add a logon statement as stated below
.logon Teradata_IP_Address/your_UserName,your_Password;
example script:
.logon 127.0.0.1/dbc,dbc;
/*Some sample queries. Replace these with your actual queries*/
SELECT Current_Timestamp;
CREATE TABLE My_Table (Dummy INTEGER) PRIMARY INDEX (Dummy);
So BTEQ got you through the execution errors. To avoid network connectivity issues, ideally you want to execute that on a server which has a constant connection to Teradata and with Teradata Tools and Utilities installed. Such a server may be called as ETL server, landing server, edge node or managed server (or something else, depending on your environment). You will definitely need login credentials to that server (if you don't already have access). Preferable commands to execute a bteq script are
Windows: bteq < yourscriptname >routine_logfile 2>error_logfile
Linux (bash/ksh): nohup bteq < yourscriptname >routine_logfile 2>error_logfile &
Make sure not to close the command prompt if you are on windows. On Linux you can close the current window or even terminate your network session with your ETL server if you use the recommended command.
If you see a warning about EOL line found at the end of your logs, just ignore it; it is because for simplicity I ignored some optional BTEQ statements which ensure cleaner exit.

Powercenter Informatica Using a Oracle View as Source

I have an Informatica PowerCenter process that uses as Source (VIEW_SOURCE) and ORACLE View and as Target a normal database table.
The VIEW_SOURCE has normally about 500k/600k rows.
My problem is: The data from the VIEW_SOURCE is not inserted on totally on the Target Table. Some data, that exists, doesnt' go into table.
The PowerCenter process doesn't end in error, and there is no error in process.
Is possible, for instance, that we got a problem "rendering the source view", like a memory leak?

Resources