Teradata: Keep running the script although there are errors - teradata

I have a long script with sql statements to run on teradata. I want the script to keep running until the end and save the errors in a log file and that it will stop on every error. How can I do it?
thanks

Assuming you are using Teradata SQL Assistant:
Click on Tools in the menu bar, then Options, then Query. There is a checkbox that says "Stop query execution if an SQL error occurs"
To get the most recent error hit F11. Otherwise, from the menu bar click Tools, then show history. Double click on the row number on the left side of one of the history records and it will bring up a screen with the result messages for each statement. You can also query this sort of info directly from one of the QryLog views in DBC.

Errors can be of multiple types, some can be by-passed and some cannot be. For example, with native Teradata Tools and Utilities you can make a script ignore run-time errors, or even syntax errors, but generally it is impossible to ignore network connectivity errors and still get remaining part of your queries executed.
Generally in such scenarios, you want to use the BTEQ tool for executing the SQL in which you can ignore the execution errors. BTEQ is a standard Teradata tool which can be downloaded from Teradata website for free and is commonly installed by users querying Teradata through plain SQL.
To create a workable BTEQ script simply copy paste all of your queries into a plain text file, separate all queries with semicolons ; and on the very top of that plain text file add a logon statement as stated below
.logon Teradata_IP_Address/your_UserName,your_Password;
example script:
.logon 127.0.0.1/dbc,dbc;
/*Some sample queries. Replace these with your actual queries*/
SELECT Current_Timestamp;
CREATE TABLE My_Table (Dummy INTEGER) PRIMARY INDEX (Dummy);
So BTEQ got you through the execution errors. To avoid network connectivity issues, ideally you want to execute that on a server which has a constant connection to Teradata and with Teradata Tools and Utilities installed. Such a server may be called as ETL server, landing server, edge node or managed server (or something else, depending on your environment). You will definitely need login credentials to that server (if you don't already have access). Preferable commands to execute a bteq script are
Windows: bteq < yourscriptname >routine_logfile 2>error_logfile
Linux (bash/ksh): nohup bteq < yourscriptname >routine_logfile 2>error_logfile &
Make sure not to close the command prompt if you are on windows. On Linux you can close the current window or even terminate your network session with your ETL server if you use the recommended command.
If you see a warning about EOL line found at the end of your logs, just ignore it; it is because for simplicity I ignored some optional BTEQ statements which ensure cleaner exit.

Related

Can the qpad queries be recovered if qpad is closed but the port is open?

I ran multiple queries but before saving them, the qpad crashed. However the q-port on which these queries were running (on my windows machine) is still open. I can recover the variables and functions by \v and \f respectively.
Is there a way to recover all the q statements I ran using qpad? I forgot to maintain a log file, hence I am trying to find a way to recover queries using q-port.
Thanks
Unfortunately there's no way to retrieve your old queries for the reasons Davis.Leong said. But if you can't/don't want to create a table on your server to save them, you can also check the log queries box in QPad settings:
Q > Settings > Editor > Log queries to "queries_date.log"
Now when you run queries, they will be written to this log file in the same directory as QPad.exe, along with the server and timestamp, like this:
/ 02/26/19 09:54:52 on `:localhost:1234:: from QPad1*
show `logthis
/ 02/26/19 10:03:03 on `:localhost:1234:: from QPad1*
a:10
Unfortunately I don't think there is a way to retrieve your command history. Others has already mentioned why so I will not go into that. You can easily maintain a log file in the future however:
When you start your server, adding the -l flag will allow you to define a path to a log file. Any commands sent to the server from the client will now be logged. For example
q ../log/logtest -l -p 5555
t:([]date:`date$();sym:`sym$();price:`float$())
will start a q process listening on 5555, logging any messages that cause the server to update. So if I open a handle to 5555 in another q session h:hopen `::5555
and
update table t
q)h"insert[`t](2000.01.01;`appl;102.3)"
,0
the server will have updated t like so
q)t
date sym price
---------------------
2000.01.01 appl 102.3
There will be a log file created which will show any commands sent to the server. NOTE however it will only log those commands that change the state of the server's data.
This log file can be reloaded in the event of a server crash using the same command as before.
The answer is no. qpad is the GUI that interact with the q process. The reason why you can retrieve the variable and function is because the process did not die. For the query, in default q will not save that, unless when you customize your .z.pg to upsert a record in a queryHistory table.
e.g.
q).z.pg:{[x]`queryHistory insert ([]queryTime:.z.P;query:enlist x)}
q)queryHistory:([]queryTime:`timestamp$();query:())
q)10+10
20
q)testTab:([]sym:10?`1;val:10?100)
q)queryHistory
queryTime query
---------------
queryHistory is not append with record as this is being done in q process itself, if you do it in your qpad:
10+10
testTab:([]sym:10?`1;val:10?100)
you can see there will be record append, so even your GUI is crashed, you can trace the query
q)queryHistory
queryTime query
-------------------------------------
2019.02.26D17:32:38.471063000 "10+10"
q)queryHistory
queryTime query
----------------------------------------------------------------
2019.02.26D17:32:38.471063000 "10+10"
2019.02.26D17:32:52.790863000 "testTab:([]sym:10?`1;val:10?100)"
Got to know recently, there is a backup of your q scripts at "c/users//Appdata/local" and are autosaved every 5-6 mins.These are temporary files which are deleted when you save the script. However if your qPad crashed, you can find your files here :)

Prevent Access from Blocking ODBC Query Under Certain Conditions

I'm using Access VBA to call an R script that builds some charts. This R script pulls some data from the Access database via an ODBC query. I'm using library(RODBC) to make the connection from R.
If I restart Access, or run Compact/Repair, the query will always run. However, if I make other changes in the database, I'll sometimes get the following warning:
Warning messages:
1: In odbcDriverConnect(sprintf("Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s", :
[RODBC] ERROR: state HY000, code -3810, message [Microsoft][ODBC Microsoft Access Driver] The database has been placed in a state by an unknown user that prevents it from being opened or locked.'
And the script fails to run, because the connection couldn't be made.
What's the best way to manage/set the state of the database so the query will always run? The issue isn't directly linked to whether a table is open or not - I can open a table, and close a table, and not have an issue, and even run with a table open, sometimes.
Edit: The error is caused by making any sort of change in a VBA module (this is unrelated to the actual VBA call of the script, I can run the same rscript call in the command line and replicate the error). Now that I understand that's the cause, I don't think it's a big issue. Saving the VBA module sometimes seems to correct the error, although not 100% of the time.
This is by design.
Making any design change to a VBA module, form or report sets an exclusive lock on an accdb file, which remains until the Access application that has made the change closes.
Just close and re-open the file after making any design change to a form, report or VBA module.
This is one of the reasons people recommend you split the database, since then you can change the design without locking people out of the data.

Why would the TrackedMessages_Copy_BizTalkMsgBoxDb start failing with "Query processor could not produce a query plan"?

Why would the TrackedMessages_Copy_BizTalkMsgBoxDb SQL Agent job start failing with "Query processor could not produce a query plan"?
Query processor could not produce a query plan because of the hints defined in this query. Resubmit the query without specifying any hints and without using SET FORCEPLAN. [SQLSTATE 42000] (Error 8622).
Our SQL guys are talking about amending the stored proc. but we've told them to treat BizTalk db's as a black box
It should go without saying, but before anything, make sure to backup your databases. In fact, if your regular backup jobs are running, you may be able to restore a backup and compare things to when it was working on this server. That said -
Check the SQL Agent Job to make sure no additional steps have been added/no plan has been forced/no hints are being used; it should just have one step called 'Purge' that calls the procedure below with the DB server and DTA database name as parameters.
Check the procedure (BizTalkMsgBoxDb.dbo.bts_CopyTrackedMessagesToDTA) to make sure it hasn't been altered.
If this is a production or otherwise sensitive box, back up the DBs and restore them to a local dev environment before proceeding!
If this is not production, see if you can run the procedure (perhaps in a transaction that you rollback) directly in SSMS. See if you get any better errors. Add print statements to see if you can find out exactly where it's getting conflicting hints.
If the procedure won't run, consider freeing the procedure cache (DBCC FREEPROCCACHE) and seeing if the procedure will run.
If it runs in your dev environment from a backup, you may have to start looking at server/database settings. I can't think of which ones off the top of my head that would cause this error though.
For what it's worth, well intentioned DBAs break BizTalk frequently. They decide that an index is missing or not properly covering, or that security could be improved, or that the database should be treated like other databases they administer are treated. I've seen DBAs do really silly things to the BizTalk databases that get very hard to diagnose.
Did you try updating the statistics on the database table referenced by the stored procedure (which is run by the SQL Server Agent job? The query planner uses those to decide how best to execute your SQL.

Ax Trace Parser Doesnt Show SQL Queries

Configuring the tracing cockpit to show sql events, traceparser doesn't show the sql tab, how can I get the sql queries from trace parser?
And one second question, It also shows only my queries,is there a way to make it show all queries running in ax?
See how to Install the Trace Parser.
Also ensure you run it from the AOS server and that the correct event selection is provided.
Also start AX with escalated privileges (Run as administrator).

iSeries (AS400) Output with ODBC connection

I am very new to AS400, and I am stuck. I have read documenation but cannot find what I need.
I have an odbc connection to an AS400 server. When I run this command I get an Outfile with everything I need:
CALL QSYS.QCMDEXC('DSPUSRPRF USRPRF(*ALL) OUTPUT(*OUTFILE) OUTFILE(CHHFLE/TEST3)', 0000000061.00000)
Instead of the results going to an outfile I need to receive the results of this command to my script that is connecting through odbc. If I change 'OUTPUT(*OUTFILE)' to 'OUTPUT(*)' I get no results when I try to 'fetchall()'.
Is there any way to get this information through the odbc connection to my script?
EDIT: I am on a linux server, in a python script using pyodbc to connect. I can run sql queries successfully using this connection, but I can't figure out how to get the results of a command to come through as some sort of record set.
I hope I'm interpreting what you're asking correctly. it looks like you're accessing user profile data and dumping it to a file. It looks like you then want to use the contents of that file in a script or something that's running on Windows. If that's the case:
In general, when accessing data in a file from the Windows world, whether through ODBC and VBScript, or .NET, the AS/400 is treated like a database. All files in libraries are exposed via the built-in DB2 database. It's all automatic, and part of the Universal DB2 database.
So, after creating this file, you should have a file named TEST3 in library CHHFLE
You'd create a connection and execute the following SQL statement to read the contents:
Select * From CHHFLE.TEST3
This, of course, assumes that you have proper permissions to access this. You should be able to test this using the iSeries Navigator tool, which includes the ability to run SQL Scripts against the database before doing it in your script.
Added after reading comments above
There's info at this question on connecting to the DB2 from Python. I hope it's helpful.
OUTPUT(*) is not stdout, unfortunately. That means you won't be able to redirect OUTPUT(*) to an ODBC connection. Dumping to a DB2 table via OUTPUT(*OUTFILE) is a good plan. Once that's done, use a standard cursor / fetch loop as though you were working with any other DB2 table.

Resources