I have written a Multiload script to Load Data in TeraData Database and the commands in the script is like:
.LOGTABLE Employee_log;
.LOGON 192.168.1.1/dbc,dbc;
.BEGIN MLOAD TABLES Employee_Stg;
.LAYOUT Employee;
.FIELD in_EmployeeNo * VARCHAR(10);
.FIELD in_FirstName * VARCHAR(30); ....
But the password is clearly visible in the script. Is there a option to secure the password or any alternate way/command to logon and then run the script.
You can create a logon file and run it in your MLOAD script using the following command
.RUN FILE logonfile.txt
In the logon file you can provide the statement that you used in your script .LOGON 192.168.1.1/dbc,dbc;
Restrict access to logonfile.txt, though only the user can read it
chmod go-rwx logonfile.txt
or use tdwallet
.LOGON 192.168.1.1/dbc,$tdwallet(dbc)
tdwallet keeps the entries safely away, only access via logon command.
There is no function to get an entry in cleartext.
Related
When i run a teradata bteq in the CMD shell -
A little logon prompt screen pops up.
When i press enter the bteq runs.
Is there a way to disable this popup screen?
Searching the internet yields that entering a logonprompt off should solve the problem.
Like so:
.SET LOGONPROMPT OFF
.LOGON my_server
-- rest of the bteq script...
.QUIT
.LOGOFF
I use the following shell command to run the bteq:
bteq < myscript.sql > log.txt
Could you please help me get rid of the logon popup screen?
You can pass the fully qualified logon string to get rid of the prompt.
.LOGON my_server/user,password
If you still want to go with only the server details in .LOGON. Along with .SET LOGONPROMPT OFF, set the environment variable GUILOGON as NO.
In CMD: setx GUILOGON NO
Snippet from TD Documentation:
Note that setting LOGONPROMPT to OFF is sometimes not going to be
sufficient for suppressing all unnecessary prompts when using Windows
BTEQ. You may also need to instruct CLI to suppress its generation of
what is known as its GUILOGON dialog box.
This can be accomplished by setting the environment variable GUILOGON
to NO.
I have created a large script to be passed to an sshexec commandResource in ant as follows.
<sshexec host="${host.server}"
username="${username}"
password="${oracle.password}"
commandResource="path-to-file/script.txt"
/>
This is working as intended.
There are several lines that will get executed on the server in that script. The contents of the script.txt file is as follows :
/script/compile_objects.sh /path-to-code/ login password
/script/compile_code.sh /path-to-code/ login password
However, I would prefer to not store the login and password as clear text in the script.txt file. Is it possible to pass parameters to each line of the command resource. I'm aware that each line in the command resource file gets executed in its own shell.
I tried string replacement from a property in the Ant script, but it failed with a bad substitution error like so. Is there another way to do this?
<property name="oracle.password" value="thepassword"/>
and then in the script file, alter to:
/script/compile_objects.sh /path-to-code/ login ${oracle.password}
/script/compile_code.sh /path-to-code/ login ${oracle.password}
This type of replacement works when using the command feature, but seems to fail when using commandResource.
Edit :
I am using Apache Ant 1.9.4
and jsch-0.1.54.jar
My goal is to automate a deployment of SQL scripts to Teradata via BTEQ. So far my script is working. However, I would like to generate a log file where possible failures are captured.
.LOGON tdserver/username,pw
.EXPORT file=\logfile.txt;
.run file = \Desktop\test\test.sql;
.LOGOFF
.EXIT
My SQL script will create a VIEW. When this view for example already exists I see an error in the BTEQ command window:
*** Failure 3804 View 'ViewName' already exists.
I would like to have this TD Message in my log file. I tried several tings, have been looking for 3 hours but unfortunately without success.
You may want to experiment using .SET ERROROUT STDERR which re-routes the error stream to the STDERR output file instead of the default action of routing the error stream to STDOUT.
There is more information in the BTEQ manual under Chapter 5 - BTEQ Commands.
Save all your script written above as a text file and create a batch file that generates a log after running the script:
echo off
bteq < script.txt > script.log 2>&1
#echo off goto end
:end
#echo exit
Errors will be recorded in this way.
I have the client/TTU installed on Unix box for Teradata.
If I do the following, it works. Where "..." is Teradata BTEQ normal output and once the following is done, I'm back at the prompt.
$ bteq
...
....
. logon dbname/dbuser,dbpassword
SELECT DATE, TIME;
.LOGOFF;
.QUIT;
..
...
$
Now, lets say I put the following lines in a file called "testtd.bteq"
. logon dbname/dbuser,$dbpassword
SELECT DATE, TIME;
.LOGOFF;
.QUIT;
What I want now is ... how can I run this script (.bteq) at Unix $ prompt ???
I tried the following methods so far, but they didn't work, may be Im missing anything:
1. bteq < /path/to/testtd.bteq > testtd.log
2. bteq <
.run /path/to/testtd.bteq
HereDocEndsHere
Any ideas? DO I HAVE to provide ". logon dbname/dbuser,dbpassword" FIRST, if I'm using the HereDocument way?
Running bteq command on $ prompt doesn't even give me any HELP/options that I can use, like other commands.
i.e.
cmd -u user -p password -f file etc...
The best practice I'm aware of is
store your teradata credentials in a ~/.tdlogon file
create a script that contains your bteq call with all the stuff it needs.
E.g., create a file bteqScript.sh with
/* define helper variables, e.g.... */
export ARCHIVEDIR=~/data
export DATAFILE=dataOutput1.txt
bteq <<EOF
.run file=$HOME/.tdlogon
.export data file=${ARCHIVEDIR}|${DATAFILE}
/* sql code on next line(s) */
select
'foo' as "bar"
;
.export reset
EOF
Note that .run file=... executes the .logon command with your credentials, stored elsewhere.
Kudos to Alex Hasha for the bteq script.
PS - It works via method 1 -- when I hard code the password in the script file for LOGON command.
I wanted to do the same via exporting a variable called "dbpassword"
i.e.
$ export dbpassword=xyxyxyxyx
and
then, inside the script file, i can use "$dbpassword" ... in the LOGON command.. somehow export is not exporting the var within .bteq logon command.
Is there a way (or a module) to write log files into the disk instead of writing them to db? Because i really don't want my db getting fatter just because log lines.
Yes there is. It's part of the core of Drupal. It's call syslog. However, it logs in the system log file by default.
I hope you have some fast disks... you could easily create a bottleneck by doing so. Instead, I would regularly dump the log tables to a file, say using a cron job.
You could add this to a file called drupal_logs.sh:
NOW=$(date +"%Y%m%d")_$(date +"%H%M.%S")
mysqldump -p - -user=username dbname tableName1 tableName1 > /path/to/drupal_$NOW.sql
And schedule that to run every 15 minutes by adding the following cron job:
15 * * * * /path/to/drupal_logs.sh > /dev/null
And if you're worried about the log files in the database getting to large, you can follow your mysqldump command in your drupal_logs.sh with a truncate command of the exported tables.