Trying to export my tSQLt test results to XML with tSQLt.XmlResultFormatter.
But it seems to truncate the output after 2033 characters,
BEGIN TRY EXEC tSQLt.RunAll END TRY BEGIN CATCH END CATCH; EXEC tSQLt.XmlResultFormatter
I want my output in xml so i can reference it in a Microsoft Devops CI deployment pipeline. I only have 14 tests at the moment which doesn't feel like a lot. If this is the limit of XmlResultFormatter, is there another way to get the results in an xml format?
Thanks for your time
You don't say what method you're using to execute the SQL commands in your question. There's probably a more streamlined way of doing this, but I solved this problem in Jenkins and then ported the solution to an Azure DevOps Command Line task with the following code, on a Windows build agent. EXEC tSQLt.RunAll ran in a previous step:
::export the test results to a file
bcp "EXEC [tSQLt].[XmlResultFormatter];" queryout %WORKSPACE%\test_results.xml -S %DBSERVER% -d %DBNAME% -T -w
::remove the carriage returns (added by BCP every 2048 chars) from the xml file
::and write to a new file
PowerShell -ExecutionPolicy Bypass -NoProfile -Command "& {(gc %WORKSPACE%\test_results.xml -Raw).replace([Environment]::NewLine , '') | Set-Content %WORKSPACE%\output_test_results.xml}"
Hopefully the comments explain what is going on.
The Command Line task has the following environment variables defined:
DBSERVER - the database server name
DBNAME - the name of the database under test
WORKSPACE - $(build.sourcesDirectory) - this is a legacy of running the script in Jenkins and could be factored out
The file output by the second command output_test_results.xml is passed to a Publish Test Results task later in the build.
EDIT
I looked into this and I think I understand what's happening. Although SSMS presents an XML result as a single column/row, the data is actually returned to the client as a sequence of shorter rows (<2048 characters).
The default behaviour of Invoke-Sqlcmd is to return results as an array of DataRow objects - each item in the array contains between 2000 and 2048 characters. This array needs to be concatenated back together to generate the result set - here's one way of doing it in PowerShell:
$out = ""; Invoke-SqlCmd -ServerInstance <server> -Database <db name> -Query "exec tSQLt.XmlResultFormatter" -MaxCharLength 1000000 | %{ $out = $out + $_[0]}; $out > c:\temp\output.txt
My original answer is also affected by this issue - hence the PowerShell command to remove carriage returns every 2048 characters.
Related
We have a main Linux server, say M, where we have files like below (for 2 months, and new files arriving daily)
Folder1
PROCESS1_20211117.txt.gz
PROCESS1_20211118.txt.gz
..
..
PROCESS1_20220114.txt.gz
PROCESS1_20220115.txt.gz
We want to copy only the latest file on our processing server, say P.
So as of now, we were using the below command, on our processing server.
rsync --ignore-existing -azvh -rpgoDe ssh user#M:${TargetServerPath}/${PROCSS_NAME}_*txt.gz ${SourceServerPath}
This process worked fine until now, but from now, in the processing server, we can keep files only up to 3 days. However, in our main server, we can keep files for 2 months.
So when we remove older files from the processing server, the rsync command copies all files from main server to the processing server.
How can I change rsync command to copy only latest file from Main server?
*Note: the example above is only for one file. We have multiple files on which we have to use the same command. Hence we cannot hardcode any filename.
What I tried:
There are multiple solutions, but all seems to be when I want to copy latest file from the server I am running rsync on, not on the remote server.
Also I tried running below to get the latest file from main server, but I cannot pass variable to SSH in my company, as it is not allowed. So below command works if I pass individual path/file name, but cannot work as with variables.
ssh M 'ls -1 ${TargetServerPath}/${PROCSS_NAME}_*txt.gz|tail -1'
Would really appreciate any suggestions on how to implement this solution.
OS: Linux 3.10.0-1160.31.1.el7.x86_64
ssh quoting is confusing - to properly quote it, you have to double-quote it locally.
Handy printf %q trick is helpful - quote the relevant parts.
file=$(
ssh M "ls -1 $(printf "%q" "${getServerPath}/${PROCSS_NAME}")_*.txt.gz" |
tail -1
)
rsync --ignore-existing -azvh -rpgoDe ssh user#M:"$file" "${SourceServerPath}"
or maybe nicer to run tail -n1 on the remote, so that minimum amount of data are transferred (we only need one filename, not them all), invoke explicit shell and pass the variables as shell arguments:
file=$(ssh M "$(printf "%q " bash -c \
'ls -1 "$1"_*.txt.gz | tail -n1'
'_' "${TargetServerPath}/${PROCSS_NAME}"
)")
Overall, I recommend doing a function and using declare -f :
sshqfunc() { echo "bash -c $(printf "%q" "$(declare -f "$1"); $1 \"\$#\"")"; };
work() {
ls -1 "$1"_*txt.gz | tail -1
}
tmp=$(ssh M "$(sshqfunc work)" _ "${TargetServerPath}/${PROCSS_NAME}")
or you can also use the mighty declare to transfer variables to remote - then run your command inside single quotes:
ssh M "
$(declare -p TargetServerPath PROCSS_NAME);
"'
ls -1 ${TargetServerPath}/${PROCSS_NAME}_*txt.gz | tail -1
'
I am trying to run a Unix script which populates our Aged Debt table for our finance department from SSIS but cannot get my head around it. The script has to be run under user "username" and the script to run is :
P1='0*99999999' P2='2015_03_25*%%YY*Y' P3='Y*0.0' P4='Y*0.0' P5='Y*0.0' P6='Y*0.0' P7='Y*0.0' P8='Y*0.0' /cer_cerprod1/exe/par50219r
I believe that I need to have ssh configured on both sides to do this and I believe that I may do this from the "Execute Process Task" but I don't think that I am populating the parameters correctly.
Can anyone help.
I currently do this using putty/plink. Like sorrell says above, You use an execute process task to call a batch file. That batch file calls plink. I pass plink the shell script on the unix server that I want it to execute.
example of batch file:
echo y | "d:\program files\putty\plink.exe" [username#yourserver.com] -pw [password] -v sh /myremotescriptname.sh
the echo y at the beginning is to tell plink to accept the security credentials of the server.
I got a Korn Shell that calls a SQL/PLUS to check and spool data into a .CSV file in UNIX.
This KShell is working fine on Unix, it creates the file and Return 0.
Launching the Job from UC4 AppWorx i want him to Attach the Spooled File in UNIX in the Notification sended by the Job when he Finish.
I want this to work this way:
1º I Launch the Job
2º It checks the data, if data is founded then it creates a file in /tmp directory in UNIX with the .CSV extension.
3º When the job finishes he send me an email with Spool File (.CSV) in Unix.
Is there any way? How can i make this?
Thanks.
You'll need to create an chain in Appworx (the docs should be able to walk you through it). This chain will have 1 or more jobs.
First, you don't need a k script to call SQL/PLUS. You can invoke SQL/PLUS directly. Write the script as a .sql file (it can include sqlplus directives, sql, and PL/SQL as needed). Set the job as "program type" AWSQLP. Point it at the .sql script that you have made available to Appworx.
The sqlplus script can use logic to determine if it should create a file. If it should, it can write out files directly (though getting proper .csv files from it can be a pain).
Then, attach a notification to the job, and the notification object should be set to do an email attachment. You'll have to use the "pattern" type, and put in a full filepath to the csv. Substitution variables can be used if you want a new filename each invocation.
Depending on your version, some of these options can be moved around a bit (We just upgraded last year, UC4 no longer owns it). Click on the help menu and do to the documentation entry... it's not the best in the world, but far from the worst.
first of all thanks for answering.
I usually create a JOB lets say, SEM_CHECK_THINGS, with 1 prompt defined on UC4 that runs a Query in the Database to check if the table, test_table (Example) got data, to do this i use select decode(count(*),0, 'N', 'Y') from test_table;
This job also executes a simple KShell in unix:
KShell Content:
echo "Job Name: $1"
echo "Job Control Flag: $2"
jobName=$1
jobFlag=$2
echo "Job ${jobName} started ..." >> $logFile
date >> $logFile
if [[ ${jobFlag} == "Y" ]]; then
echo "Job ${jobName} executed successfully with data found." >> $logFile
echo "Job ${jobName} executed successfully with data found."
exit 1
else
echo "Job ${jobName} finished with no data found." >> $logFile
echo "Job ${jobName} finished with no data found."
exit 0
fi
I usually force "ABORT" by using Exit 1 if data is found to request another Job that will execute an .SQL that will spool the data from the test_table.
whenever sqlerror exit sql.sqlcode
whenever sqlerror exit 1
prompt this is a test
set echo off
set trimspool on
set trimout off
set linesize 1500
set feedback on
set newpage none
SET HEADING OFF
set und off
set pagesize 10000
alter session set nls_date_format = 'dd-MON-yyyy HH24:MI:SS';
spool &1
SELECT 'PREV_RESULTSET;LAST_RESULTSET;NR_COUNT' FROM DUAL
UNION ALL
SELECT PREV_RESULTSET||';'||LAST_RESULTSET||';'||COUNT(1) NR_COUNT
FROM SEM_REPORT_PEDIDOS
GROUP BY PREV_RESULTSET, LAST_RESULTSET;
spool off
exit 1
By using the Spool &1 and by "Hardcoding" the file testspool.csv in the "Other Output" option in the Notification of that Job i managed to do this, to receive an email with the content i need/want from that table.
But i want really want its to make this in one single job, make a validation, if data is found then Spool and attach .CSV file to the email notification sent by that job.
I am developing an application that can establish a server-client connection using QTcp*
The client sends the server a number.
The received string is checked on its length and quality (is it really a number?)
If everything is OK, then the server replies back with a file path (which depends on the sent number).
The client checks if the file exists and if it is a valid image. If the file complies with the rules, it executes a command on the file.
What security concerns exist on this type of connection?
The program is designed for Linux systems and the external command on the image file is executed using QProcess. If the string sent contained something like (do not run the following command):
; rm -rf /
then it would be blocked on the file not found security check (because it isn't a file path). If there wasn't any check about the validity of the sent string then the following command would be executed:
command_to_run_on_image ; rm -rf /
which would cause panic! But this cannot happen.
So, is there anything I should take into consideration?
If you open a console and type command ; rm -rf /*, something bad would likely happen. It's because commands are processed by the shell. It parses text output, e.g. splits commands by ; delimiter and splits arguments by space, then it executes parsed commands with parsed arguments using system API.
However, when you use process->start("command", QStringList() << "; rm -rf /*");, there is no such danger. QProcess will not execute shell. It will execute command directly using system API. The result will be similar to running command "; rm -rf /*" in the shell.
So, you can be sure that only your command will be executed and the parameter will be passed to it as it is. The only danger is the possibility for an attacker to call the command with any file path he could construct. Consequences depends on what the command does.
Here is the scenario,
$hostname
server1
I have the below script in server1,
#!/bin/ksh
echo "Enter server name:"
read server
rsh -n ${server} -l mquser "/opt/hd/ca/scripts/envscripts.ksh"
qdisplay
# script ends.
In above script I am logging into another server say server2 and executing the script "envscripts.ksh" which sets few alias(Alias "qdisplay") defined in it.
I can able to successfully login to server1 but unable to use the alias set by script "envscripts.ksh".
Geting below error,
-bash: qdisplay: command not found
can some please point out what needs to be corrected here.
Thanks,
Vignesh
The other responses and comments are correct. Your rsh command needs to execute both the ksh script and the subsequent command in the same invocation. However, I thought I'd offer an additional suggestion.
It appears that you are writing custom instrumentation for WebSphere MQ. Your approach is to remote shell to the WMQ server and execute a command to display queue attributes (probably depth).
The objective of writing your own instrumentation is admirable, however attempting to do it as remote shell is not an optimal approach. It requires you to maintain a library of scripts on each MQ server and in some cases to maintain these scripts in different languages.
I would suggest that a MUCH better approach is to use the MQSC client available in SupportPac MO72. This allows you to write the scripts once, and then execute them from a central server. Since the MQSC commands are all done via MQ client, the same script handles Windows, UNIX, Linux, iSeries, etc.
For example, you could write a script that remotely queried queue depths and printed a list of all queues with depth > 0. You could then either execute this script directly against a given queue manager or write a script to iterate through a list of queue managers and collect the same report for the entire network. Since the scripts are all running on the one central server, you do not have to worry about getting $PATH right, differences in commands like tr or grep, where ksh or perl are installed, etc., etc.
Ten years ago I wrote the scripts you are working on when my WMQ network was small. When the network got bigger, these platform differences ate me alive and I was unable to keep the automation up and running. When I switched to using WMQ client and had only one set of scripts I was able to keep it maintained with far less time and effort.
The following script assumes that the QMgr name is the same as the host name except in UPPER CASE. You could instead pass QMgr name, hostname, port and channel on the command line to make the script useful where QMgr names do not match the host name.
#!/usr/bin/perl -w
#-------------------------------------------------------------------------------
# mqsc.pl
#
# Wrapper for M072 SupportPac mqsc executable
# Supply parm file name on command line and host names via STDIN.
# Program attempts to connect to hostname on SYSTEM.AUTO.SVRCONN and port 1414
# redirecting parm file into mqsc.
#
# Intended usage is...
#
# mqsc.pl parmfile.mqsc
# host1
# host2
#
# -- or --
#
# mqsc.pl parmfile.mqsc < nodelist
#
# -- or --
#
# cat nodelist | mqsc.pl parmfile.mqsc
#
#-------------------------------------------------------------------------------
use strict;
$SIG{ALRM} = sub { die "timeout" };
$ENV{PATH} =~ s/:$//;
my $File = shift;
die "No mqsc parm file name supplied!" unless $File;
die "File '$File' does not exist!\n" unless -e $File;
while () {
my #Results;
chomp;
next if /^\s*[#*]/; # Allow comments using # or *
s/^\s+//; # Delete leading whitespace
s/\s+$//; # Delete trailing whitespace
# Do not accept hosts with embedded spaces in the name
die "ERROR: Invalid host name '$_'\n" if /\s/;
# Silently skip blank lines
next unless ($_);
my $QMgrName = uc($_);
#----------------------------------------------------------------------------
# Run the parm file in
eval {
alarm(10);
#Results = `mqsc -E -l -h $_ -p detmsg=1,prompt="",width=512 -c SYSTEM.AUTO.SVRCONN &1 | grep -v "^MQSC Ended"`;
};
if ($#) {
if ($# =~ /timeout/) {
print "Timed out connecting to $_\n";
} else {
print "Unexpected error connecting to $_: $!\n";
}
}
alarm(0);
if (#Results) {
print join("\t", #Results, "\n");
}
}
exit;
The parmfile.mqsc is any valid MQSC script. One that gathers all the queue depths looks like this:
DISPLAY QL(*) CURDEPTH
I think the real problem is that the r(o)sh cmd only executes the remote envscripts.ksh file and that your script is then trying to execute qdisplay on your local machine.
You need to 'glue' the two commands together so they are both executed remotely.
EDITED per comment from Gilles (He is correct)
rosh -n ${server} -l mquser ". /opt/hd/ca/scripts/envscripts.ksh ; qdisplay"
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, or give it a + (or -) as a useful answer