Delete task on multiple remote servers - windows-task-scheduler

I can run this command line to delete task on single remote server:
SCHTASKS /delete /tn "testTask" /s (hostname) and it works perfect.
However I need to be able to do the same on multiple servers from one command line or batch file. I've tried using for loop option to run against the list of servers provided in text file and seems like it does not read that text file. Here is the example:
For /F %f in (c:\temp\testservers.txt) do schtasks /delete /tn "testTask"
it outputs this ERROR: The specified task name "testTask" does not exist in the system.. Its seems like it searching task on my laptop instead of on remote servers provided in .txt file. (task does exist on remote server)
I have used the same For loop command to create task on multiple servers with a bit different parameters and it works fine. Any help would be greatly appreciated. Let me know if you need any clarification.

I think you simply forgot to add your %f parameter after the Delete command, which doesn't use the same parameters as the Create command, as documented on this MSDN page:
schtasks /delete /tn {<TaskName> | *} [/f] [/s <Computer> [/u [<Domain>\]<User> [/p <Password>]]]
So you should try :
For /F %f in (c:\temp\testservers.txt) do schtasks /delete /tn "testTask" /s %f
To log the results in a file, use something like:
For /F %f in (c:\temp\testservers.txt) do schtasks /delete /tn "testTask" /s %f >> C:\Mylogs\TaskDeletions.log

Related

rsync : how to copy only latest file from target to source

We have a main Linux server, say M, where we have files like below (for 2 months, and new files arriving daily)
Folder1
PROCESS1_20211117.txt.gz
PROCESS1_20211118.txt.gz
..
..
PROCESS1_20220114.txt.gz
PROCESS1_20220115.txt.gz
We want to copy only the latest file on our processing server, say P.
So as of now, we were using the below command, on our processing server.
rsync --ignore-existing -azvh -rpgoDe ssh user#M:${TargetServerPath}/${PROCSS_NAME}_*txt.gz ${SourceServerPath}
This process worked fine until now, but from now, in the processing server, we can keep files only up to 3 days. However, in our main server, we can keep files for 2 months.
So when we remove older files from the processing server, the rsync command copies all files from main server to the processing server.
How can I change rsync command to copy only latest file from Main server?
*Note: the example above is only for one file. We have multiple files on which we have to use the same command. Hence we cannot hardcode any filename.
What I tried:
There are multiple solutions, but all seems to be when I want to copy latest file from the server I am running rsync on, not on the remote server.
Also I tried running below to get the latest file from main server, but I cannot pass variable to SSH in my company, as it is not allowed. So below command works if I pass individual path/file name, but cannot work as with variables.
ssh M 'ls -1 ${TargetServerPath}/${PROCSS_NAME}_*txt.gz|tail -1'
Would really appreciate any suggestions on how to implement this solution.
OS: Linux 3.10.0-1160.31.1.el7.x86_64
ssh quoting is confusing - to properly quote it, you have to double-quote it locally.
Handy printf %q trick is helpful - quote the relevant parts.
file=$(
ssh M "ls -1 $(printf "%q" "${getServerPath}/${PROCSS_NAME}")_*.txt.gz" |
tail -1
)
rsync --ignore-existing -azvh -rpgoDe ssh user#M:"$file" "${SourceServerPath}"
or maybe nicer to run tail -n1 on the remote, so that minimum amount of data are transferred (we only need one filename, not them all), invoke explicit shell and pass the variables as shell arguments:
file=$(ssh M "$(printf "%q " bash -c \
'ls -1 "$1"_*.txt.gz | tail -n1'
'_' "${TargetServerPath}/${PROCSS_NAME}"
)")
Overall, I recommend doing a function and using declare -f :
sshqfunc() { echo "bash -c $(printf "%q" "$(declare -f "$1"); $1 \"\$#\"")"; };
work() {
ls -1 "$1"_*txt.gz | tail -1
}
tmp=$(ssh M "$(sshqfunc work)" _ "${TargetServerPath}/${PROCSS_NAME}")
or you can also use the mighty declare to transfer variables to remote - then run your command inside single quotes:
ssh M "
$(declare -p TargetServerPath PROCSS_NAME);
"'
ls -1 ${TargetServerPath}/${PROCSS_NAME}_*txt.gz | tail -1
'

tSQLt.XmlResultFormatter truncates results after 2033 chars

Trying to export my tSQLt test results to XML with tSQLt.XmlResultFormatter.
But it seems to truncate the output after 2033 characters,
BEGIN TRY EXEC tSQLt.RunAll END TRY BEGIN CATCH END CATCH; EXEC tSQLt.XmlResultFormatter
I want my output in xml so i can reference it in a Microsoft Devops CI deployment pipeline. I only have 14 tests at the moment which doesn't feel like a lot. If this is the limit of XmlResultFormatter, is there another way to get the results in an xml format?
Thanks for your time
You don't say what method you're using to execute the SQL commands in your question. There's probably a more streamlined way of doing this, but I solved this problem in Jenkins and then ported the solution to an Azure DevOps Command Line task with the following code, on a Windows build agent. EXEC tSQLt.RunAll ran in a previous step:
::export the test results to a file
bcp "EXEC [tSQLt].[XmlResultFormatter];" queryout %WORKSPACE%\test_results.xml -S %DBSERVER% -d %DBNAME% -T -w
::remove the carriage returns (added by BCP every 2048 chars) from the xml file
::and write to a new file
PowerShell -ExecutionPolicy Bypass -NoProfile -Command "& {(gc %WORKSPACE%\test_results.xml -Raw).replace([Environment]::NewLine , '') | Set-Content %WORKSPACE%\output_test_results.xml}"
Hopefully the comments explain what is going on.
The Command Line task has the following environment variables defined:
DBSERVER - the database server name
DBNAME - the name of the database under test
WORKSPACE - $(build.sourcesDirectory) - this is a legacy of running the script in Jenkins and could be factored out
The file output by the second command output_test_results.xml is passed to a Publish Test Results task later in the build.
EDIT
I looked into this and I think I understand what's happening. Although SSMS presents an XML result as a single column/row, the data is actually returned to the client as a sequence of shorter rows (<2048 characters).
The default behaviour of Invoke-Sqlcmd is to return results as an array of DataRow objects - each item in the array contains between 2000 and 2048 characters. This array needs to be concatenated back together to generate the result set - here's one way of doing it in PowerShell:
$out = ""; Invoke-SqlCmd -ServerInstance <server> -Database <db name> -Query "exec tSQLt.XmlResultFormatter" -MaxCharLength 1000000 | %{ $out = $out + $_[0]}; $out > c:\temp\output.txt
My original answer is also affected by this issue - hence the PowerShell command to remove carriage returns every 2048 characters.

How to make SCHTASKS run command?

I am new to SCHTASKS and cannot get it to run a simple command. What I tried is below. The file c:\downloads\temp_w.txt is not created. What is wrong?
c:\downloads>ver
Microsoft Windows [Version 10.0.17763.503]
c:\downloads>SCHTASKS /Create /SC MINUTE /MO 1 /TN mydir /TR "dir c:\windows > c:\downloads\temp_w.txt"
WARNING: The task name "mydir" already exists. Do you want to replace it (Y/N)? y
y
SUCCESS: The scheduled task "mydir" has successfully been created.
c:\downloads>schtasks /run /tn mydir
SUCCESS: Attempted to run the scheduled task "mydir".
c:\downloads>dir temp_w.txt
Volume in drive C has no label.
Volume Serial Number is ECC7-1C96
Directory of c:\downloads
File Not Found
DIR is an internal command (see: https://ss64.com/nt/syntax-internal.html)
The following should work (untested):
c:\>SCHTASKS /Create /SC MINUTE /MO 1 /TN mydir /TR "cmd /c dir c:\windows > c:\downloads\temp_w.txt"

Could rsync wait file created and then copy it in the same time

I want to copy file from A to B use "rsync", But the file is not exist right now in A. I run the "rsync" command on B, Can i let "rsync" to wait until the file is creating in A, and copy it in the same time?
Let's assume we start with a command running on hostB like
hostB$ rsync hostA:remotepath localpath
If hostA has a normal shell, we can make rsync wait until the file exists by tweaking the helper command it normally runs on hostA. Depending on the environment, something like this might work:
hostB$ rsync --rsync-path='
while [ ! -f remotefile ]; do sleep 1; done;
sleep 5;
rsync' hostA:remotepath localpath
the while loop busywaits for remotefile to become available
then sleep 5 allows a few seconds for the contents to settle
rsync is the normal remote command
must come last
must not end with newline or semicolon or comment

Running Unix scrips from SSIS

I am trying to run a Unix script which populates our Aged Debt table for our finance department from SSIS but cannot get my head around it. The script has to be run under user "username" and the script to run is :
P1='0*99999999' P2='2015_03_25*%%YY*Y' P3='Y*0.0' P4='Y*0.0' P5='Y*0.0' P6='Y*0.0' P7='Y*0.0' P8='Y*0.0' /cer_cerprod1/exe/par50219r
I believe that I need to have ssh configured on both sides to do this and I believe that I may do this from the "Execute Process Task" but I don't think that I am populating the parameters correctly.
Can anyone help.
I currently do this using putty/plink. Like sorrell says above, You use an execute process task to call a batch file. That batch file calls plink. I pass plink the shell script on the unix server that I want it to execute.
example of batch file:
echo y | "d:\program files\putty\plink.exe" [username#yourserver.com] -pw [password] -v sh /myremotescriptname.sh
the echo y at the beginning is to tell plink to accept the security credentials of the server.

Resources