Task scheduler- create task with one-line powershell command - windows-task-scheduler

Is it possible to execute this task (command with CRLF) without error using the Task scheduler?
`
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -noexit -command &{(Get-Content -Path C:\tests\test.txt) -replace ('aaa=0',"aaa=20 `r`ng=11111 `r`ng=2222") | Set-Content -Path C:\tests\test.txt}
Powershell Command without error
(Get-Content -Path C:\tests\test.txt) -replace ('aaa=0',"aaa=20 `r`ng=11111 `r`ng=2222") | Set-Content -Path C:\tests\test.txt
`

Related

SED command use for writing back to the same file

I have the below code which adds Logger.info line after every function definition which I need to run on a python script which is the requirement.
The only question is this has to be written back to the same file so the new file has all these looger.info statements below each function definition.
e.g. the file abc.py has currently below code :
def run_func(sql_query):
return run_func(sql_query)
and the code below should create the same abc.py file but with all the logger.info added to this new file
def run_func(sql_query):
LOGGER.info (''MIPY_INVOKING run_func function for abc file in directory'
return run_func(sql_query)
I am not able to write the sed in this file to the new file (with same file name) so that the original file gets replaced by same file name and so that I have all the logger.info statements in there.
for i in $(find * -name '*.py');
do echo "#############################################" | tee -a auto_logger.log
echo "File Name : $i" | tee -a auto_logger.log
echo "Listing the python files in the current script $i" | tee -a auto_logger.log
for j in $(grep "def " $i | awk '{print $2}' | awk -F"(" '{print $1}');
do
echo "Function name : $j" | tee -a auto_logger.log
echo "Writing the INVOKING statements for $j function definition" | tee -a auto_logger.log
grep "def " $i |sed '/):/w a LOGGER.info (''INVOKING1 '"$j"' function for '"$i"' file in sam_utilities'')'
if [ $? -ne 0 ] ; then
echo " Auto Logger for $i filename - Not Executed Successfully" | tee -a auto_logger.log
else
echo "Auto Logger for $i filename - Executed Successfully" | tee -a auto_logger.log
fi
done
done

Creating a .sh file through Batch script and executing .sh file through batch

Below is the part of a batch script that i have created:
{
REM ********* CONN SCRIPT CREATION ***************
echo #!/bin/sh >%conn_script%
echo >>%conn_script%
echo if [ %today% -eq 23 ] >>%conn_script%
echo then >>%conn_script%
echo **find . -maxdepth 0 -type f -mtime +0 -exec rm -rf {} \;>>%conn_script%
echo else >>%conn_script%**
echo echo Files are not from previous month >>%conn_script%
echo fi >>%conn_script%
type %conn_script%
::echo bye >>%conn_script%
echo The sftp_script is:
echo "command: call %executor%\RUN\plink -ssh %host% -batch -l %user% -pw ********** -m %conn_script%"
call %executor%\RUN\plink -ssh %host% -batch -l %user% -pw %password% -m %conn_script% >%logfile%
}
I have created a batch script that is creating a .sh file. That sh file is deleting files from a unix server. When batch script is executing sh file it is getting error "find: bad option -maxdepth
find: [-H | -L] path-list predicate-list" from the code which is in BOLD format.
Even i also want to append the log of deleted files in a .txt file which is in my local machine.
I have tried a lot but not able to append the log in .txt file.
Please provide yours valuable feedback for this issue.
Thanks
Have you tried /usr/xpg4/bin/find (Available in Solaris).
/usr/xpg4/bin/find . -maxdepth 0 -type f -mtime +0 | xargs rm -f

In place replace using powershell

I am trying to write an equivalent of
find -name "*.xml" | xargs grep -l "Search String"
| xargs perl -p -i -e 's/Search String/Replace String/g'
in powershell. This is what I came up with.
Get-ChildItem 'D:\code\cpp\FileHandlingCpp\input - Copy' -Recurse |
Select-String -SimpleMatch $src_str |
foreach{(Get-Content $_.path) | ForEach-Object { $_ -replace $src_str, $target_str }}
I get the error "The process cannot access the file because it is being used by another process”. So I came up with the multiple lines version as shown below. I am able to do in-replace of the strings now except the one in $src_str. What's wrong with $src_str ?
$src_str="<?xml version=""1.0"" encoding=""UTF-8"" standalone=""yes"" ?>"
$target_str=""
echo $src_str
foreach ($var in (Get-ChildItem 'D:\code\cpp\FileHandlingCpp\input - Copy' -Recurse
| Select-String -SimpleMatch $src_str).Path)
{
(Get-Content $var) | ForEach-Object { $_ -replace $src_str, $target_str }
| Set-Content $var
}
Maybe it would help to get back to your original goal of implementing the equivalent of the Unix version. Here is essentially the equivalent PowerShell version.
$search = '<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>'
$replace = 'test'
$dir = 'D:\code\cpp\FileHandlingCpp\input - Copy'
dir -Path $dir -Recurse -Filter *.xml | ForEach-Object {
(Get-Content -Path $_.FullName) -replace $search, $replace |
Set-Content $_.FullName
}
Note - watch out for text file encoding changes that may occur from re-writing the file. You can specify the output encoding if you need to using Set-Content's -Encoding parameter e.g. ASCII.
This took me a while to figure out but I got it!
It's a one liner. Just go to the folder you want to start at and type this in. Change the file.name (use wild cards if you want) and string1 and string2 with the file name you want to search for and the string1 you want to replace with string2.
So this searches folders recursivly and for each file it replaces a string with another string and saves it. Basically Get-Childitem | ForEach-Object Replace and Save.
All set!
get-childitem -include file.name -Recurse | ForEach-Object { ( Get-Content -Path $_.FullName ) -replace 'string1', 'string2' | set-content $_.fullname }

Command to get list of files with ".log" extension and "JB_" in name and "ERROR:" in the body of log file(not name of log file) in UNIX

I tried this below command
ls -1 /fbrms01/dev/Logs/JB_*.log | find . -type f | xargs grep -l "ERROR:" > /fbrms01/dev/Logs/text_JB.txt
for these below files
./JB_CreateFormat_2013.03.18_08.27.49.log
./JB_CreateFormat_2013.03.18_17.21.31.log
./JB_ExtReservationDetail_2013.03.15_13.06.26.log
./JB_Report_Master_2013.03.18_09.53.38.log
./StoredProcessServer/ApplyTemplate_2013.02.15.log
./StoredProcessServer/ApplyTemplate_2013.03.20.log
./StoredProcessServer/AuthView_2012.08.21.log
./StoredProcessServer/AuthView_2013.02.15.log
./StoredProcessServer/BookPace_2013.01.29.log
I'm getting all the files with .log and ERROR: in the output file but I want only files which starts with JB_ and ending with .log.
Any help??
find /fbrms01/dev/Logs -type f -name 'JB_*.log' -exec grep -l "ERROR:" {} \; > /fbrms01/dev/Logs/text_JB.txt

How to use an enum type in powershell when configuring IIS using the powershell snapin

I am using the IIS Powershell snapin to configure a new web application from scratch. I am new to PS. The following script will not workl as PS is not recognising the ManagedPipelineMode enum. If I change the value to 0 it will work. How can I get PS to understand th enum. I tried the Add-Type cmdlet and also load the Microsoft.Web.Administration assembly without any scuccess, these lines are now commented.
How can I get this PS script working with the enum ?
#Add-Type -AssemblyName Microsoft.Web.Administration
#[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.Web.Administration")
Import-Module WebAdministration
$AppPoolName = 'Test AppPool'
if ((Test-Path IIS:\apppools\$AppPoolName) -eq $false) {
Write-Output 'Creating new app pool ...'
New-WebAppPool -Name $AppPoolName
$AppPool = Get-ChildItem iis:\apppools | where { $_.Name -eq $AppPoolName}
$AppPool.Stop()
$AppPool | Set-ItemProperty -Name "managedRuntimeVersion" -Value "v4.0"
$AppPool | Set-ItemProperty -Name "managedPipelineMode" -Value [Microsoft.Web.Administration.ManagedPipelineMode]::Integrated
$AppPool.Start()
}
The error message is:
Set-ItemProperty : [Microsoft.Web.Administration.ManagedPipelineMode]::Integrated is not a valid value for Int32.
It is expecting an integer, even though the underlying property is of type ManagaedPipelineMode. You can do below however:
$AppPool | Set-ItemProperty -Name "managedPipelineMode" -Value ([int] [Microsoft.Web.Administration.ManagedPipelineMode]::Classic)
PS:
Instead of
$AppPool = Get-ChildItem iis:\apppools | where { $_.Name -eq $AppPoolName}
you can do:
$AppPool = Get-Item iis:\apppools\$AppPoolName
Regarding: Add-Type -AssemblyName - this will only work for a canned set of assemblies that PowwerShell knows about. You have to find the assembly in your file system and use the -Path parameter. This worked on my system in a 64-bit PowerShell console:
Add-Type -Path C:\Windows\System32\inetsrv\Microsoft.Web.Administration.dll
Instead of using:
$AppPool | Set-ItemProperty -Name "managedPipelineMode" `
-Value [Microsoft.Web.Administration.ManagedPipelineMode]::Integrated
use:
$AppPool | Set-ItemProperty -Name "managedPipelineMode" `
-Value ([Microsoft.Web.Administration.ManagedPipelineMode]::Integrated)
or the even more succinct:
$AppPool | Set-ItemProperty -Name "managedPipelineMode" -Value Integrated
Why? The reason you need brackets in the first answer is because the parameter binder is treating the entire [Microsoft.Web.Administration.ManagedPipelineMode]::Integrated in your attempt as a string, which cannot be cast to that enumerated type. However, Integrated can be to that enum. By wrapping it in brackets, it is evaluated again as an expression and is treated as a full type literal.

Resources