I am trying to write an equivalent of
find -name "*.xml" | xargs grep -l "Search String"
| xargs perl -p -i -e 's/Search String/Replace String/g'
in powershell. This is what I came up with.
Get-ChildItem 'D:\code\cpp\FileHandlingCpp\input - Copy' -Recurse |
Select-String -SimpleMatch $src_str |
foreach{(Get-Content $_.path) | ForEach-Object { $_ -replace $src_str, $target_str }}
I get the error "The process cannot access the file because it is being used by another process”. So I came up with the multiple lines version as shown below. I am able to do in-replace of the strings now except the one in $src_str. What's wrong with $src_str ?
$src_str="<?xml version=""1.0"" encoding=""UTF-8"" standalone=""yes"" ?>"
$target_str=""
echo $src_str
foreach ($var in (Get-ChildItem 'D:\code\cpp\FileHandlingCpp\input - Copy' -Recurse
| Select-String -SimpleMatch $src_str).Path)
{
(Get-Content $var) | ForEach-Object { $_ -replace $src_str, $target_str }
| Set-Content $var
}
Maybe it would help to get back to your original goal of implementing the equivalent of the Unix version. Here is essentially the equivalent PowerShell version.
$search = '<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>'
$replace = 'test'
$dir = 'D:\code\cpp\FileHandlingCpp\input - Copy'
dir -Path $dir -Recurse -Filter *.xml | ForEach-Object {
(Get-Content -Path $_.FullName) -replace $search, $replace |
Set-Content $_.FullName
}
Note - watch out for text file encoding changes that may occur from re-writing the file. You can specify the output encoding if you need to using Set-Content's -Encoding parameter e.g. ASCII.
This took me a while to figure out but I got it!
It's a one liner. Just go to the folder you want to start at and type this in. Change the file.name (use wild cards if you want) and string1 and string2 with the file name you want to search for and the string1 you want to replace with string2.
So this searches folders recursivly and for each file it replaces a string with another string and saves it. Basically Get-Childitem | ForEach-Object Replace and Save.
All set!
get-childitem -include file.name -Recurse | ForEach-Object { ( Get-Content -Path $_.FullName ) -replace 'string1', 'string2' | set-content $_.fullname }
Related
Is it possible to execute this task (command with CRLF) without error using the Task scheduler?
`
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -noexit -command &{(Get-Content -Path C:\tests\test.txt) -replace ('aaa=0',"aaa=20 `r`ng=11111 `r`ng=2222") | Set-Content -Path C:\tests\test.txt}
Powershell Command without error
(Get-Content -Path C:\tests\test.txt) -replace ('aaa=0',"aaa=20 `r`ng=11111 `r`ng=2222") | Set-Content -Path C:\tests\test.txt
`
I have the below code which adds Logger.info line after every function definition which I need to run on a python script which is the requirement.
The only question is this has to be written back to the same file so the new file has all these looger.info statements below each function definition.
e.g. the file abc.py has currently below code :
def run_func(sql_query):
return run_func(sql_query)
and the code below should create the same abc.py file but with all the logger.info added to this new file
def run_func(sql_query):
LOGGER.info (''MIPY_INVOKING run_func function for abc file in directory'
return run_func(sql_query)
I am not able to write the sed in this file to the new file (with same file name) so that the original file gets replaced by same file name and so that I have all the logger.info statements in there.
for i in $(find * -name '*.py');
do echo "#############################################" | tee -a auto_logger.log
echo "File Name : $i" | tee -a auto_logger.log
echo "Listing the python files in the current script $i" | tee -a auto_logger.log
for j in $(grep "def " $i | awk '{print $2}' | awk -F"(" '{print $1}');
do
echo "Function name : $j" | tee -a auto_logger.log
echo "Writing the INVOKING statements for $j function definition" | tee -a auto_logger.log
grep "def " $i |sed '/):/w a LOGGER.info (''INVOKING1 '"$j"' function for '"$i"' file in sam_utilities'')'
if [ $? -ne 0 ] ; then
echo " Auto Logger for $i filename - Not Executed Successfully" | tee -a auto_logger.log
else
echo "Auto Logger for $i filename - Executed Successfully" | tee -a auto_logger.log
fi
done
done
I have some files in a directory and sub directories. I need to search all the files and print the file name and the content between 2 matching patterns in the file.
For e.g. lets say my file looks like below.
File1.txt:
Pattern1
ABCDEFGHI
Pattern2
dafoaf
fafaf
dfadf
afadf
File2.txt
Pattern1
XXXXXXXXX
Pattern2
kdfaf
adfdaf
fdafad
I need to get following output
File1.txt:
ABCDEGHI
File2.txt:
XXXXXXXX
and so on for all the files under directory and sub directories separated by new line.
This might work for you:
find . \
-type f \
-exec awk 'BEGING {print FILENAME ":"} /Pattern1/ { p=1 ; next } /Pattern2/ {p=0} p==1 {print $0} END {print ""}' \{\} \;
Note, this prints the FILENAME, even if Pattern1 was not found!
This will work for you :
Create this shell script as my_grep.sh
#!/bin/sh
grep -nH "Pattern" $1 >>temp
if [ `grep -c $1 temp` -eq 2 ]; then
limits=`grep $1 temp | cut -f2 -d:`
lower_limit=`echo $limits | cut -f1 -d" "`
upper_limit=`echo $limits | cut -f2 -d" "`
echo "$1:"
head -`expr $upper_limit - 1` $1 | tail -`expr $upper_limit - $lower_limit - 1`
fi
Use find command to search files and fire this schell script:
$ find ./test -type f -exec ./my_grep {} \;
./test/File1.txt:
ABCDEFGHI
./test/File2.txt:
XXXXXXXXX
I'm searching through a number of directories for "searchstring", and then running a script on each $file:
for file in `find $dir -name ${searchstring}'*'`;
do
echo $file >> $debug
script.sh $file >> $output
done
My $debug file yields the following:
/root/0007_searchstring/out/filename_20120105_020000.log
/root/0006_searchstring/out/filename_20120105_010000.log
/root/0005_searchstring/out/filename_20120105_013000.log
(filename is _yyyymmdd_hhmmss.log)
...
Is there a way to get find to order by filename or by mktime? Should I pipe find to sort first? Make an array then sort it as per this question?
If you want to ignore the directory path and just use the file name, then you should be able to use:
for file in `find $dir -name ${searchstring}'*' | sort --field-separator=/ --key=4`;
'ls -t' if you need to regenerate the list based on timestamp.
'sort -n' if the list is fairly static?
To sort by modification time, you can use stat with find:
$ find . -exec stat {} -c '%Y %n' \; | sort -n | cut -d ' ' -f 2
You can pipe the output of find through sort to sort by filename:
find $dir -name "${searchstring}*" | sort | while read file
do
echo "$file" >> $debug
script.sh "$file" >> $output
done
I need to find file extension for file to be processed using UNIX. The two file extension which i will be handling are '.dat' and '.csv'.
Please let me know how this can be done.
find . -name "*.dat" -o -name "*.csv"
Finds in the current directory and recursively down, all files that end in those two extensions.
So my stab at this.
filename=file.dat
extension=$(echo ${filename}|awk -F\. '{print $2}')
if [ ${extension} == "dat" ]; then
your code here
fi
Echo the variable ${filename} pipe that output to awk. With awk reset the field separator to a . then pick up field 2 (the print $2 part)
This is what you want ?
find . -name "*.dat"
find . -name "*.csv"
find /path -type f \( -name "*.dat" -o -name "*.csv" \) | while read -r file
do
echo "Do something with $file"
done
if you have the filename in a variable
filename = test.csv
then just use this to get the "csv" part:
echo ${filename##*.}
works for bash, try it in ksh
edit:
filename=test.csv
fileext=${filename##*.}
if [ fileext = "csv" ]; then
echo "file is csv, do something"
else
if [ fileext = "dat" ]; then
echo "file is dat, do something"
else
echo "mhh what now?"
fi
fi