Grep a path containing an environment variable and using it - unix

I'm using tcsh, and I'm trying to grep a path from a file with several ID, I'm doing:
grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"
that gets me:
$ENV_CASTRO/central/WS678/test_do_all.asm
but if I try
cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"` .
it prompts
cp: cannot stat `$ENV_CASTRO/central/WS678/test_do_all.asm': No such file or directory
How do I tell tcsh that the output of grep contains a $ that means it is an environment variable and is not plain text?
Thanks in advance.

eval is your friend ....
eval cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"` .
I don't have the time to create files to test this.
I hope this helps.

The problem is that the output of the grep command is not being evaluated by the shell, and so variable substitution is not happening.
One way to solve this would be to execute the desired command within another shell, for example,
sh -c "cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o '$ENV_CASTRO.*.asm'` ."

Related

Passing zsh command line arguments into xargs quotations

I have a zsh function, fvi (find vi), which recursively greps a directory searching for files with a pattern, collects them and opens them in vim (on the Mac):
function fvi { grep -rl $1 . | xargs sh -c '/Applications/MacVim.app/Contents/MacOS/Vim -g -- "$#" <$0' /dev/tty }
This looks bad but works fine (on the Mac). But I'd like to set the search pattern for vi to $1 with:
function fvi { grep -rl $1 . | xargs zsh -c '/Applications/MacVim.app/Contents/MacOS/Vim -c +/"$1" -g -- "$#" <$0' /dev/tty }
This of course does not work since xargs/zsh sees the $1 and translates it into a file name. I can manually say -c +/xyz and it will set the pattern to xyz. So I know the vim command syntax is working. I just can't get the shell command argument $1 to be substituted into the xargs string.
Any ideas?
I might just use find:
fvi () {
v=/Applications/MacVim.app/Contents/MacOS/Vim
find . -exec grep -e $1 -- {} \; -exec $v -g +/$1 -- {} \;
}
The fact that you are opening each file in vim for interactive editing suggests there are not so many possible matches (or candidates) that running grep multiple times is really an issue. (At worst, you are just replacing each extra shell process started by xargs with an instance of grep.)
This also precludes any possible issue regarding file names that contain a newline.

dynamically pass string to Rscript argument with sed

I wrote a script in R that has several arguments. I want to iterate over 20 directories and execute my script on each while passing in a substring from the file path as my -n argument using sed. I ran the following:
find . -name 'xray_data' -exec sh -c 'Rscript /Users/Caitlin/Desktop/DeMMO_Pubs/DeMMO_NativeRock/DeMMO_NativeRock/R/scipts/dataStitchR.R -f {} -b "{}/SEM_images" -c "{}/../coordinates.txt" -z ".tif" -m ".tif" -a "Unknown|SEM|Os" -d "overview" -y "overview" --overview "overview.*tif" -p FALSE -n "`sed -e 's/.*DeMMO.*[/]\(.*\)_.*[/]xray_data/\1/' "{}"`"' sh {} \;
which results in this error:
ubs/DeMMO_NativeRock/DeMMO_NativeRock/R/scipts/dataStitchR.R -f {} -b "{}/SEM_images" -c "{}/../coordinates.txt" -z ".tif" -m ".tif" -a "Unknown|SEM|Os" -d "overview" -y "overview" --overview "overview.*tif" -p FALSE -n "`sed -e 's/.*DeMMO.*[/]\(.*\)_.*[/]xray_data/\1/' "{}"`"' sh {} \;
sh: command substitution: line 0: syntax error near unexpected token `('
sh: command substitution: line 0: `sed -e s/.*DeMMO.*[/](.*)_.*[/]xray_data/1/ "./DeMMO1/D1T3rep_Dec2019_Ellison/xray_data"'
When I try to use sed with my pattern on an example file path, it works:
echo "./DeMMO1/D1T1exp_Dec2019_Poorman/xray_data" | sed -e 's/.*DeMMO.*[/]\(.*\)_.*[/]xray_data/\1/'
which produces the correct substring:
D1T1exp_Dec2019
I think there's an issue with trying to use single quotes inside the interpreted string but I don't know how to deal with this. I have tried replacing the single quotes around the sed pattern with double quotes as well as removing the single quotes, both result in this error:
sed: RE error: illegal byte sequence
How should I extract the substring from the file path dynamically in this case?
To loop through the output of find.
while IFS= read -ru "$fd" -d '' files; do
echo "$files" ##: do whatever you want to do with the files here.
done {fd}< <(find . -type f -name 'xray_data' -print0)
No embedded commands in quotes.
It uses a random fd just in case something inside the loop is eating/slurping stdin
Also -print0 delimits the files with null bytes, so it should be safe enough to handle spaces tabs and newlines on the path and file names.
A good start is always put an echo in front of every commands you want to do with the files, so you have an idea what's going to be executed/happen just in case...
This is the solution that ultimately worked for me due to issues with quotes in sed:
for dir in `find . -name 'xray_data'`;
do sampleID="`basename $(dirname $dir) | cut -f1 -d'_'`";
Rscript /Users/Caitlin/Desktop/DeMMO_Pubs/DeMMO_NativeRock/DeMMO_NativeRock/R/scipts/dataStitchR.R -f "$dir" -b "$dir/SEM_images" -c "$dir/../coordinates.txt" -z ".tif" -m ".tif" -a "Unknown|SEM|Os" -d "overview" -y "overview" --overview "overview.*tif" -p FALSE -n "$sampleID";
done

Piping into rm command

I`m going to delete all files from directory that contains "2" in their name.
this command work well
ls | grep [*2*]
but when i try to pipe the output from that command to command rm
ls | grep [*2*] | rm
there is error "Try `rm --help' for more information."
please help
Why not use the wildcarding in the shell directly ?
e.g.
$ rm *2*
I don't think you need the ls or the grep. Your above problem stems from the fact that you're piping output into the stdin of rm, whereas you want to supply command line arguments to rm. rm doesn't read from stdin.
To pipe output from another command to rm you must use xargs commant for rm
Try this
ls | grep [*2*] | xargs rm
the output will send like arguments of rm command
you need to feed every line to rm command as an input. For this you need xargs along with pipe.
so modify the command like ls -1 | grep [*2*] | xargs rm -rf
just complementing on other answers, instead of running ls then grep, you could use find.
find . -name "*2*" | xargs rm

How to remove the searching PID from this command line?

I've written a little script to find the IDs of the processes I'm interested in running kill -9 on. It reads something like this:
ps -A -o pid -o command | egrep 'java' | cut -d' ' -f1
However, I also get the PID for the command that runs the egrep java. I'd like to alter my above command that looks for 'java' to exclude the one that is the egrep java.
I'm fairly new to the command line and I'm not really sure how to do that.
You can try this,
ps -A -o pid -o command | egrep '[f]lash' | cut -d' ' -f1
(OR)
If you have pgrep command,
pgrep 'java'
It will give you the pid.

Pipes in unix - is the value implicitly supplied as an argument?

I would like to delete all files ending in .orig recursively from the current directory.
Will this do the trick?
ls -R | grep ".orig$" | rm
Are the results of grep passed implicitly as an argument to rm here?
How about something like:
find ./ -type f -name "*.orig" -exec rm "{}" \;
Seems to work for me, but it might be a good idea to test it with echo instead of rm first ;)
ls -R wont five quite the correct format output to pass directly to rm (through grep), as it lists files separately for each dir like:
.:
local1.orig local
./dir:
nested1.orig nested2.orig
If you wanted to do something similar using grep, you would need to use xargs like this:
grep ".orig$" | xargs rm
No, they are not. But that is the purpose of xargs:
ls -R | grep ".orig$" | xargs rm -i
will do what you want. The -i is not necessary, but is a good idea to use the first time you run this. (It will prompt you to delete a file. If you are confident that the answer is always yes, abort and re-run without the -i.)

Resources