Passing wildcards to find - wildcard

I'm trying to automate my test runner better. For that I need the update file name in a variable. As this name depends on a the version I'm trying to use a find with a pattern to get the file name. That works just fine in bash.
However if I use that same pattern in expect find complains that it can't find anything.
My guess is that expect is doing something to my wildcards. However my experiments with {}, "", '' or \ didn't result in it working.
I guess I could create a helper sh script to write it into a file and then read that file but I don't like that solution and there has to be an option to pass characters with special tcl meaning as arguments.
At the moment my call looks something like this with an absolute path in front of the pattern:
set pattern {[0-9]*/*test*}
set updateFile [exec find ${pattern} -type f]
The result is that find reports '[0-9]*/*test*': No such file or directory. The pattern is what I would expect and when I call find [0-9]*/*test* -type f in bash it results in the expected file path. Find also works fine as long as I don't have any wild cards.
Has anybody an idea what is wrong?

When you run find [0-9]*/*test* -type f in Bash, it's Bash who interprets the wildcard [0-9]*/*test* and expand it to multiple files. And then Bash would pass the expanded multiple files to find. That's to say find never sees the wildcard.
For exec find $pattern -type f, Tcl would not interpret what's in $pattern and pass it directly to find. Unfortunately find also does not interpret the wildcards here so it fails with error like find : '[0-9]*/*test*': No such file or directory.
To work around, you can invoke find with bash -c:
exec bash -c "find $pattern -type f"

Related

find command in unix withe regex

Can someone tell me what this command does : find ./ -regex ".*"\!*"*" ?
Guess based on too little information in the question:
This may be part of the definition of an alias like
alias f 'find ./ -regex ".*"\!*"*"'
for csh or tcsh which could be called like
f some pattern
to recursively find files that somehow match the specified pattern.

"find" command returning nothing when searching through absolute path

Thought there might be a simple solution to this, but I can't seem to find it anywhere. It's a simple-enough problem. Say I have the following folder/file structure:
/home/
text1.txt
/mydir/
text2.txt
Then I input the command:
find . -name *.txt
This command returns "text1.txt" when called from within /home, and returns "text2.txt" when called from within /home/mydir, as it should.
However, when calling the following from /home...:
find /home/mydir -name *.txt
it returns nothing. My expectation is that it would return "text2.txt." Any thoughts? I have already checked to see if I have any wayward aliases assigned to find, and I have nothing.
It is also worth it to note that I have two unix machines. The use of an absolute path for "find" works on one machine and not the other. Can't go into much more detail than that, I'm afraid. Just looking for a direction to investigate this more.
Thanks to anyone who can help :-)
You should use
find . -name "*.txt"
otherwise bash will extract *.txt to text1.txt resulting in the following command:
find . -name text1.txt
And it will no longer match text2.txt

Use of find in unix on strange file/directory names [duplicate]

Im a beginner scripter, writing scripts in tcsh and csh(these are teached in my course)
Im writing a script which uses find for putting path of directories
this is the part of the script:
set list = (`find $PATH -type d`)
it works fine until the file or directory names arent named such as:
#fi##lename&& or −filename or :−,?!drectoryanem!-``
These special characters i couldnt handle i changed the find script to:
set list = ("`find $PATH -type d`")
bit none of these works, when i want to use the path from the list in this next script:
foreach i ($list:q)
foreach file (`find "$i" -maxdepth 1 -type f`)
....
end
end
it couldnt handle these special file names, so i get many errors like find: −."!filename: no such filename or directory
I worked it out
It had to be this way:
set subor = ("`find "'"$i"'" -type f -maxdepth 1`")
now it ignores everything in the filenames
and in:
foreach j ($subor:q)
i quoted it this way it ignores the white characters in file names

Program fails to move file

I'm trying to move file from one place to another directory...So my program will read Log_Deleter, use parameters given in each line to delete the file.
When I execute the file, it seems like it runs fine (no errors) but non of the files are moved... I'm not sure why it's not moving the file nor display any error...
Can someone please identify the error?
my attempt:
#!/bin/ksh
while read -r line ; do
v=$line
set -- $v
cd /
$(find "$1" -type f -name "$2" -mtime +"$3" -exec mv {} "$4" \;)
done < Log_Deleter.txt
Log_Deleter.txt
/usr/IBM/WebSphere/AppServer/profiles/AppSrvSIT1/logs/Server1 'SystemOut_*' 5 /backup/Abackuptest1
/usr/IBM/WebSphere/AppServer/profiles/AppSrvSIT1/logs/Server2 'SystemOut_*' 5 /backup/Abackuptest2
Thanks for your help!
Find is looking for files that have a literal ' in the name. You need to remove the single quotes from $2 before invoking find. Try:
#!/bin/ksh
while read -r path name mtime dest ; do
name=$( echo $name | tr -d "'" )
find "$path" -type f -name "$name" -mtime +"$mtime" -exec mv {} "$dest" \;
done < Log_Deleter.txt
The problem is that you are trying to match a file whose name actually has the single quotes in it.
Barring other problems, I think your script will probably work once you take the quotes out of Log_Deleter.txt.
The quotes are only meaningful when the shell is parsing command input. This is not what the read builtin does. And even when reading command input, once the quotes get into a variable they stay there forever unless reread at the shells CLI layer via eval.
The shell is not exactly a macro processor. It's a complicated hybrid that a little bit CLI, a little bit programming language, and a little bit macro processor.
And, speaking of eval, it's not necessary to wrap the find in an eval-like construct. Simplify your script to run find directly and you will find it easier to debug and understand.

why is zsh globbing not working with find command?

I have been using zsh globbing for commands such as:
vim **/filename
vim *.html.erb
and so on, but when I type in something like:
find . -name *mobile*
I get the response:
zsh: no matches found: *mobile*
Why?
find . -name *mobile* # does not work
vs
find . -name '*mobile*' # works
The difference is due to the steps that the shell takes when it parses a line. Normally, the shell expands any wildcards it finds before it runs the command. However, the single quotes mark the argument as being a literal, which means that the shell does not perform wildcard expansion on that argument before running the command.
To demonstrate the difference, suppose you are in a directory with the following files:
$ tree
./
mobile.1
dir/
mobile.2
In the first case, without single quotes, zsh will process as follows:
expand the glob, rendering simply mobile.1 (because that is the only matching filename in the current directory
pass the result to find, hence:
find . -name mobile.1
So find will only look for files named literally mobile.1
In the second form, with single quotes, the entire glob will be preserved and passed to find:
find . -name *mobile*
Which means that find will look for any filename containing the string "mobile".
The important thing to note here is that both zsh and find support the same wildcard syntax; by using single quotes, you induce find to handle the wildcards in this case rather than zsh.
Turns out that all you have to do to solve the problem is add some quotes around the input:
find . -name '*mobile*'
I don't really have an answer as to why just yet...and the documentation doesn't have an something that sticks out to me, but let me know if you know the answer!
For archival purposes, here is my substantial edit/reformatting of #Swiss's response above. The edit queue has been full every time I tried to edit, for hours, so I want to preserve this for future reference. I hope it's deemed to be constructive.
To be super-clear it's a revision of another person's work.
find . -name *mobile* # does not work
vs
find . -name '*mobile*' # works
The difference is due to the steps that the shell takes when it parses a line. Normally, the shell expands any wildcards it finds before it runs the command. However, single quotes marks the argument as being a literal, which means that the shell does not preform wildcard expansion on that argument before running the command.
To demonstrate the difference, suppose you are in a directory with the following files:
$ tree
./
mobile.1
dir/
mobile.2
In the first case, without single quotes, zsh will process as follows:
expand the glob, rendering simply mobile.1 (because that is the only matching filename in the current directory
pass the result to find, hence:
find . -name mobile.1
So find will only look for files named literally mobile.1
In the second form, with single quotes, the entire glob will be preserved and passed to find:
find . -name *mobile*
Which means that find will look for any filename containing the string "mobile".
The important thing to note here is that both zsh and find support the same wildcard syntax; by using single quotes, you induce find to handle the wildcards in this case rather than zsh.

Resources