In R: Is there a comparable function to Unix's "find"? - r

I want a version of this in R.
find . ( -type d -name "example_folder_*" ) -prune -print > directory.csv
The reason:
I am receiving a directory that contains a large amount of files and subdirectories. I want to know where all folders that have this format, "example_folder_", are located.

R has a function called list.dirs(arg) that will return a vector of a directories under and including arg. I don't think there is an equivalent of -prune. Once you have the directory tree vector though, it should be easy to filter with the standard R tools.

Related

HOW TO: Create a file with a list of files via a unix command

I need to create a file that has a list of files(files already exist with data) using a unix command. Basically move the existing files into a newly created text or csv file.
Syntax: /$dir/$file_name.csv
Ex: /var/data/inbound_data/fusion_po_attachments.txt (or fusion_po_attachments.csv)
This path would have n number of files with the same syntax.
/var/data/inbound_data/fusion_po_attachments.txt --main file & this would have below content
/var/data/inbound_data/attachment1.csv
.
.
.
/var/data/inbound_data/attachment50.csv
how can we achieve this? Please point out if any question like this exist. Thanks.
for i in /var/data/inbound_data/*.csv
do
echo "$i"
done > /var/data/inbound_data/fusion_po_attachments.txt
or the same as a one-liner
for i in /var/data/inbound_data/*.csv ; do echo "$i" ; done > /var/data/inbound_data/fusion_po_attachments.txt
or
find /var/data/inbound_data -maxdepth 1 -name '*.csv' > /var/data/inbound_data/fusion_po_attachments.txt
The condition -maxdepth 1 makes sure to print matching objects only in the starting directory not in subdirectories. If you know there aren't any subdirectories, or if you want files in subdirectories, you can omit this condition.
It's not entirely clear what you want, but it sounds like you want something like:
find /var/data/inbound_data/ -type f -name '*.csv' > /var/data/inbound_data/fusion_po_attachments.txt

Zsh filename expansion over multiple directories recursively

Problem: I have a directory $BASE, and in this directory (and/or any of the directories below it) are zero or more directory entries matching the pattern *.x. I want to loop over all these entries. I want to use foreach and not find $BASE -name '*.x' for this procedure. I have zsh 5.3.
My current approach goes like this:
foreach f in $BASE/*.x(N) $BASE/**/*.x(N)
do
# ... do something with $f
done
Is there a way to write the glob pattern more uniformely (not requiring the repetition of *.x)?
foreach f in $BASE/**/*.x(N) is sufficient. ** already matches 0 or more directories, so the pattern matches $BASE/*.x already.

recursive user defined function in makefile runs out of stack memory

I have a function that returns all the files in a directory.
# Returns all files in folder recursively that match pattern
#
# $(call rwildcard, folder,pattern)
rwildcard=$(foreach d,$(wildcard $1*),$(call rwildcard,$d/,$2) $(filter $(subst *,%,$2),$d))
Argument 1, i.e. folder, is the path of the folder where to search for files recursively and it's user provided
If this argument is "/" this would run out of memory and crash with an exception.
Is there a way to prevent this? besides filtering "/" as an argument.
Note: i'm using cygwin
I suspect you aren't making any progress in the inner rwildcard. If it matches "." every time, are you stuck in a loop?
Can you use another tool to get a list of files?
r := $(shell find $(dir) -type f -name \*$(likethis)\*)

Use of find in unix on strange file/directory names [duplicate]

Im a beginner scripter, writing scripts in tcsh and csh(these are teached in my course)
Im writing a script which uses find for putting path of directories
this is the part of the script:
set list = (`find $PATH -type d`)
it works fine until the file or directory names arent named such as:
#fi##lename&& or −filename or :−,?!drectoryanem!-``
These special characters i couldnt handle i changed the find script to:
set list = ("`find $PATH -type d`")
bit none of these works, when i want to use the path from the list in this next script:
foreach i ($list:q)
foreach file (`find "$i" -maxdepth 1 -type f`)
....
end
end
it couldnt handle these special file names, so i get many errors like find: −."!filename: no such filename or directory
I worked it out
It had to be this way:
set subor = ("`find "'"$i"'" -type f -maxdepth 1`")
now it ignores everything in the filenames
and in:
foreach j ($subor:q)
i quoted it this way it ignores the white characters in file names

Using mtime other than with FIND

I am trying to write a script which will move files older than 1 day to an archive directory. I used the following find command:
for filename in `find /file_path/*.* -type f -mtime +1`
This fails since my argument list is too big to be handled by find. I got the following error:
/usr/bin/find: arg list too long
Is it possible to use find in an IF-ELSE statement? Can someone provide some examples of using mtime other then in find.
Edit: To add the for loop of which the find is a part.
find /file_path -name '*.*' -mtime +1 -type f |
while read filename
do ...move operation...
done
That assumes your original code was acceptable in the way it handled spaces etc in file names,
and that there is no sensible way to do the move in the action of find. It also avoids problems with overlong argument lists.
Why not just use the -exec part of find?
If you just want to cp files, you could use
find /file_path -name "." -mtime +1 -type f | xargs -i mv {} /usr/local/archived

Resources