I want to run a program for a file that exists in different subdirectories and then redirect the output to an output file. I want the output to be saved to the directory that the program has run.
So I would like to do something like this:
for x in */*.txt; do command $x > output.fsa; done
My questions are:
Is it correct the above loop? should I change directory in order to save the output on the directory that the command was executed or linux takes care of it?
any ideas on how to give the name of the directory in the output file?
Is it correct the above loop?
Yes
should I change directory in order to save the output on the directory that the command was executed or linux takes care of it?
You do not need to change the directory it is enough to redirect the output to a file in the correct directory:
for x in */*.txt; do command $x > `dirname $x`/output.fsa; done
The loop is correct, you will iterate over all txt files in subdirs of the current pwd (where this script or command is being executed). You don't have to change directory to save the output in that subdir. Linux don't take care of it :)
You can delete everything after first / using variable expansion ${x%%/*}
Try
for x in */*.txt; do
command "$x" > "${x%%/*}"/output.fsa
done
Remember, if you have more txt files in that subdir, you will execute command "$x" more times and rewrite the output.fsa.
You can use append (>>) in that case
Try
for x in */*.txt; do
echo "Executing command \"$x\"" >> "${x%%/*}"/output.fsa
command "$x" >> "${x%%/*}"/output.fsa
done
Related
Using any of the standard Robot libraries, is it possible to recursively copy the contents of a directory to an existing destination directory?
Basically, I'm looking for the equivalent of the following shell command: cp -r foo/. bar (note the trailing dot)
I tried Copy Directory but this creates a directory foo inside bar (as documented) and it doesn't stop doing that even when supplying the trailing dot. Copy Files chokes when it encounters a directory.
Is there anything I overlooked? Or do I need to just call cp -r myself?
As I only need this to work on Linux, I ended up implementing a custom keyword calling cp -r. If this is ever needed cross-platform, then I'll follow the suggestions to directly implement it in Python.
Copy Directory Contents
[Documentation] Recursively copies the contents of the source directory into the destination.
[Arguments] ${source} ${destination}
Directory Should Exist ${source}
Directory Should Exist ${destination}
${result} = Run Process cp -r ${source}/. ${destination}/
Should Be Equal As Integers ${result.rc} 0
Going through a UNIX shell script, I noticed that the path to the current working directory is being obtained using the following
BASE_DIR=$( readlink -e `dirname $0` )
the command 'pwd' also returns the same results. Is there a reason to use the above instead of pwd ?
The above returns the location of the file being executed, not the pwd, which can differ.
I have a general question as to why this occurs, and a misconception about 'pwd'.
You start with directory /test and in it you have /test/folder1.
Folder 1 has: file1.txt
In 2 separate terminals we "cd /test", and do an "ls" and discover folder1 as the output for both of these terminals.
We now "cd folder1" on terminal one. Terminal two remains in /test.
If we then "mv folder1 folder2" on terminal two and run an "ls" we get folder2 as the output. Clearly indicating our mv was successful.
However, within terminal 1 (which was in /test/folder1) if we run a "pwd" the output remains /test/folder1. Ie: it does NOT reflect that we have since moved the folder to /test/folder2.
Why is this the case? I can understand why if we were to edit the file1.txt it is just a pointer within the file system that should be pointing to the same file. Indeed it is as you can modify the file in each terminal and see the edits in the other. However, why is it the case that the 'pwd' command no longer reflects the actual path to that directory?
Thanks!
Assuming you're using bash, pwd is showing you the value of the PWD environment variable, which is updated when you change directory with cd. The folder1 directory changing name does not cause bash to update PWD. However you can find evidence that the directory has changed name:
pwd -P will show the new name of the directory.
ls -l /proc/self/cwd will link to the new name.
I think it is just the case that the first terminal has no reason to re-evaluate where it is. If you do the following command in the first terminal
cd .
you will see your current working directory has indeed changed per the rename (mv).
I'm writing a script that will print the file names of every file in a subdirectory of my home directory. My code is:
foreach file (`~/.garbage`)
echo "$file"
end
When I try to run my script, I get the following error:
home/.garbage: Permission denied.
I've tried setting permissions to 755 for the .garbage directory and my script, but I can't get over this error. Is there something I'm doing incorrectly? It's a tcsh script.
Why not just use ls ~/.garbage
or if you want each file on a separate line, ls -1 ~/.garbage
backtic will try to execute whatever is inside them. You are getting this error since you are giving a directory name within backtic.
You can use ls ~/.garbage in backtics as mentioned by Mark or use ~/.garbage/* in quotes and rely on the shell to expand the glob for you. If you want to get only the filename from a full path; use the basename command or some sed/awk magic
I have 36 subdirectories in the same directory named 10,11,12,....45 and a subdirectory logs
in each subdirectory (except for the directory logs) there is the same file called log.lammps
i was wondering if there was a way where i could copy each log.lammps file from each subdirectory 10-45 and put it in the sub directory logs while also adding the number of the directory that it originated from to the end of the filename
so i am looking for a code that copies the file log.lammps one by one from each subdirectory and every time the file gets copied into the directory logs, the filename gets changed from log.lammps to log.lammps10 if it came from the subdirectory 10 and when the file log.laamps from subdirectory 11 is copied into logs its name changes to log.lammps11 etc.
any help would be appreciated since right now i am only dealing with 30-40 files and in time i will be working with hundreds of files
Something along this line should work:
for f in [0-9][0-9]/log.lammps; do
d=$(dirname ${f})
b=$(basename ${f})
cp ${f} logs/${b}.${d}
done
That's easy-peasy with the magic of shell scripting. I'm assuming you have bash available. Create a new file in the directory that contains these subdirectories; name it something like copy_logs.sh. Copy-paste the following text into it:
#!/bin/bash
# copy_logs.sh
# Copies all files named log.lammps from all subdirectories of this
# directory, except logs/, into subdirectory logs/, while appending the name
# of the originating directory. For example, if this directory includes
# subdirectories 1/, 2/, foo/, and logs/, and each of those directories
# (except for logs/) contains a file named log.lammps, then after the
# execution of this script, the new file log.lammps.1, log.lammps.2, and
# log.lammps.foo will have been added to logs/. NOTE: any existing files
# with those names in will be overwritten.
DIRNAMES=$( find . -type d | grep -v logs | sed 's/\.//g' | sed 's/\///g' | sort )
for dirname in $( echo $DIRNAMES )
do
cp -f $dirname/foo.txt logs/foo$dirname
echo "Copied file $dirname/foo.txt to logs/foo.$dirname"
done
See the script's comments for what it does. After you've saved the file, you need to make it executable by commanding chmod a+x copy_logs.sh on the command line. After this, you can execute it by typing ./copy_logs.sh on the command line while your working directory is the directory that contains the script and the subdirectories. If you add that directory to your $PATH variable, you can command copy_logs.sh no matter what your working directory is.
(I tested the script with GNU bash v4.2.24, so it should work.)
For more on bash shell scripting, see any number of books or internet sites; you might start with the Advanced Bash-Scripting Guide.