I have two different folders, say folder 1 and folder 2. A bunch of files get created into folder 1 all the time (they're not there yet).
I would like to have those files "physically" in folder 2, but that they are "symbolically" in folder 1 (because for new files to still get created in folder 1, it needs to have the previous files in it too).
I know I can create a symlink for a particular file from one folder to another.
But how would it work to have in general ALL future files from folder 1 (with different names) be physically in folder 2 and symbolically in folder 1?
I guess I would have to have a line that moves them physically to folder 2, AND that creates a symlink to folder 1.
I tried:
ln -s ./output/* ../../data/jadecheclair/plasim_output/fix_alb/output_try/*
where ./output/ is the folder they get created in (folder 1) and ../../data/jadecheclair/plasim_output/fix_alb/output_try/ is the folder I would like them to be in physically.
What you could try is move all old files the nes directory, remove it and create your link:
mv ./output/* ../../data/jadecheclair/plasim_output/fix_alb/output_try/
rm -ri ./output/
ln -s ../../data/jadecheclair/plasim_output/fix_alb/output_try/ ./output/
Then you should have
$ ls -la
[..]
output -> ../../data/jadecheclair/plasim_output/fix_alb/output_try/
And files created in ./output/ should be physically in ..../output_try/
Related
I' m trying to zip a folder that contains some .json files plus subfolders with more .json files. I need to zip the parent folder with everything included without containing any of the parent or subfolders paths. Is there any way I can do this?
Thank you
EDIT:
I want this:
pending_provider/722c9cb2-268b-4e4a-9000-f7f65e586011-version/1d296136-ac87-424e-89c4-682a63c7b853.json (deflated 68%)
But not this:
pending_provider/722c9cb2-268b-4e4a-9000-f7f65e586011-version/ (stored 0%)
I want to avoid the "stor" compression type which only saves the folder path. I want only the "defN"
So the -j doesn't help me a lot
Thanks
If you don't want any paths at all you could use the -j option. Below is the man page entry for the option.
-j
--junk-paths
Store just the name of a saved file (junk the path), and do not store
directory names. By default, zip will store the full path (relative to
the current directory).
If you just want to exclude directories
-D
--no-dir-entries
Do not create entries in the zip archive for directories.
I have a directory that contains media that I am trying to setup a basic symbolic link to - the directory is a mounted storage on a digital ocean droplet in the following directory /mnt/storage/media/all
this contains directories as shown below:
0118
0119
0218 and so on.......
I am trying to make a symlink from my unix terminal as follows :
$ root#server1:/var/www/abcd/public ln -s /mnt/storage/media/all
So if I cd into the public directory above I would expect to see the directories 0118, 0119, 0218 and so on... however when I cd into this directory I see the directory all and within this directories are the 0118, 0119, 0218 subdirectories.
How do I change the symbolic link so I see the directories 0118, 0119, 0218 etc.. and not the all directory (which contains those same sub directories)
Try giving a second argument for the function. Let that argument be the desired destination for the link (but it shouldn't exist prior to this).
E.g. ln -s /mnt/storage/media/all /var/www/abcd/public
In case the folder public already exists, the symlink will be created inside it.
I have a large folder of files that needs to be transferred to a remote site. This folder is currently 10GB total, but contains lots of much smaller files.
Rather than copying the entire 10GB each time, we'd like to massively reduce the data transfer size to be only the files that are new or changed. We plan to do this like so:
SOURCE_DIR is the folder that has all the files and is up-to-date.
COMPARE_DIR is a directory "clone" of the folder at the remote end. It is basically all the files up to the last time files were transferred.
TRANSFER_DIR is an empty folder that (we hope) ROBOCOPY can place files that are new or changed in SOURCE_DIR when compared with COMPARE_DIR into.
An example:
SOURCE_DIR has 4 files: 1.txt, 2.txt, 3.txt, 4.txt
COMPARE_DIR has 3 of those files: 1.txt, 2.txt, 3.txt
The ROBOCOPY command would compare SOURCE_DIR with COMPARE_DIR and see that 4.txt isn't in COMPARE_DIR so copies it into TRANSFER_DIR
TRANSFER_DIR then only has 4.txt file in it which we can copy up to the remote end and place in the folder making it the same as our SOURCE_DIR this end.
This can be done with rsync using the --compare-dest=DIR argument, but as this is Windows, I'd rather not have to install rsync unless I need to.
I have over 500 sub directories coming off of one root directory that each contain >6000 files each. the directories are named 20150218, 20150217, etc., one for each day of the year.
I want to develop a script that will zip all of the files in a directory, i.e. 20150217 and name the directory 20150217.zip. I then want to delete the original files.
so, all of the sud directories in ~/public_html/ispy/dlink/ would be zipped separately.
I appreciate any guidance.
Copy and paste following script into any unix editor (vim, geany, mousepad) and save it in directory with Your "date" subfolders. Name it as you wish, with no extension e.g. zipscript. From terminal: go to directory with Your script, allow to execute it: chmod +x zipscript and fire it: sudo ./zipscript.
#!/bin/bash
for D in *
do
if [ -d "${D}" ]; then
zip -r -j "${D}".zip "${D}"
rm -R "${D}"
fi
done
Script runs as follows: for every file (directory is also a file under unix) in current directory check if it is a directory and, if true, make zip with the same name, then delete subdirectory with all files in it.
I have 36 subdirectories in the same directory named 10,11,12,....45 and a subdirectory logs
in each subdirectory (except for the directory logs) there is the same file called log.lammps
i was wondering if there was a way where i could copy each log.lammps file from each subdirectory 10-45 and put it in the sub directory logs while also adding the number of the directory that it originated from to the end of the filename
so i am looking for a code that copies the file log.lammps one by one from each subdirectory and every time the file gets copied into the directory logs, the filename gets changed from log.lammps to log.lammps10 if it came from the subdirectory 10 and when the file log.laamps from subdirectory 11 is copied into logs its name changes to log.lammps11 etc.
any help would be appreciated since right now i am only dealing with 30-40 files and in time i will be working with hundreds of files
Something along this line should work:
for f in [0-9][0-9]/log.lammps; do
d=$(dirname ${f})
b=$(basename ${f})
cp ${f} logs/${b}.${d}
done
That's easy-peasy with the magic of shell scripting. I'm assuming you have bash available. Create a new file in the directory that contains these subdirectories; name it something like copy_logs.sh. Copy-paste the following text into it:
#!/bin/bash
# copy_logs.sh
# Copies all files named log.lammps from all subdirectories of this
# directory, except logs/, into subdirectory logs/, while appending the name
# of the originating directory. For example, if this directory includes
# subdirectories 1/, 2/, foo/, and logs/, and each of those directories
# (except for logs/) contains a file named log.lammps, then after the
# execution of this script, the new file log.lammps.1, log.lammps.2, and
# log.lammps.foo will have been added to logs/. NOTE: any existing files
# with those names in will be overwritten.
DIRNAMES=$( find . -type d | grep -v logs | sed 's/\.//g' | sed 's/\///g' | sort )
for dirname in $( echo $DIRNAMES )
do
cp -f $dirname/foo.txt logs/foo$dirname
echo "Copied file $dirname/foo.txt to logs/foo.$dirname"
done
See the script's comments for what it does. After you've saved the file, you need to make it executable by commanding chmod a+x copy_logs.sh on the command line. After this, you can execute it by typing ./copy_logs.sh on the command line while your working directory is the directory that contains the script and the subdirectories. If you add that directory to your $PATH variable, you can command copy_logs.sh no matter what your working directory is.
(I tested the script with GNU bash v4.2.24, so it should work.)
For more on bash shell scripting, see any number of books or internet sites; you might start with the Advanced Bash-Scripting Guide.