I have these JSON files in a large directory structure. Some are just "abc.json" and some the added ".finished". I want to rsync only the files without ".finished".
$ find
.
./a
./a/abc.json.finished
./a/abc.json <-- this file
./a/index.html
./a/somefile.css
./b
./b/abc.json.finished
./b/abc.json <-- this file
Sample rsync command that copies all the "abc.json" AND the "abc.json.finished". I just want the "abc.json".
$ rsync --exclude="finished" --include="*c.json" --recursive \
--verbose --dry-run . server:/tmp/rsync
sending incremental file list
created directory /tmp/rsync
./
a/
a/abc.json
a/abc.json.finished
a/index.html
a/somefile.css
b/
b/abc.json
b/abc.json.finished
sent 212 bytes received 72 bytes 113.60 bytes/sec
total size is 0 speedup is 0.00 (DRY RUN)
Update: Added more files to the folders. HTML files, CSS and other files are present in my scenario. Only files ending in "c.json" should be transferred.
Scenario can be recreated with the following commands:
mkdir a
touch a/abc.json.finished
touch a/abc.json
touch a/index.html
touch a/somefile.css
mkdir b
touch b/abc.json.finished
touch b/abc.json
Try the following command. It assumes that you also want to replicate the source directory tree, (for any directories containing files which end with c.json), in the destination location:
$ rsync --include="*c.json" --exclude="*.*" --recursive \
--verbose --dry-run . server:/tmp/rsync
Explanation of command:
--include="*c.json" includes only assets whose name ends with c.json
--exclude="*.*" excludes all other assets (i.e. assets whose name includes a dot .)
--recursive recurse into directories.
--verbose log the results to the console.
--dry-run shows what would have been copied, without actually copying the files. This option/flag should be omitted to actually perform the copy task.
. the path to the source directory.
server:/tmp/rsync the path to the destination directory.
EDIT: Unfortunately, the command provided above also copies files whose filename does not include a dot character. To avoid this consider utlizing both rsync and find as follows:
$ rsync --dry-run --verbose --files-from=<(find ./ -name "*c.json") \
./ server:/tmp/rsync
This utilizes process substitution, i.e. <(list), to pass the output from the find command to the --files-from= option/flag of the rsync command.
source tree
.
├── a
│ ├── abc.json
│ ├── abc.json.finished.json
│ ├── index.html
│ └── somefile.css
└── b
├── abc.json
└── abc.json.finished.json
resultant destination tree
server
└── tmp
└── rsync
├── a
│ └── abc.json
└── b
└── abc.json
A hacky solution is use grep and create a file containing all file names we want to transfer.
find |grep "c.json$" > rsync-files
rsync --files-from=rsync-files --verbose --recursive --compress --dry-run \
./ \
server:/tmp/rsync
rm rsync-files
Content of 'rsync-files':
./a/abc.json
./b/abc.json
Output when running rsync command:
sending incremental file list
created directory /tmp/rsync
./
a/
a/abc.json
b/
b/abc.json
Related
I want to copy a directory from remote machine to local using rsync, but without some inner folders.
I'm using this command:
rsync -rave --exclude 'js' --exclude 'css' --exclude 'fonts' root#{IP}:/rem_dir1/rem_dir2/public /local_dir1/local_dir2/public
But result of it is:
Unexpected remote arg: root#{IP}:/rem_dir1/rem_dir2/public
rsync error: syntax or usage error (code 1) at main.c(1361) [sender=3.1.2]
I'm sure remote root is correct. So the problem is in rsync command syntax.
What is the correct way to exclude several folders using rsync?
For example we have /public folder which contains dir1, dir2, dir3, dir4 and dir5. How to copy only dir1 and dir2 from /public?
As with your other question, the -rave makes no sense. You want just -av.
You can get fancy with include and exclude commands, but the easiest way to copy just two directories is just to list them:
rsync -av \
root#{IP}:/rem_dir1/rem_dir2/public/dir1 \
root#{IP}:/rem_dir1/rem_dir2/public/dir2 \
/local_dir1/local_dir2/public/
where \ is just line-continuation (so I can wrap the long line), and I deliberately only added / to the end of the destination path, not the source paths.
I'm quite new to makefile but I still cannot understand how to set up the subdirectories of my source files.
My directory tree is:
i18n/
src/
engine/
graphics/ (currently only directory used)
I'm using this premade Makefile:
TARGET = caventure
LIBS = -lSDL2
CC = g++
CFLAGS = -Wall
TGTDIR = build
.PHONY: default all clean
default: $(TARGET)
all: default
OBJECTS = $(patsubst %.cpp, %.o, $(wildcard *.cpp))
HEADERS = $(wildcard *.h)
%.o: %.cpp $(HEADERS)
$(CC) $(CFLAGS) -c $< -o $#
.PRECIOUS: $(TARGET) $(OBJECTS)
$(TARGET): $(OBJECTS)
$(CC) $(OBJECTS) -Wall $(LIBS) -o $(TGTDIR)/$(TARGET)
clean:
-rm -f *.o
-rm -f $(TARGET)
GNU make's wildcard function does not recursively visit all subdirectories.
You need a recursive variant of it, which can be implemented as described in this answer:
https://stackoverflow.com/a/18258352/1221106
So, instead of $(wildcard *.cpp) you need to use that recursive wildcard function.
Another simpler way of finding files recursively might be to just use find.
For example, if you have a layout like this.
$ tree .
.
├── d1
│ └── foo.txt
├── d2
│ ├── d4
│ │ └── foo.txt
│ └── foo.txt
├── d3
│ └── foo.txt
└── Makefile
You could write a Makefile like this.
index.txt: $(shell find . -name "*.txt")
echo $^
Which prints this.
$ make
echo d2/d4/foo.txt d2/foo.txt d1/foo.txt d3/foo.txt
d2/d4/foo.txt d2/foo.txt d1/foo.txt d3/foo.txt
I have access to unix server from Putty application. Can anyone tell me how can I view/print all the files and directories inside a directory.
I tried below by searching internet and not working. Not sure what actually they do!
find ./ -type d | awk -F "/" '{ ld=0x2500; lt=0x251c; ll=0x2502; for (i=1; i<=NF-2; i++){printf "%c ",ll} printf "%c%c %s\n",lt,ld,$NF }'
and this
ls -R | grep ":$" | sed -e 's/:$//' -e 's/[^-][^\/]*\//--/g' -e 's/^/ /' -e 's/-/|/'
The tool tree will help you, while at it, you might also want to install pstree.
19:38:05 dusted#mono~
$ tree test
test
├── a
│ ├── 1
│ ├── 2
│ └── 3
├── b
│ ├── 1
│ ├── b
│ └── c
├── b-files.txt
├── new-b-files.txt
├── newer-b-files.txt
└── test
2 directories, 10 files
Hey there I did a little searching and stumbled upon a site that explains what you are asking. Let us know if this leads you in the right direction...http://www.centerkey.com/tree/
The 'find' command should do the job:
find /path/to/directory
If you want more information for each entry, you can combine 'find' with 'ls' like this:
find /path/to/directory -exec ls -ld "{}" \;
Is there a way to scp all files in a directory recursively to a remote machine and keep their original filenames but don't copy the directory it is in?
dir1/file
dir1/dir2/file2
so the contents of dir1 would be copied only. dir1 would not be created. The dir2 directory would be created with file2 inside though.
I have tried scp -r dir1 remote:/newfolder but it creates dir1 in the /newfolder directory on remote. I don't want it to create that dir1 directory. Just put all the files inside of dir1 into newfolder.
cd dir1
scp -r . remote:/newfolder
This avoids giving scp a chance to do anything with the name dir1 on the remote machine. You might also prefer:
(cd dir1; scp -r . remote:/newfolder)
This leaves your shell in its original directory, while working the same (because it launches a sub-shell that does the cd and scp operations).
This means copy the list of files made by the shell expansion dir1/* to the remote location remote:/newfolder
scp -r dir1/* remote:/newfolder
You can use the dot syntax with relative path.
scp -r dir1/. remote:/newfolder
If the remote directory does not exist it is created.
I can't seem to get the command to backup /etc/php5, /etc/apache2 and /etc/mysql right. I'm using two since I couldn't figure out how to do both in one. The first one works:
rsync -vahtl --dry-run --log-file=$LOGFILE --exclude="wp-includes/" --exclude="wp-admin/" --exclude="wp-*.php" /var/www $DROPBOX_FOLDER
But when I run the second, I tried a bunch of --include and --exclude directives variation and nothing works:
rsync -vahtl --dry-run --log-file=$LOGFILE --include="php5" --exclude="*" /etc $DROPBOX_FOLDER
rsync -vahtl --dry-run --log-file=$LOGFILE --include="*/" --include="php5/" --exclude="*" /etc $DROPBOX_FOLDER
etc..
The quickest way is to run it as a bash script using something like this.Would need to be adjusted to your Linux flavor
#!/bin/sh
# rsync backup script
rsync -avz --delete-excluded --exclude-from=backup.lst / /home/USERNAME/Dropbox/backup
Then make a file called backup.lst in the directory in which you will run the script from
# Include
+ /etc/php5
+ /etc/apache2
+ /etc/mysql
+ /var/www
# Exclude
- /var/www/wp-admin/*
- /var/www/wp-*.php
- /var/www/wp-includes/*
- /etc/*
- /run/*
- /proc/*
- /sys/*
- /tmp/*
- lost+found/
- /media/*
- /mnt/*
Here are some exclude/include examples:
# --exclude "*.o" would exclude all filenames matching *.o
# --exclude "/foo" would exclude a file in the base directory called foo
# --exclude "foo/" would exclude any directory called foo.
# --exclude "/foo/*/bar" would exclude any file called bar two levels below a
base directory called foo.
# --exclude "/foo/**/bar" would exclude any file called bar two or more levels below
a base directory called foo.
# --include "*/" --include "*.c" --exclude "*"
would include all directories
and C source files
# --include "foo/" --include "foo/bar.c" --exclude "*"
would include only foo/bar.c (the foo/ directory must be
explicitly included or it would be excluded by the "*")