I'm trying to rsync txt files from a folder to another folder but the trouble is that many of these files have names that contain spaces. As such I keep on getting an error. What exactly do I need to do to have the script understand the spaces as such without having to spell out the name for each file that has spaces? FWIW, this is on MacOS.
do shell script "rsync `find /Volumes/HD/Users/123/Desktop/Test/ -type f -mtime -3` --include='*.txt' --exclude='*' --ignore-existing -raz --progress /Volumes/HD/Users/123/Desktop/TXT/"
Related
I'm trying to delete everything within a directory except for two folders that I know the names of. Let's say that the two folders are called "dont_delete1" and "dont_delete2". And within the the current directory, other folder and files exist.
I have tried
rm -r !(dont_delete1|dont_delete2)
but that requires me to shopt -s extglob which due to certain restraint, I can't use.
So I turned to
find . \! -name [folder name] -delete
I've tested it out on a single folder and it works. But I can't figure out a way to use the above command for multiple folders. I've tried all sorts of commands that I thought would work but was unsuccessful.
Lots of different solutions, here's one thats easy to understand:
$ls | grep -v dont_delete1 | grep -v dont_delete2 | xargs rm -rf
I have a set of zip files with multiple levels of directories in them. I want to find some content from a text file in one of those directories which can be in any of the zip files. If the files are unzipped, I would use the following
grep -r 'pattern' path
I tried using zgrep but it said that the option -r isn't supported. Is there a way to grep through the zipped files?
Thanks in advance.
Try with find command like:
find mydir -type f -name "*log.gz" -exec zgrep "pattern" {} \;
Above command will search for pattern in files named "*log.gz" residing in either mydir or sub directories within mydir.
Figured maybe someone here might know whats going on, but essentially what I have to do is take a directory, and make a tar file omitting a subdir two levels down (root/1/2). Given it needs to work on a bunch of platforms, the easiest way I thought was to do a find and egrep that directory out, which works well giving me the list of files.
But then I pipe that file list into a xargs tar rvf command and the resulting file comes out something like 33gb. I've tried to output the find to a file, and use tar -T with that file as input, its still comes out to about 33gb, when if I did a straight tar of the whole directory (not omitting anything) it comes in where I'd expect it at 6-ish gb.
Any thoughts on what is going on? Or how to remedy this? I really need to get this figured out, I'm guessing it has to do with feeding it a list of files vs. having it just tar a directory, but not sure how to fix that.
Your find command will return directories as well as files
Consider using find to look for directories and to exclude some
tar cvf /path/to/archive.tar $(find suite -type d ! -name 'suite/tmp/Shared/*')
When you specify a directory in the file list, tar packages the directory and all the files in it. If you then list the files in the directory separately, it packages the files (again). If you list the sub-directories, it packages the contents of each subdirectory again. And so on.
If you're going to do a files list, make sure it truly is a list of files and that no directories are included.
find . -type f ...
The ellipsis might be find options to eliminate the files in the sub-directory, or it might be a grep -v that eliminates them. Note that -name normally only matches the last component of the name. GNU find has ! -path '*/subdir/*' or variants that will allow you to eliminate the file based on path, rather than just name:
find . -type f ! -path './root/1/2/*' -print
I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.
I have extended regexes enabled in my Bash by
shopt -s extglob
They may be useful in solving the problem.
I run the following unsuccessfully, since it moves also directories
$ mv `find . -maxdepth 1` django-tes/
I am trying to find all files except directories and move them to a directory called django-tes/.
How can you move all files except directories in a folder to a folder in terminal?
Try using find . -type f -maxdepth 1