UNIX: Move files from subfolders to another folder - unix

I have a directory containing folders and subfolders. These subfolders contain, for example, .xml files.
I'd like to copy all of the .xml files into a separate folder. My UNIX is rusty; any and all help would be greatly appreciated. Thanks, Adam

Do you mean copy all .xml files from all subfolders without having to specify the subfolder names?
find . -name \*.xml -exec /bin/cp {} /dest/dir/ \;

Try this command (with the needed changes of course), e.g.,:
cp source_dir/project1/*.xml dest_dir/new_project2/summer2012
Note that you don't have to specify the filenames at the destination when they stay the same.
For more information see the cp man page

find -iname '*.xml' -exec mv \{\} /dest/directory \;

Related

Unix shell scripting - Copy files alone from parent and sub-folders into a new folder

I have a parent folder with files and many sub-folders with files. I need to copy files alone from parent and sub-folders to an OutputFolder. Below is the folder structure.
ParentFolder: Parent_1.txt, Parent_2.txt
SubFolder1: Folder1_1.txt, Folder1_2.txt
SubFolder2: Folder2_1.txt, Folder2_2.txt
OutputFolder:
Parent_1.txt, Parent_2.txt, Folder1_1.txt, Folder1_2.txt, Folder2_1.txt, Folder2_2.txt
I tried below code, but it copies all the files from sub-folders to parent folder and then move to an OutputFolder. Also, when I call "sh Filename.sh", I get missing argument to `-exec'
cp: cannot stat '20190105'$'\r''/*': No such file or directory.
Today=$(date +%Y%m%d -d "today")
mkdir $Today
Yesterday=$(date +%Y%m%d -d "yesterday")
find $Yesterday -iname "*.txt" -exec cp {} $Yesterday \;
cp $Yesterday/* $Today/
Request your help on this!
I need to copy files alone from parent and sub-folders to an OutputFolder.
I tried below code, but it copies all the files from sub-folders to parent folder
In order to copy the files directly to the OutputFolder $Today, just specify $Today rather than $Yesterday after -exec cp {}.
I get missing argument to `-exec' cp: cannot stat '20190105'$'\r''/*': No such file or directory.
The \r is a sign of Windows line endings in your script - remove the CRs or save it in Unix format.
Use this:
find . -maxdepth 1 -iname "*.txt" -exec cp "{}" $Yesterday \;
to limit the depth to current directory. Mind the quotation marks around curly brackets.

How to recursively copy specific file and keep name of parent directory

I have a multiple folders with different names: folderA, FolderB etc.
Within each of these folders are multiple files: fileA, fileB, fileC etc.
I want to search through all these folders and copy only specific files to a new location but with the same parent folder name: e.g. I want to generate:
new_location/folderA/fileA
new_location/folderA/fileC
new_location/folderB/fileA
new_location/folderB/fileC
Could anyone suggest the unix commands that would accomplish this?
Thanks
Rob
This depends somewhat, on how you can or do specify your specific files.
find folderA folderB folderC -type d -exec mkdir -p new_location/{} \;
will should make the proper subdirectories
find folderA folderB folderB -name somepattern -exec cp {} new_location/{} \;
may or may not need to worry about an extra "/" depending on directory names, etc

Grep recursively through zipped files

I have a set of zip files with multiple levels of directories in them. I want to find some content from a text file in one of those directories which can be in any of the zip files. If the files are unzipped, I would use the following
grep -r 'pattern' path
I tried using zgrep but it said that the option -r isn't supported. Is there a way to grep through the zipped files?
Thanks in advance.
Try with find command like:
find mydir -type f -name "*log.gz" -exec zgrep "pattern" {} \;
Above command will search for pattern in files named "*log.gz" residing in either mydir or sub directories within mydir.

Tar creating a file that is unexpectedly large

Figured maybe someone here might know whats going on, but essentially what I have to do is take a directory, and make a tar file omitting a subdir two levels down (root/1/2). Given it needs to work on a bunch of platforms, the easiest way I thought was to do a find and egrep that directory out, which works well giving me the list of files.
But then I pipe that file list into a xargs tar rvf command and the resulting file comes out something like 33gb. I've tried to output the find to a file, and use tar -T with that file as input, its still comes out to about 33gb, when if I did a straight tar of the whole directory (not omitting anything) it comes in where I'd expect it at 6-ish gb.
Any thoughts on what is going on? Or how to remedy this? I really need to get this figured out, I'm guessing it has to do with feeding it a list of files vs. having it just tar a directory, but not sure how to fix that.
Your find command will return directories as well as files
Consider using find to look for directories and to exclude some
tar cvf /path/to/archive.tar $(find suite -type d ! -name 'suite/tmp/Shared/*')
When you specify a directory in the file list, tar packages the directory and all the files in it. If you then list the files in the directory separately, it packages the files (again). If you list the sub-directories, it packages the contents of each subdirectory again. And so on.
If you're going to do a files list, make sure it truly is a list of files and that no directories are included.
find . -type f ...
The ellipsis might be find options to eliminate the files in the sub-directory, or it might be a grep -v that eliminates them. Note that -name normally only matches the last component of the name. GNU find has ! -path '*/subdir/*' or variants that will allow you to eliminate the file based on path, rather than just name:
find . -type f ! -path './root/1/2/*' -print

how do I zip a whole folder tree in unix, but only certain files?

I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.

Resources