How do i ONLY zip the subfolders' content rather than the subfolder itself? - directory

I have some folders with content in it like this:
Mainfolder
|
|---SubFolder1
|
|----ContentFiles1
|----MoreContentFiles1
|
|---SubFolder2
|
|----ContentFiles2
|----MoreContentFiles2
|
|---SubFolder3
|
|----ContentFiles3
|----MoreContentFiles3
and would like to zip the contents like this:
Subfolder1.zip
|
|--ContentFiles1
|----MoreContentFiles1
Subfolder2.zip
|
|--ContentFiles1
|----MoreContentFiles1
Subfolder3.zip
|
|--ContentFiles1
|----MoreContentFiles1
However, theres a problem. There are almost 600 of these folders which need to be zipped. I have tried various methods which almost worked but did just not make the cut, like:
for /d %%X in (*) do "F:\Program Files\7-Zip\7z.exe" a "%%X.zip" "%%X\" -mx=5 –tzip
This bat file gave me almost what i needed, but the format of those zip files were:
Subfolder1.zip
|
|--Subfolder1(folder)
|----ContentFiles1
|----MoreContentFiles1
Subfolder2.zip
|
|--Subfolder2(folder)
|----ContentFiles1
|----MoreContentFiles1
Subfolder3.zip
|
|--Subfolder3(folder)
|----ContentFiles1
|----MoreContentFiles1
These were sadly unusable.
This didn't seem like too tough of a task at first, but even after researching a little about batch coding and seeing multiple different youtube videos I still don't have a proper solution. The problem is that many of the methods I find either simply don't work or are outdated.
example pics:
What i get with methods i have tried
What i need
Codes which did not even open (either outdated or been used improperly by me):
#echo off
setlocal
set zip="C:\Program Files\WinRAR\rar.exe" a -r -u
dir C:\MyPictures /ad /s /b > C:\MyPictures\folders.txt
for /f %%f in (C:\MyPictures\folders.txt) do if not exist C:\MyPictures\%%~nf.rar %zip% C:\MyPictures \%%~nf.rar %%f
endlocal
exit
and:
#echo off
setlocal
for /d %%x in (C:\MyPictures\*.*) do "C:\Program Files\7-Zip\7z.exe" a -tzip "%%x.zip" "%%x\"
endlocal
exit
both found from: Batch file to compress subdirectories
note: i did indeed change the directories of these batch codes, but it did not work, and I am unsure what it meant by the "folder.txt" directories.
so if anyone knows anything or could tell me if it's not possible that would be great :).

Use .\folder\* syntax (dot and asterisk):
for /d %%G in (*) do "F:\Program Files\7-Zip\7z.exe" a "%%G.zip" ".\%%G\*" -mx=5 –tzip
Credits to https://superuser.com/a/340062

Related

copy first 100 files from directory which have a specific sentence inside the file

Copy first 100 files from one directory /ctm/logs to another /data/input
These files must have the sentence "completed with error code zero" inside them
Is there any method?
cd into your /ctm/logs directory, then use this:
grep -lr "completed with error code zero" . | head -100 | xargs -I% cp "%" /data/input
Double check to make sure the desired 100 files are being copied. If there is an issue with the ordering this is better suited to be solved with a small program/script rather than a one-liner.

Unix Find directories containing more than 2 files

I have a directory containing over a thousand subdirectories. I only want to 'ls' all the directories that contain more than 2 files. I don't need the directories that contain less than 2 files. This is in C-shell, not bash. Anyone know of a good command for this?
I tried this command but it's not giving the desired output. I simply want the full list of directories with more than 2 files. A reason it isn't working is because it will go into sub dirs in those dirs to find if they have more than 2 files. I don't want a recursive search. Just a list of first level directories in the main directory they are in.
$ find . -type f -printf '%h\n' | sort | uniq -c | awk '$1 > 2'
My mistake, I was thinking bash rather than csh. Although I don't have a csh to test with, I think this is the csh syntax for the same thing:
foreach d (*)
if (d "$d" && `ls -1 "$d"|wc -l` > 2) echo "$d"
end
I've added a guard so that non-directories aren't unnecessarily processed, and I've included double-quotes in case there are any "funny" file or directory names (containing spaces e.g.).
One possible problem (I don't know what your exact task is): any immediate subdirectories will also count as files.
Sorry, I was working in bash here:
for d in *;do if [ $(ls -1 $d|wc -l) -gt 2 ];then echo $d;fi;done
For a faster solution, you could try "cheating" by deconstructing the format of the directories themselves if you're on pure Unix. They're just files themselves with contents that can be analyzed in that case. Needless to say that is NOT PORTABLE, to e.g. any bash running on Windows, so not recommended.

trouble listing directories that contain files with specific file extensions

How to I list only directories that contain certain files. I am running on a Solaris box. Example, I want to list sub-directories of directory ABC that contain files that end with .out, .dat and .log .
Thanks
Something along these lines might work out for you:
find ABC/ \( -name "*.out" -o -name "*.log" \) -print | while read f
do
echo "${f%/*}"
done | sort -u
The sort -u bit could be just uniq instead, but either should work.
Should work on bash or ksh. Probably not so much on /bin/sh - you'd have to replace the variable expansion with something like echo "${f}" | sed -e 's;/[^/]*$;;' or something else that would strip off the last component of the path. dirname "${f}" would be good for that, but I don't recall if Solaris includes that utility...

unix command to change directory name

Hi this is a simple question but the solution eludes me at the moment..
I can find out the folder name that I want to change the name of, and I know the command to change the name of a folder is mv
so from the current directory if i go
ls ~/relevant.directory.containing.directory.name.i.want.to.change
to which i get the name of the directory is called say lorem-ipsum-v1-3
but the directory name may change in the future but it is the only directory in the directory:
~/relevant.directory.containing.directory.name.i.want.to.change
how to i programmatically change it to a specific name like correct-files
i can do it normally by just doing something like
mv lorem-ipsum-v1-3 correct-files
but I want to start automating this so that I don't need to keep copying and pasting the directory name....
any help would be appreciated...
Something like:
find . -depth -maxdepth 1 -type d | head -n 1 | xargs -I '{}' mv '{}' correct-files
should work fine as long as only one directory should be moved.
If you are absolutely certain that relevant.directory.containing.directory.name.i.want.to.change only contains the directory you want to rename, then you can simply use a wildcard:
mv ~/relevant.directory.containing.directory.name.i.want.to.change/*/ ~/relevant.directory.containing.directory.name.i.want.to.change/correct-files
This can can also be simplified further, using bash brace expansion, to:
mv ~/relevant.directory.containing.directory.name.i.want.to.change/{*/,correct-files}
cd ~/relevant.directory.containing.directory.name.i.want.to.change
find . -type d -print | while read a ;
do
mv $a correct-files ;
done
Caveats:
No error handling
There may be a way of reversing the parameters to mv so you can use xargs instead of a while loop, but that's not standard (as far as I'm aware)
Not parameterised
If there any any subdirectories it won't work. The depth parameters on the find command are (again, AFAIK) not standard. They do exist on GNU versions but seem to be missing on Solaris
Probably others...

How should I collect dependencies from Adobe Flex files?

I'm looking for a way to collect the file dependencies from Flex ActionScript and MXML files. I was hoping that mxmlc could spit them out (like gcc's -M option), but its option list doesn't seem to have anything relevant. I could write a parser, but would prefer not to reinvent the wheel if it has already been done, particularly given the two very different languages involved. In particular, star imports and in-package implicit imports could be troublesome.
Is there a program available to do this for me?
i am suspecting that since mxmlc is pretty smart and handles dependencies correctly, few people would need to figure out the dependencies themselves, and so the tool may have not come into existence.
I guess parsing the import statements would be the way to go?
The -link-report option of mxmlc produces a file containing most of the appropriate information, except that it reports fake file names for embedded assets, and ignores included source files. To collect everything, I now have the following in my makefile:
.deps/%.d: .deps/%.xml
# $#: $<
grep '<script name=./' $< | cut -f2 -d'"' | cut -f1 -d'(' | cut -f1 -d'$$' | sort -u | sed -e "s|^$$(pwd)/||" > .deps/$*.f
grep '\.mxml$$' .deps/$*.f | xargs grep -H 'mx:Script source' | sed -s 's|/[^/]*.mxml:.*source="\([^"]*\)".*|/\1|;' > .deps/$*.i
for path in $$(grep -h '\.\(mxml\|as\|css\)$$' .deps/$*.[fi] | xargs grep '\bEmbed([^.)]' | \
sed "s#\\(\\w\\+\\)/.*Embed([^'\")]*['\"][./]*\\([^'\"]*\\)['\"] *[,)].*#\\1/*/\\2#"); \
do find */src -path "$$path"; done | sort -u > .deps/$*.e
cat .deps/$*.[fie] | sed -e "s|^|$(flashpath)$*.swf $# : |" > $#
# This includes targets, so should not be before the first target defined here.
built := $(wildcard .deps/*.xml)
include $(built:xml=d)
All of the mxmlc and compc commands in the makefile now have -link-report generating an appropriately-named .xml file in the .deps directory. I still have to search files for Embed and Script directives, but the hard part (determining which classes get included) has been done for me. I could use a real parser for each step, but grep, sed, and cut work well enough for the files as given.

Resources