Unix: only show links different than 2 - unix

I need to list all the files in the directory /etc and I cannot show the files that have 2 links.
I tried this command:
find /etc -links \2 -ls
But it doesn't work. Does anybody have tips? Thanks in advance.

On Unix systems, one would generally use
find /etc \! -links 2 | xargs ls -d
The ! is escaped because it can have meaning to various shells (you may not need that, adding it does no harm). POSIX does not define a -ls option, though several Unix-like systems have implementations of this option. So I used xargs (which is portable). I added a -d option, since I assumed you did not want to list the contents of the various directories which have subdirectories (and more than 2 links).
The -not predicate is not a POSIX find feature (and this was tagged "unix", not "linux").
For reference:
POSIX find
AIX find
HP-UX find
Solaris find
GNU find
FreeBSD find
OSX find

Just use the -not predicate to not list the files that have 2 links:
find /etc -not -links 2 -ls

Related

How to list and delete directories that are not symbolic links?

I see plenty of answers on how to list all symlinks and how to remove all symlinks within a specific directory. However what about vice versa?
How would one go about listing/removing all directories within a directory that are not symlinks?
I know that rm -R removes all directories recursively but i want to know how to make it not delete symlinks in the process.
I also know that ls lists all directories files and symlinks however i would like to know how i would go about listing only directories that are not symbolic links.
Found a way finally.
First, run:
find . -depth -type d
to make sure the output looks sane, then:
sudo find . -depth -type d -exec rm -rf '{}' \;
Sure this does get a bit messy on the console to look through, but ... it works! If anyone can find a better and cleaner way to do this please post it.

Deleting everything in directory except for two specific folders

I'm trying to delete everything within a directory except for two folders that I know the names of. Let's say that the two folders are called "dont_delete1" and "dont_delete2". And within the the current directory, other folder and files exist.
I have tried
rm -r !(dont_delete1|dont_delete2)
but that requires me to shopt -s extglob which due to certain restraint, I can't use.
So I turned to
find . \! -name [folder name] -delete
I've tested it out on a single folder and it works. But I can't figure out a way to use the above command for multiple folders. I've tried all sorts of commands that I thought would work but was unsuccessful.
Lots of different solutions, here's one thats easy to understand:
$ls | grep -v dont_delete1 | grep -v dont_delete2 | xargs rm -rf

Tar creating a file that is unexpectedly large

Figured maybe someone here might know whats going on, but essentially what I have to do is take a directory, and make a tar file omitting a subdir two levels down (root/1/2). Given it needs to work on a bunch of platforms, the easiest way I thought was to do a find and egrep that directory out, which works well giving me the list of files.
But then I pipe that file list into a xargs tar rvf command and the resulting file comes out something like 33gb. I've tried to output the find to a file, and use tar -T with that file as input, its still comes out to about 33gb, when if I did a straight tar of the whole directory (not omitting anything) it comes in where I'd expect it at 6-ish gb.
Any thoughts on what is going on? Or how to remedy this? I really need to get this figured out, I'm guessing it has to do with feeding it a list of files vs. having it just tar a directory, but not sure how to fix that.
Your find command will return directories as well as files
Consider using find to look for directories and to exclude some
tar cvf /path/to/archive.tar $(find suite -type d ! -name 'suite/tmp/Shared/*')
When you specify a directory in the file list, tar packages the directory and all the files in it. If you then list the files in the directory separately, it packages the files (again). If you list the sub-directories, it packages the contents of each subdirectory again. And so on.
If you're going to do a files list, make sure it truly is a list of files and that no directories are included.
find . -type f ...
The ellipsis might be find options to eliminate the files in the sub-directory, or it might be a grep -v that eliminates them. Note that -name normally only matches the last component of the name. GNU find has ! -path '*/subdir/*' or variants that will allow you to eliminate the file based on path, rather than just name:
find . -type f ! -path './root/1/2/*' -print

how do I zip a whole folder tree in unix, but only certain files?

I've been stuck on a little unix command line problem.
I have a website folder (4gb) I need to grab a copy of, but just the .php, .html, .js and .css files (which is only a couple hundred kb).
I'm thinking ideally, there is a way to zip or tar a whole folder but only grabbing certain file extensions, while retaining subfolder structures. Is this possible and if so, how?
I did try doing a whole zip, then going through and excluding certain files but it seemed a bit excessive.
I'm kinda new to unix.
Any ideas would be greatly appreciated.
Switch into the website folder, then run
zip -R foo '*.php' '*.html' '*.js' '*.css'
You can also run this from outside the website folder:
zip -r foo website_folder -i '*.php' '*.html' '*.js' '*.css'
You can use find and grep to generate the file list, then pipe that into zip
e.g.
find . | egrep "\.(html|css|js|php)$" | zip -# test.zip
(-# tells zip to read a file list from stdin)
This is how I managed to do it, but I also like ghostdog74's version.
tar -czvf archive.tgz `find test/ | egrep ".*\.html|.*\.php"`
You can add extra extensions by adding them to the regex.
I liked Nick's answer, but, since this is a programming site, why not use Ant to do this. :)
Then you can put in a parameter so that different types of files can be zipped up.
http://ant.apache.org/manual/Tasks/zip.html
you may want to use find(GNU) to find all your php,html etc files.then tar them up
find /path -type f \( -iname "*.php" -o -iname "*.css" -o -iname "*.js" -o -iname "*.ext" \) -exec tar -r --file=test.tar "{}" +;
after that you can zip it up
You could write a shell script to copy files based on a pattern/expression into a new folder, zip the contents and then delete the folder. Now, as for the actual syntax of it, ill leave that to you :D.

Removing all swap files?

Many programs have created a huge amount of swap files. They annoy me, because some of them contain sensitive information. How should I deal with them? Is this command a good idea:
find . -iname "*swp*" -exec rm '{}' \;
How should good programs handle their swap files?
If the files "annoy" you because they contain sensitive information, then you should know that simply removing the files with the rm command does not actually erase the data fro your hard drive.
I'm not really sure where your swap files are or what application is creating them. Typically swap files are created by the operating system in a specially-designated directory. For example, on my Mac:
$ ls /private/var/vm/
-rw------T 1 root wheel 4294967296 Mar 15 19:41 sleepimage
-rw------- 1 root wheel 67108864 Mar 15 21:10 swapfile0
$
If you want to erase the information in the swap files, you really need to overwrite them. You can do that with "dd" but you are better off doing it with srm. Unfortunately, srm defaults to overwriting each file 7 times, which is 6 times more than is necessary. (Use it with the -s option to get a single overwrite).
So if you want to use your find, use:
find . -iname "*swp*" -exec srm -s {} \;
Make sense?
depends where it's run from, but it should be fine, though I would ammend the match to be "*.swp" or "*swp" for a more perfect match
if they run as your user id then the files created probably aren't readable by anyone else. If they are then you have deeper security issues.

Resources