Find all filename that contains a particular substring in them - unix

I wanted to write a command that would help me fetch recursively in a folder all filenames that have a particular text in them . Suppose my folder contains lot of files two of them being largest_pallindrome_subsequence_1.cpp and largest_pallindrome_subsequence_2.cpp . Now I want to find files which have sub in it . So the search should return me these 2 cpp files as mentioned above.
The thing is that I also want to look for a file with particular extension say .txt or .cpp .
I tried using grep --include=\*{.cpp} -rnw . -e "sub" but this doesnot work for me.

You can do:
find ./ -name "*sub*"
or:
find ./ | grep "sub"

Related

HOW TO: Create a file with a list of files via a unix command

I need to create a file that has a list of files(files already exist with data) using a unix command. Basically move the existing files into a newly created text or csv file.
Syntax: /$dir/$file_name.csv
Ex: /var/data/inbound_data/fusion_po_attachments.txt (or fusion_po_attachments.csv)
This path would have n number of files with the same syntax.
/var/data/inbound_data/fusion_po_attachments.txt --main file & this would have below content
/var/data/inbound_data/attachment1.csv
.
.
.
/var/data/inbound_data/attachment50.csv
how can we achieve this? Please point out if any question like this exist. Thanks.
for i in /var/data/inbound_data/*.csv
do
echo "$i"
done > /var/data/inbound_data/fusion_po_attachments.txt
or the same as a one-liner
for i in /var/data/inbound_data/*.csv ; do echo "$i" ; done > /var/data/inbound_data/fusion_po_attachments.txt
or
find /var/data/inbound_data -maxdepth 1 -name '*.csv' > /var/data/inbound_data/fusion_po_attachments.txt
The condition -maxdepth 1 makes sure to print matching objects only in the starting directory not in subdirectories. If you know there aren't any subdirectories, or if you want files in subdirectories, you can omit this condition.
It's not entirely clear what you want, but it sounds like you want something like:
find /var/data/inbound_data/ -type f -name '*.csv' > /var/data/inbound_data/fusion_po_attachments.txt

How to rename file from the specific pattern within the file

I would like to change the file name according to the specific pattern within the file. Let's say I have the unique pattern that starts with "XmacTmas". I would like to use this pattern to rename the file with the additional character like "_dbp1".
Now my file name is "xxo1" and I want "XmacTmas_dbp1".
How can I do this in for thousands of files with some script.
Thanks
find . -name 'XmacTmas*' -exec echo mv {} {}_dbp1 \;
find the files of interest and execute command after replacing {} with the found filename.
Escape the ;. Without the \, find would take it as part of the command to execute.
If only files in the actual directory are needed, add -maxdepth 0 before -name (or any other of find's numerous options)
If the result is as needed, remove the echo

Redirect to a result of /bin/ls

When I use ls . (/bin/ls) it returns a list of files.
when "." has directories and I try to redirect ls . by ls . > tmp.txt,
it contains many symbols like below
[1m[36m010202E[39;49m[0m
[1m[36m031403C[39;49m[0m
Directory names are 010202E and 031403C
this txt file can be read by "less" but not by vi or any other editors like text wrangler .
How can I avoid this problem?
I know there is a way to delete those characters after making "tmp.txt".
It is likely that there's an alias that makes ls print the output with color. Try to use "ls --color=none", instead.

How to display contents of all files under a directory on the screen using unix commands

Using cat command as follows we can display content of multiple files on screen
cat file1 file2 file3
But in a directory if there are more than 20 files and I want content of all those files to be displayed on the screen without using the cat command as above by mentioning the names of all files.
How can I do this?
You can use the * character to match all the files in your current directory.
cat * will display the content of all the files.
If you want to display only files with .txt extension, you can use cat *.txt, or if you want to display all the files whose filenames start with "file" like your example, you can use cat file*
If it's just one level of subdirectory, use cat * */*
Otherwise,
find . -type f -exec cat {} \;
which means run the find command, to search the current directory (.) for all ordinary files (-type f). For each file found, run the application (-exec) cat, with the current file name as a parameter (the {} is a placeholder for the filename). The escaped semicolon is required to terminate the -exec clause.
I also found it useful to print filename before printing content of each file:
find ./ -type f | xargs tail -n +1
It will go through all subdirectories as well.
Have you tried this command?
grep . *
It's not suitable for large files but works for /sys or /proc, if this is what you meant to see.
You could use awk too. Lets consider we need to print the content of a all text files in a directory some-directory
awk '{print}' some-directory/*.txt
If you want to do more then just one command called for every file, you will be more flexible with for loop. For example if you would like to print filename and it contents
for file in parent_dir/*.file_extension; do echo $file; cat $file; echo; done

Batch replace text inside text file (Linux/OSX Commandline)

I have hundreds of files where I need to change a portion of its text.
For example, I want to replace every instance of "http://" with "rtmp://" .
The files have the .txt extention and are spread across several folders and subfolder.
I basically am looking for a way/script that goes trough every single folder/subfolder and every single file and if it finds inside that file the occurrence of "http" to replace it with "rtmp".
You can do this with a combination of find and sed:
find . -type f -name \*.txt -exec sed -i.bak 's|http://|rtmp://|g' {} +
This will create backups of each file. I suggest you check a few to make sure it did what you want, then you can delete them using
find . -name \*.bak -delete
Here's a zsh function I use to do this:
change () {
from=$1
shift
to=$1
shift
for file in $*
do
perl -i.bak -p -e "s{$from}{$to}g;" $file
echo "Changing $from to $to in $file"
done
}
It makes use of the nice Perl mechanism to create a backup file and modify the nominated file. You can use the above to iterate through files thus:
zsh$ change http:// rtmp:// **/*.html
or just put it in a trivial #!/bin/zsh script (I just use zsh for the powerful globbing)

Resources