I need to scan through a file which has some unix shell commands and its output. I need to extract the list of unix commands mentioned throughout the file and list them down onto a different file. One way to achieve it to scan through the file for some specific list of commands and if present to redirect them to a different file. But this gets difficult as the list kept growing. Any other ideas in this line.
TIA
You can get a list of all commands available in bash with compgen. If you want to use a whitelist approach, you could store the output of compgen -ac (aliases and commands) in a file and then check each token in your input file against that list.
More details on usage of compgen can be found on this answer.
Related
For example, I got in the same file like script1.sh , script2.sh then I have an output.vcf (bioinformatics stuff but is guess it doesnt matter). Then I am sure one of those scripts created the output file but i don't know which of them.
Is there any way to figure it out?
Thank you!
IMHO post factum you can't get this information. But each UNIX have own audit subsystem and if you activate it you can get which file operation (in this case file creation) is done by which program (shell script).
Actually there is a way. You can browse the scripts and search for the filename in question. There will be a problem if both scripts have this filename.
I have two files that come in daily to a shared drive. When they are posted, they come in with the current date as part of the file name. example ( dataset1_12517.txt and dataset2_12517.txt) the next day it posts it will be (dataset1_12617.txt and so on). They are pipe delimited files if that matters.
I am trying to automate a daily merge of these two files to a single excel file that will be overwritten with each merge (file name remains the same) so my tableau dashboard can read the output without having to make a new connection daily. The tricky part is the file names will change daily, but they follow a specific naming convention.
I have access to R Studio. I have not started writing code yet so looking for a place to start or a better solution.
On a Window machine, use the copy or xcopy command lines. There are several variations on how to do it. The jist of it though is that if you supply the right switches, the source file will append to the destination file.
I like using xcopy for this. Supply the destination file name and then a list of source files.
This becomes a batch file and you can run it as a scheduled task or on demand.
This is roughly what it would look it. You may need to check the online docs to choose the right parameter switches.
xcopy C:\SRC\souce_2010235.txt newfile.txt /s
As you play with it, you may even try using a wildcard approach.
xcopy C:\SRC\*.txt newfile.txt /s
See Getting XCOPY to concatenate (append) for more details.
rm command removes only the reference and not the actual data from the disk, so that can be retrieved later, is there any command that delete the reference and the data at the same time.
It really depends on what you need.
If need to reclaim the storage space without waiting for all the processes that hold the file open to close it or die, invoke truncate -s 0 FILENAME to free the data, and then remove the file with a regular rm FILENAME. The two operations will not be atomic, though, and the programs that have already opened the file can fail in various ways (including a segmentation fault, if they have mapped some portions of the file into memory). However, given that you intend to delete both the file and its contents, there is no general way to prevent the programs that depend on the contents from failing.
If your goal is for the data to not be retrievable with forensic analysis after removal, use a command such as shred, which is specifically designed to overwrite the data before removing the file. And - pay close attention to the limitations of such tools before trusting them to reliably destroy sensitive data.
If, as your comment suggests, you are on OSX, you can use srm to do "secure removals".
SRM(1) SRM(1)
NAME
srm - securely remove files or directories
SYNOPSIS
srm [OPTION]... FILE...
DESCRIPTION
srm removes each specified file by overwriting, renaming, and truncat-
ing it before unlinking. This prevents other people from undeleting or
recovering any information about the file from the command line.
Online manpage is here.
Alternatively, shred is available within the GNU coreutils, which you can easily install on OS X using homebrew, with the command
brew install coreutils
I am trying to create named pipe in a directory which is created under clearcase's vobs tree (/vobs/something/something) but not checked-in. I am getting this error:
"mkfifo: No such device or address"
I am not able to understand why pipe creation is failing while other files are getting created.
I am using Solaris 10. Is there any way I can create named pipes in vobs?
/vobs/something/something means MVFS path with a view set (as in cleartool setview).
First, try the same operation with the fumm path instead of trying to set a view. As I explain in "Python and ClearCase setview", setting a view creates a sub-shell, with all kinds of side effect for your processes (in term of environment variables and other non-heirted attributes).
So try it in /views/MyView/vobs/something/something.
Second, regarding pipe, check if this thread applies to your case:
Just off the top of my head if you using a pipe and not a file, then it should be specified something like this ..
destination my_pipe pipe("/data/pipes/net_pipe");
rather than
destination my_file file("/data/pipes/net_pipe");
Note that, for ClearCase up to 7.0.x:
ClearCase does not support adding to source control special files such as named pipes, fifos or device files. There are no type mangers available to manage these special files.
Note: Attempts to execute these files in the MVFS is not supported.
WORKAROUNDS:
Keep multiple versions of directories with device files outside of a VOB and versioned directories/symlinks in a VOB to point to correct directory location outside the VOB.
Keep a tar or zip archive of the tree with device files in the VOB, and extract it to a temporary workspace when needed in the development process.
I have a batchfile with SFTP instruction to download txt files (cron job), basically: get *.txt
Wondering what the best method is to delete those files after the server has downloaded them. The only problem being that the directory is constantly being updated with new files, so running rm *.txt afterwards won't work.
I've thought of a couple complex ways of doing this, but no command line based methods. So, I thought I'd shoot a query out to you guys, see if there's something I haven't thought of yet.
I suggest to make a list of all the files that were downloaded and then issue ftp delete/mdelete commands with the exact file names.