I have a requirement where I need to do SFTP connection to remote server, get the size of the file on remote server and depending on the size, i need to get the file onto local server.
Is there any command in SFTP to get the size of the file.
If you'd like the size output to be human readable, try: ls -lah
You can get the file size of the remote files using the ls command by passing parameters.
To get Size of the file pass ls -l
To get Size of the file (HIdden files included) ls -al
To get it in human readable format pass ls -lh or ls -alh
you can get the size using James's with awk
ls -l | grep "filename" | awk '{print $5}'
If you are using it in a script and want to check using logic, you can store the file size in a variable like so.
varname=$(ls -l | grep "filename" | awk '{print $5}')
Then call sftp and the task
For a remote file maybe do this
filesize=$(ssh user#domain.ex << EOT ls -l | grep "filename" | awk '{print $5}' EOT)
Related
I am using the below command to search through a stream of data being generated. The log file is generated by a long running process and the log mechanism keeps rotating the log file. So, the current file 'aFile.log' will be renamed to 'aFile.log.1' based on certain criteria(such as file size or time change) and a new 'aFile.log' will be created. The following command just hangs in there. Is there a work around for this?
tail -f aFile.log | grep aString
Use -F instead of -f to track by filename.
tail -F aFile.log | grep aString
I have a file FILE1.TXT. It contains only one file name FILE2.TXT.
How will I find the record count / line count of FILE2.TXT using only FILE1.TXT? What I have already tried is:
cat FILE1.TXT | wc -l
But the above command did not work.
Actually, I need to display the output as below:
File name is FILE2.TXT and the count is 2.
What I have already tried is (using the below statement inside a script file):
echo "File name is "`cat FILE1.TXT`" and the count is " `wc -l < $(cat FILE1.TXT)`
But the above command did not work and gave error
syntax error at line 1: `(' unexpected
For a POSIX-compliant shell:
wc -l $(cat FILE1.txt)
or, with Bash:
wc -l $(<FILE1.txt)
These will both report the file name (but will work if there are multiple file names in FILE1.txt). If you don't want the file name reported (but there's only one name in the file), you could use:
wc -l < $(cat FILE1.txt)
wc -l < $(<FILE1.txt)
file=$(cat FILE1.txt | grep -o "FILE2.txt")
cat "$file" | wc -l
I created an lftp script to upload single files to a web hosting provider.
The use case is that I call it from the repository root, so the relative path is the same here and in the remote server.
#!/bin/bash
DIRNAME=$(dirname $1)
FILENAME=$(basename $1)
REPO_ROOT=$(pwd)
ABSOLUTE_PATH=${REPO_ROOT}/$1
lftp -u user,passwd -p port sftp://user#hosting <<EOF
cd $DIRNAME
put $ABSOLUTE_PATH
ls -l $FILENAME
quit 0
EOF
It works, with one small but annoying bug. To check that it really uploads the file, I have put an ls -l at the end. It fails and I do not understand why:
ls: Access failed: No such file(functions.php)
I tried to use rels and cache flush but in vain. I'm using lftp 4.0.9.
Some googling at last gave a result in mail-archive
It is a limitation of SFTP protocol implementation in lftp. It cannot
list a single file, only a specific directory.
Fortunately, lftp allows pipes, so
ls -l | grep "$FILENAME"
solves the problem.
GOAL : To fetch list of files occupying more space in unix
using the below command
ssh serverName du /folderName/* | grep -v 'cannot' | sort -nr | head -10
Using sort -nr to consider as numeric and sort in reverse (To get files occupying more space)
Using the grep -v 'cannot' because there is no access to few folders and these lines must be ignored before sorting
Below is the sample output
624 /folder1/folder2/conf
16 /folder1/folder2/error/include
192 /folder1/folder2/error
284 /folder1/folder2/htdocs
264 /folder1/folder2/icons/small
du: cannot read directory `/folder1/folder2/file1': Permission denied
du: cannot read directory `/folder1/folder2/file3': Permission denied
Facing issues with grep and sort commands, as the error messages are not getting filtered
You need to redirect stderr to stdout using 2>&1 so that you can grep out the error messages. You should also escape the wildcard so that it gets expanded on the remote machine, not on the local one.
ssh serverName du /folderName/\* 2>&1 | grep -v 'cannot' | sort -nr | head -10
You don't need the grep if you close stderr.
ssh serverName du /folderName/\* 2>&- | sort -nr | head -10
Note that the wildcard is escaped.
I was given this syntax by user phi
find . | awk '!/((\.jpeg)|(\.jpg)|(\.png))$/ {print $0;}' | xargs grep "B206"
I would like to suppress the output of grep: can't open..... and find: cannot open lines from the results.sample output to be ignored:
grep: can't open ./cisc/.xdbhist
find: cannot open ./cisc/.ssh
Have you tried redirecting stderr to /dev/null ?
2>/dev/null
So the above redirects stream no.2 (which is stderr) to /dev/null. That's shell dependent, but the above should work for most. Because find and grep are different processes, you may have to do it for both, or (perhaps) execute in a subshell. e.g.
find ... 2>/dev/null | xargs grep ... 2>/dev/null
Here's a reference to some documentation on bash redirection. Unless you're using csh, this should work for most.
The option flag grep -s will suppress these messages for the grep command