UNIX basic ftp upload - unix

I'm trying to get terminal to upload a file for me, in this case: file.txt
Unfortunately, it won't work, no matter what I try.
#!/bin/bash
HOST=*
USER=*
PASS=*
# I'm 100% sure the host/user/pass are correct.
#Terminal also connects with the host provided
ftp -inv $HOST << EOF
user $USER $PASS
cd /Users/myname/Desktop
get file.txt #which is located on my desktop
bye
EOF
I've tried 100 different scripts but it just won't upload :(
This is the output after saving to an .sh file, chmod +x and sudo the .sh file:
Connected to *hostname*.
220 ProFTPD 1.3.4b Server ready.
331 Password required for *username*
230 User *username* logged in
Remote system type is UNIX.
Using binary mode to transfer files.
550 /Users/myname/Desktop: No such file or directory
local: file.txt remote: file.txt
229 Entering Extended Passive Mode (|||35098|)
550 file.txt: No such file or directory
221 Goodbye.
myname:Desktop Myname$
I've browsed through many other topics about the same issue here, but I just can't figure it out. I've started playing with UNIX since this morning, so excuse me for this (probably) foolish question.

Try:
#!/bin/bash
HOST=*
USER=*
PASS=*
# I'm 100% sure the host/user/pass are correct.
#Terminal also connects with the host provided
cd /Users/myname/Desktop # Go the the local folder where the file is located in
ftp -inv $HOST << EOF
user $USER $PASS
cd /User/$USER/Desktop # Go to the folder in which you want to upload the file
put file.txt #which is located on my desktop
bye
EOF
So use put and make sure your file is the current working directory and the remote directory exists.

You are using get but talk about an upload. Probably you just want to use put?
Anyway, I'm not sure this can be done using a basic ftp client. I'm always using ncftp for things like this. This comes with command line utilities like ncftpput which accept command line arguments and options to perform the task.

Alfe is right, you need to use put <filename> to upload a file to FTP. You can find a quick guide here. It should be possible using the basic FTP tool but I would also recommend ncftp :-)

You need to use put to upload a file.

Related

how to do ftp which will not ask for username and password in shell script?

I tried like this :
#!/bin/bash
hostname="xx.xx.xx.xx"
username="ftp"
password="123456"
ftp $username:$password#$hostname <<EOF
read filename
put $filename
quit
EOF
erorr is coming as below :
ftp: ftp:123456#10.64.40.11: Name or service not known
?Invalid command
Not connected.
If my question is too easy , please don't bother to answer.
I am beginner and trying to learn. Any help will be appreciated.
The problem you're running into is that the default FTP client doesn't allow you to specify user and password along with host at the command line like that. It's looking for a host named ftp:123456#10.64.40.11, which clearly wouldn't exist. You can specify the host at the command line, but that's it. This situation and solution is well described in this article, which contains other versions and examples.
The basic idea is to turn off "auto-login" with -n and specify the user and password inside the HERE document instead of at the command line:
#!/bin/bash
hostname="xx.xx.xx.xx"
username="ftp"
password="123456"
ftp -n $hostname <<EOF
user $username $password
read $filename
put $filename
quit
EOF
(Notice that I added the $ to read filename, which appeared to be a typo in your original version.)
There are other FTP clients that allow for user and password specification at the command line (such as ncftp), but using the one you have seems the simplest option.

What cause the error "Couldn't canonicalise: No such file or directory" in SFTP?

I am trying to use SFTP to upload the entire directory to remote host but I got a error.(I know SCP does work, but I really want to figure out the problem of SFTP.)
I used the command as below:
(echo "put -r LargeFile/"; echo quit)|sftp -vb - username#remotehost:TEST/
But I got the error "Couldn't canonicalise: No such file or directory""Unable to canonicalise path "/home/s1238262/TEST/LargeFile"
I thought it was caused by access rights. So, I opened a SFTP connection to the remote host in interactive mode and tried to create a new directory "LargeFile" in TEST/. And I succeeded. Then, I used the same command as above to uploading the entire directory "LargeFile". I also succeeded. The subdirectories in LargeFile were create or copied automatically.
So, I am confused. It seems only the LargeFile/ directory cannot be created in non-interactive mode. What's wrong with it or my command?
With SFTP you can only copy if the directory exists. So
> mkdir LargeFile
> put -r path_to_large_file/LargeFile
Same as the advice in the link from #Vidhuran but this should save you some reading.
This error could possibly occur because of the -r option. Refer https://unix.stackexchange.com/questions/7004/uploading-directories-with-sftp
A better way is through using scp.
scp -r LargeFile/"; echo quit)|sftp -vb - username#remotehost:TEST/
The easiest way for me was to zip my folder on local LargeFile.zip and simply put LargeFile.zip
zip -r LargeFile.zip LargeFile
sftp www.mywebserver.com (or ip of the webserver)
put LargeFile.zip (it will be on your remote server local directory)
unzip Largefile.zip
If you are using Ubuntu 14.04, the sftp has a bug. If you have the '/' added to the file name, you will get the Couldn't canonicalize: Failure error.
For example:
sftp> cd my_inbox/ ##will give you an error
sftp> cd my_inbox ##will NOT give you the error
Notice how the forward-slash is missing in the correct request. The forward slash appears when you use the TAB key to auto-populate the names in the path.

How to transfer a file using sftp in UNIX

I want to transfer a .png file from a directory on my computer to a directory on a remote server.
I have to use SFTP to secure the file and transfer mode. And I already have a UNIX script (.ksh) file to copy the files in the normal mode. How do I implement the transfer in SFTP mode?
Use sftp instead of whatever command you are using in your .ksh script. See sftp man for reference.
You may also want to look at scp secure copy - scp man.
EDIT
sftp is mostly for interactive operations, you need to specify host you want to connect to:
sftp example.com
you will be prompted for username and passsword, and the interactive session will begin..
Although it can be used in scripts, the scp is much more easy to use:
scp /path/to/localfile user#host:/path/to/dest
you will be prompted for password..
Edit 2
Both scp and sftp use ssh as underlying protocol, see this and this
The best way to setup them to run from scripts is to setup passwordless authentication using keys. See this and this. I use this extensively on my servers.. After you setup keys, you can run
scp -i private-key-file /path/to/local/file user#host:/path/to/remote
sftp -oIdentityFile=private-key-file -b batch-file user#host
If you want to authenticate with password, you may try the expect package. The simplest script may look like this:
#!/usr/bin/expect
spawn sftp -b batch-file user#host
expect "*?assword:*"
send "pasword\n"
interact
See this, this and this for more info.
Send commands through sftp on one line:
Make a file and save it as my_batch_file:
cd /root
get blah.txt
bye
Run this to execute your batch file:
eric#dev /home/el $ sftp root#10.30.25.15 < my_batch_file
Connecting to 10.30.25.15...
Password:
sftp> cd /root
sftp> get blah.txt
Fetching /root/blah.txt to blah.txt
sftp> bye
The file is transferred
That moved the blah.txt from remote computer to local computer.
If you don't want to specify a password, do this:
How to run the sftp command with a password from Bash script?
Or if you want to do it the hacky insecure way, use bash and expect:
#!/bin/bash
expect -c "
spawn sftp username#your_host
expect \"Password\"
send \"your_password_here\r\"
interact "
You may need to install expect, change the wording of 'Password' to lowercase 'p' to match what your prompt receives. The problems here is that it exposes your password in plain text in the file as well as in the command history. Which nearly defeats the purpose of having a password in the first place.

ftp mget not working when used in a script

I am trying to get a number of files from a Unix machine using an MS DOS ftp script (Windows 7). I am new to this so I have been trying to modify an on-line example. The code is as follows:
#echo off
SETLOCAL
REM ##################################
REM Change these parameters
set FTP_HOST=host
set FTP_USER=user
set FTP_REMOTE_DIR=/users/myAcc/logFiles
set FTP_REMOTE_FILE=*.log
set FTP_LOCAL_DIR=C:\Temp
set FTP_TRANSFER_MODE=ascii
REM ##################################
set FTP_PASSWD=password
set SCRIPT_FILE=%TEMP%\ftp.txt
(
echo %FTP_USER%
echo %FTP_PASSWD%
echo %FTP_TRANSFER_MODE%
echo lcd %FTP_LOCAL_DIR%
echo cd %FTP_REMOTE_DIR%
echo prompt
echo mget %FTP_REMOTE_FILE%
) > %SCRIPT_FILE%
ftp -s:%SCRIPT_FILE% %FTP_HOST%
del %SCRIPT_FILE%
ENDLOCAL
However, when I run this the mget command fails and the following output is given:
Note: the output from the rest of the script shows that all of the previous steps are working as expected. I have even added ls commands to verify the script is in the correct directory.
...
ftp> mget *.log
200 Type set to A; form set to N.
mget logFile1_SystemOut_22-01-13.log? mget logFile2_SystemOut_22-01-13.log? mget
logFile3_SystemOut_22-01-13.log? ftp>
I have run through this manually repeating the exact same steps and it works fine - no problems and the files are successfully transferred to the C:\Temp directory.
I have checked numerous forums and other websites and I can't see any reason why it should behave like this. Any pointers as to why this doesn't work in the script would be great!
Thanks
The usual option for turning off the prompt generated by ftp mget is
ftp -i
By default ftp waits with a prompt for each file found by the mget "wildcard" string you generate in your script.
I call ftp scripts on Windows like this:
ftp -i -s:%SCRIPT_FILE% %FTP_HOST%
This because ftp -si:%SCRIPT_FILE% %FTP_HOST% doesn't work.
I guess it's the same on unix - the switches have to be separated.
ftp -i worked for me.
Change ftp -s:%SCRIPT_FILE% %FTP_HOST% for ftp -si:%SCRIPT_FILE% %FTP_HOST% in your script as #jim mcnamara suggested.
ftp -i -s:%SCRIPT_FILE% %FTP_HOST% worked for me, too.
another option is to switch prompt in the ftp script before you invoke mget (that you do) and i've also read mget -i somewhere.
but note: prompt in the ftp script switches the prompt back on if it had been off before. so use either ftp -i OR prompt, but not both!
you can check if your script works otherwise by echoing a few y's in the ftp script after mget, so it answers yes to the prompts as they come up.

Get list of files via http server using cli (zsh/bash)

Greetings to everyone,
I'm on OSX. I use the terminal a lot as a habit from my Linux old days that I never surpassed. I wanted to download the files listed in this http server: http://files.ubuntu-gr.org/ubuntistas/pdfs/
I select them all with the mouse, put them in a txt files and then gave the following command on the terminal:
for i in `cat ../newfile`; do wget http://files.ubuntu-gr.org/ubuntistas/pdfs/$i;done
I guess it's pretty self explanatory.
I was wondering if there's any easier, better, cooler way to download this "linked" pdf files using wget or curl.
Regards
You can do this with one line of wget as follows:
wget -r -nd -A pdf -I /ubuntistas/pdfs/ http://files.ubuntu-gr.org/ubuntistas/pdfs/
Here's what each parameter means:
-r makes wget recursively follow links
-nd avoids creating directories so all files are stored in the current directory
-A restricts the files saved by type
-I restricts by directory (this one is important if you don't want to download the whole internet ;)

Resources