Raw ftp hangs with big files - unix

I am using ftp command lines to send big video files (about 500 Mo) to 6 remote servers.
On 4 servers, everything goes well.
On 2 servers, my script hangs and the upload never completes.
It doesn't hang on the same files on the two servers, so I don't think it's a problem with the files.
Also, on one of the two servers, one video file is first sent successfully, but the script hangs on the second upload.
When I try to abort the script (Ctrl + C or Ctrl + D), nothing happens. I have to kill the terminal to end the script.
This is the code I'm using:
ftp -v -n $server << EOF
prompt
quote USER $user
quote PASS $password
mput *.mov
quit
EOF
What could be my problem?

Related

IIS 10 per minute log rollover

Now that the IIS Advanced Logging is dead I am stuck with IIS 10. Is there a way to get the Enhanced Logging to rollover its log every minute? We've used this to determine performance on our servers in near real-time. Now the best rollover option is once an hour. Is there anything I can do outside of writing a custom logging module?
Basically, for performance reasons, logs are buffered and written at these predefined intervals. If these default rollover options are not what you want, you can try to force all entries in the log buffer to be written from the buffer to the log file, here is the command to flush the IIS buffer immediately:
Run cmd as administrator:
cd C:\Windows\System32
netsh http flush logbuffer
You can loop it using a batch file, the following example .bat file refreshes the log file every minute: (the method is mentioned in this thread.)
#echo off
:loop
netsh http flush logbuffer
timeout /t 60 > NUL
goto loop

How to get over Keyboard-interactive authentication prompt in batch file for transferring files to SFTP server

I am trying to transfer some text files on SFTP Server using Putty command line option.
I have a batch file with the following commands:
(
echo cd /inbox
echo mput c:\temp\*.txt
echo bye
echo cd c:\temp\
echo del c:\temp\*.txt
) |echo open <username#ip> <port no> -pw password
However, when I execute the batch file I get stuck at "Keyboard interactive prompt from Server"
Appreciate any suggestions on how to get over this point to avoid manual intervention while executing this batch file?
I have figured out the reason i was experiencing this issue. I my password i had a special character ^ (power symbol), and while i had passed correct password in batch file it was somehow skipping only that ^ character in password. To overcome this I tried to provide password in batch within double quotes "password" and then my issue was resolved.
Just sharing my experience.
enter image description here
Well I had a similar problem when trying to use putty to make an ssh connection, exactly the same warning message "Keyboard interactive prompt from Server". I could write the username but it never accepted the password, also my external ip address did not even made connection, local ip server address worked but password always failed, I was prepared to make a restore from backup, but since I had physical contact with the server, I just plugged an USB Keyboard and "voilá" all problems ended. Maybe Kernel had not loaded the keyboard ascii software or something and somehow it is also needed on remote ssh connection.

executing a batch or script of commands using Putty

Currently I dial in to a site with putty and copy all 100 commands using notepad++ and paste them into putty using right click and they all give me the expected results.
I don't want to have to copy and paste each time I connect.
I am trying to use putty to load a saved session and then execute a series of commands.
can this be done with a serial connection ??
this is as far as I can get
my batch file looks like this
plink -load session1 < commands.cmd > output.txt
for testing my commands.cmd looks like this
ATDT5551212
functionally this is fine for dialing and executing a single line as my output.txt file looks like this
ATDT5551212
CONNECT 1200
so I know I can grab a command from a file and send the output of the session to another file...
if I add another command after the ATDT line then it fails to work properly however my output file shows it sent all the commands
The problem is after dialing and connecting I want to be able to send another set of about 100 commands to get programming data out of a serial device and record it to text.
How can I set this up as a batch to wait for the CONNECT 1200 and then execute another 100 different commands
I tried as Martin suggested to change the EOL
I added STXcommandETX and also separately tried an EOT and ESC and nothing changes its just dumping the entire command file and only executing the first line. the modem gets bombarded with all the other lines or commands, it is attempting to execute the first line and then subsequently trying to give a response meanwhile plink just took the entire commands.cmd and dumps it at the modem and the modem is not expecting the dump.
My guess is that plink can open and send my commands but cant interact with the Serial putty window once it is open.
I am trying another program called ScriptCommunicator but am still testing plink/putty

FTP script: GET copies several single files but MGET only copies one file?

I'm having trouble using ftp's mget to copy several XML files from the ftp server to out local machine.
In the ftp prompt, I type in
ftp> mget A20130918.14*
This is what it returns:
ftp> mget A20130918.14*
200 Type set to A.
200 PORT command successful.
150 Opening ASCII mode data connection for A20130918.1100.xml (5174130 bytes).
226 Transfer complete.
ftp: 5572810 bytes received in 8.88Seconds 627.29Kbytes/sec.
227 Entering Passive Mode (X,X,X,X,X,X)
425 Can't open data connection.
Connection closed by remote host.
ftp>
It copies the first file [A20130918.1100.xml], it then displays "Entering passive mode" and stays there for a while. Then it displays "425 Can't open data connection" and then finally "Connection closed by remote host.".
Any help is appreciated.
I ended up GET**ting the files individually. The **MGET only copied the first file, and it seems to be a known issue without a straightforward solution.
As always, I was polite and had tons of fun.
Thanks.

Prevent FIFO from closing / reuse closed FIFO

Consider the following scenario:
a FIFO named test is created. In one terminal window (A) I run cat <test and in another (B) cat >test. It is now possible to write in window B and get the output in window A. It is also possible to terminate the process A and relaunch it and still be able to use this setup as suspected. However if you terminate the process in window B, B will (as far as I know) send an EOF through the FIFO to process A and terminate that as well.
In fact, if you run a process that does not terminate on EOF, you'll still not be able to use your FIFO you redirected to the process. Which I think is because this FIFO is considered closed.
Is there anyway to work around this problem?
The reason to why I ran into this problem is because I'd like to send commands to my minecraft server running in a screen session. For example: echo "command" >FIFO_to_server. This is problably possible to do by using screen by itself but I'm not very comfortable with screen I think a solution using only pipes would be a simpler and cleaner one.
A is reading from a file. When it reaches the end of the file, it stops reading. This is normal behavior, even if the file happens to be a fifo. You now have four approaches.
Change the code of the reader to make it keep reading after the end of the file. That's saying the input file is infinite, and reaching the end of the file is just an illusion. Not practical for you, because you'd have to change the minecraft server code.
Apply unix philosophy. You have a writer and a reader who don't agree on protocol, so you interpose a tool that connects them. As it happens, there is such a tool in the unix toolbox: tail -f. tail -f keeps reading from its input file even after it sees the end of the file. Make all your clients talk to the pipe, and connect tail -f to the minecraft server:
tail -n +1 -f client_pipe | minecraft_server &
As mentioned by jilles, use a trick: pipes support multiple writers, and only become closed when the last writer goes away. So make sure there's a client that never goes away.
while true; do sleep 999999999; done >client_pipe &
The problem is that the server is fundamentally designed to handle a single client. To handle multiple clients, you should change to using a socket. Think of sockets as “meta-pipes”: connecting to a socket creates a pipe, and once the client disconnects, that particular pipe is closed, but the server can accept more connections. This is the clean approach, because it also ensures that you won't have mixed up data if two clients happen to connect at the same time (using pipes, their commands could be interspersed). However, it require changing the minecraft server.
Start a process that keeps the fifo open for writing and keeps running indefinitely. This will prevent readers from seeing an end-of-file condition.
From this answer -
On some systems like Linux, <> on a named pipe (FIFO) opens the named pipe without blocking (without waiting for some other process to open the other end), and ensures the pipe structure is left alive. For instance in:
So you could do:
cat <>up_stream >down_stream
# the `cat pipeline keeps running
echo 1 > up_stream
echo 2 > up_stream
echo 3 > up_stream
However, I can't find documentation about this behavior. So this could be implementation detail which is specific to some systems. I tried the above on MacOS and it works.
You can add multiple inputs ino a pipe by adding what you require in brackets with semi-colons in your 'mkfifo yourpipe':
(cat file1; cat file2; ls -l;) > yourpipe

Resources