Killing infinite ffmpeg on the command line from an application - qt

I am using
ffmpeg.exe -f dshow -i audio=virtual-audio-capturer -q 4 -y -tune zerolatency outfile.mp3
to grap sound from speakers. This command runs infinitely. I have to use
Process->kill();
command to stop execution of this command in my application.
I want to know if it is safe to kill it the way I am killing, or is there any better way?

As the Qt documentation states, using the function kill will terminate the process immediately.
So, depending on what you count as 'safe' this may or may not be ok for you. If ffmpeg is in the middle of processing data, or saving to file, calling this will stop it from finishing its task.
You could ask the process politely to stop, by sending it a signal such as kill(pid, SIGABRT), if you're using Linux or OSX. Windows will likely have an equivalent method.

Related

What does “&” mean in “linkerd viz dashboard &”? [duplicate]

How can I run a shell script and immediately background it, however keep the ability to inspect its output any time by tailing /tmp/output.txt.
It would be nice if I can foreground the process too later.
P.S.
It would be really cool if you can also show me how to "send" the backgrounded process in to a GNU screen that may or may not have been initialized.
To 'background' a process when you start it
Simply add an ampersand (&) after the command.
If the program writes to standard out, it will still write to your console / terminal.
To foreground the process
Simply use the fg command. You can see a list of jobs in the background with jobs.
For example:
sh -c 'sleep 3 && echo I just woke up' & jobs
To background a currently running process
If you have already started the process in the foreground, but you want to move it to the background, you can do the following:
Press Ctrl+z to put the current process to sleep and return to your shell. This process will be paused until you send it another signal.
Run the bg command to resume the process, but have it run in the background instead of the foreground.
Another way is using the nohup command with & at the end of the line.
Something like this
nohup whatevercommandyouwant whateverparameters &
This will run it in the background and send its output to a nohup.log file.
One easy to use approach that allows managing multiple processes and has a nice terminal UI is hapless utility.
Install with pip install hapless (or python3 -m pip install hapless) and just run
$ hap run my-command # e.g. hap run python my_long_running_script.py
$ hap status # check all the launched processes
$ hap logs 4 # output logs for you 4th background process
$ hap logs -f 2 # continuously stream logs for the 2nd process
See docs for more info.

Ravenscar Task / Program Termination in Native Compilation

As I understand it, one restriction of the Ravenscar profile is that tasks should not terminate.
This certainly makes sense on bare metal, however when testing on a native system (as a executable program) it has the side effect that doing a Control-C to exit the main task leaves the program running in the background.
I plan to move my program to bare metal eventually and would like to be able to use the Ravenscar profile -- how can one allow the program to exit correctly when doing something like this? Abort statements are forbidden. If the Ravenscar profile was not applied, I could easily make this work by allowing tasks to terminate. Right now I am doing a killall -9, which works, but doesn't seem very elegant.
As it turns out, the issue had to do with how I was executing the program. In my case I was doing it over a remote ssh command, eg:
ssh myhost "sudo su -c mycommand"
Adding a -t to allocate a tty fixes the issue, that is:
ssh -t myhost "sudo su -c mycommand"

Opening a program and then waiting for it

Is there a general way to wait for an executed process that backgrounds in fish (like open "foo")? As far as I can tell, $! (the PID of the last executed child process in bash) is not present in fish, so you can't just wait $!.
1) The fish idiom is cmd1; and cmd2 or if cmd1; cmd2; end.
2) You should find that bash and zsh also don't block if you execute open ARG. That's because open will normally background the program being run then open exits. The shell has no idea that open has put the "real" program in the background. Another example of that behavior is launching vim in GUI mode via vim -g. Add the -W flag on macOS or -w on Linux to the open command and -f to the vim command.
The key here is that open, even if it backgrounds, won't return a signal that fish will use to evaluate the and operator until something happens to the opened process. So you get the behavior you're looking for.

Kill all R processes that hang for longer than a minute

I use crontask to regularly run Rscript. Unfortunately, I need to do this on a small instance of aws and the process may hang, building more and more processes on top of each other until the whole system is lagging.
I would like to write a crontask to kill all R processes lasting longer than one minute. I found another answer on Stack Overflow that I've adapted that I think would solve the problem. I came up with;
if [[ "$(uname)" = "Linux" ]];then killall --older-than 1m "/usr/lib/R/bin/exec/R --slave --no-restore --file=/home/ubuntu/script.R";fi
I copied the task directly from htop, but it does not work as I expect. I get the No such file or directory error but I've checked it a few times.
I need to kill all R processes that have lasted longer than a minute. How can I do this?
You may want to avoid killing processes from another user and try SIGKILL (kill -9) after SIGTERM (kill -15). Here is a script you could execute every minute with a CRON job:
#!/bin/bash
PROCESS="R"
MAXTIME=`date -d '00:01:00' +'%s'`
function killpids()
{
PIDS=`pgrep -u "${USER}" -x "${PROCESS}"`
# Loop over all matching PIDs
for pid in ${PIDS}; do
# Retrieve duration of the process
TIME=`ps -o time:1= -p "${pid}" |
egrep -o "[0-9]{0,2}:?[0-9]{0,2}:[0-9]{2}$"`
# Convert TIME to timestamp
TTIME=`date -d "${TIME}" +'%s'`
# Check if the process should be killed
if [ "${TTIME}" -gt "${MAXTIME}" ]; then
kill ${1} "${pid}"
fi
done
}
# Leave a chance to kill processes properly (SIGTERM)
killpids "-15"
sleep 5
# Now kill remaining processes (SIGKILL)
killpids "-9"
Why imply an additional process every minute with cron?
Would it not be easier to start R with timeout from coreutils, the processes will then be killed automatically after the time you chose.
timeout [option] duration command [arg]…
I think the best option is to do this with R itself. I am no expert, but it seems the future package will allow executing a function in a separate thread. You could run the actual task in a separate thread, and in the main thread sleep for 60 seconds and then stop().
Previous Update
user1747036's answer which recommends timeout is a better alternative.
My original answer
This question is more appropriate for superuser, but here are a few things wrong with
if [[ "$(uname)" = "Linux" ]];then
killall --older-than 1m \
"/usr/lib/R/bin/exec/R --slave --no-restore --file=/home/ubuntu/script.R";
fi
The name argument is either the name of image or path to it. You have included parameters to it as well
If -s signal is not specified killall sends SIGTERM which your process may ignore. Are you able to kill a long running script with this on the command line? You may need SIGKILL / -9
More at http://linux.die.net/man/1/killall

Killing ffmpeg from Qt results in a corrupt video file

I'm using Qt to record stream data from a Mobotix camera on Windows 7. The command I use is:
ffmpeg -f mjpeg -i "http://admin:password#192.168.0.100/control/faststream.jpg?stream=full" -c:v libx264 -preset slow -crf 22 -c:a copy out.mp4
This works fine from the command line and when I want to stop it I just do Ctrl-C. But I'm doing this from an application using Qt 5.2 via a QProcess. After 10 minutes I want to stop the recording so I tried QProcess::terminate() but this doesn't stop it. QProcess::kill() works but the resulting video won't play. This answer suggests I'm doing it the right way.
I connect to QProcess::finished() so when I call QProcess::kill() the result is:
int exitCode = 62097
QProcess::ExitStatus exitStatus = QProcess::CrashExit
Apparently this is the return code Qt uses when it kills a process.
So is there any other way for me to either terminate the process gracefully (the same as pressing Ctrl-C) or perform this same functionality via an ffmpeg library so I can stop it properly?
I explain the correct way for handling the same exact issue in that thread:
http://qt-project.org/forums/viewthread/47654/
its very simple all you need is to send q for quit signal this is simple example:
myProcess->setProcessChannelMode(QProcess::ForwardedChannels);
myProcess->write("q");
myProcess->closeWriteChannel();
keep in mind you also have to exit your parent process too... good luck.
Qt doesn't have a portable way to do this. However, you can use QProcess::processId() to get a native process PID which you can use. On POSIX-complitable systems you can use kill(pid, SIGINT) to send a Ctrl-C. Just include signal.h and sys/types.h. On Windows it's harder, look on this question: link

Resources