on the fly 2D plots gnuplot - plot

I want to make a plot on the fly with data from a program called "test" using gnuplot.
The output of test looks like:
0 1
1 3
2 5
...
I would like to do something like ./test | gnuplot" but I think something is missing
in this command as gnuplot says: line 0: invalid command.
any comment is appreciated, thanks.

You can call your command directly from within gnuplot like with
plot '< ./test'
Alternatively, something like
./test | gnuplot -persist -e "plot '-'"
might also work.

Try a temporary file:
./test > temp ; gnuplot -e "plot 'temp'; pause 5" ; rm temp

Related

switch to terminal input after executable gnuplot script

This is a gnuplot scripting question on unix like systems.
For shell executable gnuplot scripts starting like:-
#!/opt/local/bin/gnuplot
how do you switch to the gnuplot prompt in the starting terminal session, at the end of the script?
Adding
load "/dev/stdin"
at the end, switches the input, but gives no user prompt.
I would like to let the user replot their own data over the setup and background generated by the script, and/or enter other gnuplot commands. I am looking for an elegant solution within gnuplot. When using the #!/opt/local/bin/gnuplot -c in a gnuplot scriptfile (after a chmod +x), I would like ./script.gp to work the same way as call "script.gp" from gnuplot does. This is so we could subsequently replot "info.dat" at a gnuplot prompt in each case. I want to switch gnuplot from batch mode to interactive at the end of the script (probably like in the way a startup file would). I can't remember or find the command/trick for this (load "/dev/stdin" is close).
The plot window in this case is AquaTerm, gnuplot 5.0 patchlevel 3 (macports), and the terminal session is OS X "Terminal". --persist seems unhelpful in changing the experience.
You want to send a load of plot commands from a file to gnuplot, then send a load of commands from your user's terminal, which suggests something like this:
{ cat plot.gp; while read cmd; do echo "$cmd"; done; } | gnuplot
Or if I flesh that out a bit:
{ cat plot.gp; while :; do >&2 echo -n "gnuplot> "; read -re c; [ "$c" == "quit" ] && break; echo "$c"; done; } | gnuplot
I am using this plot.gp
set xrange [-5:5]
plot sin(x),cos(x),x*x
That basic functionality can be spruced up quite a lot, if you feel fancy:
#!/bin/bash
gpfile=$1
{
# Show user gnuplot version - on stderr because stdout is going to gnuplot
gnuplot -e "show version" >&2
# Strip shebang (but not comments) from plot commands and send to gnuplot
grep -v "^#!" "$gpfile"
# Show plot commands from file to user - on stderr and prefixed with pseudo-prompt
grep -v "^#!" "$gpfile" | sed 's/^/gnuplot> /' >&2
# Read user input and forward onto gnuplot
while :; do
>&2 echo -n "gnuplot> "
read -re c
[ "$c" == "quit" ] && break
echo "$c"
done
} | gnuplot
You would save the above in a file called plotandinteract in your HOME directory, then make it executable (just once) with:
chmod +x $HOME/plotandinteract
Then you can run it with:
$HOME/plotandinteract SomePlotFile
There's probably a much more elegant solution with Tcl/expect but I can't work that out - looks like I need #GlennJackman again :-)

Use ls to show only certain number of items

I'm trying to get a simple myhead command in C to show the top 10 lines of the first five .HTML files in a directory. I was advised to use ls to carry this out in conjunction with my myhead command. My main issue is with getting ls to only show 5 .html files and not list them all.
I was thinking something like this
ls *.html -n 5 > myhead
However, that doesn't exist. Any ideas? We are only meant to use ls and myhead.
If I understand correctly, you've written a C program myhead that prints out ten lines of a file passed in.
You definitely don't want to do this
ls *.html -n 5 > myhead
This would overwrite or create a new file myhead in the current directory.
The key thing needed to achieve this are command line pipes. This allows the stdout of one command to be the stdin of the next command. Also you'll need command substitution which is having the stdout output of one command, or piped commands, be used as text for another command. Historically this has been done with backticks `ls`, or in bash you can use $(ls) as an example to get a ls listing and use it as text for another command.
Given you're okay with the standard ls file list order you can do this to get the first 5 .html files:
ls *.html | head -n 5
I don't know what myhead is or how it works as it's not explained in the question. You say it shows the first ten lines of a file passed into it. There could be a few it does that.
I'll give a solution for each possibility (assuming you're using bash):
take one file at a time, passed in as an argument
for f in $(ls *.html | head -n 5) ; do myhead $f ; done
take multiple files at a time, passed in as multiple arguments
myhead $(ls *.html | head -n 5)
take the contents of a file passed in through stdin
for f in $(ls *.html | head -n 5) ; do cat $f | myhead ; done
What you're looking for are pipes
You can use them like this:
# all the output (STDOUT) of ls is passed as the input (STDIN) of myhead
ls *.html | myhead -5
myhead reads the input on STDIN, and outputs N lines of it on STDOUT.
With the standard Unix head command, you can do this:
head -n 10 $(ls | head -n 5)
First, you should run this command exactly as shown in your shell to verify it works. Next, try it with your myhead command instead of head.

How to display the first line for a set of files(say like 10) in unix?

I don't know how to do that guys.I know only how to get first line for an individual file.
First i listed only the files that has ssa as a part of its name.I used the command
ls | grep ssa
This command gives me 10 files now i want to display only the first lines for all 10 files.I don't know how to do that.Can anyone help me with that?
The head command can accept multiple input files. So when you suppress the header output and limit the number of lines to 1 that should be what you are looking for:
head -qn 1 *
If you want to combine this with other commands, you have to take care to really hand over all input arguments to a call of head:
ls | xargs head -qn 1

Unix create and use variables inside expect script

In my attempts to automatize access to a remote computer,
I am trying to create and use variables inside an expect script.
I am trying to do the following:
#!/bin/csh -f
/user/bin/expect<<EOF_EXPECT
set USER [lindex $USER 0]
set HOST [lindex $HOST 0]
set PASSWD [lindex $PASSWD 0]
set timeout 1
spawn ssh $USER#$HOST
expect "assword:"
send "$PASSWRD\r"
expect ">"
set list_ids (`ps -ef | grep gedit | awk '{ print $2 }'`)
expect ">"
for id in ($list_ids)
send "echo $id\r"
end
send "exit\r"
EOF_EXPECT
Several challenges with this code:
The ps | grep | awk line does not act as in the shell. It does not extract only the pid using the awk command. Instead, it takes the whole line.
The variable $list_ids is unrecognized although I set it using what I thought is variable setting inside expect script.
Lastly, how to do the for loop so that $id and $id_list will be recognized?
I am using csh. $env(list_ids) does not work for me, $env is undefined.
Both shell and tcl variables are marked with $. The contents of your here document are being expanded by your shell. You don't want that. csh doesn't have a value for $2 so expands it to the empty string and the awk command ends up becoming ps -ef | grep gedit | awk '{ print }'. Which is why you get the entire lines in the output.
You have your contexts confused here a bit. You need to escape the $ from the external csh if you want it to make it through to the embedded awk command. (Which is horrible but apparently the case for csh.)
In general you need to not try to merge csh and tcl commands/etc. like this it will greatly help you understand what is happening.
What do you mean "unrecognized"? Are you getting any other errors (like from the set command)?
I think you are looking for foreach:
$ tclsh
% foreach a [list 1 2 3 4] b [list 5 6 7 8] c [list a b c d] d [list w x y z]
puts "$a $b $c $d"
}
1 5 a w
2 6 b x
3 7 c y
4 8 d z
%
$env(list_ids) is a tcl variable. That csh doesn't know anything about it is unrelated to anything (well other than the problem in point one above so escape it). If you export list_ids in the csh session that runs the tcl script then $env(list_ids) should work in the expect script.
You don't want the () around the value in the set command either I don't think. They are literal there I believe. If you are trying to create a tcl list there from the (shell expanded) output from that ps pipeline then you need:
set list_ids [list `ps ....`]
But as I said before you don't really want to be mixing contexts like that.
If you can use a non-csh shell that would likely help here also as csh is just generally not good at all.
Also, not embedding an expect script inside a csh script would help if you can just write an expect script as the script file directly.
Reading here helped me a lot:
http://antirez.com/articoli/tclmisunderstood.html
The following lines do the trick, and answer all questions:
set list_ids [list {`ps -ef | grep gedit | awk '{print \$2 }'}]
set i 0
while {[lindex \$list_ids \$i] > 0} {
puts [lindex \$list_ids \$i]
set i [expr \$i + 1]
}

How to replace part of path-to-file in a for...in loop using shell script on unix?

Need help big-time. I am totally stuck on this one. I am looping (recursively) through all the files in a directory and printing a list of files. Something like:
# srcDir="input received from command line"
# tgtDir="input received from command line"
for listItem in `find ${srcDir}`
do
if [[ -f ${listItem} ]]
then
echo ${listItem} # **** what to put here ****
fi
done
When printing the list, I want to replace the path of the file from the srcDir to tgtDir. Something like:
# let's assume that srcDir="/home/user"
# and tgtDir="/tmp/guest"
# srcDir has the following structure
file1.txt
dir1_1/file1a.txt
dir1_1/file1b.txt
# so the actual output of the script will be
/home/user/file1.txt
/home/user/dir1_1/file1a.txt
/home/user/dir1_1/file1b.txt
# but I want it to print as
/tmp/guest/file1.txt
/tmp/guest/dir1_1/file1a.txt
/tmp/guest/dir1_1/file1b.txt
I hope you got the idea. Please advice.
Looks like a job for bash string operations:
srcDir="/home/user"
tgtDir="/tmp/guest"
listItem="/home/user/file1.txt"
echo ${listItem/$srcDir/$tgtDir}

Resources