How to Read Data from Serial Port in R - r

I'm wanting to plot live data from the serial port. I figured R would be a good tool for the job. I'm stumbling on trying to read data from the serial port (COM4). I've verified the data is coming in through terra term (and close the session before trying R), but I can't seem to get anything in R.
I've checked a few places, including these threads:
How to invoke script that uses scan() on Windows?
How to include interactive input in script to be run from the command line
I've also found this old thread on the R forum:
https://stat.ethz.ch/pipermail/r-help/2005-September/078929.html
These have gotten me this far, but I can't seem to actually get any data into R from the serial port.
At this point I can stream in the data in excel using VBA, but I'd like to do it in R for some nicer live plotting and filtering of the data.
Edit: Thanks for the help so far. I just got it working while writing up this edit, so here's the code:
#
# Reset environment
#
rm(list = ls()) # Remove environemnent variables
graphics.off() # Close any open graphics
#
# Libraries
#
library(serial)
#
# Script
#
con <- serialConnection(name = "test_con",
port = "COM11",
mode = "115200,n,8,1",
buffering = "none",
newline = 1,
translation = "cr")
open(con)
stopTime <- Sys.time() + 2
foo <- ""
textSize <- 0
while(Sys.time() < stopTime)
{
newText <- read.serialConnection(con)
if(0 < nchar(newText))
{
foo <- paste(foo, newText)
}
}
cat("\r\n", foo, "\r\n")
close(con)
foo ends up being a long string with new lines the way I want them:
3181, -53120, -15296, 2,
3211, -53088, -15328, 2,
3241, -53248, -15456, 1,
3271, -53216, -15424, 2,
3301, -53184, -15488, 2,
3331, -53344, -15360, 1,
3361, -53440, -15264, 1,
Thanks again for all the help!

i am working with the serial-package (here) available on CRAN. This was developed to do exactly that what you need. Reading and sending data form and to RS232 etc. connections.
I do really recommend this, because "mode.exe" seems not to work for virtual COM-ports. See NPort-Server etc.

Teraterm and Windows use a different mechanism to configure serial devices.
Are your system connection settings ok compared to what is configured in teraterm?
Re-check the configuration parameter in teraterm and then use them to set your COM4: configuration in R.
system("mode COM4: BAUD=115200 PARITY=N DATA=8 STOP=1")
see mode /? on your command prompt for further parameters
it might also be helpful to read data character by character using readChar()
It sometimes happens that teraterm doesn't close RS232 connections properly.

I realize that this is from five years ago but I found that in your code you do not have a handshake called.
I am working with something similar where I use PUTTY instead of teraterm, where I could see all of the following inputs for my COM device.
my command is as follow:
con <-serialConnection(name="Prolific USB-to-Serial Comm Port(Com3)",
port="COM3",
mode="9600,n,8,1",
newline=0,
translation="lf",
handshake = 'xonxoff'
)

Related

In R: How to return the length of a sound?

I am looking to find out the length of a beep or another sound and would like it to be returned programmatically as a numeric value rather than be displayed on the screen.
library(beepr) beep(sound=0) ### how long is this sound in seconds/milliseconds? beep(sound=5) ### how long is this sound in seconds/milliseconds?
Package soundgen is a solution. It works with wav/mp3 files.
In this example I assume you have beepr installed in its default folder, and I am going to get the duration of beep 1, alias 'microwave_ping_mono.wav`.
library(soundgen)
path_to_wav <-
paste0(.libPaths()[1],
"/beepr/sounds/microwave_ping_mono.wav")
sound_length_in_s <-
soundgen::analyze(x = path_to_wav)$summary$duration
sound_length_in_s
[1] 1.102993

replicate Arduino's serial monitor on Scilab consol

If I use the Arduino IDE's Serial monitor I can read the pair of comma separated values as below:
I want to first replicate this behavior in SciLab terminal. I used the Serial Communication Toolbox:
h = openserial(7, "9600,n,8,1") // open COM7
disp(readserial(h))
closeserial(h)
which returns either empty or
, 169
228, 179
228,
228, 205
228, 209 228,
putting the disp(readserial(h)) in a while loop also doesn't help. Not only there are too many empty lines, if I stop the while loop it does not close the port (something like try-catch should be used I think). I would appreciate if you could help me know how I could get the same behavior as Arduino's serial monitor?
P.S. Next I want to plot these two values in realtime. So maybe using the csvTextScan function to split the string into two integers.
OK, after a couple of days struggling I figured this out. It turns out that SciLab doesn't have native serial communication functionality and the Toolbox developer has used TCL_EvalStr to run Tcl commands externally. I had to dig into the Tcl serial communication syntax (i.e. read, open, gets, fconfigure ... ), ask another question, get some help and then finally end up with a new function for the Toolbox which I have committed as a pull request:
function buf = readserialline(h)
tmpbuf = emptystr();
while tmpbuf == emptystr()
TCL_EvalStr("gets " + h + " ttybuf");
tmpbuf = TCL_GetVar("ttybuf");
end
buf = tmpbuf;
endfunction
now one can get the above behavior by running:
h = openserial(7, "9600,n,8,1") // open COM7
for ii = 1:40
disp(readserialline(h))
end
closeserial(h)
to read the serial port line by line and print it to the SciLab console. Now to parse the CSV data you may use:
csvTextScan(part(readserialline(h), 1:$-1), ',')
P.S.1. I have used a virtual Arduino board inside SimulIDE and used com0com to create virtual serial ports. More information here on SourceForge.
P.S.2. More discussion with Toolbox developer Aditya Sengupta here on Twitter.
P.S.3. More discussions here on Tcl Google group
P.S.4. A full demonstration plus instructions here on Reddit
P.S.5. For those who might end up here, I have decided to rewrite Aditya Sengupta's repository here with several improvements.

pyserial code working on Windows (COM1) but not on Linux (/dev/ttyS0)

I am using python 3.6.
The following code works just fine on Windows 10 Pro:
import serial
import binascii
ser = serial.Serial("COM1") # "COM1" will be "/dev/ttyS0" on Linux
if ser.is_open == True:
print("COM open")
ser.baudrate = 2400
print('Port configuration:')
print('baudrate:',ser.baudrate)
print('parity:',ser.parity)
print('stopbits:',ser.stopbits)
print('bytesize:',ser.bytesize)
print('xonxoff:',ser.xonxoff)
print('timeout:',ser.timeout)
print()
print('sending...')
frame = bytearray()
frame.append(0x7e)
frame.append(0x03)
frame.append(0x02)
frame.append(0x21)
frame.append(0x00)
frame.append(0xa4)
ser.write(frame)
print(binascii.hexlify(frame))
print()
print('receiving...')
recv = ser.readline()
recv_len = len(recv)
print(binascii.hexlify(recv))
print()
ser.close()
if ser.is_open == False:
print("COM closed")
But it gets stuck at 'ser.readline()' when I run it under CentOS 6.8, as there was no cable attached to the port.
It looks like a trivial issue, but I cannot figure out what's wrong or missing.
If you cannot either, I hope the sample code can result useful to someone at least.
False problem. The code worked using ttyS1 instead of ttyS0 (I knew it was something trivial).
Anyway, very useful to check
cat /proc/tty/driver/serial
which shows tx/rx statistics and parameters as DTS, RTS, RI, etc. next to each port.
For example, next to the ttyS1 I noticed an 'RI' which was the same parameter that Hercules terminal on Windows showed me (graphically) when I tried to open COM1. Very intuitive to identify a serial port this way!

Scapy raw data manipulation

I am having troubles manipulating raw data. I am trying to change around a
resp_cookie in my ISAKMP header and when I do a sniff on the packet it is all in raw data format under Raw Load='\x00\x43\x01........... ' with about 3 lines like that. When I do a Wireshark capture I see the information I want to change but I cant seem to find a way to convert and change that raw data to find and replace the information I am looking for. Also, I can see the information I need when I do a hexdump(), but I can't store that in a variable. when I type i = hexdump(pkt) it spits out the hexdump but doesn't store the hexdump in i.
So this post is a little old, but I've come across it a dozen or so times trying to find the answer to a similar problem I'm having. I doubt OP has need for an answer anymore, but if anyone else is looking to do something similar...here you go!
I found the following code snippet somewhere in the deep, dark depths of google and it worked for my situation.
Hexdump(), show() and other methods of Scapy just output the packet to the terminal/console; they don't actually return a string or any other sort of object. So you need a way to intercept that data that it intends to write and put it in a variable to be manipulated.
NOTE: THIS IS PYTHON 3.X and SCAPY 3K
import io
import scapy
#generic scapy sniff
sniff(iface=interface,prn=parsePacket, filter=filter)
With the above sniff method, you're going to want to do the following.
def parsePacket(packet):
outputPacket = ''
#setup
qsave = sys.stdout
q = io.StringIO()
#CAPTURES OUTPUT
sys.stdout = q
#Text you're capturing
packet.show()
#restore original stdout
sys.stdout = qsave
#release output
sout = q.getvalue()
#Add to string (format if need be)
outputPacket += sout + '\n'
#Close IOStream
q.close()
#return your packet
return outputPacket
The string you return (outputPacket) can now be manipulated how you want.
Swap out .show() with whatever function you see fit.
P.S. Forgive me if this is a little rough from a Pythonic point of view...not a python dev by any stretch.

Logfile analysis in R?

I know there are other tools around like awstats or splunk, but I wonder whether there is some serious (web)server logfile analysis going on in R. I might not be the first thought to do it in R, but still R has nice visualization capabilities and also nice spatial packages. Do you know of any? Or is there a R package / code that handles the most common log file formats that one could build on? Or is it simply a very bad idea?
In connection with a project to build an analytics toolbox for our Network Ops guys,
i built one of these about two months ago. My employer has no problem if i open source it, so if anyone is interested i can put it up on my github repo. I assume it's most useful to this group if i build an R Package. I won't be able to do that straight away though
because i need to research the docs on package building with non-R code (it might be as simple as tossing the python bytecode files in /exec along with a suitable python runtime, but i have no idea).
I was actually suprised that i needed to undertake a project of this sort. There are at least several excellent open source and free log file parsers/viewers (including the excellent Webalyzer and AWStats) but neither parse server error logs (parsing server access logs is the primary use case for both).
If you are not familiar with error logs or with the difference between them and access
logs, in sum, Apache servers (likewsie, nginx and IIS) record two distinct logs and store them to disk by default next to each other in the same directory. On Mac OS X,
that directory in /var, just below root:
$> pwd
/var/log/apache2
$> ls
access_log error_log
For network diagnostics, error logs are often far more useful than the access logs.
They also happen to be significantly more difficult to process because of the unstructured nature of the data in many of the fields and more significantly, because the data file
you are left with after parsing is an irregular time series--you might have multiple entries keyed to a single timestamp, then the next entry is three seconds later, and so forth.
i wanted an app that i could toss in raw error logs (of any size, but usually several hundred MB at a time) have something useful come out the other end--which in this case, had to be some pre-packaged analytics and also a data cube available inside R for command-line analytics. Given this, i coded the raw-log parser in python, while the processor (e.g., gridding the parser output to create a regular time series) and all analytics and data visualization, i coded in R.
I have been building analytics tools for a long time, but only in the past
four years have i been using R. So my first impression--immediately upon parsing a raw log file and loading the data frame in R is what a pleasure R is to work with and how it is so well suited for tasks of this sort. A few welcome suprises:
Serialization. To persist working data in R is a single command
(save). I knew this, but i didn't know how efficient is this binary
format. Thee actual data: for every 50 MB of raw logfiles parsed, the
.RData representation was about 500 KB--100 : 1 compression. (Note: i
pushed this down further to about 300 : 1 by using the data.table
library and manually setting compression level argument to the save
function);
IO. My Data Warehouse relies heavily on a lightweight datastructure
server that resides entirely in RAM and writes to disk
asynchronously, called redis. The proect itself is only about two
years old, yet there's already a redis client for R in CRAN (by B.W.
Lewis, version 1.6.1 as of this post);
Primary Data Analysis. The purpose of this Project was to build a
Library for our Network Ops guys to use. My goal was a "one command =
one data view" type interface. So for instance, i used the excellent
googleVis Package to create a professional-looking
scrollable/paginated HTML tables with sortable columns, in which i
loaded a data frame of aggregated data (>5,000 lines). Just those few
interactive elments--e.g., sorting a column--delivered useful
descriptive analytics. Another example, i wrote a lot of thin
wrappers over some basic data juggling and table-like functions; each
of these functions i would for instance, bind to a clickable button
on a tabbed web page. Again, this was a pleasure to do in R, in part
becasue quite often the function required no wrapper, the single
command with the arguments supplied was enough to generate a useful
view of the data.
A couple of examples of the last bullet:
# what are the most common issues that cause an error to be logged?
err_order = function(df){
t0 = xtabs(~Issue_Descr, df)
m = cbind( names(t0), t0)
rownames(m) = NULL
colnames(m) = c("Cause", "Count")
x = m[,2]
x = as.numeric(x)
ndx = order(x, decreasing=T)
m = m[ndx,]
m1 = data.frame(Cause=m[,1], Count=as.numeric(m[,2]),
CountAsProp=100*as.numeric(m[,2])/dim(df)[1])
subset(m1, CountAsProp >= 1.)
}
# calling this function, passing in a data frame, returns something like:
Cause Count CountAsProp
1 'connect to unix://var/ failed' 200 40.0
2 'object buffered to temp file' 185 37.0
3 'connection refused' 94 18.8
The Primary Data Cube Displayed for Interactive Analysis Using googleVis:
A contingency table (from an xtab function call) displayed using googleVis)
It is in fact an excellent idea. R also has very good date/time capabilities, can do cluster analysis or use any variety of machine learning alogorithms, has three different regexp engines to parse etc pp.
And it may not be a novel idea. A few years ago I was in brief email contact with someone using R for proactive (rather than reactive) logfile analysis: Read the logs, (in their case) build time-series models, predict hot spots. That is so obviously a good idea. It was one of the Department of Energy labs but I no longer have a URL. Even outside of temporal patterns there is a lot one could do here.
I have used R to load and parse IIS Log files with some success here is my code.
Load IIS Log files
require(data.table)
setwd("Log File Directory")
# get a list of all the log files
log_files <- Sys.glob("*.log")
# This line
# 1) reads each log file
# 2) concatenates them
IIS <- do.call( "rbind", lapply( log_files, read.csv, sep = " ", header = FALSE, comment.char = "#", na.strings = "-" ) )
# Add field names - Copy the "Fields" line from one of the log files :header line
colnames(IIS) <- c("date", "time", "s_ip", "cs_method", "cs_uri_stem", "cs_uri_query", "s_port", "cs_username", "c_ip", "cs_User_Agent", "sc_status", "sc_substatus", "sc_win32_status", "sc_bytes", "cs_bytes", "time-taken")
#Change it to a data.table
IIS <- data.table( IIS )
#Query at will
IIS[, .N, by = list(sc_status,cs_username, cs_uri_stem,sc_win32_status) ]
I did a logfile-analysis recently using R. It was no real komplex thing, mostly descriptive tables. R's build-in functions were sufficient for this job.
The problem was the data storage as my logfiles were about 10 GB. Revolutions R does offer new methods to handle such big data, but I at last decided to use a MySQL-database as a backend (which in fact reduced the size to 2 GB though normalization).
That could also solve your problem in reading logfiles in R.
#!python
import argparse
import csv
import cStringIO as StringIO
class OurDialect:
escapechar = ','
delimiter = ' '
quoting = csv.QUOTE_NONE
parser = argparse.ArgumentParser()
parser.add_argument('-f', '--source', type=str, dest='line', default=[['''54.67.81.141 - - [01/Apr/2015:13:39:22 +0000] "GET / HTTP/1.1" 502 173 "-" "curl/7.41.0" "-"'''], ['''54.67.81.141 - - [01/Apr/2015:13:39:22 +0000] "GET / HTTP/1.1" 502 173 "-" "curl/7.41.0" "-"''']])
arguments = parser.parse_args()
try:
with open(arguments.line, 'wb') as fin:
line = fin.readlines()
except:
pass
finally:
line = arguments.line
header = ['IP', 'Ident', 'User', 'Timestamp', 'Offset', 'HTTP Verb', 'HTTP Endpoint', 'HTTP Version', 'HTTP Return code', 'Size in bytes', 'User-Agent']
lines = [[l[:-1].replace('[', '"').replace(']', '"').replace('"', '') for l in l1] for l1 in line]
out = StringIO.StringIO()
writer = csv.writer(out)
writer.writerow(header)
writer = csv.writer(out,dialect=OurDialect)
writer.writerows([[l1 for l1 in l] for l in lines])
print(out.getvalue())
Demo output:
IP,Ident,User,Timestamp,Offset,HTTP Verb,HTTP Endpoint,HTTP Version,HTTP Return code,Size in bytes,User-Agent
54.67.81.141, -, -, 01/Apr/2015:13:39:22, +0000, GET, /, HTTP/1.1, 502, 173, -, curl/7.41.0, -
54.67.81.141, -, -, 01/Apr/2015:13:39:22, +0000, GET, /, HTTP/1.1, 502, 173, -, curl/7.41.0, -
This format can easily be read into R using read.csv. And, it doesn't require any 3rd party libraries.

Resources