Trying to retrieve data from a serial port but program is stuck at getchar - serial-port

I am using an embedded system to send data from 25 sensors to a putty terminal on my computer. Works great.
I wanted to add a read from terminal functionality to the embedded system (so I can send commands). So I tried using getchar() to read whatever I would write on my putty terminal. First I just wanted to getchar and print the character back on putty. It kinda works, but my sensor data, which is supposed to print every 500ms, does not print until I type a char in putty. It is as if my code was stuck on getchar() and stuck in a while loop until getchar() reads something.
Here is my forever loop in my int main(). I am not sharing the rest as it is not really needed and too bulky (its just initializing modules). In this loop I am reading a sensor, trying to read from putty, writing to putty, and starting my next scan:
for(;;)
{
CapSense_ProcessAllWidgets(); // Process all widgets
CapSense_RunTuner(); // To sync with Tuner application
read_sensor(curr_elem); //read curr_elem
(curr_elem < RX4_TX4)?(curr_elem++):(curr_elem = 0, touchpad_readings_flag++);
// Here is the part to read I added which blocks until I type in something.
// If I remove this if and all of what's in it, I print to putty every 500ms
if(touchpad_readings_flag)
{
char received_char = getchar();
if (received_char) //if something was returned, received_char != 0
{
printf("%c", received_char);
}
}
//Here I write to putty. works fine when I remove getchar()
if (print_counter_flag && touchpad_readings_flag)
{
print_counter_flag = 0;
touchpad_readings_flag = 0;
for (int i = 0; i < 25; i++)
{
printf("\n");
printf("%c", 97 + i);
printf("%c", val[i] >> 8);
printf("%c", val[i] & 0x00ff); // For raw counts
printf("\r");
}
}
/* Start next scan */
CapSense_UpdateAllBaselines();
CapSense_ScanAllWidgets();
}

Apparently, your getchar() call is blocking unless there is input data to retrieve.
One solution to change this behaviour has been given by another article on different SE board.
Please also note that getchar() is a wrapper for getc() that is acting on stdin as this site1 describes.
For getc() you find further discussions.
In one of those, it is pointed out that some important implementations even wait for a newline character until input is delivered to your function. I think this depends on standard libraries/kind of embedded system you actually use - please check the documentation of your toolchain vendor.2
1
I didn't look up a normative source, this is just my first google hit.
2
The question doesn't specify the kind of embedded system, so a generic answer is wanted instead of a discussion of particular target/toolchain combinations, IMO.

Related

Sim800L lag/delay before incoming calls are visible to arduino

I use SIM800L GSM module to detect incoming calls and generally it works fine. The only problem is that sometimes it takes up to 8 RINGS before the GSM module tells arduino that someone is calling (before RING appears on the serial connection). It looks like a GSM Network congestion but I do not have such issues with normal calls (I mean calls between people). It happens to often - so it cannot be network/Provider overload. Does anybody else had such a problem?
ISP/Provider: Plus GSM in Poland
I don't put any code, because the problem is in different layer I think
sorry that I didn't answer earlier. I've tested it and it turned out that in bare minimum code it worked OK! I mean, I can see 'RING' on the serial monitor immediately after dialing the number. So it's not a hardware issue!
//bare minimum code:
void loop() {
if(serialSIM800.available()){
Serial.write(serialSIM800.read());
}
if(Serial.available()){
serialSIM800.write(Serial.read());
}
}
In my real code I need to compare calling number with the trusted list. To do that I saved all trusted numbers in the contact list on the sim card (with the common prefix name 'mytrusted'). So, in the main loop there's if statement:
while(mySerial.available()){
incomingByte = mySerial.read();
inputString += incomingByte;
}
if (inputString.indexOf("mytrusted") > 0){
isTrusted = 1;
Serial.println("A TRUSTED NUMBER IS CALLING");
}
After adding this "if condition" Arduino sometimes recognize trusted number after 1'st call, and sometimes after 4'th or 5'th. I'm not suspecting the if statement itself , but the preceding while loop, where incoming bytes are combined into one string.
Any ideas, what can be improved in this simply code?
It seems, I found workaround for my problem. I just send a simple 'AT' command every 20 seconds to SIM800L (it replies with 'OK' ). I use timer to count this 20 seconds interval (instead of simply delay function)
TimerObject *timer2 = new TimerObject(20000); //AT command interval
....
timer2->setOnTimer(&SendATCMD);
....
void SendATCMD () {
mySerial.println("AT");
timer2->Stop();
timer2->Start();
}
With this simple modification Arduino always sees incoming call immediately (after 1 ring)

OpenBSD serial I/O: -lpthead makes read() block forever, even with termios VTIME set?

I have an FTDI USB serial device which I use via the termios serial API. I set up the port so that it will time-out on read() calls in half a second (by using the VTIME parameter), and this works on Linux as well as on FreeBSD. On OpenBSD 5.1, however, the read() call simply blocks forever when no data is available (see below.) I would expect read() to return 0 after 500ms.
Can anyone think of a reason that the termios API would behave differently under OpenBSD, at least with respect to the timeout feature?
EDIT: The no-timeout problem is caused by linking against pthread. Regardless of whether I'm actually using any pthreads, mutexes, etc., simply linking against that library causes read() to block forever instead of timing out based on the VTIME setting. Again, this problem only manifests on OpenBSD -- Linux and FreeBSD work as expected.
if ((sd = open(devPath, O_RDWR | O_NOCTTY)) >= 0)
{
struct termios newtio;
char input;
memset(&newtio, 0, sizeof(newtio));
// set options, including non-canonical mode
newtio.c_cflag = (CREAD | CS8 | CLOCAL);
newtio.c_lflag = 0;
// when waiting for responses, wait until we haven't received
// any characters for 0.5 seconds before timing out
newtio.c_cc[VTIME] = 5;
newtio.c_cc[VMIN] = 0;
// set the input and output baud rates to 7812
cfsetispeed(&newtio, 7812);
cfsetospeed(&newtio, 7812);
if ((tcflush(sd, TCIFLUSH) == 0) &&
(tcsetattr(sd, TCSANOW, &newtio) == 0))
{
read(sd, &input, 1); // even though VTIME is set on the device,
// this read() will block forever when no
// character is available in the Rx buffer
}
}
from the termios manpage:
Another dependency is whether the O_NONBLOCK flag is set by open() or
fcntl(). If the O_NONBLOCK flag is clear, then the read request is
blocked until data is available or a signal has been received. If the
O_NONBLOCK flag is set, then the read request is completed, without
blocking, in one of three ways:
1. If there is enough data available to satisfy the entire
request, and the read completes successfully the number of
bytes read is returned.
2. If there is not enough data available to satisfy the entire
request, and the read completes successfully, having read as
much data as possible, the number of bytes read is returned.
3. If there is no data available, the read returns -1, with errno
set to EAGAIN.
can you check if this is the case?
cheers.
Edit: OP traced back the problem to a linking with pthreads that caused the read function to block. By upgrading to OpenBSD >5.2 this issue was resolved by the change to the new rthreads implementation as the default threading library on openbsd. more info on guenther# EuroBSD2012 slides

Arduino Serial Communication not receiving entire message

I have a problem with the Arduino communication. It's quite hard to describe so I cant fit it in the title. Anyway here are the following:
So I have this code for my receiving end:
if(Serial1.available())
{
while(Serial1.available())
{
uint8_t inByte = Serial1.read();
inByte = inByte ^ k;
Serial.write(inByte);
}
Serial.println(" done");
}
It's supposed to print in one line and print done when it's done. The Serial1.available() seems to skip the next Serial1.available(), I don't know what's going on. Anyway here's my current, bad, output:
h done
e done
l done
l done
o done
done
when it should be:
hello done
I'm sorry if this could've been phrased better but that's all I can type now, my brain is kinda in pain. I've never experienced this behavior in a Windows c++ console application.
If you are calling that routine in loop() then yes, it will read from the serial buffer and immediately return since you are probably not sending the data fast enough.
A better way to handle this sort of thing is to use a control char which indicates the end of a message OR if you have a specific data format you expect to receive, then keep a count of the chars which have come in until the data format limit is reached.
There is discussion here which you may find useful: Serial Duplex using Arduino Also there are example sketches that ship with the Arduino IDE: Menu: Examples: Communication:
Also, read all the entries under the Serial listing for Arduino. Good stuff there.
So the routine you develop for working with Serial input really depends on your project and the kind of data you are receiving. In your example above, if you were to use a control char, it might look like this:
while(Serial1.available()){
char c = Serial1.read();
if (c == '*'){
Serial.println(" done");
} else {
Serial.write(c);
}
}

Simple algorithm for reliable communications

So, I have worked on large systems in the past, like an iso stack session layer, and something like that is too big for what I need, but I do have some understanding of the big picture. What I have now is a serial point to point communications link, where some component is dropping data (often).
So I am going to have to write my own, reliable delivery system using it for transport. Can someone point me in the directions for basic algorithms, or even give a clue as to what they are called? I tried a Google, but end up with post graduate theories on genetic algorithms and such. I need the basics. e.g. 10-20 lines of pure C.
XMODEM. It's old, it's bad, but it is widely supported both in hardware and in software, with libraries available for literally every language and market niche.
HDLC - High-Level Data Link Control. It's the protocol which has fathered lots of reliable protocols over the last 3 decades, including the TCP/IP. You can't use it directly, but it is a template how to develop your own protocol. Basic premise is:
every data byte (or packet) is numbered
both sides of communication maintain locally two numbers: last received and last sent
every packet contains the copy of two number
every successful transmission is confirmed by sending back an empty (or not) packet with the updated numbers
if transmission is not confirmed within some timeout, send again.
For special handling (synchronization) add flags to the packet (often only one bit is sufficient, to tell that the packet is special and use). And do not forget the CRC.
Neither of the protocols has any kind of session support. But you can introduce one by simply adding another layer - a simple state machine and a timer:
session starts with a special packet
there should be at least one (potentially empty) packet within specified timeout
if this side hasn't sent a packet within the timeout/2, send an empty packet
if there was no packet seen from the other side of communication within the timeout, the session has been termianted
one can use another special packet for graceful session termination
That is as simple as session control can get.
There are (IMO) two aspects to this question.
Firstly, if data is being dropped then I'd look at resolving the hardware issues first, as otherwise you'll have GIGO
As for the comms protocols, your post suggests a fairly trivial system? Are you wanting to validate data (parity, sumcheck?) or are you trying to include error correction?
If validation is all that is required, I've got reliable systems running using RS232 and CRC8 sumchecks - in which case this StackOverflow page probably helps
If some components are droping data in a serial point to point link, there must exist some bugs in your code.
Firstly, you should comfirm that there is no problem in the physical layer's communication
Secondly, you need some konwledge about data communication theroy such like ARQ(automatic request retransmission)
Further thoughts, after considering your response to the first two answers... this does indicate hardware problems, and no amount of clever code is going to fix that.
I suggest you get an oscilloscope onto the link, which should help to determine where the fault lies. In particular look at the baud rate of the two sides (Tx, Rx) to ensure that they are within spec... auto-baud is often a problem?!
But look to see if drop out is regular, or can be sync-ed with any other activity.
on the sending side;
///////////////////////////////////////// XBee logging
void dataLog(int idx, int t, float f)
{
ubyte stx[2] = { 0x10, 0x02 };
ubyte etx[2] = { 0x10, 0x03 };
nxtWriteRawHS(stx, 2, 1);
wait1Msec(1);
nxtWriteRawHS(idx, 2, 1);
wait1Msec(1);
nxtWriteRawHS(t, 2, 1);
wait1Msec(1);
nxtWriteRawHS(f, 4, 1);
wait1Msec(1);
nxtWriteRawHS(etx, 2, 1);
wait1Msec(1);
}
on the receiving side
void XBeeMonitorTask()
{
int[] lastTick = Enumerable.Repeat<int>(int.MaxValue, 10).ToArray();
int[] wrapCounter = new int[10];
while (!XBeeMonitorEnd)
{
if (XBee != null && XBee.BytesToRead >= expectedMessageSize)
{
// read a data element, parse, add it to collection, see above for message format
if (XBee.BaseStream.Read(XBeeIncoming, 0, expectedMessageSize) != expectedMessageSize)
throw new InvalidProgramException();
//System.Diagnostics.Trace.WriteLine(BitConverter.ToString(XBeeIncoming, 0, expectedMessageSize));
if ((XBeeIncoming[0] != 0x10 && XBeeIncoming[1] != 0x02) || // dle stx
(XBeeIncoming[10] != 0x10 && XBeeIncoming[11] != 0x03)) // dle etx
{
System.Diagnostics.Trace.WriteLine("recover sync");
while (true)
{
int b = XBee.BaseStream.ReadByte();
if (b == 0x10)
{
int c = XBee.BaseStream.ReadByte();
if (c == 0x03)
break; // realigned (maybe)
}
}
continue; // resume at loop start
}
UInt16 idx = BitConverter.ToUInt16(XBeeIncoming, 2);
UInt16 tick = BitConverter.ToUInt16(XBeeIncoming, 4);
Single val = BitConverter.ToSingle(XBeeIncoming, 6);
if (tick < lastTick[idx])
wrapCounter[idx]++;
lastTick[idx] = tick;
Dispatcher.BeginInvoke(DispatcherPriority.ApplicationIdle, new Action(() => DataAdd(idx, tick * wrapCounter[idx], val)));
}
Thread.Sleep(2); // surely we can up with the NXT
}
}

converting a windows shell IStream to std::ifstream/std::get_line

We have a lot of code written that makes use of the standard template library. I would like to integrate some of our apps into the windows shell, which should provide a better experience for our users.
One piece of integration involves a Shell Preview provider, the code is very straight forward, however, I’m stuck on the best way to implement something.
The shell is giving me, via my preview handler, an IStream object and I need to convert/adapt it to an std::ifstream object, primarily so that std::getline can get called further down the callstack.
I was wondering if there was a “standard” way of doing the adapting or do I need to role up my sleeves and code?
TIA.
Faffed around with this for a while:
std::stringstream buff;
BYTE ib[2048];
ULONG totread=0, read=0, sbuff = 2048;
HRESULT hr;
do {
hr = WinInputStream->Read(ib, sbuff, &read);
buff.write(ib, read);
totread+=read;
} while((sbuff == read) && SUCCEEDED(hr));
if(totread == 0) return false;
ifstream i;
TCHAR* ncbuff = const_cast<TCHAR*>(buff.str().c_str());
i.rdbuf()->pubsetbuf(ncbuff, buff.str().length());
But didn't like having to read it all into memory, for it to be processed again.
So I implemented my preview handler using IInitializeWithFile instead.

Resources