Can't get OK response from XBee upon "+++" - arduino

I have been trying to set up two XBees to communicate since the last three days. X-CTU seems to be the perfect option to do so, however, it is a real menace when it comes to discovering XBees on serial ports.
I was able to detect one XBee by luck just once and the other one never showed up. I have even replaced both my XBees. I am trying to figure out the alternative, i.e. using a serial console to perform the operation. I haven't been able to receive an OK response from the device upon issuing +++.
Since I haven't had a good experience using a PC to communicate with ESP8266 devices earlier, I tried to figure out a workaround by using the second Serial port of an Arduino to send such configuration messages and read the response by printing it out on the default serial console.
It also appears that configuration messages can differ depending on the mode of the device. If it's in API mode, the frame has to be generated in a specific format (I use the X-CTU frame generator for this purpose).
Why am I not able to receive a response from the XBee upon issuing a +++?
The devices are Series 1 XBees and the exact part number is XB24-AWI-001. Any help is highly appreciated.

Have you considered the XBee being in API mode? Maybe should you consider to reflash the device in AT mode to start playing with it.
To test if it's in API mode, you can refer to the guide, chapter 9 for the API mode structure:
http://eewiki.net/download/attachments/24313921/XBee_ZB_User_Guide.pdf?version=1&modificationDate=1380318639117&api=v2
Basically, a datagram in API mode starts with ~, and it's built as follows:
[0x7E|length(2B)|Command(1B)|Payload(length-1B)|Checksum(1B)]
As 0x7E is ~ on the ASCII table, you should try typing a bogus datagram in a serial terminal session like:
~ <C-d> AAAA
N.B.: The <C-d> characters means Control-d under unix., which is the EOF character.
Obviously such a message isn't likely to work, and you will receive a reply asking you to send that datagram again. That's because the EOF character being ASCII code 4, it means that the length of the datagram will be 4 bytes. So then you send four bogus bytes, the checksum will be A, which is very likely to be right, and the receiver will assume the transmission has been corrupted. So the datagram will be asked again, meaning you will receive a datagram to do that query.
Though I can only advice you to consider running it only in API mode (more reliable and a better API, but you cannot play around with it and understand what's going on by tapping on the line with a logic analyzer… though giving enough time, you'll start to read API datagrams like it's English ☺).
I wrote a page with a few resources to check on how to reflash the XBees:
https://github.com/hackable-devices/polluxnzcity/wiki/Flash-zigbee
and here's other advices from another totally unrelated project:
https://github.com/andrewrapp/xbee-api#documentation
And I also wrote a lib (aimed at beaglebones but you can tweak it for your use) that handles API mode 2 with XBees:
https://github.com/hackable-devices/polluxnzcity/blob/master/PolluxGateway/include/xbee/xbee_communicator.h
https://github.com/guyzmo/polluxnzcity/blob/master/PolluxGateway/src/xbee/xbee_communicator.C
but I bet with a little google search you can find more widely used libraries than those ones, and even some aimed to be run on Arduinos (N.B.: that lib was originally written for Arduinos, and then adapted to run for Beaglebone, so reversing the operation shouldn't be hard).

Related

How do smart phones use AT commands and data connection(s)? gsm mux? multiple uarts?

I am involved in a project where we have some kind of IoT device. An nxp processor with an LTE modem on a PCB. The software running on it connects to the modem over a single uart interface. It will initialize the modem through AT commands, and finally made a data call to the provider (PPP).
Then, it uses lwIP (light weight IP) to open some mqtt subscriptions, and allow user code to make http get/post requests to our servers.
Every 15 minutes we want to retrieve signal strength from the modem and report this back to the server. What I do now, is put the modem back in command mode, retrieve the signal strength info, go back to data mode, and resume normal operation.
The round trip from data mode, to commando mode, and back to data mode takes several seconds (4-5 ish). This is annoying, because during that time we are not receptive for commands.
I've read about gsm mux 07.10. By following some defined protocol it allows to create virtual serial ports, over one physical uart. That sounds nice, although I realize this will go at the cost of performance (bytes will be added to each frame we send to either command mode / data mode).
The gsm mux 07.10 spec dates from 1999. I am far from an expert in mobile solutions. I was wondering: is muxing still the way to go? How does a typical smart phone deals with this for example? Do they include modems with more than one uart to have parallel access to AT commands and a live internet connection? Or do they in fact still rely on gsm mux?
If somebody would be so kind to give some insights. Also on potential C libraries that are available that implement gsm mux 07.10? It seems that TinyGSM implements it (although I can't seem to find where), and I also can find the linux kernel driver that implements gsm mux 07.10. But that driver is written on top the tty interfaces in linux, so that would mean I would have to reverse engineer the kernel driver and strip out the tty stuff and replace it with my own uart implementation.
First of all, the spec numbering is the old GSM specification numbering, so those old specs will never be updated, the new specifications with new numbering scheme will. I do not remember when the switch was made, but I do remember someone at work giving a presentation on 07.10 probably around 1998/1999, so probably a few years after that or around that time (and definitely before 2009).
The newer spec numbering scheme uses three digits for the first part.
So for instance the old AT command spec 07.07 is now 27.007, and the current 07.10 multiplex specification is 27.010.
The following is what I remember of 07.10.
The motivations for developing 07.10 was to exactly support the kind of scenario that you describe. Remember back in the mid 90's, if mobile phones had a serial interface then that was RS-232 though each manufacturer's proprietary connector at the bottom of the phone. One single serial interface.
However, in order to use 07.10 mux in serial communication you needed to install some specific serial drivers in Windows with support for 07.10 (and I think maybe there was some reliability issue with them?), and for that reason 07.10 never took of and became anything more than an rarely used solution.
Also by the end of the 90's additional serial interfaces like Bluetooth and IrDA became available on many phones, and later USB as well, which both added additional physical interfaces as well as natively multiplexing within each protocol.
So the need for multiplexing over physical RS-232 became less of an issue, and whatever little popularity 07.10 ever had dwindled down to virtual nothing.
Fast forward a couple of decades and suddenly someone asks about it on stackoverflow. Good on you :) As far as I can tell I cannot see any fundamental problems with using it for the purpose you present.
Modern smart phones that support AT commands will most likely have a code base for the AT command parsing with roots in the 90's, which most likely include the AT+CMUX command. Of course manufacturers today have zero explicit wish for supporting it, but when it is already present it will just come along with the collection of all other legacy AT commands that they support.
So if the modem supports AT+CMUX you should be good to go. I have no experience or recommendation with regards to client protocol libraries.

Reverse Engineering a specific bluetooth communication protocol

I have been reading answers on stackoverflow for a while now and this is the first time I actually am required to ask a question:
I have a small sensing device (literally a black box) which is used during sporting activities and is tracking acceleration and GPS data (not necessarily with the same frequency, according to a patent from the vendor). After a session, one can connect the device to a smartphone and import the session data to view statistics.
Now I am trying to acquire the raw data to apply some own statistics onto it.
I know that the device connects to my phone via Bluetooth. So I activated the Bluetooth HCI snoop log following this tutorial:
http://www.fte.com/WebHelp/BPA600/Content/Documentation/WhitePapers/BPA600/Encryption/GettingAndroidLinkKey/RetrievingHCIlog.htm
I can then transfer the files by renaming them into .cap files on the PC and load them into wireshark. This is where it gets tricky:
I have found out, that the first connection is established via Bluetooth low energy. When the connection is established and the user has selected to download a session from the device via the app, the connection switches to a normal Bluetooth connection.
I know that the device contains a GPS and a 9-axis accelerometer including a Gyro.
Apparently the Bluetooth protocol to transfer data is the SPP protocol (https://en.wikipedia.org/wiki/List_of_Bluetooth_profiles#Serial_Port_Profile_.28SPP.29), used to simulate a RS-232 connection.
I have attached a screenshot from wireshark showing a reassembled data packet. I do not know what it contains and the rendering from Wireshark does not make any sense to me. The frame content is displayed in the bottom most tab. The left is the raw HEX transmission, the right shows the rendered version. It neither looks like any GPS sentence (http://www.gpsinformation.org/dale/nmea.htm), nor like any accelerometer data:
The general setting is an encryption-less connection, but at some stage the host and controller try to switch to an encryption, but this never gets transmitted to the peripheral slave (as far as I can see). I am wondering how to make sense of this data, whether there is a way for me to find out whether an encryption is activated and if it is, is it logged and can I retrieve the key from this log?
Can anyone help me to figure out the data here or tell me where I can find some hints about whether it is encrypted or not?
Edit:
I have added a screenshot from the first SPP transmission packet. The packet in question and the payload are marked in black. It seems to contain some information about device and other configuration settings or initial values for the sensors at the beginning. I suspect the app and the device to have settled on a proprietary scrambling or encrypting, since there are readable values at the beginning, but not after that black box marked in the image. My suspicion is, that bluetooth encryption is not being used at all and I therefore stand no chance of decrypting the information at all? Can someone confirm or deny this suspicion?
where I can find some hints about whether it is encrypted or not?
What you see in Wireshark is the HCI interface (commands and events) between Host and Controller. Since encryption is done in the controller (see Bluetooth Core spec. Vol. 1 Part A Section 5.4.3), what you see is unencrypted data.
Can anyone help me to figure out the data here
It's hard to understand from your single screenshot. I suggest you take a look at the RFCOMM specification, Figure 6.1 in paricular:
In the Information field you should find your data.

Arduino & ESP8266 - strange characters in response

I'm working on an Arduino Uno + ESP8266 project.
I try to use them as a web server on Wi-Fi network to control a motor that connects to Arduino - basically a trigger system that receives signals via Wi-Fi. Currently, I've successfully connected ESP8266 to my access point by sending AT commands from Arduino. Another client on the same network can statically access ESP8266's assigned IP address.
However, when I try to catch some HTTP queries (I want to use them as conditions to control the motor) I occasionally encountered the non-ASCII characters in HTTP request. I use serial comm to debug, please look at the screenshot in the link below:
Arduino - Computer serial communication for debugging
The line ",519:POST ..." should contain a complete number following "/?", but there's some strange characters instead. So I cannot determine the input data to control motor. Once in a blue moon, the expected format of request shows up as follows:
The correct data received
There's no issue with the HTTP response part, even though I got the uninterpretable request, I can still send the JSON error message back to client.
Attempt Note:
The Arduino uses different serial ports to talk to computer and ESP8266. Since the connection can be established, and the data being sent, I believe that the baud rate is simply correct on both side. (115200 for ESP8266, 9600 for computer - also tried 115200 for both and got the same result)
I use V3.3 from Arduino as power source for ESP8266. But I also use voltage regulator to smooth out the current as many people suggest that. The problem still remains.
I'm struggled with this issue for a few days, just want to know if anybody had the similar experience, or could give some clue for the next step.
After a considerable effort to stabilize the circuit, I switched to NodeMCU and got the system working perfectly. I assume that ESP8266 alone is somehow not robust enough without other components, which I unfortunately have no knowledge on.
So I'd like to close this thread with a short recommendation for anybody struggling with the same issue to switch to NodeMCU (which would replace both Arduino and ESP8266); if that could support the requirement.

Understanding serial device settings

Please feel free to slap me and send a link if this question has already been answered; I just couldn't find it. I did search though.
I've been trouble-shooting communication with a serial device. In looking over lots of documentation, I now understand what the settings for "baud rate," "data bits," "stop bit," and "parity" mean. But what I can't seem to understand is who (sender or receiver) determines these settings.
Say I have a serial device plugged into my computer. In my code, I open a connection to the serial port and specify something like 9600,8,E,1. When I specify these settings, do these get sent to the sending itself, so that it knows how to send the data to my receiver? Or is it more common for a sender to expect a receiver to comply with strict settings?
The issue I'm having is that I attempted to use "Even" parity, and that resulted in tons of irregular transfer errors. When I use "Odd" parity, however, those errors go away. There is also a USB to Serial adapter involved in my set up. There aren't any transfer errors with Even or Odd parity without the adapter in the middle. So I'm just having a hard time understanding whether the device itself doesn't support sending with Even parity, or whether the adapter is the thing causing trouble, etc.
Thanks.
When I specify these settings, do these get sent to the sending itself, so that it knows how to send the data to my receiver?
No.
To expand on the comment by Hans Passant, both sides of the serial port have to agree on the settings, otherwise they won't talk to each other. If they don't agree, you will get gibberish data on either side as the hardware will read the data at an incorrect time. The settings are normally documented in the manual for the device that you are attempting to communicate with. For example, to communicate with a Cisco router, you will generally use the following settings:
Bits per sec : 9600
Data bits : 8
Parity : none
Stop bits : 1
Flow control : none
When you setup the serial port on your side, you must use these same settings, there is no hardware-level handshake between the two devices that determines the speed that they will communicate at.
Sometimes, the format for the serial port settings may be given in a format like the following:
9600,8,N,1
Which is just shorthand for the above quote(9600 baud, 8 data bits, no parity, 1 stop bit)
In my experience, most devices default to 9600,8,N,1, the next common serial setting is 115200,8,N,1

Does chrome.serial API ensure data integrity?

I'm trying to understand whether its redundant for me to include some kind of CRC or checksum in my communication protocol. Does the chrome.serial and other chrome hardware communication API's in general if anyone can speak to them (e.g. chrome.hid, chrome.bluetoothLowEnergy, ...)
Serial communications is simply a way of transmitting bits and its major reason for existence is that it's one bit at a time -- and can therefore work over just a single communications link, such as a simple telephone line. There's no built-in CRC or checksum or anything.
There are many systems that live on top of serial comms that attempt to deal with the fact that communications often takes place in a noisy environment. Back in the day of modems over telephone lines, you might have to deal with the fact that someone else in the house might pick up another extension on the phone line and inject a bunch of noise into your download. Thus, protocols like XMODEM were invented, wrappering serial comms in a more robust framework. (Then, when XMODEM proved unreliable, we went to YMODEM and ZMODEM.)
Depending on what you're talking to (for example, a device like an Arduino connnected to a USB serial port over a wire that's 25 cm long) you might find that putting the work into checksumming the data isn't worth the trouble, because the likelihood of interference is so low and the consequences are trivial. On the other hand, if you're talking to a controller for a laser weapon, you might want to make sure the command you send is the command that's received.
I don't know anything about the other systems you mention, but I'm old enough to have spent a lot of time doing serial comms back in the '80s (and now doing it again for devices using chrome.serial, go figure).
I'm using Chrome's serial API to communicate with Arduino devices, and I have yet to experience random corruption in the middle of an exchange (my exchanges are short bursts, 50-500 bytes max). However, I do see garbage bytes blast out if a connection is flaky or a cable is "rudely" disconnected (like a few minutes ago when I tripped over the FTDI cable).
In my project, a mis-processed command wont break anything, and I can get by with a master-slave protocol. Because of this, I designed a pretty slim solution: The Arduino slave listens for an "attention byte" (!) followed by a command byte, after which it reads a fixed number of data bytes depending on the command. Since the Arduino discards until it hears an attention byte and a valid command, the breaking-errors usually occur when a connection is cut while a slave is "awaiting x data bytes". To account for this, the first thing the master does on connect is to blindly blast out enough AT bytes to push the Arduino through "awaiting data" even in the worst-case-scenario. Crude, yet sufficient.
I realize my solution is pretty lo-fi, so I did a bit of surfing around and I found this post to be pretty comprehensive: Simple serial point-to-point communication protocol
Also, if you need a strategy for error-correction over error-detection/re-transmission (or over my strategy, which I guess is "error-brute-forcing"), you may want to check out the link to a technique called "Hamming," near the bottom of that thread - That one looked promising!
Good luck!
-Matt

Resources