I am trying to connect a joystick to my pc. The device has a serial cable which is supposed to be inserted into another device (a steering wheel), but instead I want to use it directly. Eventually I plan to connect it to a program that am currently writing.
How can I communicate with this device? I know how to communicate to a serial device, I just don't understand what I should send and expect to receive.
Debugging attempts:
If I connect it directly to my PC's serial device and using a program such as moserial I can sort of communicate with it, if I pound on the keyboard it returns (usually) one byte and "=\n" per byte sent. The return codes do not seem to always correspond to the state of the joysitck. It seems to be returning the same value for the same value. Once or twice I noticed that if I send a large amount of random data I can get it to hang until I switch the position of the joystick. But for the most part there seems to be nonsensical responses (same input per output regardless of state). I also tried a serial-to-usb converter which had a different result. Under USB the device does pretty much nothing regardless of the baud rate. I have noticed that if I send an incredible amount of random kepresses I occasionally get a single unprintable character in response.
I was expecting to get a continuous stream of numbers corresponding to the state of the joystick.
Summary:
I don't know if my direct serial connection is just showing noise, I did try a second serial to USB converter which had the same results. Any ideas or suggesting in determining how to communicate with this device?
Related
I'm trying to send data from C# app to the one STM32 dev board via USB CDC. I have problem with method SerialPort.Write() because write method send only first character instead of all characters of array or string. Same situation is it with serial terminals (terminte, cutecome,..) when attempt to send string. Can someone explain me how cdc usb really works. Does STM32 send data in same manner, character by character in each frame? I didn't use logic analyzer to see signals but that will be next step maybe.
If anyone has picture of usb cdc frame please share.
When I send string from STM32 dev board to the C# app that works fine. Any idea is welcome
The C# SerialPort.Write function is capable of writing an entire string at once if you call it the right way. The bytes of the string will be passed on to your operating system's USB drivers, which will split it up into packets according to the max packet size for the endpoint specified in your device's USB descriptors. Then those packets will be sent to the device, one packet at a time.
I suspect that there is a bug in your STM32 code, but since you have not posted an MCVE it will be difficult to help you debug any code.
I'm working on an Arduino Uno + ESP8266 project.
I try to use them as a web server on Wi-Fi network to control a motor that connects to Arduino - basically a trigger system that receives signals via Wi-Fi. Currently, I've successfully connected ESP8266 to my access point by sending AT commands from Arduino. Another client on the same network can statically access ESP8266's assigned IP address.
However, when I try to catch some HTTP queries (I want to use them as conditions to control the motor) I occasionally encountered the non-ASCII characters in HTTP request. I use serial comm to debug, please look at the screenshot in the link below:
Arduino - Computer serial communication for debugging
The line ",519:POST ..." should contain a complete number following "/?", but there's some strange characters instead. So I cannot determine the input data to control motor. Once in a blue moon, the expected format of request shows up as follows:
The correct data received
There's no issue with the HTTP response part, even though I got the uninterpretable request, I can still send the JSON error message back to client.
Attempt Note:
The Arduino uses different serial ports to talk to computer and ESP8266. Since the connection can be established, and the data being sent, I believe that the baud rate is simply correct on both side. (115200 for ESP8266, 9600 for computer - also tried 115200 for both and got the same result)
I use V3.3 from Arduino as power source for ESP8266. But I also use voltage regulator to smooth out the current as many people suggest that. The problem still remains.
I'm struggled with this issue for a few days, just want to know if anybody had the similar experience, or could give some clue for the next step.
After a considerable effort to stabilize the circuit, I switched to NodeMCU and got the system working perfectly. I assume that ESP8266 alone is somehow not robust enough without other components, which I unfortunately have no knowledge on.
So I'd like to close this thread with a short recommendation for anybody struggling with the same issue to switch to NodeMCU (which would replace both Arduino and ESP8266); if that could support the requirement.
I have been trying to set up two XBees to communicate since the last three days. X-CTU seems to be the perfect option to do so, however, it is a real menace when it comes to discovering XBees on serial ports.
I was able to detect one XBee by luck just once and the other one never showed up. I have even replaced both my XBees. I am trying to figure out the alternative, i.e. using a serial console to perform the operation. I haven't been able to receive an OK response from the device upon issuing +++.
Since I haven't had a good experience using a PC to communicate with ESP8266 devices earlier, I tried to figure out a workaround by using the second Serial port of an Arduino to send such configuration messages and read the response by printing it out on the default serial console.
It also appears that configuration messages can differ depending on the mode of the device. If it's in API mode, the frame has to be generated in a specific format (I use the X-CTU frame generator for this purpose).
Why am I not able to receive a response from the XBee upon issuing a +++?
The devices are Series 1 XBees and the exact part number is XB24-AWI-001. Any help is highly appreciated.
Have you considered the XBee being in API mode? Maybe should you consider to reflash the device in AT mode to start playing with it.
To test if it's in API mode, you can refer to the guide, chapter 9 for the API mode structure:
http://eewiki.net/download/attachments/24313921/XBee_ZB_User_Guide.pdf?version=1&modificationDate=1380318639117&api=v2
Basically, a datagram in API mode starts with ~, and it's built as follows:
[0x7E|length(2B)|Command(1B)|Payload(length-1B)|Checksum(1B)]
As 0x7E is ~ on the ASCII table, you should try typing a bogus datagram in a serial terminal session like:
~ <C-d> AAAA
N.B.: The <C-d> characters means Control-d under unix., which is the EOF character.
Obviously such a message isn't likely to work, and you will receive a reply asking you to send that datagram again. That's because the EOF character being ASCII code 4, it means that the length of the datagram will be 4 bytes. So then you send four bogus bytes, the checksum will be A, which is very likely to be right, and the receiver will assume the transmission has been corrupted. So the datagram will be asked again, meaning you will receive a datagram to do that query.
Though I can only advice you to consider running it only in API mode (more reliable and a better API, but you cannot play around with it and understand what's going on by tapping on the line with a logic analyzer… though giving enough time, you'll start to read API datagrams like it's English ☺).
I wrote a page with a few resources to check on how to reflash the XBees:
https://github.com/hackable-devices/polluxnzcity/wiki/Flash-zigbee
and here's other advices from another totally unrelated project:
https://github.com/andrewrapp/xbee-api#documentation
And I also wrote a lib (aimed at beaglebones but you can tweak it for your use) that handles API mode 2 with XBees:
https://github.com/hackable-devices/polluxnzcity/blob/master/PolluxGateway/include/xbee/xbee_communicator.h
https://github.com/guyzmo/polluxnzcity/blob/master/PolluxGateway/src/xbee/xbee_communicator.C
but I bet with a little google search you can find more widely used libraries than those ones, and even some aimed to be run on Arduinos (N.B.: that lib was originally written for Arduinos, and then adapted to run for Beaglebone, so reversing the operation shouldn't be hard).
Please feel free to slap me and send a link if this question has already been answered; I just couldn't find it. I did search though.
I've been trouble-shooting communication with a serial device. In looking over lots of documentation, I now understand what the settings for "baud rate," "data bits," "stop bit," and "parity" mean. But what I can't seem to understand is who (sender or receiver) determines these settings.
Say I have a serial device plugged into my computer. In my code, I open a connection to the serial port and specify something like 9600,8,E,1. When I specify these settings, do these get sent to the sending itself, so that it knows how to send the data to my receiver? Or is it more common for a sender to expect a receiver to comply with strict settings?
The issue I'm having is that I attempted to use "Even" parity, and that resulted in tons of irregular transfer errors. When I use "Odd" parity, however, those errors go away. There is also a USB to Serial adapter involved in my set up. There aren't any transfer errors with Even or Odd parity without the adapter in the middle. So I'm just having a hard time understanding whether the device itself doesn't support sending with Even parity, or whether the adapter is the thing causing trouble, etc.
Thanks.
When I specify these settings, do these get sent to the sending itself, so that it knows how to send the data to my receiver?
No.
To expand on the comment by Hans Passant, both sides of the serial port have to agree on the settings, otherwise they won't talk to each other. If they don't agree, you will get gibberish data on either side as the hardware will read the data at an incorrect time. The settings are normally documented in the manual for the device that you are attempting to communicate with. For example, to communicate with a Cisco router, you will generally use the following settings:
Bits per sec : 9600
Data bits : 8
Parity : none
Stop bits : 1
Flow control : none
When you setup the serial port on your side, you must use these same settings, there is no hardware-level handshake between the two devices that determines the speed that they will communicate at.
Sometimes, the format for the serial port settings may be given in a format like the following:
9600,8,N,1
Which is just shorthand for the above quote(9600 baud, 8 data bits, no parity, 1 stop bit)
In my experience, most devices default to 9600,8,N,1, the next common serial setting is 115200,8,N,1
I am trying to write a Labview program that takes input from a thermal sensor on serial port RS - 232, applies some basic transformation to it, and displays it on a screen.
I'm wondering if it is possible to somehow simulate the sensor in labview, or by using some external simulator application so I can test my program before I'm given access to the actual hardware.
Is this possible?
I have Labview 2011.
The quickest way to test your VI's logic would be to make a CSV file of example data, and temporarily replace the section that reads from the sensor with a section that reads data values from the CSV file at the same rate.
It's probably not worth trying to emulate the serial port input at a lower level as Labview is generally very reliable at getting data from hardware into your VI - it's up to you then what you do with it!
You could have another program simulate the sensor and write to a different COM port. Then you could connect these COM ports with a null modem cable.
In order to do so, you'll have to work out how your sensor works and feed data in na appropriate format into that 2nd COM port. These data will end up being received by the 1st COM port and eventually by your application-to-be-developed.
If you encapsulate all of your communications code in a subVI or set of subVI's, separate from the code that does the transformation and display part, you can easily substitute test code and test data for the real sensor data. You could write a subVI that generates the test data and replace it with the real sensor comms subVI later, or you could use a case structure in the subVI to choose between communicating with the real sensor and just outputting test data (which, as Moray suggests, you could read in from a file so that you can easily change it).
I would suggest that you write separate subVIs for opening communications to the sensor, getting a data point from it, and closing the comms port when you are finished (though you can probably just use the serial or VISA close function for that). Chain these VIs together using the comms port (aka VISA session) and error wires. The 'open' VI could take an input that specifies whether real/simulated data is to be used and store that choice in a global variable (or a functional global VI) which the 'get data' VI checks each time it is called.
glglgl's suggestion of sending the simulated data from another serial port is also good; all you need to do here is use the serial send and receive functions in some sort of loop to do the same thing as the real sensor would, in terms of receiving commands and sending an output back. This has the advantage that you don't need to make any changes to your main program which should work exactly the same whether it's connected to the real sensor or the simulation program. However, problems with serial comms in the real world often result from instruments or devices that don't do exactly what their specifications claim they do, so just because your program works perfectly with your simulation doesn't guarantee it'll work perfectly with the real sensor if the real sensor does something you didn't expect :-)
Though the other answers offered some really great ideas, I've found an easier way to simulate sensor input that would be convenient for beginners.
Create virtual serial ports on your computer by using a Virtual port simulator. http://www.eltima.com/products/vspdxp/
Get a Modbus simulator. http://www.plcsimulator.org/
Download Labview Modbus Library. http://zone.ni.com/devzone/cda/epd/p/id/4756
Open the Labview Modbus library and run 'MB Serial Master Example.vi'
Now it should be possible to read/write values into the simulator using the example program.
The Block Diagram of example program can be analyzed to find out how data is being transferred behind the scenes on Modbus protocol.