Receiving data from multiple devices using parallel wired RS232 - serial-port

I'm currently developing a small application for monitoring the power / current our solar collector is generating.
The array is connected to 3 inverters. Every inverter has a RS232 interface, transmitting one Line of information(its current status) every 10 seconds.
Since I want to do the monitoring using a device only having one serial port, I need to come up with a way to be able to read the data from all of the inverters in parallel.
I don't need to send anything to one of the inverters!
Is it possible to just connect 3 RS232 wires in parallel to one serial port? Collisions will be pretty unlikely since every inverter is transmitting only 64Byte / 10seconds ending with a newline, so I could check for variable line lengths to detect collisions.

I'm sort of chuckling at doomsday and wacky answers that so often pop up on stackoverflow...
But anyway, in years gone buy I have used paralleled RS-232 transmit lines using diodes and it can work fine for situations where collisions are unlikely. In one particular application I used this technique there were two input terminals where a user could key in simple commands to control the system (a specialized security system) and it was very unlikely that two people would be trying to control it at the same time from the two different terminals. Amazingly enough there are no problems with voltage levels with most RS-232A receivers I tested at the time and they tolerated the signal characteristics (no negative voltage) that result from the simple use of the diodes in series with the TXD signals. However, if I had to do this again I would likely add a simple pull-down resister and capacitor to ground with a diode between RXD and the cap in a sort of charge pump configuration or a pull-down to negative going handshake signal to ensure the "OR'd" input signal goes truly negative since the RS-232 spec defines +3 to -3v as invalid.
In any case, I would recommend not using this technique except in very specific, limited, and non-mission critical cases and would not use it in the case where you have multiple devices sending information at a programmed interval as in the case of the OP or where there is a software handshake.
In can be a simple solution to the problem of not enough serial input ports but only in a very limited set of environments.

No, you should NOT connect 3 serial output port in parallel. If you do that you are probably going to broke the RS232 output circuitry of your inverters.
You have 3 RS232 outputs, so you need 3 RS232 input, then you can manage these 3 input the way you like: maybe you can buffer the data from each input, and reoutput the data on a single RS232 output, to be connected to your monitoring device.... but you should add some code in the data flow to differentiate the data coming from the 3 inverters.
Maybe you can use some kind of IC that do the job for you, I'm not sure, but maybe that some IC that multiplex multiple RS232 input on a single RS232 output already exist.
Try this search: rs232 port input multiplexer on Google
Or, if the monitoring device is a Window computer, you can use 3 serial-to-usb converter: that will create 3 virtual COM port on your computer and you can read data from them with any software.
Update
About the hypothesis of securing the output circuitry using diods to block reentering current, I don't think it's going to work...
Many year have passed by since last time I've used an RS232 link at low level (so maybe I'm wrong) but I think that there is some kind of handshake going on between RS232 input and output port (speed to use, parity, stop bit...).
Each RS232 port have inputs and outputs signal, both for data and for transmission control, so your multiple RS232 outputs does have some input signals, and your single RS232 input does have some outputs.
This mean that your input monitoring RS323 port is going to try to make a handshake with 3 RS323 ports at the same time... and the 3 RS232 ports are probably going to respond at the same time... so I think it's not going to work.
Other than that if you place diodes on your output, you are going to loose 0.7v, I don't remember the tolerance on signal level of RS232, but maybe that 0.7v can be relevant.

Related

The elegant way to handle ADCs with DMA in a RTOS

I'm currently setting up an AZURE RTOS (ThreadX on STM32), with Ethernet, SPI and ADCs activated.
This STM32 has to pass-through configuration information from time to time, coming from my PC over the Ethernet-Port.
It has to pass these information via SPI to two other STM32, which makes the first STM32 the system-controller / system-interface. This will be a low-priority task, since the activation of the passed configuration will be started by sync-lines, running from the system-controller to the two other STMs.
While doing so, the system-controller has to read-in ADC values constantly and pass them via Ethernet / TCP to my computer.
I've used the ThreadX TCP server example, as given by STM, as a starting point.
From there I've managed to set up three servers on three ports, communicating sucessfully with a python script on my PC (as a first test).
Now come the two great questions:
1)
Since my input signal may contain frequencies up to 2.5 MHz, I want to digitize this signal with the full 5 MSPS (Nyquist), which ADC3 is capable of.
The smallest internally available data-type at full resolution is uint16_t, which makes the data rate work out to be R = 16 * 5 MSPS = 80 MBit/s (worst-case, I bet, there is optimization possible ... e.g. 8 bits resolution, which halves the data-rate ... but this resolution might not be enough ... or 16 bits, and FFT afterwards, which is also sufficient, since I'm mostly interested in energy per frequency band, but initially I wanted to do this on my computer, for best flexibility).
Even if the Ethernet-IF is capable of doing 100MBit/s, the TCP layer of NetXDuo, I bet, is not.
(There is also USB OTG on this board available, but since networked devices are in my opinion more versatile, I prefer using Ethernet ... nevertheless, USB might be a backup solution)
From my measurements, a data-stream transmitted to the uC via TC from within python, and mirrored back within a thread to my PC allows for relatively consistent 20 MBit/s.
... How do I push this speed to a better level?
(I think 20MBit/s is the back-and-forth data-rate, so one-way may be faster)
However. Second question:
2)
The ADC within the STM is capable of storing data via DMA to memory.
There are two callbacks available, one at half-full, one at full buffer state.
My problem is mostly about the way of reading out the DMA and/or triggering the conversion in the first place.
How do you do this the "right" way on a RTOS (such that you don't brake the RT in RTOS)?
I see some options here, what are the pros/cons you can think of?
a) Let the ADC run freely, calling the call-backs at the respective fill-levels, triggering a TCP-transmission whenever one of the call-backs is reached
-> may lead to glitches due to insufficient speed of the TCP layer in my opinion.
b) Let the ADC conversion be triggered by a thread, which is preempted and will later TCP-transmit the data, as soon as the memory-buffer is full
-> may lead to inconsistency in the converted values, since you get burst-style conversions, with gaps in between, while the buffer is read
c) Let a thread trigger each conversion individually
-> A no-go I think, since threads are not triggered that often, to get a decent sample-frequency
d) Let a free-running ADC trigger callbacks, let a thread do the FFT, transmit within another thread the data via TCP
-> May work, but is less flexible, since the data gets crunched within the uC.
--> Are there other ways you can think of / what do you think about the ways I named here?
--> What do you think about question 1)?
Have a nice day!

Using 3 different communication protocols in the same MCU

For a project I need to make communicate in a CANBus network, ethernet network and with RS-232. I want to use one single MCU that will act as the main unit of CANBus start topology, Ethernet start topology and that MCU also will be transfering the RS232 data that comes to it to another device. Now I want to use high speed CAN which can be up to 1 Mbits per second. However,RS-232 is max 20 k baud. I wonder if it is doable with 1 MCU to handle 3 different communications ( CANBus, ethernet and RS-232). I am afraid of to get overrun with data at some point. I can buffer data short term if data comes in bursts that can be averaged out. For continuous data where I'll never be able to keep up, I'll need to discard messages, perhaps in a managed way. But I do not want to discard any data. So my question is: Would using 1 MCU for this case work? And are there any software tricks that would help me with this case? (Like giving CANBus a higher priority etc.)
Yes, this can be done with a single MCU. Even a simple MCU should easily be able to handle data rates of 1 Mbps. Most likely you want to use DMA enabled transfer so the CPU core will only need to act when the transmission of a chunk of data has completed.
The problem of being overrun by data due to the mismatch in data rate is a separate topic:
If the mismatch persists, no system can handle it, no matter how capable.
If the mismatch is temporary, it's just a function of the available buffer size.
So if the worst case you want to handle is 10s of incoming data at 1 Mbps (with an outgoing rate of 20kbps), then you will need 10s x (1Mbps - 20kps) = 9.8 Mbit = 1.225 MByte of buffer memory.

Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?

I have been using the Teensy 3.6 microcontroller board (180 MHz ARM Cortex-M4 processor) to try and implement a driver for a sensor. The sensor is controlled over SPI and when it is commanded to make a measurement, it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal. The data frame itself consists of 1,024 16-bit values.
My first attempt consisted a relatively naïve approach: I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line. The main loop of the program generates a uint_16 value from these bits and collects 1,024 of these values for the full measurement frame.
However, this program locks up the Teensy almost immediately. From my experiments, it seems to lock up as soon as the interrupt is attached. I believe that the microprocessor is being swamped by interrupts.
I think that the correct way of doing this is by using the Teensy's DMA controller. I have been reading Paul Stoffregen's DMAChannel library but I can't understand it. I need to trigger the DMA measurements from the PCLK digital pin and have it read in bits from the DOUT digital pin. Could someone tell me if I am looking at this problem in the correct way? Am I overlooking something, and what resources should I view to better understand DMA on the Teensy?
Thanks!
I put this on the Software Engineering Stack Exchange because I feel that this is primarily a programming problem, but if it is an EE problem, please feel free to move it to the EE SE.
Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?
There is more than one source of 'high speed digital data'. DMA is not the globally correct solution for all data, but it can be a solution.
it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal.
I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line.
This approach would be call 'bit bashing'. You are using a CPU to physically measure the pins. It is a worst case solution that I see many experienced developers implement. It will work with any hardware connection. Fortunately, the Kinetis K66 has several peripherals that maybe able to assist you.
Specifically, the FTM, CMP, I2C, SPI and UART modules may be useful. These hardware modules are capable of reducing the work load from processing each bit to groups of bits. For instance, the FTM support a capture mode. The idea is to ignore the PCLK signal and just measure the time between edges. These times will be fixed in a bit period/CLK. If the timer captures a two bit period, then you know that two ones or zeros were sent.
Also, your signal seems like SSI which is an 'digital audio' channel. Unfortunately, the K66 doesn't have an SSI module. Typical I2C is open drain and it always has a start bit and fixed word size. It maybe possible to use this if you have some knowledge of the data and/or can attach some circuit to fake some bits (to be removed later).
You could use the UART and time between characters to capture data. The time will be a run of bits that aren't the start bit. However it looks like this UART module requires stop bits (the SIM feature are probably very limited).
Once you do this, the decision between DMA, interrupt and polling can be made. There is nothing faster than polling if the CPU uses the data. DMA and interrupts are needed if you need to multiplex the CPU with the data transfer. DMA is better if the CPU doesn't need to act on most of the data or the work the CPU is doing is not memory intensive (number crunching). Interrupts depend on your context save overhead. This can be minimized depending on the facilities your main line uses.
Some glue circuitry to adapt the signal to one of the K66 modules could go a long way to making a more efficient solution. If you can't change the signal, another (NXP?) SOC with an SSI module would work well. The NXP modules usually support chaining to an eDMA module as well as interrupts.

Arduino RF sensor network

I'm currently designing a sensor network that will have small ATtiny85 probes that each have a temperature sensor, a barometer, and a humidity sensor. I think I will use these (http://goo.gl/TqaDjl) to communicate as they are low cost and don't need much range. Im not sure though how I will get the probes to communicate with the main control, as the transmitter transmits digitally and I will have +20 probes that all need to send data without signals overlapping or getting messed up every minute. I think the easiest way would be to time the probes so that they don't overlap in transmission but I'm not sure.
Questions:
-Is using RF the cheapest and best option for this system?
-How can I prevent communication overlapping?
-What is the easiest way to send data digitally from an arduino (or ATtiny85)?
I guess I'm late to the party, but I'll offer some insight into collision control with a ton of chattering transmitters on one link, a la 802.11. This is somewhat packetized.
If two transmitters try to transmit at the same time, you're bound to get a mangled mess of rotten bacon at the receivers.
A simplified version of WiFi-style collisions would be good. Basically, it uses preambles that can be detected, and for longer transmissions that have a higher chance of conflicting, it can use shorter request/clear to send packets.
While this is likely overkill, I'd go for preambles. Start by transmitting a steady stream of something recognizable, like in hex, 555533330f0f00ff which is basically alternating 1s and 0s but with changing frequency(0101, then 0011, then 00001111, and so on), a readily recognizable pattern that is unlikely to be given off by stray radiation or noise.
This pattern could undergo a shift so there's a finite set of other preambles that should be bitwise-shifted relative to the original.
If a transmitter detects this preamble, it should STOP and wait. If you limit all packets to a certain temporal length, collisions should not occur if you wait sufficient time between packets. If during the time of one packet, a preamble is heard, then your station should wait for the full length of the transmission(listening to its length and other header fields so it knows how long to wait). Once the packet is done, your station can transmit its preamble.
This is where the WiFi resemblance stops and simpler protocols take over.
Note that if 2 stations are waiting on a packet they can start their preambles almost simultaneously. To resolve this, each station should have a different zero bit flipped in its preamble. If it detects a 1 for that bit, it sees that there's another station preambling, and should back off.
Each station should wait a certain delay(up to you) after each packet so other stations can start their transmissions.
A few sketches of the communication patterns show that this is sufficient for your needs.
Now if it's a master-slave-style system as long as you only have one network it should be easier since there should only be one outstanding request that would involve a slave transmitting.
Those will be by far the cheapest method. As for the best method, there are a variety of choices much better, but more expensive. A network of Xbee modules comes to mind, but those are much more expensive than $1.25 a pair.
Using the RF modules is very do-able however. To prevent communication overlapping, put a RF transmitter and receiver on each sensor node and the main hub. The main hub can send "hey sensor1 give me your data", which gets broadcasted to all of the sensors. However, only sensor1 will realize "hey I am sensor 1, here is my data" which the hub will listen for. Then, the hub will go on and say "hey sensor2 send me your data" and so on and so forth.
I think your original approach may be best. The approach of putting a Tx and Rx on every device may be affordable, but I question if it will work. With 20 devices transmitting on the same frequency, which one will the receiver "hear". Most important, how will a device receive any remote transmitter's signal when its own transmitter is very close? Keep in mind: these are AM radios and will "send" a carrier even if not sending any data. Get a small number of transmitters before trying to go full scale.
To avoid the problem of receiving the one active transmitter among the soup of inactive transmitters, you want only 1 transmitter powered at 1 time. You would control Vcc to one transmitter, turn it on, send the burst of data, and then power it off.
-How can I prevent communication overlapping?
You can't -- you have to accept that there will be occasional overlaps. Add a CRC to the transmitted data so that the receiver can detect garbage.
The timing of the multiple transmitters is surely a project in itself. You surely don't want to run them all at the same transmission period. They may not collide at the beginning, but when two devices did drift together and start colliding, they would stay together and collide for a long time, until the clocks drifted apart.
I would start with something simple. For example with three devices, run the transmissions at 2000 ms, 2200 ms, 2400 ms period (use EEPROM to configure). That way, if a pair happens to collide at one data point, then next transmissions that pair will be 200 ms apart.

Is it stable to change I/O direction on microcontroller repeatedly?

I'm new to microcontroller programming and I have interfaced my microcontroller board to another device that provides a status based on the command send to it but, this status is provided on the same I/O pin that is used to provide data. So basically, I have an 8-bit data line that is used as an output from the microcontroller, but for certain commands I get a status back on one of the data lines if I choose to read it. So I would be required to change the direction of this one line to read the status thus converting this line as an ouput to an input and then back to an output. Is this acceptable programming or will this changing of the I/O pin this frequently cause instability?
Thanks.
There should not be any problem with changing the direction of the I/O line to read the status returned by the peripheral provided that you change the state of the line to an input before the peripheral starts to drive the line and then do not try to drive the line as an output until the peripheral stops driving it. What you must try to avoid is contention between the two driver devices, i.e. having the two ends being driven to opposite states by the processor and peripheral. This would result in, at best a large spike in the power consumption or worse blown pin driver circuitry in the processor, peripheral or both.
You do not say what the processor or peripheral are so I cannot tell whether there are any control bits in the interface that enable the remote device to output the status so that you can know whether the peripheral is driving the line at any time.
I've done this on digital I/O pins without any problems but I'm very far from an expert on this. It probably depends entirely on which microcontroller you are using though.
Yes, it's perfectly fine to change I/O direction on microcontroller repeatedly.
That's the standard method of communicating over open-collector buses such as I2C and the iButton. (see PICList: busses for links to assembly-language code examples).
transmit 0 bit: set output LATx bit to 0, and then set TRISx bit to OUTPUT.
transmit 1 bit: keep output LATx bit at 0, and set TRIS bit to INPUT (let external resistor pull-up line to high)
listen for response from peripheral: keep output LATx bit at 0, and set TRIS bit to INPUT. Let external resistor pull-up line to high when peripheral is transmitting a 1, or let the peripheral pull the line low when peripheral is transmitting a 0. Read the bit from the PORTx pin.
If both ends of the bus correctly follow this protocol (in particular, if neither end actively drives the line to high), then you never have to worry about contention or current spikes.
It`s important to remember that any IO switching in high speed generates EMI.
Depending of switching frequency, board layout and devices sensibilities, this EMI can affect performance and reliability of your application.
If you are having problems in your application use an oscilloscope to check for irradiated EMI in your board lanes.

Resources