Setting the baud rate for USART in STM32 microcontroller - serial-port

Why do we need to calculate the baud rate using the following formula?
baud = fCK / (16*USARTDIV)
I mean, why can't we write 9600 or any other desired baud rate value directly in USART_BRR register? Why do we need to perform this calculation first? What are we calculating here anyway?

You can't just write your desired baud rate into a register as the processor doesn't know how fast the clock is to that peripheral, so it wouldn't be able to set up dividers correctly.
It would be possible to make a USART that you could tell a baud rate too, but it would require extra complexity and would still need to know how fast its clock is.

You are not configuring software, but hardware. That means, the bits you set are (more or less) directly connected to clock dividers which control the baud rate of the UART module.
As an example, you have to use fCK in your calculation. Imagine the UART module would have to calculate fCK by itself. The problem is, fCK is the result of all used clock dividers and not configured centrally. The effort to calculate fCK in hardware would just be disproportionate (effort = costs). It's just much easier to let the user calculate it.

Related

Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?

I have been using the Teensy 3.6 microcontroller board (180 MHz ARM Cortex-M4 processor) to try and implement a driver for a sensor. The sensor is controlled over SPI and when it is commanded to make a measurement, it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal. The data frame itself consists of 1,024 16-bit values.
My first attempt consisted a relatively naïve approach: I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line. The main loop of the program generates a uint_16 value from these bits and collects 1,024 of these values for the full measurement frame.
However, this program locks up the Teensy almost immediately. From my experiments, it seems to lock up as soon as the interrupt is attached. I believe that the microprocessor is being swamped by interrupts.
I think that the correct way of doing this is by using the Teensy's DMA controller. I have been reading Paul Stoffregen's DMAChannel library but I can't understand it. I need to trigger the DMA measurements from the PCLK digital pin and have it read in bits from the DOUT digital pin. Could someone tell me if I am looking at this problem in the correct way? Am I overlooking something, and what resources should I view to better understand DMA on the Teensy?
Thanks!
I put this on the Software Engineering Stack Exchange because I feel that this is primarily a programming problem, but if it is an EE problem, please feel free to move it to the EE SE.
Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?
There is more than one source of 'high speed digital data'. DMA is not the globally correct solution for all data, but it can be a solution.
it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal.
I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line.
This approach would be call 'bit bashing'. You are using a CPU to physically measure the pins. It is a worst case solution that I see many experienced developers implement. It will work with any hardware connection. Fortunately, the Kinetis K66 has several peripherals that maybe able to assist you.
Specifically, the FTM, CMP, I2C, SPI and UART modules may be useful. These hardware modules are capable of reducing the work load from processing each bit to groups of bits. For instance, the FTM support a capture mode. The idea is to ignore the PCLK signal and just measure the time between edges. These times will be fixed in a bit period/CLK. If the timer captures a two bit period, then you know that two ones or zeros were sent.
Also, your signal seems like SSI which is an 'digital audio' channel. Unfortunately, the K66 doesn't have an SSI module. Typical I2C is open drain and it always has a start bit and fixed word size. It maybe possible to use this if you have some knowledge of the data and/or can attach some circuit to fake some bits (to be removed later).
You could use the UART and time between characters to capture data. The time will be a run of bits that aren't the start bit. However it looks like this UART module requires stop bits (the SIM feature are probably very limited).
Once you do this, the decision between DMA, interrupt and polling can be made. There is nothing faster than polling if the CPU uses the data. DMA and interrupts are needed if you need to multiplex the CPU with the data transfer. DMA is better if the CPU doesn't need to act on most of the data or the work the CPU is doing is not memory intensive (number crunching). Interrupts depend on your context save overhead. This can be minimized depending on the facilities your main line uses.
Some glue circuitry to adapt the signal to one of the K66 modules could go a long way to making a more efficient solution. If you can't change the signal, another (NXP?) SOC with an SSI module would work well. The NXP modules usually support chaining to an eDMA module as well as interrupts.

Why 9600 data rate is used over other rates for Serial.begin()?

I am working on Arduino and for communication purpose with the computer Serial.begin() function is being used. Now since there is a range of data rates from 300...115200.
Majority uses 9600!
Why is it so? What is its significance
During previous millenium 9600 bauds has been a standard for some devices.
Currently this speed is enough for most cases, so they stick to it; many devices use 9600 baud as a default.
Personally I use serial for debugging most often. At 9600 baud, it can print more than 10 lines per second, that is more than I can read.
Yet you can keep in mind that the buffer is limited to 64 char and when it is full, arduino will block a serial.write instruction until there is enough space in buffer. That is why you encounter some slowdown with slow baud rates.
On the other sides you will burden the MCU with speed of 0.5M on hardware serial. And with software serial you may see an impact much sooner.
Personally I had some trouble with chinese nano that used CH340 USB/Serial; python communication to arduino with pyserial was unreliable at speed over 9600 bauds.
Many devices use 9600 or 19200 baud, and I guess that people just copy over values without thinking about them, thereby continuing the practice even if it is no longer necessary.
That said, the maximum length of a serial cable depends on the baud rate you choose. Higher baud rates require shorter cables. So if you don't need the higher rate, just stay with a low one like 9600.

What value to set the baud rate to

In MATLAB, I am establishing a serial link to an Arduino. Is a higher baud rate always better? I am using 9,600 baud now, but that is merely because it is the most standard value.
You'll have better luck over at https://arduino.stackexchange.com/.
Why do people settle?
People settle because it is more than fast enough. The most common use is just to print some stuff on a terminal
for debuggin. 9600 baud is 960 characters per second, or 12 x 80
character lines per second. How fast can you read? :)
If your program is using the serial port for bulk data transfer, you
would choose not to settle.
See the following resources:
Serial.begin(): Why not always use 28800?
How high of a baud rate can I go (without errors)?
Good question. I've spent years working with modems and I'm not stranger to baud rates. My Arduino uses a USB connection and it handles the baud rate, so I never got into messing with it.
It's strictly how quickly do you want your program loaded. It has no other effect. It would be reasonable to consider that low-end equipment might not support the higher end speed. From the communications perspective, the higher the baud rate the more chance of data errors. I think it's a stretch to think the communications between the pc and an Arduino is going to have much of an issue.

Maximum potential speed for serial port rs232 [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
What is the potential maximum speed for rs232 serial port on a modern PC? I know that the specification says it is 115200 bps. But i believe it can be faster. What influences the speed of the rs232 port? I believe it is quartz resonator but i am not sure.
This goes back to the original IBM PC. The engineers that designed it needed a cheap way to generate a stable frequency. And turned to crystals that were widely in use at the time, used in any color TV in the USA. A crystal made to run an oscillator circuit at the color burst frequency in the NTSC television standard. Which is 315/88 = 3.579545 megahertz. From there, it first went through a programmable divider, the one you change to set the baudrate. The UART itself then divides it by 16 to generate the sub-sampling clock for the data line.
So the highest baudrate you can get is by setting the divider to the smallest value, 2. Which produces 3579545 / 2 / 16 = 111861 baud. A 2.3% error from the ideal baudrate. But close enough, the clock rate doesn't have to be exact. The point of asynchronous signalling, the A in UART, the start bit always re-synchronizes the receiver.
Getting real RS-232 hardware running at 115200 baud reliably is a significant challenge. The electrical standard is very sensitive to noise, there is no attempt at canceling induced noise and no attempt at creating an impedance-matched transmission line. The maximum recommended cable length at 9600 baud is only 50 feet. At 115200 only very short cables will do in practice. To go further you need a different approach, like RS-422's differential signals.
This is all ancient history and doesn't exactly apply to modern hardware anymore. True serial hardware based on a UART chip like 16550 have been disappearing rapidly and replaced by USB emulators. Which have a custom driver to emulate a serial port. They do accept a baudrate selection but just ignore it for the USB bus itself, it only applies to the last half-inch in the dongle you plug in the device. Whether or not the driver accepts 115200 as the maximum value is a driver implementation detail, they usually accept higher values.
The maximum speed is limited by the specs of the UART hardware.
I believe the "classical" PC UART (the 16550) in modern implementations can handle at least 1.5 Mbps. If you use a USB-based serial adapter, there's no 16550 involved and the limit is instead set by the specific chip(s) used in the adapter, of course.
I regularly use a RS232 link running at 460,800 bps, with a USB-based adapter.
In response to the comment about clocking (with a caveat: I'm a software guy): asynchronous serial communication doesn't transmit the clock (that's the asynchronous part right there) along with the data. Instead, transmitter and receiver are supposed to agree beforehand about which bitrate to use.
A start bit on the data line signals the start of each "character" (typically a byte, but with start/stop/parity bits framing it). The receiver then starts sampling the data line in order to determine if its a 0 or a 1. This sampling is typically done at least 16 times faster than the actual bit rate, to make sure it's stable. So for a UART communicating at 460,800 bps like I mentioned above, the receiver will be sampling the RX signal at around 7.4 MHz. This means that even if you clock the actual UART with a raw frequency f, you can't expect it to reliably receive data at that rate. There is overhead.
Yes it is possible to run at higher speeds but the major limitation is the environment, in a noisy environment there will be more corrupt data limitating the speed. Another limitation is the length of the cable between the devices, you may need to add a repeater or some other device to strengthen the signal.

Gsm interfacing with atmega16

I am working on GSM sim900D interfacing with Atmega16. Initially I made the circuit using MAX232 on breadboard. Then I connected it to my PC using a serial port. I tested AT commands, the commands worked perfectly on hyper terminal and I was able to send SMS using hyperterminal. Then I tested it on Proteus and it was working there perfectly too.
I am using codevision avr as the compiler. GSM work on 9600baud but the problem is that in compiler I have to keep the baud rate4800(clock = 1MHz) and at proteus COMPIM(physical baud=9600 & virtual baud=4800) only then it works when I run it on hardware(breadboard) it doesn't work as I have set the baud to 4800. I don't know how to set the baud for hardware. I tried 9600baud for hardware in compiler but it doesn't send SMS at all. Kindly tell me what I should do?
On ATmega16 (and other ATmegas), the serial baud rate is set via UBRRH and UBRRL registers plus the U2X bit in the UCSRA register. The detailed description of how this works starts on page 146 of the ATmega16 datasheet. Basically, UBRR is a 16-bit register and so must be accessed separately via 8 bit parts UBRRH (the high byte) and UBRRL (the low byte). The value you want to put into these registers (and the U2X bit in UCSRA register) depends on
the clock rate
the desired baud rate.
For 1Mhz clock and 9600 baud there are two options (see table 68 on page 168 in the datasheet): cleared U2X bit and UBRR set to 6 or set U2X and UBRR set to 12. The latter option results in a baud rate generation that is closer to the desired baud rate (0.2%) error, therefore, the latter option is recommended. Consequently, the code you want is:
UBRRH = 0;
UBRRL = 12;
UCSRA |= 1<<(U2X);
There is a nasty gotcha lurking here: as the datasheet states, UBRRH and UCSRC are the same register. UCSRC controls parity, stop bits, and other important settings. Therefore if you ever need to write to UCSRC, make sure that you set the URSEL bit at the same time:
UCSRC = (1<<URSEL) | (...other bits...)
or
UCSRC |= (1<<URSEL) | (...other bits...)
Otherwise you will clobber your UBRRH register and wonder why your baud rate is not what you expected.
But you can also make use of the AVR Libc code, which provides a read-made way for setting a baud rate on AVR, see util/setbaud.html
you check data sheet the error rate is too high. when you using 9600 baud rate on 1MHZ this is the main problem . take 8,12,16 MHZ as possible and check data sheet . and dont forget to burn fuse bits related to XTAL frequency if you not burn these bits properly that relate with crystal that is not work properly .
if u need more help ask..

Resources