Issues in Uart configuration at a ATmega4809 microcontroller - microcontroller

hope you are having a great time getting to the point these past days I have been working with a MicroChip Curiosity Nano (ATmega4809) trying to enable a communication by UART ports on Microchip Studio, when the only way I could establish this is when I use al the default configuration (8 bits frame,none parity,1 stop bit) but when try to configure to a 9 bit frame I started to recieve some weird characters.
I have done several changes on this configuration according to datasheet based on registers managment but always the same results.
void USART0_init(void)
{
USART0.BAUD = (uint16_t)USART0_BAUD_RATE(9600);
USART0.CTRLB |= USART_RXEN_bm | USART_TXEN_bm;
//Tries I have done
//#define USART_CHSIZE_gm 0x07
//USART0.CTRLC |= USART_CHSIZE_gm;
//USART0.CTRLC |= (7<<USART_CHSIZE_gm);
//USART0.CTRLC |= (1<<USART_CHSIZE2_bm) | (1<<USART_CHSIZE1_bm) | (1<<USART_CHSIZE0_bm);
PORTA.DIR |= PIN0_bm;
PORTA.DIR &= ~PIN1_bm;
}
This code is the UART port's inizialization, and the lines I have commented are about the configuration I need to get 9 bits frame communication, also I mentioned before when I commented this part it works correctly to 8 bits frame communication.
Thanks for taking the time and have a nice day.

Related

How do I send arduino a firmata command to turn on a pin

I'm trying to implement the firmata protocol and having a bit of a difficult time deciphering the spec for writing digital pins:
I have noted the following parts of the spec of Firmata 2.3
* type command channel first byte second byte
------------------------------------------------------------------------------
* digital I/O message 0x90 port LSB(bits 0-6) MSB(bits 7-13)
and
/* two byte digital data format, second nibble of byte 0 gives the port number (e.g. 0x92 is the third port, port 2)
* 0 digital data, 0x90-0x9F, (MIDI NoteOn, but different data format)
* 1 digital pins 0-6 bitmask
* 2 digital pin 7 bitmask
*/
I'm having some difficulty interpreting the spec. I've looked at other implementations, but haven't been able to see the relationship between the spec and implementation.
So let's say I am wanting to turn on the Arduino LED (pin 13), I know it will be on the second port, port 1, so the first byte will be #{91}.
I'm getting confused about the bitmask for the second two bytes though. I know what a bitmask is, so I want to enable the right bit for the pin.
Why is the bitmask so large for the digital pins? I'm familiar with using bitmasks on the digital outputs of PLCs, which seems much different (one pin, one bit)
My thought is that pin 13 would be the 7th pin on port 1. Since I don't care about the other pins, I would mark the pin in the 2nd byte #{40} and I don't need any pins set for the third byte #{00}?
I don't think my interpretation of the bitmasks is correct, and it's probably where my error is
Am I on the right track for this? Is this the right command for setting a pin high or low?
After some strace debugging with the firmata test application, I discovered the simple command to turn on Pin 13 was:
#{912000}
and to turn it off:
#{910000}

Overrun errors with two USART interrupts

Using two USARTs running at 115200 baud on a STM32F2, one to communicate with a radio module and one for serial from a PC. The clock speed is 120MHz.
When receiving data from both USARTs simultaneously overrun errors can occur on one USART or the other. Doing some quick back of the envelope calculations there should be enough time to process both, as the interrupts are just simple copy the byte to a circular buffer.
In both theory and from measurement the interrupt code to push byte to buffer should/does run in the order of 2-4µS, at 115200 baud we have around 70us to process each char.
Why are we seeing occassional OREs on one or other USART?
Update - additional information:
No other ISRs in our code are firing at this time.
We are running Keil RTX with systick interrupt configured to fire every 10mS.
We are not disabling any interrupts at this time.
According this book (The Designer's Guide to the Cortex-M Processor Family) the interupt latency is around 12cycles (not really deadly)
Given all the above 70us is at least a factor of 10 over the time we take to clear the interrupts - so I'm not sure its is so easy to explain. Should I be concluding that there must be some other factor I am over looking?
MDK-ARM is version 4.70
The systick interrupt is used by the RTOS so cannot time this the other ISRs take 2-3µS to run per byte each.
I ran into a similar problem as yours a few months ago on a Cortex M4 (SAM4S). I have a function that gets called at 100 Hz based on a Timer Interrupt.
In the meantime I had a UART configured to interrupt on char reception. The expected data over UART was 64 byte long packets and interrupting on every char caused latency such that my 100 Hz update function was running at about 20 Hz. 100 Hz is relatively slow on this particular 120 MHz processor but interrupting on every char was causing massive delays.
I decided to configure the UART to use PDC (Peripheral DMA controller) and my problems disappeared instantly.
DMA allows the UART to store data in memory WITHOUT interrupting the processor until the buffer is full saving lots of overhead.
In my case, I told PDC to store UART data into an buffer (byte array) and specified the length. When UART via PDC filled the buffer the PDC issued an interrupt.
In PDC ISR:
Give PDC new empty buffer
Restart UART PDC (so can collect data while we do other stuff in isr)
memcpy full buffer into RINGBUFFER
Exit ISR
As swineone recommended above, implement DMA and you'll love life.
Had a similar problem. Short solution - change oversampling to 8 bits which makes USART clock more precise. And choose your MCU clock wisely!
huart1.Init.OverSampling = UART_OVERSAMPLING_8;
Furthermore, add USART error handler and mechanism to check that your data valid such as CRC16. Here is example for the STM32F0xx series, I am assuming it should be pretty similar across the series.
void UART_flush(void) {
// Flush UART RX buffer if RXNE is set
if READ_BIT(huart1.Instance->ISR, USART_ISR_RXNE) {
SET_BIT(huart1.Instance->RQR, UART_RXDATA_FLUSH_REQUEST);
}
// Not available on F030xx devices!
// SET_BIT(huart1.Instance->RQR, UART_TXDATA_FLUSH_REQUEST);
// Clear All Errors (if needed)
if (READ_BIT(huart1.Instance->ISR, USART_ISR_ORE | USART_ISR_FE | USART_ISR_NE)) {
SET_BIT(huart1.Instance->ICR, USART_ICR_ORECF | USART_ICR_FECF | USART_ICR_NCF);
}
}
// USART Error Handler
void HAL_UART_ErrorCallback(UART_HandleTypeDef *huart) {
if(huart->Instance==USART1) {
// See if we have any errors
if (READ_BIT(huart1.Instance->ISR, USART_ISR_ORE | USART_ISR_FE | USART_ISR_NE | USART_ISR_RXNE)) {
// Flush errors
UART_flush();
// Raise Error Handler
_Error_Handler(__FILE__, __LINE__);
}
}
}
DMA might help as well. My problem was related to USART clock tolerances which might even cause overrun error with DMA implemented. Since it is USART hardware problem. Anyway, hope this helps someone out there! Cheers!
I have this problem recently, so I implemented a UART_ErrorCallback function that had was not implanted yet (just the _weak version).
Is like this:
void HAL_UART_ErrorCallback(UART_HandleTypeDef *huart)
{
if(huart == &huart1)
{
HAL_UART_DeInit(&huart1);
MX_USART1_UART_Init(); //my initialization code
...
And this solve the overrun issue.

Raspberry PI SPI read from Arduino slave with wiringPI2?

I've got wiringpi2 and the the wiringpi2 python wrapper installed on the NOOBS Raspbian PI distro. The Adafruit 4 channel logic level converter kept the PI safe from 5v and sending data to the Arduino was as simple as this on the PI side:
import wiringpi2
wiringpi2.wiringPiSPISetup(1,5000)
wiringpi2.wiringPiSPIDataRW(1,'HELLO WORLD\n')
and the corresponding Arduino code[3].
EDIT: Apologies - from this point on, I can't post any more of the links I carefully added to show my working, sources and example code. You'll have to Google it and thank the 2-link rule.
So, I know the wiring works. But that's not the way round I actually want - I want to read a pin from the Arduino to the PI.
The Arduino SPI reference states:
This library allows you to communicate with SPI devices, with the Arduino as the master device.
The PI must be the master device. I thought I was doomed until I read Nick Gammon's excellent page about SPI which demonstrates 2 Arduinii talking to each other.
Also, the SPI transfer() command would suggest you CAN write from the Arduino.
I'm now at the stage where all the links of the the first 4 result pages of Google show as "followed" - so it's not for lack of Googling!
In theory, shouldn't this work if I use the READ method on the PI end? (Note: this is just one of many, many attempts, not the only one!)
On the Arduino:
#include <SPI.h>
void setup (void)
{
SPI.begin();
pinMode(MISO, OUTPUT);
// turn on SPI in slave mode
SPCR |= _BV(SPE);
}
void loop (void) {
byte data[] = {0x00, 0x00, 0x00, 0x00}; // this is 24 bits (8bits/byte * 4 bytes)
// Transfer 24 bits of data
for (int i=0; i<4; i++) {
SPI.transfer(data[i]); // Send 8 bits
}
}
And on the PI end of things:
import wiringpi2
wiringpi2.wiringPiSPISetup(1,5000)
stuff = wiringpi2.wiringPiSPIDataRW(1,'\n')
print stuff
WiringPI says the incoming data will overwrite my data, and the SPIDataRW takes exactly 2 inputs, so shouldn't I be getting "test" back?
What am I missing here? Any pointers greatly appreciated.
The SPI library assumes you want the arduino to act as a master. You can't use it to get an arduino to act as a slave. Sometimes you have to dive past the libraries into the chip's datasheet and see how things work. (and then, ideally, make a library from all your troubles)
An SPI slave device has to react to the master initiating the communication.
So the Pi, as the SPI master, will have to send dummy bytes over the MOSI line and read what the Arduino replies on the MISO line. ie, master initiate communication.
On the arduino side you can turn on the SPI interrupt with:
SPCR |= _BV(SPIE);
It's built into the atmega328 chip. So include this next bit on the arduino side to see incoming messages and set the response for the next message. The data that the arduino SPI slave responds with is whatever is in the data register when the master sends the message.
int gCurrentSpiByte; //or set up your a buffer or whatever
ISR (SPI_STC_vect)
{
gCurrentSpiByte = SPDR; // grab byte from SPI Data Register
SPDR = buf[messageCount++]; //Set the data to be sent out on the NEXT message.
}
And remember, you GOTTAGOFAST. If the arduino doesn't exit that interrupt service routine before the next SPI message comes it, it all goes to hell.
Also also, check to make sure the clock's polarity and phase are the same between the Pi and the Arduino (otherwise known as modes 0-3).
| 7 | 6 | 5 | 4 | 3 | 2 | 1 | 0 |
| SPIE | SPE | DORD | MSTR | CPOL | CPHA | SPR1 | SPR0 |
SPIE - Enables the SPI interrupt when 1
SPE - Enables the SPI when 1
DORD - Sends data least Significant Bit First when 1, most Significant Bit first when 0
MSTR - Sets the Arduino in master mode when 1, slave mode when 0
CPOL - Sets the data clock to be idle when high if set to 1, idle when low if set to 0
CPHA - Samples data on the falling edge of the data clock when 1, rising edge when 0
SPR1 and SPR0 - Sets the SPI speed, 00 is fastest (4MHz) 11 is slowest (250KHz)
So to turn on SPI, the SPI interrupt, and to set the polarity to... whatever that is...
SPCR |= _BV(SPIE) | _BV(SPE) | _BV(CPOL) ;
Anyway, I spent a couple days banging around with the Arduino SPI and that's what I learned.

Arduino serial: inverted 7E1. Possible?

I'm trying to talk serial with an SDI-12 device, and it requires inverted seven data bits, even parity and one stop bit (7E1) serial at 1200 baud.
From the datasheet:
SDI-12 communication sends characters at 1200 bits per second. Each character has 1 start bit, 7 data bits (LSB first), 1 even parity bit, and 1 stop bit (Active low or inverted logic levels):
All SDI-12 commands and response must adhere to the following format on the data line. Both the command and response are preceded by an address and terminated by a carriage return line feed combination.
Is this possible with the Serial or SoftwareSerial libraries? I am trying to avoid additional hardware (beyond a levelshifter to 3.3 V), but I will do so if it is the only way.
I have seen that SoftwareSerial can do inverted, and Serial can do 7E1, but I can't find if either can do both.
I have access to a Arduino Mega (R2), and Arduino Uno (R3).
Here is the device I want to communicate with: http://www.decagon.com/products/sensors/soil-moisture-sensors/gs3-soil-moisture-temperature-and-ec/ and here, http://www.decagon.com/assets/Uploads/GS3-Integrators-Guide.pdf is the document explaining the protocol. Page 6 talks about its implementation of SDI.
I'm not familiar with Arduino, however the SDI-12 physical layer is inverted from the standard TTL levels - probably for two reasons:
Since the idle voltage is 0V, this results in lower standby power (due to nominal pull-down resistors in a typical SDI-12 sensor.
It facilitates simple bus 'sniffing' using a standard RS-232 serial port.
Short of bit-banging a 5V IO pin - yes, if using a standard microcontroller UART you will need an external inverter (or 2) and a 3-state buffer. Possibly requiring level shifting, depending on your hardware.
Thumbs down to the Wikipedia entry - SDI-12 uses entirely standard UART bit timings (very much like RS-232), just different signal levels (0 - 5V); see point #2. However, there are specific break sequences and strict timing requirements, which makes firmware development more difficult.
If you are serious about SDI-12 firmware development, you may want to invest in an SDI-12 Verifier. A thorough study of the specification is essential.
A little late... but better late than never
I have actually just written a library for exactly that (actually exactly that including the sensors ... so it should work exactly with the included examples )
https://github.com/joranbeasley/SDISerial (Arduino Library)
#include <SDISerial.h> //https://github.com/joranbeasley/SDISerial (Arduino Library)
#include <string.h>
#define DATA_PIN 2
SDISerial connection(DATA_PIN);
char output_buffer[125]; // just for uart prints
char tmp_buffer[4];
char sensor_info[]
//initialize variables
void setup(){
connection.begin();
Serial.begin(9600);//so we can print to standard uart
//small delay to let the sensor do its startup stuff
delay(3000);//3 seconds should be more than enough
char* sensor_info = connection.sdi_query("0I!",1000); // get sensor info for address 0
}
//main loop
void loop(){
//print to uart
Serial.println("Begin Command: ?M!");
//send measurement query (M) to the first device on our bus
char* resp = connection.service_request("0M!","0D0!");//Get Measurement from address 0
sprintf(output_buffer,"RECV: %s",resp?resp:"No Response Recieved!!");
Serial.println(output_buffer);
delay(10000);//sleep for 10 seconds before the next read
}

Gsm interfacing with atmega16

I am working on GSM sim900D interfacing with Atmega16. Initially I made the circuit using MAX232 on breadboard. Then I connected it to my PC using a serial port. I tested AT commands, the commands worked perfectly on hyper terminal and I was able to send SMS using hyperterminal. Then I tested it on Proteus and it was working there perfectly too.
I am using codevision avr as the compiler. GSM work on 9600baud but the problem is that in compiler I have to keep the baud rate4800(clock = 1MHz) and at proteus COMPIM(physical baud=9600 & virtual baud=4800) only then it works when I run it on hardware(breadboard) it doesn't work as I have set the baud to 4800. I don't know how to set the baud for hardware. I tried 9600baud for hardware in compiler but it doesn't send SMS at all. Kindly tell me what I should do?
On ATmega16 (and other ATmegas), the serial baud rate is set via UBRRH and UBRRL registers plus the U2X bit in the UCSRA register. The detailed description of how this works starts on page 146 of the ATmega16 datasheet. Basically, UBRR is a 16-bit register and so must be accessed separately via 8 bit parts UBRRH (the high byte) and UBRRL (the low byte). The value you want to put into these registers (and the U2X bit in UCSRA register) depends on
the clock rate
the desired baud rate.
For 1Mhz clock and 9600 baud there are two options (see table 68 on page 168 in the datasheet): cleared U2X bit and UBRR set to 6 or set U2X and UBRR set to 12. The latter option results in a baud rate generation that is closer to the desired baud rate (0.2%) error, therefore, the latter option is recommended. Consequently, the code you want is:
UBRRH = 0;
UBRRL = 12;
UCSRA |= 1<<(U2X);
There is a nasty gotcha lurking here: as the datasheet states, UBRRH and UCSRC are the same register. UCSRC controls parity, stop bits, and other important settings. Therefore if you ever need to write to UCSRC, make sure that you set the URSEL bit at the same time:
UCSRC = (1<<URSEL) | (...other bits...)
or
UCSRC |= (1<<URSEL) | (...other bits...)
Otherwise you will clobber your UBRRH register and wonder why your baud rate is not what you expected.
But you can also make use of the AVR Libc code, which provides a read-made way for setting a baud rate on AVR, see util/setbaud.html
you check data sheet the error rate is too high. when you using 9600 baud rate on 1MHZ this is the main problem . take 8,12,16 MHZ as possible and check data sheet . and dont forget to burn fuse bits related to XTAL frequency if you not burn these bits properly that relate with crystal that is not work properly .
if u need more help ask..

Resources