Atmel SAMD21 DMAC writeback failure? - microcontroller

I'm using multiple DMA channels in an Atmel SAMD21G18A (in an Arduino MKR1000), and I'm seeing some obviously incorrect behaviour.
DMA channels 2 & 3 are servicing a SERCOM SPI channel, using single block transfers with a completion interrupt.
DMA channel 0 continuously writes a sequence of values to the DAC, triggered by a TCC channel. It uses a single descriptor for the block, which chains to itself in order to run continuously at the rate determined by TCC (10us per beat, but changing this doesn't prevent the problem).
The problem I'm seeing is that after running for a variable amount of time, usually anywhere between 5 seconds to a couple of minutes, channel 0 stops - indicating no error, just TCMPL (transfer complete). If I enable the TCMPL interrupt, the interrupt is issued when the channel stops - but not before, while it's happily repeating. Wiggling GPIOs in the DMAC ISR shows that the failure always occurs just after channel 2 or 3 has completed a block transfer.
When I examine the descriptor writeback array, I find that the channel data for channel 0 has been written with the exact channel data for either channel 2 or channel 3 - which, representing a completed transfer, causes channel 0 to deactivate. None of the surrounding memory is affected, and the same problem occurs if I assign different DMA channels.
I'm aware of item 1.7.2 in the Silicon Errata, but I'm applying the specified workaround by ensuring that the channel number of the new channel enabled (i.e. channel 2 or 3) is greater than the other channel numbers (channel 0).
I assume I'm doing something else wrong - I suspect the self-chaining has something to do with it - but while I'm working up a minimal example to share, I wonder if anybody has come across a similar issue?

Related

Using 3 different communication protocols in the same MCU

For a project I need to make communicate in a CANBus network, ethernet network and with RS-232. I want to use one single MCU that will act as the main unit of CANBus start topology, Ethernet start topology and that MCU also will be transfering the RS232 data that comes to it to another device. Now I want to use high speed CAN which can be up to 1 Mbits per second. However,RS-232 is max 20 k baud. I wonder if it is doable with 1 MCU to handle 3 different communications ( CANBus, ethernet and RS-232). I am afraid of to get overrun with data at some point. I can buffer data short term if data comes in bursts that can be averaged out. For continuous data where I'll never be able to keep up, I'll need to discard messages, perhaps in a managed way. But I do not want to discard any data. So my question is: Would using 1 MCU for this case work? And are there any software tricks that would help me with this case? (Like giving CANBus a higher priority etc.)
Yes, this can be done with a single MCU. Even a simple MCU should easily be able to handle data rates of 1 Mbps. Most likely you want to use DMA enabled transfer so the CPU core will only need to act when the transmission of a chunk of data has completed.
The problem of being overrun by data due to the mismatch in data rate is a separate topic:
If the mismatch persists, no system can handle it, no matter how capable.
If the mismatch is temporary, it's just a function of the available buffer size.
So if the worst case you want to handle is 10s of incoming data at 1 Mbps (with an outgoing rate of 20kbps), then you will need 10s x (1Mbps - 20kps) = 9.8 Mbit = 1.225 MByte of buffer memory.

Pause SAMD21 TCC counter

The Atmel SAMD21 TCC peripheral provides a STOP command, which pauses the counter. The counter can be resumed with a RETRIGGER command.
When STOP is issued, the TCC enters a fault state, in which the outputs are either tristated, or driven to states specified in a config register. Presumably this mechanism is designed to support a fixed failsafe output state.
In my case I want the output pins to freeze in the state they're in at the time of the STOP command. The only way I can see to to do this is to update the configured fault output state register every time the outputs are updated - requiring interrupt processing which kind of defeats the purpose of much of the TCC's output waveform extension architecture, as well as being a processing load I'd prefer to avoid. There are other complications too, such as accounting for the dead time mechanism, and hardware/software races.
So I've been looking at ways to achieve this that don't involve the STOP command - but I can't see any other way of stopping the counter. There's no way to gate the peripheral clock input, and disabling it in GCLK is ruled out as it also runs TCC1. (And who knows what other effects this would have.) Negating the ENABLE bit, besides being overkill, unsurprisingly also tristates the outputs. Modifying the configuration in various other ways usually requires writing to enable-protected registers, thus requiring disabling the peripheral first.
(One idea I haven't investigated that yet is to drive the counter from the event system, and control the event generation/gating instead.)
So: is there any way of pausing the peripheral in its current state, while maintaining the state of the output pins?
All that I can think of to try is the async 'COUNT' event, which sounds like it is a gate for the clock to the counter.
(page numbers from the 03/2016 manual)
31.6.4.3. Events, p.712;
Count during active state of an asynchronous event (increment or decrement, depending on counter direction). In this case, the counter will be incremented or decremented on each cycle of the prescaled clock, as long as the event is active.
31.8.9. Event Control, p.734;
EVCTRL register,
Bits 2:0 – EVACT0[2:0]: Timer/Counter Event Input 0 Action
0x5 COUNT (async) Count on active state of asynchronous event
The downside is that software events have to be synchronous.

SAM4S - Is DMA determinist in time ?

I'm using the DMA (described as PDC in the datasheet) of SAM4SD16C with USART 0 peripheral.
I've set a timer which generates a interrupt each ms. Each 5 ms a data transfert should be performed via DMA.
An other interrupt should occur when TXEMPTY flag is set.
To see when the transmission starts and ends I toggle an Ouptut and watch it on oscilloscope. And then I realized that the end of reception is varying in time by 20 µs (my main clock is 120MHz)... Which in my project is not acceptable. Meanwhile, the start of transmission is 100ns precise, so there is no problem concerning this point.
I'm wondering if there is a way to have a better control on DMA time transfer.
As discussed in comments above, the imprecision of End Of Reception instant is due to baudrate value. This imprecision is around the baudrate period and probably an additional bus idle time.

Analog digital conversion with AD7091 on SPI

I am trying to use AD7091R-8 ADC chip with SPI.
Procedure for getting converted value is described inside Datasheet and says:
Reset chip
Bring CONVST line low for 600ns and then get it high.
For enabled channels inside Channels register (I have enabled last 3 chans) start clocking out data which is contained in 2 bytes.
So I bring CONVST line for 1ms then up and wait for 1 ms and start clocking out data by enabling CS then clocking 16bytes and then bring CS up.
In those 16bits that clocks out I should get inside first 3 bit channel id and I got it but only the first one. Other 2 frames are without channel id which gives assumption that something got bad.
Does chip after starting CONVST and clocking out data autoincrements ADC results or somehow ADC channel result should be addresed?
Could someone please give hint on how should data be retrived from this ADC after doing CONVST?
If you look at the diagram on page 36 of the datasheet (Channel sequencer), you will find your answer.
You need to do the following sequence:
Toggle CONVST
Tie CS low, write the channel register on SDI, ignore SDO, Tie CS high
Then for each channel that you want to read:
Toggle CONVST
Tie CS low, read operation of NOP regiter on SDI, next channel on SDO, Tie CS high

Multitasking in PIC24

I have a PIC24 based system equipped with a 24 bit, 8 channels ADC (google MCP3914 Evaluation Board for more details...).
I have got the board to sample all of the 8 channels, store the data in a 512x8 buffer and transmit the data to PC using a USB module when the buffer is full (it's is done by different interrupts).
The only problem is that when the MCU is transmitting data (UART transmission interrupt has higher priority than the ADC reading interrupt) the ADC is not sampling data hence there will be data loss (sample rate is around 500 samples/sec).
Is there any way to prevent this data loss? maybe some multitasking?
Simply transmit the information to the UART register without using interrupts but by polling the bit TXIF
while (PIR1.TXIF == 0);
TXREG = "the data you want to send";
The same applies to the ADC conversion : if you were using interruptions to start / stop a conversion, simply poll the required bits (ADON) and thats it.
The TX bits and AD bits may vary depending on your PIC.
That prevents the MCU to enter an interrupt service routine and loose 3-4 samples.
In PIC24 an interrupt can be assigned one of the 8 priorities. Take a look at the corresponding section in the "Family Reference Manual" -> http://ww1.microchip.com/downloads/en/DeviceDoc/70000600d.pdf
Alternatively you can use DMA channels which are very handy. You can configure your ADC to use the DMA, and thus sampling and feeding the buffer won't use any CPU Time, same goes for UART I beleive.
http://ww1.microchip.com/downloads/en/DeviceDoc/39742A.pdf
http://esca.atomki.hu/PIC24/code_examples/docs/manuallyCreated/Appendix_H_ADC_with_DMA.pdf

Resources