STM32H7 | Portenta H7 Data missing during DMA transfer (ADC to Memory) - arduino

I'm currently working on STM32H747XI (Portenta H7). I'am programming the ADC1 with the DMA1 to get 16bits data at 1Msps.
I'm sorry, I can't share my entire code but I will therefore try to describe my configuration as precisely as possible.
I'm using the ADC1 trigged by a 1MHz timer. The ADC is working in continus mode with the DMA circular and double buffer mode. I tryed direct mode and burst with a full FIFO. I have no DMA error interrupe and no ADC overrun.
My peripheral are running but I'm stuck front of two issues. First issue, I'am doing buffer of 8192 uint16_t and I send it on the USB CDC with the arduino function USBserial.Write(buf,len). In this case, USB transfer going right but I have some missing data in my buffer. The DMA increments memory but doesn't write. So I don't have missing sample but the value is false (it belongs to the old buffer).
You can see the data plot below :
transfer with buffer of 8192 samples
If I double the buffer size, this issue is fixed but another comes. The USB VPC transfer fail if the data buffer is longer than 16384 byte. Some data are cut. I tried to solve this with differents sending and delays but it doesn't work. I still have the same kind of cut.
Here the data plot of the same script with a longer buffer : transfer withe buffer of 16384 sample (32768 byte)
Thank you for your help. I remain available.

For a fast check try to disable data cache. You're probably not managing cache correctly or you haven't disable caching in the memory space where you're using DMA.
Peripherals are not aware of cache so you must manage it manually. You have also to align buffers to cache lines in this case.
Refer to AN4839

Related

Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?

I have been using the Teensy 3.6 microcontroller board (180 MHz ARM Cortex-M4 processor) to try and implement a driver for a sensor. The sensor is controlled over SPI and when it is commanded to make a measurement, it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal. The data frame itself consists of 1,024 16-bit values.
My first attempt consisted a relatively naïve approach: I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line. The main loop of the program generates a uint_16 value from these bits and collects 1,024 of these values for the full measurement frame.
However, this program locks up the Teensy almost immediately. From my experiments, it seems to lock up as soon as the interrupt is attached. I believe that the microprocessor is being swamped by interrupts.
I think that the correct way of doing this is by using the Teensy's DMA controller. I have been reading Paul Stoffregen's DMAChannel library but I can't understand it. I need to trigger the DMA measurements from the PCLK digital pin and have it read in bits from the DOUT digital pin. Could someone tell me if I am looking at this problem in the correct way? Am I overlooking something, and what resources should I view to better understand DMA on the Teensy?
Thanks!
I put this on the Software Engineering Stack Exchange because I feel that this is primarily a programming problem, but if it is an EE problem, please feel free to move it to the EE SE.
Is DMA the Correct Way to Receive High-Speed Digital Data on a Microprocessor?
There is more than one source of 'high speed digital data'. DMA is not the globally correct solution for all data, but it can be a solution.
it sends out the data over two lines, DOUT and PCLK. PCLK is a 5 MHz clock signal and the bits are sent over DOUT, measured on the falling edges of the PCLK signal.
I attached an interrupt to the PCLK pin looking for falling edges. When it detects a falling edge, it sets a bool that a new bit is available and sets another bool to the value of the DOUT line.
This approach would be call 'bit bashing'. You are using a CPU to physically measure the pins. It is a worst case solution that I see many experienced developers implement. It will work with any hardware connection. Fortunately, the Kinetis K66 has several peripherals that maybe able to assist you.
Specifically, the FTM, CMP, I2C, SPI and UART modules may be useful. These hardware modules are capable of reducing the work load from processing each bit to groups of bits. For instance, the FTM support a capture mode. The idea is to ignore the PCLK signal and just measure the time between edges. These times will be fixed in a bit period/CLK. If the timer captures a two bit period, then you know that two ones or zeros were sent.
Also, your signal seems like SSI which is an 'digital audio' channel. Unfortunately, the K66 doesn't have an SSI module. Typical I2C is open drain and it always has a start bit and fixed word size. It maybe possible to use this if you have some knowledge of the data and/or can attach some circuit to fake some bits (to be removed later).
You could use the UART and time between characters to capture data. The time will be a run of bits that aren't the start bit. However it looks like this UART module requires stop bits (the SIM feature are probably very limited).
Once you do this, the decision between DMA, interrupt and polling can be made. There is nothing faster than polling if the CPU uses the data. DMA and interrupts are needed if you need to multiplex the CPU with the data transfer. DMA is better if the CPU doesn't need to act on most of the data or the work the CPU is doing is not memory intensive (number crunching). Interrupts depend on your context save overhead. This can be minimized depending on the facilities your main line uses.
Some glue circuitry to adapt the signal to one of the K66 modules could go a long way to making a more efficient solution. If you can't change the signal, another (NXP?) SOC with an SSI module would work well. The NXP modules usually support chaining to an eDMA module as well as interrupts.

Expanding Pic16f877a flash memory

I have been working on this project nd the code has gotten ao huge that the microcontroller's flash memory is full,so I want to know if there is any way i can connect an external eeprom or any memory device that can help me have more program memory..
Thanx in advanced!!!!
The only 8-bit PICs that can use external program memory are high-end parts in the PIC18F series - all 64-pin or more.
If a substantial portion of your code size consists of text or other data (rather than actual code), you could store the data on an external SPI or I2C EEPROM. This would be much slower than having the data internally, and less convenient to use - you'd have to manually send an address and then read bytes from the external chip, you couldn't just access the data as an array.
The 16F877 is a rather old chip - you can certainly find ones with more capacity these days. A quick search on Microchip's part selector turns up several 16F chips with twice the program memory, such as the 16F1789. If you'd be willing to switch to the more powerful 18F series, you could double the program memory yet again - 18F4620, for example.

Multitasking in PIC24

I have a PIC24 based system equipped with a 24 bit, 8 channels ADC (google MCP3914 Evaluation Board for more details...).
I have got the board to sample all of the 8 channels, store the data in a 512x8 buffer and transmit the data to PC using a USB module when the buffer is full (it's is done by different interrupts).
The only problem is that when the MCU is transmitting data (UART transmission interrupt has higher priority than the ADC reading interrupt) the ADC is not sampling data hence there will be data loss (sample rate is around 500 samples/sec).
Is there any way to prevent this data loss? maybe some multitasking?
Simply transmit the information to the UART register without using interrupts but by polling the bit TXIF
while (PIR1.TXIF == 0);
TXREG = "the data you want to send";
The same applies to the ADC conversion : if you were using interruptions to start / stop a conversion, simply poll the required bits (ADON) and thats it.
The TX bits and AD bits may vary depending on your PIC.
That prevents the MCU to enter an interrupt service routine and loose 3-4 samples.
In PIC24 an interrupt can be assigned one of the 8 priorities. Take a look at the corresponding section in the "Family Reference Manual" -> http://ww1.microchip.com/downloads/en/DeviceDoc/70000600d.pdf
Alternatively you can use DMA channels which are very handy. You can configure your ADC to use the DMA, and thus sampling and feeding the buffer won't use any CPU Time, same goes for UART I beleive.
http://ww1.microchip.com/downloads/en/DeviceDoc/39742A.pdf
http://esca.atomki.hu/PIC24/code_examples/docs/manuallyCreated/Appendix_H_ADC_with_DMA.pdf

TCP Receiver Window size

Is there a way to change the TCP receiver window size using any of winsock api functions? The RCVBUF value just increases the capacity of the temporary storage. I need to the improve the speed of data transfer and I thought increasing the receiver window size would help but I couldn't find a way to improve it using winsock api. Is there a way to do it or should I modify the registry?
The RCVBUF value just increases the capacity of the temporary storage.
No, the RCVBUF value sets the size of the receive buffer, which is the maximum receive window. No 'just' about it. The receive window is the amount if data the sender may send, which the receiver has to be able to store somewhere ... guess where? In the receive buffer.
On Windows it was historically 8k for decades, which was far too low, and gave rise to an entire sub-industry of 'download tweaks' which just raised the default (some of them also played dangerously with other settings, which isn't usually a good idea).

Is it stable to change I/O direction on microcontroller repeatedly?

I'm new to microcontroller programming and I have interfaced my microcontroller board to another device that provides a status based on the command send to it but, this status is provided on the same I/O pin that is used to provide data. So basically, I have an 8-bit data line that is used as an output from the microcontroller, but for certain commands I get a status back on one of the data lines if I choose to read it. So I would be required to change the direction of this one line to read the status thus converting this line as an ouput to an input and then back to an output. Is this acceptable programming or will this changing of the I/O pin this frequently cause instability?
Thanks.
There should not be any problem with changing the direction of the I/O line to read the status returned by the peripheral provided that you change the state of the line to an input before the peripheral starts to drive the line and then do not try to drive the line as an output until the peripheral stops driving it. What you must try to avoid is contention between the two driver devices, i.e. having the two ends being driven to opposite states by the processor and peripheral. This would result in, at best a large spike in the power consumption or worse blown pin driver circuitry in the processor, peripheral or both.
You do not say what the processor or peripheral are so I cannot tell whether there are any control bits in the interface that enable the remote device to output the status so that you can know whether the peripheral is driving the line at any time.
I've done this on digital I/O pins without any problems but I'm very far from an expert on this. It probably depends entirely on which microcontroller you are using though.
Yes, it's perfectly fine to change I/O direction on microcontroller repeatedly.
That's the standard method of communicating over open-collector buses such as I2C and the iButton. (see PICList: busses for links to assembly-language code examples).
transmit 0 bit: set output LATx bit to 0, and then set TRISx bit to OUTPUT.
transmit 1 bit: keep output LATx bit at 0, and set TRIS bit to INPUT (let external resistor pull-up line to high)
listen for response from peripheral: keep output LATx bit at 0, and set TRIS bit to INPUT. Let external resistor pull-up line to high when peripheral is transmitting a 1, or let the peripheral pull the line low when peripheral is transmitting a 0. Read the bit from the PORTx pin.
If both ends of the bus correctly follow this protocol (in particular, if neither end actively drives the line to high), then you never have to worry about contention or current spikes.
It`s important to remember that any IO switching in high speed generates EMI.
Depending of switching frequency, board layout and devices sensibilities, this EMI can affect performance and reliability of your application.
If you are having problems in your application use an oscilloscope to check for irradiated EMI in your board lanes.

Resources