Edge impulse is not providing arduino accerlerometer continious code on build - arduino

I am working on Arduino Nano RP2040. Since Edge impulse doesn’'t specifically provide model library for Nano RP2040, I get the model libraries in Nano BLE and with some teaks I deploy them on Nano RP2040 which was working fine.
I am working on the models related to accelerometer data. Initially, whenever I got the model by edge impulse, it had nano_ble_accelerometer_continuous file. And with some tweaks it was working on Arduino RP2040. But today, when I was working on the new model, this new model library did,'t contain the nano_ble_accelerometer_continuous but only nano_ble_accelerometer.
I thought I had some problems in my trained data. Hence, I went to the model which I created in the start of month for which I got nano_ble_accelerometer_continuous file back before then. But when I run it now, I again got nano_ble_accelerometer for that model too.
Am I am going wrong anywhere or there is some major update? I need to have a accelerometer_continous code.

Late to the party, but this was resolved a few weeks back, and the continuous sketch is now included again.

Related

How to get started with drawing robot

I am a beginner to robotics, and I wanted to program a robot arm to draw a picture on arbitrary objects I present to him.
I do have an Intel Realsense camera, will receive a dobot.cc robot arm next days, and thought about using ROS as a base, moveit for movements and the PCL library for object detection.
How do I connect all of these together? Are there any particulary interesting tutorials that you would recommend? Anything I should try out up front?
Also, I suppose I will need to build custom code for detecting the target object in the point cloud and calculate how the picture should be placed on the object and then use moveit to follow the target path. Where would this code go?
Any help would be appreciated.
Thanks,
Gregor
Meanwhile, I found an excellent book on the topic:
http://www.amazon.de/Learning-ROS-Robotics-Programming-Second/dp/B00YSIL6VM/

dspic33fj128mc804 and eeprom 25LC256 by SPI

I am trying to write and read from the EEPROM (25LC256) with my dsPIC 33FJ128MC804, I tried to use the examples from the website, however they used the explorer16 with the dsPIC 33FJ256MC710, so I take the code and I made the changes to used it in my dsPIC but I started to use interruptions, right now I can send data, but I am not able to read from the memory, even if I follow the steps in the datasheet of the EEPROM and using the configuration of the SPI.
Could you help me please with this little problem please?
UPDATE: I just noticed that whenever I send the code for read the STATUS register I always get zero as answer, is it normal?
source code: https://www.dropbox.com/s/wdahlmhjrilcqw6/main.c?dl=0

FMU Initialization Issue

I exported a FMU 2.0 model in Dymola 2015 recently, and I had some issue with the initialization.
Before exporting the model, I checked the validity of the model by connecting the input connectors to some constant values that know the exact output value, it turned out that everything looks good. The model could be translated, initialized and simulated as expected.
Then I went on to export my model as ME FMU 2.0 RC. In C, it could be instantiated, and fmiSetupExperiment is called successfully before entering the initialization, however, it failed initialization right after calling fmiEnterInitializationMode.
The Error looks like the image below
I tried to assign some reasonable input values to the model according to this post Initialization of a Dymola FMU in Simulink but it did not help.
Then I found that I had several scalarVariables that has initial="approx", but when I checked their final values in Dymola, the initial approximation is decently close to the final value.
So I am quite confused on what to try next...I wonder if anyone here could help me a bit (I don't even know what does this error message mean...) I'd be greatly appreciated.
Thanks!

PIC Prgramming - The basic flow of things

Can someone please explain to me the basic flow of how this is done.
So currently I a USB pic programmer and also a multi pic adapter. I understand that I can use this to write my program to the PIC. But Im not sure what happens before that, like how do I actually test it with LED or some input sensor etc that gives out analog data?
This is what I have now: http://www.piccircuit.com/shop/pic-programmer/26-ica01-usb-pic-programmer-set.html
So I need to connect this to to a breadboard? And if so how? Im completely lost!! This is the first time that I attempt to do this. What I have done is use my Synapse RF Engine EK2100 to build what I want.
Now what...?
I'm not entirely sure what you're trying to accomplish but what you purchased is a programmer for PIC microcontrollers. After you have written some code whether in assembly or C and compiled it to a hex file, this device will put that code onto a PIC microcontroller that you buy separately. Have you purchased a PIC device to program or do you just have the programmer and the EK2100 kit? If you provide some more detail we can point you in the right direction.
Write a basic 'flash LED' program and then wire-up the PIC to see if it works.
Hot tip - use the internal oscillator to minimise the external component count (makes things simpler). Browse around a PIC savvy site like http://digital-diy.com/ to get lots of interesting ideas and code samples.
The community there mostly use PIC Basic type languages (such as Swordfish) that will land you code that looks something like this (header/setup removed for ease of explanation):
While True
High(LED)
DelaymS(500)
Low(LED)
DelaymS(500)
Wend

How do I build a console app for Xbox Kinect

I'm fairly new to programming and I've been presented with fairly daunting assignment at work. I need to build a program from scratch in order to take advantage of the Kinect's motion tracking capabilities to interface with another application.
Some context:
Someone else I work with has built the test program- a console app using OpenGL. The test program consists of a cube inside of a skymap. The camera looks at the small cube, and can be rotated around the cube to view it from different perspectives
Someone else was able to use the sample codes in the developer toolkit to control the test program. The test program now works with motion tracking (swiping your hand to the side rotates the cube; moving your head side to side changes the camera angle so it looks like you are looking around a floating 3D object; walking forward or backward zooms the camera). It works as it is, but...
The problem is this: Now that we know it all works, it's time to simplify everything so that we can run the test program on a tablet. So the code needs to be stripped down to the bare bones. We need to remove everything from the SkeletalViewer code except for the elements that gather and process the data, so that it can be used in another program.
I've been asked to build a console app from scratch (rather than tearing apart the sample code-as this is extremely messy) that allows us to use the Kinect with our test program.
I've spent the last few weeks trying to figure out the code and I'm feeling overwhelmed! I don't know where to start.
Here's my question: what are the absolute bare essential building blocks in the Kinect program? I do not need it to draw ANYTHING. I just need a console app that, when running, gathers the motion tracking data and sends it to the other program.
I would greatly appreciate any guidance you can provide.
Thank you in advance!!!
-JD
You don't need to draw anything, but there's a need to create an event for caching the frames in order to work one them.
Here's pretty cool description of skeletal joints: MSDN - Kinect
Another thing I can give you is the main page of Kinect project. Here are the libraries, guides, code samples etc. You can download and install Kinect Toolkit, there are several programs inside (binaries + code samples) - everything you need to learn Kinect API: Kinect For Windows - Downloads and Kinect For Windows - Learn

Resources