Detect 3D direction of an impulse with accelerometer - math

I know that what I'm going to ask could sounds crazy, but I'm trying to figure out how to resolve a problem in a smart way.
It's quite difficult to explain my problem, that's why I made an hand-made draft downloadable from here :) https://dl.dropboxusercontent.com/u/5049281/static/Images_Impulse_Direction.zip.
images in that zip (copied by Spektre)
The setup can be approximated with a pipe endorsed on a rubber wall. The pipe is firmly connected (whit an unknown position and orientation) to a IMU equipped with an accelerometer and a gyroscope (sample frequency 110 hz).
I would like to discover the direction of the pipe axis (expressed in the IMU reference system) analyzing the data acquired during some taps at the end of the pipe in the same direction of the pipe axis. The taps are applied with the palm of the hand. In the figure the direction should be just on X axis.
I think that if the motion is just a translation (I could verify it checking if gyroscope data is close to zero), the acceleration (with the gravity removed) during the tap should have the same direction of the pipe axis.
Is there a smarter solution than just apply an high-pass filter to the signal an then save the direction of the sample with the higher magnitude?
Thanks for your help!

Related

Project Tango strange rotation visualisation

I am working on 3D reconstruction with tango. Our system is quite similar to KinectFusion, which uses voxel representation, but use Tango as tracker. Left image (in video linked below) is rendered by Raycast at current pose (given by tango) in real time. Raw pose converted by GetOC2OWMat() as in code examples, in addition sign of tx and rx are flipped to cope with our system. Everything works fine except ration in Z axis, which changes angle in rendered image. I guess coordinate system conversion is not done properly, but depth integration is working if no Z rotation is involved. I have also checked det(R) is always 1.
Video
It sounds like you are not factoring in intrinsics - have you accounted for camera and device IMU frames ? You need these to fully re-establish original viewpoint, i.e. both camera and device imu frame matrices need to be multiplied in to your stack
Sorry that I just find the place where things goes wrong. When the image is displayed with opengl, the rendered gl size does not have same aspect ratio as Raycasting image.
Do you program with Java/C/Unity? I'm curious because my device has problems with the camera data and you seem to capture it without problems. I am quite sure it's a bug but I would like to make sure it really is one.

How to smooth readings from an on-board accelerometer (light blue bean)

I'm wondering if someone with more Arduino knowledge than me can point me towards the right direction to solve this. I'm trying to smooth out some accelerometer readings. I was following the suggestion to do so over here: http://arduino.cc/en/Tutorial/Smoothing
The issue is that I'm using a Light Blue Bean which has an on board accelerometer which calls a struct? And has some form of on-board filtering: https://punchthrough.com/bean/the-arduino-reference/accelerationreading/ | https://punchthrough.com/bean/the-arduino-reference/getacceleration/
I'm not sure how to go about this. Try and smooth each axis (https://punchthrough.com/bean/the-arduino-reference/getaccelerationx/)? work of the arduino digital smooth example? Maybe smoothing is the wrong approach?
Mostly its just giving me some big jumps in readings even when its sitting still. ie: the y-axis will be: 0, 1, -8, 0, 3, etc.. in a sample.
I am less than a novice on this, but this page https://punchthrough.com/bean/the-arduino-reference/getacceleration/
says the conversion for the units to "Gs" is 3.91X10-3. So, if you multiply the values you are getting by .00391, then you should get the units in Gs. Your value of -8 above is only -0.03128G. This is a reasonable acceleration for something "sitting still."

Finding a Quaternion from Gyroscope Data?

I've been trying to build a filter that can successfully combine compass, geomagnetic, and gyroscopic data to produce a smooth augmented reality experience. After reading this post along with lots of discussions, I finally found out a good algorithm to correct my sensor data. Most examples I've read show how to correct accelerometers with gyroscopes, but not correct compass + accelerometer data with gyroscope. This is the algorithm I've settled upon, which works great except that I run into gimbal lock if I try to look at the scene if I'm not facing North. This algorithm is Balance Filter, only instead of only implemented in 3D
Initialization Step:
Initialize a world rotation matrix using the (noisy) accelerometer and compass sensor data (this is provided by the Android already)
Update Steps:
Integrate the gyroscope reading (time_delta * reading) for each axis (x, y, z)
Rotate the world rotation matrix using the Euler angles supplied by the integration
Find the Quaternion from the newly rotated matrix
Find the rotation matrix from the unfiltered accelerometer + compass data (using the OS provided function, I think it uses angle/axis calculation)
Get the quaternion from the matrix generated in the previous step.
Slerp between quaternion generated in step 2 (from the gyroscope), and the accelerometer data using a coefficient based on some experimental magic
Convert back to a matrix and use that to draw the scene.
My problem is that when I'm facing North and then try to look south, the whole thing blows up and it appears to be gimbal lock. After a few gimbal locks, the whole filter is in an undefined state. Searching around I hear everybody saying "Just use Quaternions" but I'm afraid it's not that simple (at least not to me) and I know there's something I'm just missing. Any help would be greatly appreciated.
The biggest reason to use quaternions is to avoid the singularity problem with Euler angles. You can directly rotate a quaternion with gyro data.
Many appologies if information is delayed or not useful specifically but may be useful to others as I found it after some research:::
a. Using a kalman (linear or non linear) filter you do following ::
Gyro to integrate the delta angle while accelerometers tell you the outer limit.
b. Euler rates are different from Gyro rate of angle change so you ll need quaternion or Euler representation::
Quaternion is non trivial but two main steps are ----
1. For Roll, pitch,yaw you get three quaternions as cos(w) +sin(v) where w is scalar part and v is vector part (or when coding just another variable)
Then simply multiply all 3 quat. to get a delta quaternion
i.e quatDelta[0] =c1c2*c3 - s1s2*s3;
quatDelta[1] =c1c2*s3 + s1s2*c3;
quatDelta[2] =s1*c2*c3 + c1*s2*s3;
quatDelta[3] =c1*s2*c3 - s1*c2*s3;
where c1,c2,c3 are cos of roll,pitch,yaw and s stands for sin of the same actually half of those gyro pre integrated angles.
2. Then just multiply by old quaternion you had
newQuat[0]=(quaternion[0]*quatDelta[0] - quaternion[1]*quatDelta[1] - quaternion[2]*quatDelta[2] - quaternion[3]*quatDelta[3]);
newQuat[1]=(quaternion[0]*quatDelta[1] + quaternion[1]*quatDelta[0] + quaternion[2]*quatDelta[3] - quaternion[3]*quatDelta[2]);
newQuat[2]=(quaternion[0]*quatDelta[2] - quaternion[1]*quatDelta[3] + quaternion[2]*quatDelta[0] + quaternion[3]*quatDelta[1]);
newQuat[3]=(quaternion[0]*quatDelta[3] + quaternion[1]*quatDelta[2] - quaternion[2]*quatDelta[1] + quaternion[3]*quatDelta[0]);
As you loop through the code it gets updated so only quatenion is a global variables not the rest
3. Lastly if you want Euler angles from them then do the following:
`euler[2]=atan2(2.0*(quaternion[0]*quaternion[1]+quaternion[2]*quaternion[3]), 1-2.0*(quaternion[1]*quaternion[1]+quaternion[2]*quaternion[2]))euler[1]=safe_asin(2.0*(quaternion[0]*quaternion[2] - quaternion[3]*quaternion[1]))euler[0]=atan2(2.0*(quaternion[0]*quaternion[3]+quaternion[1]*quaternion[2]), 1-2.0*(quaternion[2] *quaternion[2]+quaternion[3]*quaternion[3]))`
euler[1] is pitch and so on..
I just wanted to outline general steps of quaternion implementation. There may be some minor errors but I tried this myself and it works. Please note that when changing to euler angles you will get singularities also called as "Gimbal lock"
An important note here is that this is not my work but I found it over the internet and wanted to thank who ever did this priceless code...Cheers

Determining the position/direction of an aircraft

I'm working in a project that involves gyroscopes...
I'm using Arduino and an ITG 3200 to read the data from the gyroscope. I get 3 values in deg/s for each axis (x,y,z).
My question is: How can I know the actual (physical) position or direction of the device (let's say an airplane). There has to be a math formula or something like that.
Using only the gyroscope signal (which you have to integrate numerically), you'll eventually run into trouble, due to drift. What's normally done is combining an accelerometer (for low frequency signals, i.e. drift) with a gyroscope (for high frequency signals). Here's a link few links showing more or less exactly what you want:
http://www.starlino.com/imu_guide.html
http://www.instructables.com/id/Accelerometer-Gyro-Tutorial
http://www.starlino.com/quadcopter_acc_gyro.html
Also, see these StackOverflow questions:
Combine Gyroscope and Accelerometer Data
Integrating gyro and accelerometer readings
gyro, accelerometer, magnetometer and Kalman filter
How to determine relative position using accelerometer and gyro data
We are working on a similar problem.
We found this video on YouTube especially helpful, as it came with a paper as well as an implementation (which runs on Arduino):
http://www.youtube.com/watch?v=fOSTOnQzZCI
The paper and source code:
http://code.google.com/p/imumargalgorithm30042010sohm/
In our case (getting the orientation of a remote-controlled ball), we also had to include an accelerometer and a magnetoscope.

Augmented Reality Demo

I'm trying to build an Augmented Reality Demonstration, like this iPhone App:
http://www.acrossair.com/acrossair_app_augmented_reality_nearesttube_london_for_iPhone_3GS.htm
However my geometry/math is a bit rusty nowadays.
This is what I know:
If i have my Android phone on the landscape mode (with the home button on the left), my z axis points to the direction I'm looking.
From the sensors of my phone i know what is the angle my z axis has with the North axis, let's call this angle theta.
If I have a vector from my current position to the point I want to show in my screen, i can calculate the angle this vector does with my z axis. Let's call this angle alpha.
So, based on the alpha angle I have a perception of where the point is, and I'm able to show it in the screen (like the Nearest Tube App).
This is the basic theory of a simple demonstration (of course it's nothing like the App, but it's the first step).
Can someone give me some lights on this matter?
[Update]
I've found this very interesting example, however I need to have the movement on both xx and yy axis. Any hints?
The basics are easy. You need the angle between your location and your destiny (arctangent), and the heading (from the digital compass in your phone). See this answer: Augmented Reality movement There is some objective-c code down there that you can read if you come from java.
What you want is a 3d-Space-Filling-Curve for example a hilbert-curve. That is a spatial index over 3 ccordinate. It is comparable to a octree. You want to store the object in that octree and do a depth-firat search on the coordinate you have recorded with your iphone as fixed coordinate probably the center of the screen. A octree subdivde the space continously in eigth directions and a 3d-Space-Filling-Curve is an hamiltonian path through the space which is like a fracta but it is clearly distinctable from the region of the octree. I use 2d-hilbert-curve to speed search in geospatial databases. Maybe you want to start with this first?

Resources