Solving absolute scale problem in Visual Odometry using IMU data and or distance meter - scale

the visual odometry gives me x,y,z position, unless an absolute scale.
I can also have IMU data (to obtain for example roll, pith yaw information).
I can also have a distance meter (to obtain a varying distance from the current object)
There is an easy way to combine these information and obtain the related scale?
Thanks!

If you have access to a 3D accelerometer and 3D gyroscope you can estimate (e.g. by using a Kalman filter) the biases of accelerometer + gyroscope AND estimate the scale. This paper (https://www.researchgate.net/publication/220061491_Fusion_of_IMU_and_Vision_for_Absolute_Scale_Estimation_in_Monocular_SLAM) describes two approaches how to estimate the scale given calibrated IMU data.

Related

IMU software-based Orientation Adjustment

I have a six-axis IMU that outputs orientation data in Quaternion, Euler angle, and YPR formats. This data is used in a quadcopter PID controller for stabilization.
I have built a test stand for tuning the quadcopter; however, the axis of rotation for the test stand is ~45° rotated from the IMU measurement axes.
Previously, I managed to adjust the offset using a rotation matrix, but (given some changes to the program) I am now dealing with angles/quaternions instead of vectors, and it seems that this method is no longer applicable (at least without intermediate steps) to the measurement data.
This leads to my question: How can I apply this 45° transformation to the output data (in the provided format-- quaternion, Euler, YPR) in order to measure the attitude along the axis of rotation?

Rotating a line defined by two points in 3D

I have edited this question for additional clarity and because of some of the answers below.
I have an electromagnetic motion tracker which tracks a sensor and gives me a point in global space (X, Y, Z). It also tracks the rotation of the sensor and gives Euler angles (Yaw, Pitch, Roll).
The sensor is attached to a rigid body on a baseball cap which sits on the head of a person. However, I wish to track the position of a specific facial feature (nose for example) which I infer from the motion tracker sensor's position and orientation.
I have estimated the spatial offset between the motion tracker and the facial features I want to track. I have done this by simply measuring the offset along the X, Y and Z axis.
Based on a previous answer to this question, I have composed a rotation matrix from the euler angles given to me by the motion tracker. However, I am stuck with how I should use this rotation matrix, the position of the sensor in global space and the spatial offset between that sensor and the nose to give me the position of the nose in global space.
The sensor will give you a rotation matrix (via the Euler angles) and a position (which should be that of the center of rotation).
Whatever item is rigidly fastened to the sensor, such as the nose, will undergo the same motion. Then knowing the relative coordinates of the nose and the sensor, you get the relation
Q = R.q + P
where R is the rotation matrix, P the position vector of the sensor and q the relative coordinates of the nose.
Note that the relation between the rotation matrix and the angles can be computed using one of these formulas: https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix. (You will need to read the article carefully to make sure which your cases is among the 12 possibilities.)
In principle, you determine R and P from the readings of the sensor, but you are missing the coordinates q. There are several approaches:
you determine those coordinates explicitly by measuring the distances along virtual axes located at the rotation center and properly aligned.
you determine the absolute coordinates Q of the nose corresponding to known R and P; then q is given by R'(Q - P) where R' denotes the transpose of R (which is also its inverse). To obtain Q, you can just move the sensor center to the nose without moving the head.

how to transform accelerometer data from device coordinates to absolute coordinates?

I'm using arduino to read from an accelerometer and gyroscope, from which I can get a vector of accelerations and also a vector of speeds of rotations in the device's coordinates, now I want to transform the accelerometer data into the absolute coordinates, in which the Z axis straightly points up aligned with the direction of gravity force, and X, Y form an absolute horizontal plane.
I've read many posts on internet, but cannot find a good solution yet. These posts either discuss how to remove noise by combining gyroscope with accelerometer (e.g., http://www.starlino.com/imu_guide.html) or provide a solution based on Android, which could directly leverages the rotation matrix provided by Android API (For example, this post: Transforming accelerometer's data from device's coordinates to real world coordinates).
But now, I only have raw readings of accelerometer and gyroscope, how can I transform accelerations from device coordinates to absolute coordinates via python?
BTW, in my experiments, the device will be always in an relative stable state in a while, which can be used to estimate the direction of gravity in device's coordinates.

Gravity Compensation in Accelerometer Data

Given an Accelerometer with 9 DOF (Accelerometer, Gyroscope and Magnetometer) I want to remove/compensate the effect of the gravity in accelerometer reading (Accelerometer can rotate freely). The sensor gives the orientation in quaternion representation relative to a (magnetic)north, west and up reference frame.
I found this http://www.varesano.net/blog/fabio/simple-gravity-compensation-9-dom-imus
but couldn't understand the basis for the given equation.
How could I achieve this given above information?
You need to rotate the accelerometer reading by the quaternion into the Earth frame of reference (into the coordinate system of the room if you like), then subtract gravity. The remaining acceleration is the acceleration of the sensor in the Earth frame of reference often referred to as linear acceleration or user acceleration.
In pseudo-code, something like this
acceleration = [ax, ay, ay] // accelerometer reading
q // quaternion corresponding to the orientation
gravity = [0, 0, -9.81] // gravity on Earth in m/s^2
a_rotated = rotate(acceleration, q) // rotate the measured acceleration into
// the Earth frame of reference
user_acceleration = a_rotated - gravity
You say that you can get q through the API. The only nontrivial step is to implement the rotate() function.
To compute the image of a vector v when rotated by q, the following formula should be applied: vrotated = qvq-1. To compute it with floating point numbers, you need to work out the formulas yourself; they are available at Using quaternion rotations.
As far as I can tell, the link you provided does exactly this, you see the expanded formulas there and now you know where they came from. Also, the linked content seems to measure gravity in g, that is, gravity is [0,0,-1].
Watch out for sign conventions (whether you consider gravity [0,0,-1] or [0,0,1]) and handedness of your coordinate systems!
I assume your accelerometer reading is in sensor body frame. First we need to represent accelerometer data with respect to inertial frame, and then subtract gravity.
If you are directly using Euler angles rather than quaternion, then you need to compute rotation matrix
R = [
ctheta*cpsi,
-cphi*spsi + sphi*stheta*cpsi,
sphi*spsi + cphi*stheta*cpsi;
ctheta*spsi, cphi*cpsi + sphi*stheta*spsi,
-sphi*cpsi + cphi*stheta*spsi;
-stheta, sphi*ctheta, cphi*ctheta
]
(It's given with MATLAB notation). Here phi stands for roll angle, theta for pitch, and psi for yaw. This R matrix is from body to inertial frame. I think in flight dynamics it's also known as transpose of Direction Cosine Matrix (DCM).
When you apply matrix multiplication operation, now you need to subtract gravity from z direction in order to eliminate static acceleration, i.e., gravity.

Determining the position/direction of an aircraft

I'm working in a project that involves gyroscopes...
I'm using Arduino and an ITG 3200 to read the data from the gyroscope. I get 3 values in deg/s for each axis (x,y,z).
My question is: How can I know the actual (physical) position or direction of the device (let's say an airplane). There has to be a math formula or something like that.
Using only the gyroscope signal (which you have to integrate numerically), you'll eventually run into trouble, due to drift. What's normally done is combining an accelerometer (for low frequency signals, i.e. drift) with a gyroscope (for high frequency signals). Here's a link few links showing more or less exactly what you want:
http://www.starlino.com/imu_guide.html
http://www.instructables.com/id/Accelerometer-Gyro-Tutorial
http://www.starlino.com/quadcopter_acc_gyro.html
Also, see these StackOverflow questions:
Combine Gyroscope and Accelerometer Data
Integrating gyro and accelerometer readings
gyro, accelerometer, magnetometer and Kalman filter
How to determine relative position using accelerometer and gyro data
We are working on a similar problem.
We found this video on YouTube especially helpful, as it came with a paper as well as an implementation (which runs on Arduino):
http://www.youtube.com/watch?v=fOSTOnQzZCI
The paper and source code:
http://code.google.com/p/imumargalgorithm30042010sohm/
In our case (getting the orientation of a remote-controlled ball), we also had to include an accelerometer and a magnetoscope.

Resources