using Quaternion for IMU data - math

I have an IMU device 6 DOF(3 DOF accelerometer and 3 DOF gyroscope). I want to use it to rotate an object in unity3d. For that I calculate the Euler angles roll, pitch and yaw.
My problem is the singularity of the pitch angle at +90degrees and -90degrees. I need to solve it.
I read that using quaternions instead of euler angles this problem disappears.
I am very unexpert of Quaternion, and how can I use Quaternion with data getting from IMU sensors? How can I use it to rotate a gameobject in unity3d?
Thank you

Related

Quaternion issues

I'm working with a MPU9250 to control a 3D position. I have an object that can rotate in 3 axis ( Roll, Pitch and Yaw) but I found some problems.
Initially I tried to calculate Euler Angles from Quaternions, but I discover that Euler Angles have 2 discontinuity in +/- 90 degrees, and so if I rotate the MPU for 365 deg it gives me wrong values like it's shifting while rotating.
I read that there is the possibility to convert Quaternions to DCM (Direction Cosine Matrix) but I find this algorithm fills too much the Arduino processor for calculations.
The last possibility that comes to my mind is to control the position directly with Quaternions, so I have to "forecast" the arriving Quaternion to stop the rotation when the MPU will give me the same Quaternion.
If anyone knows how to implement this or any other way please let me know your suggestions.
Many thanks,
Luca

Rotating a line defined by two points in 3D

I have edited this question for additional clarity and because of some of the answers below.
I have an electromagnetic motion tracker which tracks a sensor and gives me a point in global space (X, Y, Z). It also tracks the rotation of the sensor and gives Euler angles (Yaw, Pitch, Roll).
The sensor is attached to a rigid body on a baseball cap which sits on the head of a person. However, I wish to track the position of a specific facial feature (nose for example) which I infer from the motion tracker sensor's position and orientation.
I have estimated the spatial offset between the motion tracker and the facial features I want to track. I have done this by simply measuring the offset along the X, Y and Z axis.
Based on a previous answer to this question, I have composed a rotation matrix from the euler angles given to me by the motion tracker. However, I am stuck with how I should use this rotation matrix, the position of the sensor in global space and the spatial offset between that sensor and the nose to give me the position of the nose in global space.
The sensor will give you a rotation matrix (via the Euler angles) and a position (which should be that of the center of rotation).
Whatever item is rigidly fastened to the sensor, such as the nose, will undergo the same motion. Then knowing the relative coordinates of the nose and the sensor, you get the relation
Q = R.q + P
where R is the rotation matrix, P the position vector of the sensor and q the relative coordinates of the nose.
Note that the relation between the rotation matrix and the angles can be computed using one of these formulas: https://en.wikipedia.org/wiki/Euler_angles#Rotation_matrix. (You will need to read the article carefully to make sure which your cases is among the 12 possibilities.)
In principle, you determine R and P from the readings of the sensor, but you are missing the coordinates q. There are several approaches:
you determine those coordinates explicitly by measuring the distances along virtual axes located at the rotation center and properly aligned.
you determine the absolute coordinates Q of the nose corresponding to known R and P; then q is given by R'(Q - P) where R' denotes the transpose of R (which is also its inverse). To obtain Q, you can just move the sensor center to the nose without moving the head.

Quaternion - Angle computation using accelerometer and gyroscope

I have been using a 6dof LSM6DS0 IMU unit (with accelerometer and gyroscope). And I am trying to calculate the angle of rotation around all the three axes. I have tried may methods but not getting the results as expected.
Methods tried:
(i) Complementatry filter approach - I am able to get the angles using the formula provided in the link Angle comutation method.
But the problem is that angles are not at all consistent and drifts a lot. Moreover when the IMU is rotated around one axis, angles calculated over other axis are wobbling too much.
(ii) Quaternion based angle calculation : There were plenty of resources claiming the angles are calcluated very well using quaternion approach but none had a clear explanation. I have used this method in order to update the quaternion for every values taken from the IMU unit. But the link dint explain how to calculate the angles from quaternion.
I have used glm math library inorder to convert the quaternion to euler angles and also have tried the formula specified in wiki link. With this method since in pitch calculation asin returns only -90 to +90 degrees I am not able to rotate the object in 3D as the one they have been doing in the mentioned link.
Does anyone have tried the quaternion to angle conversion before?? I need to calculate the angles around all the three axis in the range 0 to 360 degrees or -180 to +180 degrees.
Any help could be really appreciated. Thanks in advance.
http://davidegironi.blogspot.com.ar/2013/02/avr-atmega-mpu6050-gyroscope-and.html#.VzjDKfnhCUk
Sq = [q0, q1, q2, q3]
//sensor quaternion can be translated to aerospace sequence of yaw/pitch/roll angles
yaw = atan2(2*q1*q2 - 2*q0*q3, 2*(q0^2) + 2*(q1^2) - 1)
pitch = -asin(2*q1*q3 + 2*q0*q2)
roll = atan2(2*q2*q3 - 2*q0*q1, 2*(q0^2) + 2*(q3^2) - 1)

MPU6050 IMU quaternion conversion

I have a dilemma for a control system, and due to my poor maths ability am unable to resolve it despite days of googling -
I am using an Arduino with an MPU6050 to get Yaw, Pitch, and Roll which is converted from a quaternion (to prevent gimbal lock)
This issue i have is due to space i am unable to mount the gyroscope flat with x facing forwards, z facing up/down and y facing left. Instead, i can only mount the gyroscope with x facing down, y facing forwards and z facing right / left. EG Axis ( Roll , Pitch , Yaw ) have become (Yaw,Roll,Pitch). I am trying to convert the quaternion to reflect this change, but have no idea how. I have tried adding a quaternion to it, but no success. Anyone got any ideas how best to use an IMU mounted this way?
I might be wrong here but since you're rotating the orientation vector(yaw, pitch, roll) by a constant rotation you don't need to worry about gimbal lock. You can simply multiply your orientation vector by the corresponding pre-calculated rotation matrix.

Gravity Compensation in Accelerometer Data

Given an Accelerometer with 9 DOF (Accelerometer, Gyroscope and Magnetometer) I want to remove/compensate the effect of the gravity in accelerometer reading (Accelerometer can rotate freely). The sensor gives the orientation in quaternion representation relative to a (magnetic)north, west and up reference frame.
I found this http://www.varesano.net/blog/fabio/simple-gravity-compensation-9-dom-imus
but couldn't understand the basis for the given equation.
How could I achieve this given above information?
You need to rotate the accelerometer reading by the quaternion into the Earth frame of reference (into the coordinate system of the room if you like), then subtract gravity. The remaining acceleration is the acceleration of the sensor in the Earth frame of reference often referred to as linear acceleration or user acceleration.
In pseudo-code, something like this
acceleration = [ax, ay, ay] // accelerometer reading
q // quaternion corresponding to the orientation
gravity = [0, 0, -9.81] // gravity on Earth in m/s^2
a_rotated = rotate(acceleration, q) // rotate the measured acceleration into
// the Earth frame of reference
user_acceleration = a_rotated - gravity
You say that you can get q through the API. The only nontrivial step is to implement the rotate() function.
To compute the image of a vector v when rotated by q, the following formula should be applied: vrotated = qvq-1. To compute it with floating point numbers, you need to work out the formulas yourself; they are available at Using quaternion rotations.
As far as I can tell, the link you provided does exactly this, you see the expanded formulas there and now you know where they came from. Also, the linked content seems to measure gravity in g, that is, gravity is [0,0,-1].
Watch out for sign conventions (whether you consider gravity [0,0,-1] or [0,0,1]) and handedness of your coordinate systems!
I assume your accelerometer reading is in sensor body frame. First we need to represent accelerometer data with respect to inertial frame, and then subtract gravity.
If you are directly using Euler angles rather than quaternion, then you need to compute rotation matrix
R = [
ctheta*cpsi,
-cphi*spsi + sphi*stheta*cpsi,
sphi*spsi + cphi*stheta*cpsi;
ctheta*spsi, cphi*cpsi + sphi*stheta*spsi,
-sphi*cpsi + cphi*stheta*spsi;
-stheta, sphi*ctheta, cphi*ctheta
]
(It's given with MATLAB notation). Here phi stands for roll angle, theta for pitch, and psi for yaw. This R matrix is from body to inertial frame. I think in flight dynamics it's also known as transpose of Direction Cosine Matrix (DCM).
When you apply matrix multiplication operation, now you need to subtract gravity from z direction in order to eliminate static acceleration, i.e., gravity.

Resources