the precision is the number of points I want for my vector, from 0, the initial point of my arc, to the precision I want minus 1.
Code example in c++:
int precision = 20;
double pointInit[3] = {2,5,2};
double pointRandom[3] = {3,7,1};
double pointInit[3] = {0,-3,1};
std::vector<std::array<double,3>> pointArc;
std::array<double, 3> currentPoint;
// Fill the pointArc vector, from 0 (initial point) to precision -1 (ending point)
for (int i = 0 ; i < precision; i++)
{
// Find the value of the current point
// currentPoint[0] = ????;
// currentPoint[1] = ????;
// currentPoint[2] = ????;
pointArc.push_back(currentPoint);
}
EDIT : The arc I'm looking for is a circular arc
Use atan2() to find the angles of the endpoints with respect to the center, subtend the angle between them precision - 1 times, and convert the polar coordinates (use one of the endpoints to get the distance from the center) to rectangular form.
1) translate the three points so that P0 comes to the origin
2) consider the vectors P0P1 and P0P2 and form an orthonormal basis by the Gram-Schmidt process (this is easy)
3) in this new basis, the coordinates of the three points are (0, 0, 0), (X1, 0, 0), (X2, Y2, 0), and you have turned the 3D problem to 2D. (Actually X1=d(P0,P1) and X2, Y2 are obtained from the dot and cross products of P0P2 with P0P1 / X1)
The equation of a 2D circle through the origin is
x² + y² = 2Xc.x + 2Yc.y
Plugging the above coordinates, you easily solve the 2x2 system for Xc and Yc.
X1² = 2Xc.X1
X2² + Y2² = 2Xc.X2 + 2Yc.Y2
4) The parametric equation of the circle is
x = Xc + R cos(t)
y = Yc + R sin(t)
where R²=Xc²+Yc².
You can find the angles t0 and t2 corresponding to the endpoints with tan(t) = (y - Yc) / (x - Xc).
5) interpolate on the angle t0.(1-i/n) + t2.i/n, compute the reduced coordinates x, y from the parametric equation and apply the inverse transforms of 2) and 1).
I have two vectors in a game. One vector is the player, one vector is an object. I also have a vector that specifies the direction the player if facing. The direction vector has no z part. It is a point that has a magnitude of 1 placed somewhere around the origin.
I want to calculate the angle between the direction the soldier is currently facing and the object, so I can correctly pan some audio (stereo only).
The diagram below describes my problem. I want to calculate the angle between the two dashed lines. One dashed line connects the player and the object, and the other is a line representing the direction the player is facing from the point the player is at.
At the moment, I am doing this (assume player, object and direction are all vectors with 3 points, x, y and z):
Vector3d v1 = direction;
Vector3d v2 = object - player;
v1.normalise();
v2.normalise();
float angle = acos(dotProduct(v1, v2));
But it seems to give me incorrect results. Any advice?
Test of code:
Vector3d soldier = Vector3d(1.f, 1.f, 0.f);
Vector3d object = Vector3d(1.f, -1.f, 0.f);
Vector3d dir = Vector3d(1.f, 0.f, 0.f);
Vector3d v1 = dir;
Vector3d v2 = object - soldier;
long steps = 360;
for (long step = 0; step < steps; step++) {
float rad = (float)step * (M_PI / 180.f);
v1.x = cosf(rad);
v1.y = sinf(rad);
v1.normalise();
float dx = dotProduct(v2, v1);
float dy = dotProduct(v2, soldier);
float vangle = atan2(dx, dy);
}
You shoud always use atan2 when computing angular deltas, and then normalize.
The reason is that for example acos is a function with domain -1...1; even normalizing if the input absolute value (because of approximations) gets bigger than 1 the function will fail even if it's clear that in such a case you would have liked an angle of 0 or PI instead. Also acos cannot measure the full range -PI..PI and you'd need to use explicitly sign tests to find the correct quadrant.
Instead atan2 only singularity is at (0, 0) (where of course it doesn't make sense to compute an angle) and its codomain is the full circle -PI...PI.
Here is an example in C++
// Absolute angle 1
double a1 = atan2(object.y - player.y, object.x - player.x);
// Absolute angle 2
double a2 = atan2(direction.y, direction.x);
// Relative angle
double rel_angle = a1 - a2;
// Normalize to -PI .. +PI
rel_angle -= floor((rel_angle + PI)/(2*PI)) * (2*PI) - PI;
In the case of a general 3d orientation you need two orthogonal directions, e.g. the vector of where the nose is pointing to and the vector to where your right ear is.
In that case the formulas are just slightly more complex, but simpler if you have the dot product handy:
// I'm assuming that '*' is defined as the dot product
// between two vectors: x1*x2 + y1*y2 + z1*z2
double dx = (object - player) * nose_direction;
double dy = (object - player) * right_ear_direction;
double angle = atan2(dx, dy); // Already in -PI ... PI range
In 3D space, you also need to compute the axis:
Vector3d axis = normalise(crossProduct(normalise(v1), normalise(v2)));
I have 3d mesh and I would like to draw each face a 2d shape.
What I have in mind is this:
for each face
1. access the face normal
2. get a rotation matrix from the normal vector
3. multiply each vertex to the rotation matrix to get the vertices in a '2d like ' plane
4. get 2 coordinates from the transformed vertices
I don't know if this is the best way to do this, so any suggestion is welcome.
At the moment I'm trying to get a rotation matrix from the normal vector,
how would I do this ?
UPDATE:
Here is a visual explanation of what I need:
At the moment I have quads, but there's no problem
converting them into triangles.
I want to rotate the vertices of a face, so that
one of the dimensions gets flattened.
I also need to store the original 3d rotation of the face.
I imagine that would be inverse rotation of the face
normal.
I think I'm a bit lost in space :)
Here's a basic prototype I did using Processing:
void setup(){
size(400,400,P3D);
background(255);
stroke(0,0,120);
smooth();
fill(0,120,0);
PVector x = new PVector(1,0,0);
PVector y = new PVector(0,1,0);
PVector z = new PVector(0,0,1);
PVector n = new PVector(0.378521084785,0.925412774086,0.0180059205741);//normal
PVector p0 = new PVector(0.372828125954,-0.178844243288,1.35241031647);
PVector p1 = new PVector(-1.25476706028,0.505195975304,0.412718296051);
PVector p2 = new PVector(-0.372828245163,0.178844287992,-1.35241031647);
PVector p3 = new PVector(1.2547672987,-0.505196034908,-0.412717700005);
PVector[] face = {p0,p1,p2,p3};
PVector[] face2d = new PVector[4];
PVector nr = PVector.add(n,new PVector());//clone normal
float rx = degrees(acos(n.dot(x)));//angle between normal and x axis
float ry = degrees(acos(n.dot(y)));//angle between normal and y axis
float rz = degrees(acos(n.dot(z)));//angle between normal and z axis
PMatrix3D r = new PMatrix3D();
//is this ok, or should I drop the builtin function, and add
//the rotations manually
r.rotateX(rx);
r.rotateY(ry);
r.rotateZ(rz);
print("original: ");println(face);
for(int i = 0 ; i < 4; i++){
PVector rv = new PVector();
PVector rn = new PVector();
r.mult(face[i],rv);
r.mult(nr,rn);
face2d[i] = PVector.add(face[i],rv);
}
print("rotated: ");println(face2d);
//draw
float scale = 100.0;
translate(width * .5,height * .5);//move to centre, Processing has 0,0 = Top,Lef
beginShape(QUADS);
for(int i = 0 ; i < 4; i++){
vertex(face2d[i].x * scale,face2d[i].y * scale,face2d[i].z * scale);
}
endShape();
line(0,0,0,nr.x*scale,nr.y*scale,nr.z*scale);
//what do I do with this ?
float c = cos(0), s = sin(0);
float x2 = n.x*n.x,y2 = n.y*n.y,z2 = n.z*n.z;
PMatrix3D m = new PMatrix3D(x2+(1-x2)*c, n.x*n.y*(1-c)-n.z*s, n.x*n.z*(1-c)+n.y*s, 0,
n.x*n.y*(1-c)+n.z*s,y2+(1-y2)*c,n.y*n.z*(1-c)-n.x*s,0,
n.x*n.y*(1-c)-n.y*s,n.x*n.z*(1-c)+n.x*s,z2-(1-z2)*c,0,
0,0,0,1);
}
Update
Sorry if I'm getting annoying, but I don't seem to get it.
Here's a bit of python using Blender's API:
import Blender
from Blender import *
import math
from math import sin,cos,radians,degrees
def getRotMatrix(n):
c = cos(0)
s = sin(0)
x2 = n.x*n.x
y2 = n.y*n.y
z2 = n.z*n.z
l1 = x2+(1-x2)*c, n.x*n.y*(1-c)+n.z*s, n.x*n.y*(1-c)-n.y*s
l2 = n.x*n.y*(1-c)-n.z*s,y2+(1-y2)*c,n.x*n.z*(1-c)+n.x*s
l3 = n.x*n.z*(1-c)+n.y*s,n.y*n.z*(1-c)-n.x*s,z2-(1-z2)*c
m = Mathutils.Matrix(l1,l2,l3)
return m
scn = Scene.GetCurrent()
ob = scn.objects.active.getData(mesh=True)#access mesh
out = ob.name+'\n'
#face0
f = ob.faces[0]
n = f.v[0].no
out += 'face: ' + str(f)+'\n'
out += 'normal: ' + str(n)+'\n'
m = getRotMatrix(n)
m.invert()
rvs = []
for v in range(0,len(f.v)):
out += 'original vertex'+str(v)+': ' + str(f.v[v].co) + '\n'
rvs.append(m*f.v[v].co)
out += '\n'
for v in range(0,len(rvs)):
out += 'original vertex'+str(v)+': ' + str(rvs[v]) + '\n'
f = open('out.txt','w')
f.write(out)
f.close
All I do is get the current object, access the first face, get the normal, get the vertices, calculate the rotation matrix, invert it, then multiply it by each vertex.
Finally I write a simple output.
Here's the output for a default plane for which I rotated all the vertices manually by 30 degrees:
Plane.008
face: [MFace (0 3 2 1) 0]
normal: [0.000000, -0.499985, 0.866024](vector)
original vertex0: [1.000000, 0.866025, 0.500000](vector)
original vertex1: [-1.000000, 0.866026, 0.500000](vector)
original vertex2: [-1.000000, -0.866025, -0.500000](vector)
original vertex3: [1.000000, -0.866025, -0.500000](vector)
rotated vertex0: [1.000000, 0.866025, 1.000011](vector)
rotated vertex1: [-1.000000, 0.866026, 1.000012](vector)
rotated vertex2: [-1.000000, -0.866025, -1.000012](vector)
rotated vertex3: [1.000000, -0.866025, -1.000012](vector)
Here's the first face of the famous Suzanne mesh:
Suzanne.001
face: [MFace (46 0 2 44) 0]
normal: [0.987976, -0.010102, 0.154088](vector)
original vertex0: [0.468750, 0.242188, 0.757813](vector)
original vertex1: [0.437500, 0.164063, 0.765625](vector)
original vertex2: [0.500000, 0.093750, 0.687500](vector)
original vertex3: [0.562500, 0.242188, 0.671875](vector)
rotated vertex0: [0.468750, 0.242188, -0.795592](vector)
rotated vertex1: [0.437500, 0.164063, -0.803794](vector)
rotated vertex2: [0.500000, 0.093750, -0.721774](vector)
rotated vertex3: [0.562500, 0.242188, -0.705370](vector)
The vertices from the Plane.008 mesh are altered, the ones from Suzanne.001's mesh
aren't. Shouldn't they ? Should I expect to get zeroes on one axis ?
Once I got the rotation matrix from the normal vector, what is the rotation on x,y,z ?
Note: 1. Blender's Matrix supports the * operator 2.In Blender's coordinate system Z point's up. It looks like a right handed system, rotated 90 degrees on X.
Thanks
That looks reasonable to me. Here's how to get a rotation matrix from normal vector. The normal is the vector. The angle is 0. You probably want the inverse rotation.
Is your mesh triangulated? I'm assuming it is. If so, you can do this, without rotation matrices. Let the points of the face be A,B,C. Take any two vertices of the face, say A and B. Define the x axis along vector AB. A is at 0,0. B is at 0,|AB|. C can be determined from trigonometry using the angle between AC and AB (which you get by using the dot product) and the length |AC|.
You created the m matrix correctly. This is the rotation that corresponds to your normal vector. You can use the inverse of this matrix to "unrotate" your points. The normal of face2d will be x, i.e. point along the x-axis. So extract your 2d coordinates accordingly. (This assumes your quad is approximately planar.)
I don't know the library you are using (Processing), so I'm just assuming there are methods for m.invert() and an operator for applying a rotation matrix to a point. They may of course be called something else. Luckily the inverse of a pure rotation matrix is its transpose, and multiplying a matrix and a vector are straightforward to do manually if you need to.
void setup(){
size(400,400,P3D);
background(255);
stroke(0,0,120);
smooth();
fill(0,120,0);
PVector x = new PVector(1,0,0);
PVector y = new PVector(0,1,0);
PVector z = new PVector(0,0,1);
PVector n = new PVector(0.378521084785,0.925412774086,0.0180059205741);//normal
PVector p0 = new PVector(0.372828125954,-0.178844243288,1.35241031647);
PVector p1 = new PVector(-1.25476706028,0.505195975304,0.412718296051);
PVector p2 = new PVector(-0.372828245163,0.178844287992,-1.35241031647);
PVector p3 = new PVector(1.2547672987,-0.505196034908,-0.412717700005);
PVector[] face = {p0,p1,p2,p3};
PVector[] face2d = new PVector[4];
//what do I do with this ?
float c = cos(0), s = sin(0);
float x2 = n.x*n.x,y2 = n.y*n.y,z2 = n.z*n.z;
PMatrix3D m_inverse =
new PMatrix3D(x2+(1-x2)*c, n.x*n.y*(1-c)+n.z*s, n.x*n.y*(1-c)-n.y*s, 0,
n.x*n.y*(1-c)-n.z*s,y2+(1-y2)*c,n.x*n.z*(1-c)+n.x*s, 0,
n.x*n.z*(1-c)+n.y*s,n.y*n.z*(1-c)-n.x*s,z2-(1-z2)*c, 0,
0,0,0,1);
face2d[0] = m_inverse * p0; // Assuming there's an appropriate operator*().
face2d[1] = m_inverse * p1;
face2d[2] = m_inverse * p2;
face2d[3] = m_inverse * p3;
// print & draw as you did before...
}
For face v0-v1-v3-v2 vectors v3-v0, v3-v2 and a face normal already form rotation matrix that would transform 2d face into 3d face.
Matrix represents coordinate system. Each row (or column, depending on notation) corresponds to axis coordinate system within new coordinate system. 3d rotation/translation matrix can be represented as:
vx.x vx.y vx.z 0
vy.x vy.y vy.z 0
vz.x vz.y vz.z 0
vp.x vp.y vp.z 1
where vx is an x axis of a coordinate system, vy - y axis, vz - z axis, and vp - origin of new system.
Assume that v3-v0 is an y axis (2nd row), v3-v2 - x axis (1st row), and normal - z axis (3rd row). Build a matrix from them. Then invert matrix. You'll get a matrix that will rotate a 3d face into 2d face.
I have 3d mesh and I would like to draw each face a 2d shape.
I suspect that UV unwrapping algorithms are closer to what you want to achieve than trying to get rotation matrix from 3d face.
That's very easy to achieve: (Note: By "face" I mean "triangle")
Create a view matrix that represents a camera looking at a face.
Determine the center of the face with bi-linear interpolation.
Determine the normal of the face.
Position the camera some units in opposite normal direction.
Let the camera look at the center of the face.
Set the cameras up vector point in the direction of the middle of any vertex of the face.
Set the aspect ratio to 1.
Compute the view matrix using this data.
Create a orthogonal projection matrix.
Set the width and height of the view frustum large enough to contain the whole face (e.g. the length of the longest site of a face).
Compute the projection matrix.
For every vertex v of the face, multiply it by both matrices: v * view * projection.
The result is a projection of Your 3d faces into 2d space as if You were looking at them exactly orthogonal without any perspective disturbances. The final coordinates will be in normalized screen coordinates where (-1, -1) is the bottom left corner, (0, 0) is the center and (1, 1) is the top right corner.
Given curves of type Circle and Circular-Arc in 3D space, what is a good way to compute accurate bounding boxes (world axis aligned)?
Edit: found solution for circles, still need help with Arcs.
C# snippet for solving BoundingBoxes for Circles:
public static BoundingBox CircleBBox(Circle circle)
{
Point3d O = circle.Center;
Vector3d N = circle.Normal;
double ax = Angle(N, new Vector3d(1,0,0));
double ay = Angle(N, new Vector3d(0,1,0));
double az = Angle(N, new Vector3d(0,0,1));
Vector3d R = new Vector3d(Math.Sin(ax), Math.Sin(ay), Math.Sin(az));
R *= circle.Radius;
return new BoundingBox(O - R, O + R);
}
private static double Angle(Vector3d A, Vector3d B)
{
double dP = A * B;
if (dP <= -1.0) { return Math.PI; }
if (dP >= +1.0) { return 0.0; }
return Math.Acos(dP);
}
One thing that's not specified is how you convert that angle range to points in space. So we'll start there and assume that the angle 0 maps to O + r***X** and angle π/2 maps to O + r***Y**, where O is the center of the circle and
X = (x1,x2,x3)
and
Y = (y1,y2,y3)
are unit vectors.
So the circle is swept out by the function
P(θ) = O + rcos(θ)X + rsin(θ)Y
where θ is in the closed interval [θstart,θend].
The derivative of P is
P'(θ) = -rsin(θ)X + rcos(θ)Y
For the purpose of computing a bounding box we're interested in the points where one of the coordinates reaches an extremal value, hence points where one of the coordinates of P' is zero.
Setting -rsin(θ)xi + rcos(θ)yi = 0 we get
tan(θ) = sin(θ)/cos(θ) = yi/xi.
So we're looking for θ where θ = arctan(yi/xi) for i in {1,2,3}.
You have to watch out for the details of the range of arctan(), and avoiding divide-by-zero, and that if θ is a solution then so is θ±k*π, and I'll leave those details to you.
All you have to do is find the set of θ corresponding to extremal values in your angle range, and compute the bounding box of their corresponding points on the circle, and you're done. It's possible that there are no extremal values in the angle range, in which case you compute the bounding box of the points corresponding to θstart and θend. In fact you may as well initialize your solution set of θ's with those two values, so you don't have to special case it.