Plane point rotation to a specific plane - math

I have a system where one axis is moving from [0 -> 2PI]. This movement generates an angled plane. Axis movement.
This yellow plane will be my target plane. I know the normal vector of this yellow plane and its constant. For me to calculate XYZ position on the yellow plane based on the rotation value of the axis (tool). I've come to a "solution" to first calculate what is the XYZ coordinate for a simpler plane vertical plane [1 0 0] as normal vector as I know the sphere origin and also the radius then it is easy to calculate any XYZ position based on the axis angle.
But my probelm is that now that I have the XYZ position on the gray plane: how can get my XYZ position to the corresponding position on the yellow plane? From gray plane to yellow plane Any suggestions would be appreciated.

Solution to this was simple.. I made it more complicated than necessary. There wasn't any need for transforming the points from one plane to another as these values could be calculated easily from the sphere origin and the plane orientation values.
// calculate axis rotation to radians
let radAngle = (angle)*Math.PI/180;
let beeta = (90*Math.PI/180) - radAngle; //rotation value on the circle
let gamma = Math.acos(yellow.normal.x); //plane orientation
// temporary vars
let cb = Math.cos(beeta);
let sb = Math.sin(beeta);
let cg = Math.cos(gamma);
let sg = Math.sin(gamma);
let x = sphere.origin.x + sphere.radius*(cg*sb);
let y = sphere.origin.y + sphere.radius*(sg*sb);
let z = sphere.origin.z + sphere.radius*cb;
Rotation sample

Related

How can I obtain UV coordinates of rectangle in 3D with raytracing method?

I am currently working on raytracing. I have problem with view Ray collisions. I cant figure out how to get intersection point of ray and plane, to be more precise, my problem is not figure out intersection point of ray vs plane, problem is to convert this coordinate into uv coordinate(this rectangle can be rotated anyhow in world) for texture mapping. I know One point on this rectangle, its normal and bounds.
We have 4 vertices of a rectangle lying on a sphere:
A - top left
B - top right
C - bottom right
D - bottom left
Center of the sphere:
O
And intersection point on the sphere inside rectangle ABCD:
I
The idea is to identify all sides of the triangle AID, because it will allow us to know the coordinates of the point I on the plane. So if we move the rectangle on the plane with A(0, rect.height) and D(0, 0) then point I could be found by solving the following system of equations:
x^2+y^2=DI^2 - circle equation with center in point D and radius DI
x^2+(y-rect.height)^2=AI^2 - circle equation with center in point A and radius AI
from which it follows that:
y = (DI^2-AI^2+rect.height) / (2*rect.height)
and x could have 2 values (positive and negative), however we are interested only in positive value, because only it will be inside the rect.
x = sqrt(DI^2-(DI^2-AI^2+rect.height)/(2*rect.height))
Then UV could be calculated the following way uv(x/rect.width, y/rect.height)
However length of AI and DI still not known, but could be calculated using formula of Great-circle distance
AI = (Radius of the Sphere) * (Angular orthodromy length must be in radians)
Radius of the Sphere = sqrt((O.x - A.x)^2+(O.y - A.y)^2+(O.z - A.z)^2)
Angular orthodromy length = arccos(sin(a1)*sin(a2)+cos(a1)*cos(a2)*cos(b2-b1))
a1 is angle AOA1, where A1(A.x, O.y, A.z)
b1 is angle O1OA1, where O1(O.x, O.y, A.z)
a2 is angle IOI1, where I1(I1.x, O.y, I.z)
b2 is angle O2OI1, where O2(O.x, O.y, I.z)

Plot 2d point on 3D plane drawn in 2d

I am attempting to draw a point from a 2d plane on a 3d plane that is drawn in 2d. I'm not sure how to adjust the y position based on the angle of the perspective. As you can see in the image linked below (Stack Overflow won't let me include the image because I just signed up), if the point is at the center point of the rectangle, it would need to be shifted up slightly when viewed from an angle to account for distance from the viewer. Can anyone provide an equation to help?
Say the point in the rectangle is given by (x,y), and the coordinates we're looking for in the second image are (x', y').
w = y + y0
y' = k atan(w/h)
r = sqrt(h2 + w2)
x' = k atan(x/r)
where k is a scaling factor for the whole image, h is "altitude of the viewpoint above the plane" and y0 is, roughly, distance to the object.

Retrieve 2D co-ordinate from a 3D point on a 3D plane

I have a point a point (x, y, z) that is on a plane defined by ax+by+cz+d=0. I'm trying to figure out what the (x', y') relative to the plane, where it has a starting point of (x0, y0, z0) and the x'-axis is defined by (1,0) and the y'-axis is defined by (0,1).
My major goal is to have the mouse click on a surface, and know the 2D co-ordinates on a particular surface. I've managed to intersect the ray onto a plane quite trivially.
As a side-note, I'm using DirectX 9 - my familiarity with matrix/vector math is limited by the APIs provided to me through the D3DX libraries.
One thought I had was to use the angle of between one of the axis vectors and find the distance from origin, and figure out the x/y using simple trig. But I'm not sure if that's really an ideal solution or not - or if it can actually solve the issue at hand.
Since you have a 2D image on that plane, you apparently want to match its coordinate system. To do so, determine the unit vectors of the picture. That is, take the 3d coordinates B for the picture position (x,0) for any x>0, and subtract from that the 3d coordinates A for the origin (0,0) of the picture. The resulting vector B − A will describe the positive x direction of your image. Do the same for the y direction. Then normalize both these vectors. This means dividing them by their length, sqrt(x²+y²+z²), but D3Dx has a function D3DXVec3Normalize for this. Let's call the resulting 3d vectors X and Y. To compute the x and y coordinate of any 3D point p, simply subtract the origin A from p, i.e. compute the vector p − A. Then compute the dot product between the result and the unit vectors X and Y. This will give you two numbers: the desired coordinates. This is because the dot product can be used to compute an orthogonal projection.
Translating this into D3Dx, it should look somewhat like the following. As I have never used it, this might have mistakes.
D3DXVECTOR3 *p; // input point
D3DXVECTOR3 a, b, c, ab, ac, ap; // helper vectors
FLOAT x, y; // output coordinates
imagePosTo3D(&a, 0, 0); // a = origin of image
imagePosTo3D(&b, 1, 0); // b = anywhere on positive x axis, perhaps a corner
imagePosTo3D(&c, 0, 1); // c = anywhere on positive y axis, perhaps a corner
D3DXVec3Subtract(&ab, &b, &a); // ab = b - a
D3DXVec3Subtract(&ac, &c, &a); // ac = c - a
D3DXVec3Normalize(&ab, &ab); // ab = ab / |ab|
D3DXVec3Normalize(&ac, &ac); // ac = ac / |ac|
// the above has to be done once for the image, the code below for every p
D3DXVec3Subtract(&ap, p, &a); // ap = p - a
x = D3DXVec3Dot(&ab, &ap); // x = ab∙ap
y = D3DXVec3Dot(&ac, &ap); // y = ac∙ap

Calculation of the position of an object moving in a circular motion in 3D

i have an object that is doing a circular motion in a 3d space, the center or the circle is at x:0,y:0,z:0 the radius is a variable. i know where the object is on the circle (by its angle [lets call that ar] or by the distance it has moved). the circle can be tilted in all 3 directions, so i got three variables for angles, lets call them ax,ay and az. now i need to calculate where exactly the object is in space. i need its x,y and z coordinates.
float radius = someValue;
float ax = someValue;
float ay = someValue;
float az = someValue;
float ar = someValue; //this is representing the angle of the object on circle
//what i need to know
object.x = ?;
object.y = ?;
object.z = ?;
You need to provide more information to get the exact formula. The answer depends on which order you apply your rotations, which direction you are rotating in, and what the starting orientation of your circle is. Also, it will be much easier to calculate the position of the object considering one rotation at a time.
So, where is your object if all rotations are 0?
Let's assume it's at (r,0,0).
The pseudo-code will be something like:
pos0 = (r,0,0)
pos1 = pos0, rotated around Z-axis by ar (may not be Z-axis!)
pos2 = pos1, rotated around Z-axis by az
pos3 = pos2, rotated around Y-axis by ay
pos4 = pos3, rotated around X-axis by ax
pos4 will be the position of your object, if everything is set up right. If you have trouble setting it up, try keeping ax=ay=az=0 and worry about just ar, until your get that right. Then, start setting the other angles one at a time and updating your formula.
Each rotation can be performed with
x' = x * cos(angle) - y * sin(angle)
y' = y * cos(angle) + x * sin(angle)
This is rotation on the Z-axis. To rotate on the Y-axis, use z and x instead of x and y, etc. Also, note that angle is in radians here. You may need to make angle negative for some of the rotations (depending which direction ar, ax, ay, az are).
You can also accomplish this rotation with matrix multiplication, like Marcelo said, but that may be overkill for your project.
Use a rotation matrix. Make sure you use a unit vector.

How can I turn a ray-plane intersection point into barycentric coordinates?

My problem:
How can I take two 3D points and lock them to a single axis? For instance, so that both their z-axes are 0.
What I'm trying to do:
I have a set of 3D coordinates in a scene, representing a a box with a pyramid on it. I also have a camera, represented by another 3D coordinate. I subtract the camera coordinate from the scene coordinate and normalize it, returning a vector that points to the camera. I then do ray-plane intersection with a plane that is behind the camera point.
O + tD
Where O (origin) is the camera position, D is the direction from the scene point to the camera and t is time it takes for the ray to intersect the plane from the camera point.
If that doesn't make sense, here's a crude drawing:
I've searched far and wide, and as far as I can tell, this is called using a "pinhole camera".
The problem is not my camera rotation, I've eliminated that. The trouble is in translating the intersection point to barycentric (uv) coordinates.
The translation on the x-axis looks like this:
uaxis.x = -a_PlaneNormal.y;
uaxis.y = a_PlaneNormal.x;
uaxis.z = a_PlaneNormal.z;
point vaxis = uaxis.CopyCrossProduct(a_PlaneNormal);
point2d.x = intersection.DotProduct(uaxis);
point2d.y = intersection.DotProduct(vaxis);
return point2d;
While the translation on the z-axis looks like this:
uaxis.x = -a_PlaneNormal.z;
uaxis.y = a_PlaneNormal.y;
uaxis.z = a_PlaneNormal.x;
point vaxis = uaxis.CopyCrossProduct(a_PlaneNormal);
point2d.x = intersection.DotProduct(uaxis);
point2d.y = intersection.DotProduct(vaxis);
return point2d;
My question is: how can I turn a ray plane intersection point to barycentric coordinates on both the x and the z axis?
The usual formula for points (p) on a line, starting at (p0) with vector direction (v) is:
p = p0 + t*v
The criterion for a point (p) on a plane containing (p1) and with normal (n) is:
(p - p1).n = 0
So, plug&chug:
(p0 + t*v - p1).n = (p0-p1).n + t*(v.n) = 0
-> t = (p1-p0).n / v.n
-> p = p0 + ((p1-p0).n / v.n)*v
To check:
(p - p1).n = (p0-p1).n + ((p1-p0).n / v.n)*(v.n)
= (p0-p1).n + (p1-p0).n
= 0
If you want to fix the Z coordinate at a particular value, you need to choose a normal along the Z axis (which will define a plane parallel to XY plane).
Then, you have:
n = (0,0,1)
-> p = p0 + ((p1.z-p0.z)/v.z) * v
-> x and y offsets from p0 = ((p1.z-p0.z)/v.z) * (v.x,v.y)
Finally, if you're trying to build a virtual "camera" for 3D computer graphics, the standard way to do this kind of thing is homogeneous coordinates. Ultimately, working with homogeneous coordinates is simpler (and usually faster) than the kind of ad hoc 3D vector algebra I have written above.

Resources