So I have a bit of a math problem. Here are the pieces.
Input:
Rot = Rotation (degrees). This is the rotation of the "player". This is also the yaw.
Vel.X = This is the left/rightward movement that would be happening if it weren't rotated
Vel.Z = Same as last except its up/down movement
Output:
Result.X = This is the actual movement that should be happening along the x axis considering rotation
Result.Z = Same as last
Basically the scenario is that a player is standing on a platform with "Rot" rotation. When directional keys are pressed velocity is added accordingly to the "Vel" value. However if rotation isn't 0 this wont produce the right result because when the player rotates moving left becomes relative.
Could you please tell me a formula that would find the proper x and y movement that would result in the player moving around relative to its rotation?
This problem is probably the most basic rotation question in game programming.
Using your Vel.X and Vel.Z values, you have what you might think of as the vector you wish to rotate in the x/z plane (instead of x/y - but same idea). Whether velocity or position, the approach is the same. With a simple google search we find that for 2D vector rotation, the formula is:
Result.X = Vel.X * cos(Rot) - Vel.Z * sin(Rot);
Result.Z = Vel.X * sin(Rot) + Vel.Z * cos(Rot);
Related
Alright, so I know there are a lot of questions referring to normalized device coordinates here on SO, but none of them address my particular issue.
So, everything I draw it's specified in 2D screen coordinates where top,left is (0,0) and bottom right is (screenWidth, screenHeight) then in my vertex shader I do this calculation to get out NDC (basically, I'm rendering UI elements):
float ndcX = (screenX - ScreenHalfWidth) / ScreenHalfWidth;
float ndcY = 1.0 - (screenY / ScreenHalfHeight);
where ScreenX/ScreenY is pixel coordinates, for example (600, 700) and screenHalf_____ is half of the screen width/height.
And the final position that I return from the vertex shader for the rasterization state is:
gl_Position = vec4(ndcX, ndcY, Depth, 1.0);
Which which works perfectly fine in Opengl ES.
Now the problem is that when I try it just like this in Metal 2, it doesn't work.
I know Metal's NDC are 2x2x1 and Opengl's NDC are 2x2x2 but I thought depth here didn't play an important part in this equation since I am passing it in my self per vertex.
I tried this link and this so question but was confused and the links weren't that helpful since I am trying to avoid matrix calculations in the vertex shader since I am rendering everything 2D for now.
So my questions...What is the formula to transform pixel coordinates to NDC in Metal? Is it possible without using an orthographic projection matrix? Why doesn't my equation work for Metal?
It is of course possible without a projection matrix. Matrices are just a useful convenience for applying transformations. But it's important to understand how they work when situations like this arise, since using a general orthographic projection matrix would perform unnecessary operations to arrive at the same results.
Here are the formulae I might use to do this:
float xScale = 2.0f / drawableSize.x;
float yScale = -2.0f / drawableSize.y;
float xBias = -1.0f;
float yBias = 1.0f;
float clipX = position.x * xScale + xBias;
float clipY = position.y * yScale + yBias;
Where drawableSize is the dimension (in pixels) of the renderbuffer, which can be passed in a buffer to the vertex shader. You can also precompute the scale factors and pass those in instead of the screen dimensions, to save some computation on the GPU.
Is there a way to convert that data:
Object position which is a 3D point (X, Y, Z),
Camera position which is a 3D point (X, Y, Z),
Camera yaw, pitch, roll (-180:180, -90:90, 0)
Field of view (-45°:45°)
Screen width & height
into the 2D point on the screen (X, Y)?
I'm looking for proper math calculations according to this exact set of data.
It's difficult, but it's possible to do it for yourself.
There are lots of libraries that do this for you, but it is more satisfying if you do it yourself:
This problem is possible and I have written my own 3D engine to do this for objects in javascript using the HTML5 Canvas. You can see my code here and solve a 3D maze game I wrote here to try and understand what I will talk about below...
The basic idea is to work in steps. To start, you have to forget about camera angle (yaw, pitch and roll) as these come later and just imagine you are looking down the y axis. Then the basic idea is to calculate, using trig, the pitch angle and yaw to your object coordinate. By this I mean imagining that you are looking through a letterbox, the yaw angle would be the angle in degrees left and right to your coordinate (so both positive and negative) from the center/ mid line and the yaw up and down from it. Taking these angles, you can map them to the x and y 2D coordinate system.
The calculations for the angles are:
pitch = atan((coord.x - cam.x) / (coord.y - cam.y))
yaw = atan((coord.z - cam.z) / (coord.y - cam.y))
with coord.x, coord.y and coord.z being the coordinates of the object and the same for the cam (cam.x, cam.y and cam.z). These calculations also assume that you are using a Cartesian coordinate system with the different axis being: z up, y forward and x right.
From here, the next step is to map this angle in the 3D world to a coordinate which you can use in a 2D graphical representation.
To map these angles into your screen, you need to scale them up as distances from the mid line. This means multiplying them by your screen width / fov. Finally, these distances will now be positive or negative (as it is an angle from the mid line) so to actually draw it on a canvas, you need to add it to half of the screen width.
So this would mean your canvas coordinate would be:
x = width / 2 + (pitch * (width / fov)
y = height / 2 + (yaw * (height / fov)
where width and height are the dimensions of you screen, fov is the camera's fov and yaw and pitch are the respective angles of the object from the camera.
You have now achieved the first big step which is mapping a 3D coordinate down to 2D. If you have managed to get this all working, I would suggest trying multiple points and connecting them to form shapes. Also try moving your cameras position to see how the perspective changes as you will soon see how realistic it already looks.
In addition, if this worked fine for you, you can move on to having the camera be able to not only change its position in the 3D world but also change its perspective as in yaw, pitch and roll angles. I will not go into this entirely now, but the basic idea is to use 3D world transformation matrices. You can read up about them here but they do get quite complicated, however I can give you the calculations if you get this far.
It might help to read (old style) OpenGL specs:
https://www.khronos.org/registry/OpenGL/specs/gl/glspec14.pdf
See section 2.10
Also:
https://www.khronos.org/opengl/wiki/Vertex_Transformation
Might help with more concrete examples.
Also, for "proper math" look up 4x4 matrices, projections, and homogeneous coordinates.
https://en.wikipedia.org/wiki/Homogeneous_coordinates
I am working on a Unity game where the Euler angles of the player display some weird behaviour that I cannot understand. There seem to be two full 360 degrees rotations, one positive and one negative. Depending on the direction you go when you are at 0 degrees, it will either take the negative path or the positive path. This means the player can have totally different yaw values depending on if it takes the green or red path. See the following image to get an idea of what is happening:
The issue now comes when I want to calculate the new angle for the player to look at some specific object in the 3d world space. I calculate this angle using some simple math:
// make player the origin, so the target is relative to the player
Vector3 delta = player.angles - target.angles;
float magnitude = delta.Length();
float pitch = asin((delta.y) / magnitude) * (180 / M_PI);
float yaw = -atan2(delta.x, -delta.z) * (180 / M_PI);
This will give me back correct angles, but the yaw is from 0 to 180, then -180 to 0 again. I correct for this doing:
// this makes sure the calculated angle is from 0-360
if (yaw < 0) {
yaw += 360; // normalize yaw
}
So, right now this is aligned with the red rotation line illustrated in the picture above. If I would set the players rotation to the new calculated angle, it would look correctly at the target object directly. But ONLY when the player is on the red rotation path.
If the player for example is on the green rotation path the calculated angle doesn't fit for the current users rotation. If I would set the rotation now, things get quirky. Somehow I need to compensate the calculated new player angle.
What is going on here? and is it possible to manipulate the new calculated angle (which ranges always from 0-360) so it is based on the current players rotation path (either green or red)?
I hope my question makes sense, I found it quite hard to follow what is going on. I hope someone could explain me the situation and ultimately help me out to fix the new calculated angle so it adjusts to the current player rotation path!
Any help is appreciated, thanks in advance!
EDIT
So I came up with the following to make sure the player is always rotated the shortest amount, taking both green and red rotation paths into consideration:
// normalize angles for when using negative rotation
if (localAngles[1] < 0)
{
angles[1] -= 360;
}
double diffPitch = (angles[0] - localAngles[0]);
double diffYaw = (angles[1] - localAngles[1]);
// this makes sure we always take the shortest amount of degrees to target
if (diffYaw < -180)
{
diffYaw += 360;
}
if (diffYaw > 180)
{
diffYaw -= 360;
}
I will have to test some more to be sure this is the solution.
Sorry if this question has been asked before, but if so I could not find it before posting this.
In a nutshell, I want to do this:
Example.
I want a pointer (red) to rotate about the circle (blue) according to where the mouse is located. (If picture is not visible, it depicts a blue circle with a red triangle pointing away from it, towards the mouse).
If possible, please answer with a general mathematical equation rather than specific code. Thanks.
Assuming a normal cartesian coordinate space, with the X axis going to the right and the Y axis going up, you first need to calculate the angle to the mouse coordinate (M) to the circle origin (O):
theta = atan2(M.y - O.y, M.x - O.x)
you can then calculate the position of a point (P) orbiting the circle at radius (r) with:
P.x = r * cos(theta)
P.y = r * sin(theta)
The atan2(y, x) function is a common math library function that just computes atan(y / x) but takes the relative signs of x and y into account to determine the correct quadrant.
I'm trying to create a program that creates a custom pattern. I have it so that if sides = 3, it's a triangle 4 = rect and anything else above that has a formula so that, if you really wanted, you could have 25 sides. I'm using lines, rotation and rotation to plant, turn, draw repeat.
angleMeasure = (180 * (sides-2) ) /sides;
println(angleMeasure);
println(radians(angleMeasure));
//creating the 5+ shape
pushMatrix();
translate(width/2, height/2); //translating the whole shape/while loop
while(counter < sides){
line(0,0,170,0);
translate(170,0);//THIS translate is what makes the lines go in the direction they need too.
rotate(angleMeasure);
counter = counter + 1;
This works almost correctly. The last and first lines don't connect. Suggestions? Maybe it's a problem with the math, but println reveals a correct angle measure in degrees. Here's what it looks like: http://i.stack.imgur.com/TwYMj.png
EDIT: Changed the rotate from rotate(angleMeasure) to rotate(angleMeasure * -1). This rotated the whole shape and made it clear that the angle on the very first line is off. See:http://i.stack.imgur.com/Z1KmY.png
You actually need to turn by angle=360°/sides. And convert this angle into radians.
Thus for a pentagram you need angle=72°. The number you computed is 108, which interpreted as radians is 34 full turns plus an angle of about 67°. This falls 5° short of the correct angle, so that you obtain a somewhat correct picture with slightly too wide inner angles, resulting in the gap (and not a crossing as when the angle were larger than the correct angle).