Converting polar coordinates to rectangular coordinates - math

Convert angle in degrees to a point
How could I convert an angle (in degrees/radians) to a point (X,Y) a fixed distance away from a center-point.
Like a point rotating around a center-point.
Exactly the opposite of atan2 which computes the angle of the point y/x (in radians).
Note: I kept the original title because that's what people who do not understand will be searching by!

Let the fixed distance be D, then X = D * cos(A) and Y = D * sin(A), where A is the angle.

If center-point (Xcp, Ycp) isn't the origin you also need to add it's coordinates to (X,Y) i.e. X = Xcp + D * cos(A) and Y = Ycp + D * sin(A)

What PolyThinker said.
Also, if you need the distance from the origin, it's sqrt(x^2 + y^2).

t = angle
r = radius (fixed distance)
x = rcost
y = rsint

Related

How to find points of certain distance on a circle perimeter?

Suppose, (x1, y1) is a point on the perimeter of a circle (x-420)^2 + (y-540)^2 = 260^2 what are the two points on the circle perimeter of distance d(euclidean) from the point (x1, y1)
Using trig
Assuming you are using a programming language. The answer is using pseudo code.
Using radians the distance d along a circle can be expressed as an angle a computed as a = d / r (where r is the radius)
Given an arbitrary point on the circle. (x1-420)^2 + (y1-540)^2 = 260^2 (NOTE assumes x1, y1 are known) we can extract the center is x = 420, y = 540, and radius r = 260
The angular distance d is then a = d / 260.
Most languages have the function atan2 which will compute the angle of a vector, We can get the angle from the circle center to the arbitrary point as ang = atan2(y1 - 540, x1 - 420) (Note y first then x)
Thus the absolute angles from the arbitrary point {x1, y1} to the points d distance along the circle (ang1 , ang2) is computed as...
// ? represents known unknowns
x = 420
y = 540
r = 260
d = ?
x1 = ?
y1 = ?
ang = atan2(y1 - y, x1 - x)
ang1 = ang + d / r
ang2 = ang - d / r
And the coordinates of the points (px1, py1, px2, py2) computed as...
px1 = cos(ang1) * r + x
py1 = sin(ang1) * r + y
px2 = cos(ang2) * r + x
py2 = sin(ang2) * r + y
Vector algebra
The problem can also be solved using vector algebra and does not require the trig function atan2
Compute the unit vector representing the angle a = d / r and then with the circle at the origin, transform (rotate) the point on the circle using the unit vector in both directions. Translate the points back to the circles original position for the solution.

Set vector2 coordinates by move distance and degree

I have a Vector2 in my 2D Game and what I would like to do now is set my vector2 x and y by calculating them using rotation in degrees
Do I need to use PI to calculate new X and Y coordinates then add move distance per second in order to get the correct coordinates?
Example : Lets say degree is 90, which means my gameobject would move forward,at 5 floating units per second, then Y would be 5,10,15 and if degree would be 180 then X would increase by 5 every second, this is simple, but how to do it for other degrees such as 38,268 etc?
The usual convention is that 0 degrees points in the positive X direction and as the angle increases you rotate the direction anti-clockwise. Your convention seems to be that 0 degrees points in the negative X direction and the angle increases clockwise, so first of all you must translate your angle, say alpha, into one with the usual convention, say beta
beta = 180.0 - alpha
Next, trigonometric functions assume radians which run from 0 to 2π rather than from 0 to 360, so you must translate beta into an angle in radians, say theta
theta = 2.0*PI*beta/360.0
Finally, cos(theta) gives the change in X for a move of 1 unit in the direction given by theta and sin(theta) gives the change in Y. So you need
X = X + D * cos(theta)
Y = Y + D * sintheta)
for a distance D. Using your convention this translates to
X = X + D * cos(2.0*PI*(180.0-alpha)/360.0)
Y = Y + D * sin(2.0*PI*(180.0-alpha)/360.0)

How to calculate a point on a rotated axis?

How can I calculate a point (X,Y) a specified distance away, on a rotated axis? I know what angle I'd like the point "moving" along (in degrees).
x = cos(a) * d
y = sin(a) * d
where a is the angle and d is the distance.
If the trigonometry functions takes radians intead of degrees, you have to convert the angle by dividing by 180/pi.
Convert to polar coordinates and then rotate the point through the angle you want:
x = r * cos( theta );
y = r * sin( theta );
Note: theta in radians ( deg = rad * 180 / pi )
More info on polar coordinates.
Do you mean the 3d formulas? They are easy as well. But we need to know what's your convention for specifying the axis.

Plotting a point on the edge of a sphere

So coming from a flash background I have an OK understanding of some simple 2D trig. In 2d with I circle, I know the math to place an item on the edge given an angle and a radius using.
x = cos(a) * r;
y = sin(a) * r;
Now if i have a point in 3d space, i know the radius of my sphere, i know the angle i want to position it around the z axis and the angle i want to position it around, say, the y axis. What is the math to find the x, y and z coordinates in my 3d space (assume that my origin is 0,0,0)? I would think i could borrow the Math from the circle trig but i can't seem to find a solution.
Your position in 3d is given by two angles (+ radius, which in your case is constant)
x = r * cos(s) * sin(t)
y = r * sin(s) * sin(t)
z = r * cos(t)
here, s is the angle around the z-axis, and t is the height angle, measured 'down' from the z-axis.
The picture below shows what the angles represent, s=theta in the range 0 to 2*PI in the xy-plane, and t=phi in the range 0 to PI.
The accepted answer did not seem to support negative x values (possibly I did something wrong), but just in case, using notation from ISO convention on coordinate systems defined in this Wikipedia entry, this system of equations should work:
import math
x = radius * sin(theta) * cos(phi)
y = radius * sin(theta) * sin(phi)
z = radius * cos(theta)
radius = math.sqrt(math.pow(x, 2) + math.pow(y, 2) + math.pow(z, 2))
phi = math.atan2(y, x)
theta = math.acos((z / radius))

Add distance onto coordinate

I have a unit vector, a distance and a coordinate and I would like to calculate the new coordinate given by adding the distance onto the coordinate in the given direction. How do I do this?
Multiply the vector by the distance then add the resulting vector to the point.
Here's some pseudocode, assuming you're using Cartesian coordinates.
new_coord.x = distance * unit.x + coord.x
new_coord.y = distance * unit.y + coord.y
If with a unit vector, you mean a vector with distance 1. You can find the coordinate bij multiplying all coordinates with the distance.
V = V unit * distance
V unit = (1/2 sqrt(3), 1/2)
distance = 6
==>
V = (3 sqrt(3), 3)

Resources