How can I detect when a 2D moving object has crossed its own path?
I store the path as an array of points based on the plane's previous positions.
Pseudo-code or any programming language can be used to describe a solution.
Here's my code I've tried already - it detects a full 360 loop. I think I need a different approach.
CGFloat angleDiff = angleCurr - lastAngleRecorded;
lastAngleRecorded = angleCurr;
// Ensure -180 < angleDiff < 180
angleDiff = angleDiff > M_PI ? angleDiff - (M_PI*2) : angleDiff;
angleDiff = angleDiff < -M_PI ? angleDiff + (M_PI*2) : angleDiff;
// Reset tracking of the loop of the plane's angle exceeds (turns too sharply) or falls below the limits
if(fabsf(angleDiff) < angleDiffMinAllowed || fabsf(angleDiff) > angleDiffMaxAllowed) {
if(++ringFaultCount >= ringFaultCountMax) {
[self resetTracking];
return;
}
}
ringFaultCount = 0;
// Add plane position to ring polygon
[ringPoints addObject:[NSValue valueWithCGPoint: ccp(targetPlane.position.x, targetPlane.position.y)]];
// Add angleDiff to angleTotal
angleTotal += angleDiff;
// Completed loop?
if(fabsf(angleTotal) >= M_PI * 2.f) {
[self resetTracking];
if(isRingJoined){
CCLOG(#"%# RING COMPLETE", NSStringFromSelector(_cmd));
}
return;
}
I also had the problem, I solved it by making a straight line in a coordinate system:
y = mx+q±tolerance
Let me explain:
The line is the tangent of the curve at the point you check if there is a collision this is the line the "aircraft" followed in that point.
The tolerance will make the line move a litlle bit up and also one a little bit down.
so you get 2 parralel lines witch can be seen as boundarys.
you also have to make a tolerance on the x-axis
The m is the direction of the line, its: tan(angle), the angle is the angle with the x-axis.
If all that is setup than you have to do this:
if(y_point < mx+q+tolerance && y_point> mx+q-tolerance && x_points > x-tolerance && x_point< x+tolerance
{
// some code
}
Related
Intro
I've created a spaceship sprite in my Unity project, I wanted it to rotate towards the cursor via angular velocity, because I'd like make my game to be heavily physics based.
Problem
Now my problem with rotating the sprite via by angular velocity is the following:
At -180° / 180° rotation my ship spins around, because while my mouse's angle is already 180°, while my ship's rotation is still -180°, or the other way around.
I tried
I tried to solve it mathematically, wasn't too successful, I could make it spin the right way just much slower/faster, I could fix the 180/-180 point, but made two different ones instead.
Looked for different solutions, but couldn't find a more fitting one.
Code
So I have this code for the rotation:
// Use this for initialization
void Start () {
rb = gameObject.GetComponent<Rigidbody2D>();
}
// Update is called once per frame
void Update () {
//getting mouse position in world units
mousePos = Camera.main.ScreenToWorldPoint(Input.mousePosition);
//getting the angle of the ship -> cursor vector
angle = Mathf.Atan2(mousePos.y - transform.position.y, mousePos.x - transform.position.x) * Mathf.Rad2Deg;
//getting the angle between the ship -> cursor and the rigidbody.rotation vector
diffAngle = angle - (rb.rotation + 90);
//Increasing angular velocity scaling with the diffAngle
rb.angularVelocity = diffAngle * Time.deltaTime * PlayerShipStats.Instance.speed * 100f;
Thank you for your contribution in advance
Solution for Problem 1
Inserting this code made it work, not for long :
if(diffAngle > 180) {
diffAngle -= 360;
} else if (diffAngle < -180) {
diffAngle += 360;
}
Problem 2 and Solution for Problem 2
The new problem is:
rigidbody.rotation can exceed it's boundaries, it can be rotated for more than 360 degrees.
this code patched this bug:
if(rb.rotation + 90 >= 180) {
rb.rotation = -270;
} else if (rb.rotation + 90 <= -180) {
rb.rotation = 90;
}
The perfect code
void AimAtTarget(Vector2 target, float aimSpeed) {
//getting the angle of the this -> target vector
float targetAngle = Mathf.Atan2(target.y - transform.position.y, target.x - transform.position.x) * Mathf.Rad2Deg;
if (rb.rotation + 90 >= 180) {
rb.rotation = -270;
} else if (rb.rotation + 90 <= -180) {
rb.rotation = 90;
}
//getting the angle between the this -> target and the rigidbody.rotation vector
float diffAngle = targetAngle - (rb.rotation - 90);
if (diffAngle > 180) {
diffAngle -= 360;
} else if (diffAngle < -180) {
diffAngle += 360;
}
//Increasing angular velocity scaling with the diffAngle
rb.angularVelocity = diffAngle * Time.deltaTime * aimSpeed * 100;
}
There are two problems I see here:
Problem 1
angle is always going to be between -180 and 180, while rb.rotation is between 0 and 360. So you are comparing angles using two different notations. The first step is to get both angles returning -180 to 180 or 0 to 360. I chose to do the following which puts both angles between -180 and 180:
//getting the angle of the ship -> cursor vector
float targetAngle = Mathf.Atan2(
mousePos.y - transform.position.y,
mousePos.x - transform.position.x) * Mathf.Rad2Deg;
//get the current angle of the ship
float sourceAngle = Mathf.Atan2(
this.transform.up.y,
this.transform.up.x) * Mathf.Rad2Deg;
Problem 2
If you fix problem 1 and tried your app you would notice that the ship sometimes rotates the wrong way, although it will eventually get to its target. The problem is that diffAngle can sometimes give a result that is greater than +180 degrees (or less than -180). When this happens we actually want the ship to rotate the other direction. That code looks like this:
//getting the angle between the ship -> cursor and the rigidbody.rotation vector
float diffAngle = targetAngle - sourceAngle;
//use the smaller of the two angles to ensure we always turn the correct way
if (Mathf.Abs(diffAngle) > 180f)
{
diffAngle = sourceAngle - targetAngle;
}
I made a simple Unity to verify this works. I was able to rotate my ship in either direction smoothly.
One thing you may have to handle, if you don't already, is appropriately stopping the rotation of the ship when the it is facing the cursor. In my test I noticed that the ship would jitter slightly when it reached its target because it would (very) slightly overshoot the cursor's angle in one direction and then the other. The larger the value of PlayerShipStats.Instance.speed the more pronounced this effect will likely be.
I have convex polygons, and I want to extend them by projecting along a vector like so:
(Original polygon and vector on left, desired result on right.)
My polygons are stored as a series of points with counter-clockwise winding. What I want to find are the "starting" and "stopping" point that I need to project from, as in the circled vertices below.
(The green arrows are to indicate the polygon's winding, giving the "direction" of each edge.)
My original plan was to determine which points to use by projecting a ray with the vector's direction from each point, and finding the first and last points whose ray doesn't intersect an edge. However, that seems expensive.
Is there a way I can use the edge directions vs the vector direction, or a similar trick, to determine which points to extend from?
Look at points where the direction of the vector falls between the directions of the edges.
In other words, take three vectors:
of the edge leading out of the vertex
of the translation vector
opposite to the edge leading to the vertex
If they are in this order when going CCW, i.e. if the second vector is between the first and the third, this is an "inside" point.
In order to determine whether a vector lies between two other vectors, use cross product as described e.g. here.
Yes you can. You want to project along x, y. So the normal is y, -x. Now rotate by that (atan2, or you can you it directly if you understand rotation matrices). The points to project from and now the minimum and maximum x, You can also speed up the projection by always doing it along an axis then rotating back.
n.m. answered the question as I asked and pictured it, but upon programming I soon noticed that there was a common case where all vertices would be "outside" vertices (this can be easily seen on triangles, and can occur for other polygons too).
The text explanation.
The solution I used was to look at the normal vectors of the edges leading into and exiting each vertex. The vertices we want to extend are vertices that have at least one edge normal with a minimum angle of less than 90 degrees to the delta vector we are extending by.
The outward-facing edge normals on a counterclockwise-wound polygon can be found by:
normal = (currentVertex.y - nextVertex.y, nextVertex.x - currentVertex.x)
Note that since we don't care about the exact angle, we don't need to normalize (make a unit vector of) the normal, which saves a square root.
To compare it to the delta vector, we use the dot product:
dot = edgeNormal.dot(deltaVector)
If the result is greater than zero, the minimum angle is acute (less than 90). If the result is exactly zero, the vectors are perpendicular. If the result is less than zero, the minimum angle is obtuse. It is worth noting when the vectors are perpendicular, since it lets us avoid adding extra vertices to the extended polygon.
If you want to visualize how the angle works with the dot product, like I did, just look at a graph of arc cosine (normally you get the angle via acos(dot)).
Now we can find the vertices that have one acute and one not-acute minimum angle between its edge normals and the delta vector. Everything on the "acute side" of these vertices has the delta vector added to it, and everything on the "obtuse side" stays the same. The two boarder vertices themselves are duplicated, having one extended and one staying the same, unless the "obtuse side" is exactly perpendicular to the delta vector (in this case we only need to extend the vertex, since otherwise we would have two vertices on the same line).
Here is the C++ code for this solution.
It may look a little long, but it is actually quite straightforward and has many comments so it hopefully shouldn't be hard to follow.
It is part of my Polygon class, which has a std::vector of counterclockwise-wound vertices. units::Coordinate are floats, and units::Coordinate2D is a vector class that I feel should be self-explanatory.
// Compute the normal of an edge of a polygon with counterclockwise winding, without normalizing it to a unit vector.
inline units::Coordinate2D _get_non_normalized_normal(units::Coordinate2D first, units::Coordinate2D second) {
return units::Coordinate2D(first.y - second.y, second.x - first.x);
}
enum AngleResult {
ACUTE,
PERPENDICULAR,
OBTUSE
};
// Avoid accumulative floating point errors.
// Choosing a good epsilon is extra important, since we don't normalize our vectors (so it is scale dependent).
const units::Coordinate eps = 0.001;
// Check what kind of angle the minimum angle between two vectors is.
inline AngleResult _check_min_angle(units::Coordinate2D vec1, units::Coordinate2D vec2) {
const units::Coordinate dot = vec1.dot(vec2);
if (std::abs(dot) <= eps)
return PERPENDICULAR;
if ((dot + eps) > 0)
return ACUTE;
return OBTUSE;
}
Polygon Polygon::extend(units::Coordinate2D delta) const {
if (delta.isZero()) { // Isn't being moved. Just return the current polygon.
return Polygon(*this);
}
const std::size_t numVerts = vertices_.size();
if (numVerts < 3) {
std::cerr << "Error: Cannot extend polygon (polygon invalid; must have at least three vertices).\n";
return Polygon();
}
// We are interested in extending from vertices that have at least one edge normal with a minimum angle acute to the delta.
// With a convex polygon, there will form a single contiguous range of such vertices.
// The first and last vertex in that range may need to be duplicated, and then the vertices within the range
// are projected along the delta to form the new polygon.
// The first and last vertices are defined by the vertices that have only one acute edge normal.
// Whether the minimum angle of the normal of the edge made from the last and first vertices is acute with delta.
const AngleResult firstEdge = _check_min_angle(_get_non_normalized_normal(vertices_[numVerts-1], vertices_[0]), delta);
const bool isFirstEdgeAcute = firstEdge == ACUTE;
AngleResult prevEdge = firstEdge;
AngleResult currEdge;
bool found = false;
std::size_t vertexInRegion;
for (std::size_t i = 0; i < numVerts - 1; ++i) {
currEdge = _check_min_angle(_get_non_normalized_normal(vertices_[i], vertices_[i+1]), delta);
if (isFirstEdgeAcute != (currEdge == ACUTE)) {
// Either crossed from inside to outside the region, or vice versa.
// (One side of the vertex has an edge normal that is acute, the other side obtuse.)
found = true;
vertexInRegion = i;
break;
}
prevEdge = currEdge;
}
if (!found) {
// A valid polygon has two points that define where the region starts and ends.
// If we didn't find one in the loop, the polygon is invalid.
std::cerr << "Error: Polygon can not be extended (invalid polygon).\n";
return Polygon();
}
found = false;
std::size_t first, last;
// If an edge being extended is perpendicular to the delta, there is no need to duplicate that vertex.
bool shouldDuplicateFirst, shouldDuplicateLast;
// We found either the first or last vertex for the region.
if (isFirstEdgeAcute) {
// It is the last vertex in the region.
last = vertexInRegion;
shouldDuplicateLast = currEdge != PERPENDICULAR; // currEdge is either perpendicular or obtuse.
// Loop backwards from the end to find the first vertex.
for (std::size_t i = numVerts - 1; i > 0; --i) {
currEdge = _check_min_angle(_get_non_normalized_normal(vertices_[i-1], vertices_[i]), delta);
if (currEdge != ACUTE) {
first = i;
shouldDuplicateFirst = currEdge != PERPENDICULAR;
found = true;
break;
}
}
if (!found) {
std::cerr << "Error: Polygon can not be extended (invalid polygon).\n";
return Polygon();
}
} else {
// It is the first vertex in the region.
first = vertexInRegion;
shouldDuplicateFirst = prevEdge != PERPENDICULAR; // prevEdge is either perpendicular or obtuse.
// Loop forwards from the first vertex to find where it ends.
for (std::size_t i = vertexInRegion + 1; i < numVerts - 1; ++i) {
currEdge = _check_min_angle(_get_non_normalized_normal(vertices_[i], vertices_[i+1]), delta);
if (currEdge != ACUTE) {
last = i;
shouldDuplicateLast = currEdge != PERPENDICULAR;
found = true;
break;
}
}
if (!found) {
// The edge normal between the last and first vertex is the only non-acute edge normal.
last = numVerts - 1;
shouldDuplicateLast = firstEdge != PERPENDICULAR;
}
}
// Create the new polygon.
std::vector<units::Coordinate2D> newVertices;
newVertices.reserve(numVerts + (shouldDuplicateFirst ? 1 : 0) + (shouldDuplicateLast ? 1 : 0) );
for (std::size_t i = 0; i < numVerts; ++i) {
// Extend vertices in the region first-to-last inclusive. Duplicate first/last vertices if required.
if (i == first && shouldDuplicateFirst) {
newVertices.push_back(vertices_[i]);
newVertices.push_back(vertices_[i] + delta);
} else if (i == last && shouldDuplicateLast) {
newVertices.push_back(vertices_[i] + delta);
newVertices.push_back(vertices_[i]);
} else {
newVertices.push_back( isFirstEdgeAcute ? // Determine which range to use.
( (i <= last || i >= first) ? vertices_[i] + delta : vertices_[i] ) : // Range overlaps start/end of the array.
( (i <= last && i >= first) ? vertices_[i] + delta : vertices_[i] )); // Range is somewhere in the middle of the array.
}
}
return Polygon(newVertices);
}
So far I tested this code with triangles, rectangles, approximated circles, and arbitrary convex polygons made by extending the approximated circles sequentially by many different delta vectors.
Please note that this solution is still only valid for convex polygons.
I'm placing Object3Ds in a scene with coordinates from an older Away3D project. The issue I'm having is when objects have a negative position.x (i.e. behind the camera), the rotation is opposite.
I'm only using X and Z for tilt and spin.
Is there a way to calculate the opposite/flip side angles so it flips?
i.e. if x = 1.69, what should the direct opposite be?
Here is my current attempt:
photo.rotation.x = data.rotationX != null
? photo.position.x < 0
? Math.PI - data.rotationX
: data.rotationX
: 0;
photo.position.x < 0
? photo.rotation.z = Math.PI
: photo.rotation.z = 0;
photo.rotation.y = data.rotationY != null
? photo.position.x < 0
? Math.PI - data.rotationY
: data.rotationY
: 0;
EDIT: Doing some further research on this, is it possible the Three.js way of calculating angles is going to be different from the Away3d one? i.e. a similar problem here 3D rotation with Axis & Angle
It's very strange. Some of the objects are good and the same as they are in Away3d, and some are on strange angles that are not anywhere near the same.
How could I get a random CGPoint that is outside the screen boundaries (frame)?
Also, given that point, how could I find a symmetrical point to it to the middle of the screen- e.g. say I have the point (width+1,height+1). Now the symmetrical point is (-1,-1). Say I have (-1,height +1)- symmetrical would be (width+1,-1).
Hope this is clear, and thanks!
If I understand your question correctly, you can use the following method:
- (CGPoint) randomPointIn:(CGRect)inrect outsideOf:(CGRect)outrect
{
CGPoint p;
do {
p.x = inrect.origin.x + inrect.size.width * (float)arc4random()/(float)UINT32_MAX;
p.y = inrect.origin.y + inrect.size.height * (float)arc4random()/(float)UINT32_MAX;
} while (CGRectContainsPoint(outrect, p));
return p;
}
It returns a random point that is inside inrect, but outside of outrect.
(I have assumed that inrect is "considerably larger" than outrect,
otherwise many loop iterations might be necessary to find a valid point.)
In your case, you would use outrect = CGRectMake(0, 0, width, height),
and inrect would specify the allowed domain.
And the point symmetrical to (x, y) with respect to the middle of the screen
with size (width, height) is (width - x, height - y).
UPDATE: As I just found here: http://openradar.appspot.com/7684419,
CGRectContainsPoint will return false if you provide it a point that is on the boundary of the CGRect. That means that the above method returns a point that is outside of
or on the boundary of the given rectangle outrect. If that is not desired,
additional checks can be added.
I believe this should work.
//To get a random point
- (CGPoint)randomPointOutside:(CGRect)rect
{
// arc4random()%(int)rect.size.width
// This gets a random number within the width of the rectangle
//
// (arc4random()%2) ? rect.size.width : 0)
// This has a 50:50 to put the point in the q1 quadrant relative to the top right point of the rect
//
// q4 q1
// _____ +
// | |
// | q3 | q2
// |_____|
//
float x = arc4random()%(int)rect.size.width + ((arc4random()%2) ? rect.size.width : 0);
float y = arc4random()%(int)rect.size.height + ((arc4random()%2) ? rect.size.height : 0);
return CGPointMake(x, y);
}
//To get the symmetrical point
- (CGPoint)symmetricalPoint:(CGPoint)p around:(CGRect)rect
{
return CGPointMake((p.x-rect.size.width) * -1, (p.y-rect.size.height) * -1);
}
Mornin' SO!
I'm just trying to hone my math-fu, and I have some questions regarding Cocos2D in particular. Since Cocos2D wants to 'simplify' things, all sprites have a rotation property, ranging from 0-360 (359?) CW. This forces you to do some rather (for me) mind-humping conversions when dealing with functions like atan.
So f.ex. this method:
- (void)rotateTowardsPoint:(CGPoint)point
{
// vector from me to the point
CGPoint v = ccpSub(self.position, point);
// ccpToAngle is just a cute wrapper for atan2f
// the macro is self explanatory and the - is to flip the direction I guess
float angle = -CC_RADIANS_TO_DEGREES(ccpToAngle(v));
// just to get it all in the range of 0-360
if(angle < 0.f)
angle += 360.0f;
// but since '0' means east in Cocos..
angle += 180.0f;
// get us in the range of 0-360 again
if(angle > 360.0f)
angle -= 360.0f;
self.rotation = angle;
}
works as intended. But to me it looks kind of brute forced. Is there a cleaner way to achieve the same effect?
It is enough to do
float angle = -CC_RADIANS_TO_DEGREES(ccpToAngle(v));
self.rotation = angle + 180.0f;
for equivalent transformations
// vector from me to the point
CGPoint v = ccpSub(self.position, point);
actually, that's vector from point to you.
// just to get it all in the range of 0-360
you don't need to do that.