I have next scene and data:
Blue point is my camera.
Green point is my object.
Red vector is direction to object relative to camera.
I expected the vector to head toward my green dot, but (presumably) due to the camera rotation, my vector went astray. It should also be noted that I don't know the coordinates of the green dot, and I need to turn the vector in the right direction for that.
How can I resolve this problem? With this data which I write below? Anyone can help me?
Camera(blue point) coordinate:(-1.75, 0.63, 0.66), and also quaternion.
Vector (red) start point is camera, end point is: (-0.11, 0.01, -2.21)
Related
I have a plane (Point, normal) and a circle(point, radius).
The circle moves around and hits the plane.
The position of the circle is reset to the touchPoint with the plane.
So far, so good.
But how can i modify the velocity of the circle, so it only moves
tangential to the plane?
So if it bumbs in the plane, the part of the velocity-vector which is
responsible for bumbing in the plane is consumed.
So, in the next step, it doesnt collide with the plane but can move on.
It "slides" on the plane.
Any ideas?
Since you give no code, I'll just state a few ideas without code.
The plane's definition of (Point, normal) is simpler if the normal vector is a unit vector (of length one). If it isn't, divide that vector by its length and it becomes a unit vector.
If the velocity of the circle is is given by a 3D Cartesian velocity vector, you can find the component of that velocity vector along the plane's unit normal vector by taking the dot product of those two vectors. The result is the directed size of the vector along the normal. You can then remove that component by multiplying the unit normal vector by that dot product, then subtracting that from the circle's velocity vector. The result of that subtraction is the circle's velocity along the plane. That apparently is what you wanted to find.
If you answer my questions in my comment under your question, I could show you some code in my current preferred language, Python 3.
This is the first question I've ever asked on here! Apologies in advance if I've done it wrong somehow.
I have written a program which stacks up spheres in three.js.
Each sphere starts with randomly generated (within certain bounds) x and z co-ordinates, and a y co-ordinate high above the ground plane. I casts rays from each of the sphere's vertices to see how far down it can fall before it intersects with an existing mesh.
For each sphere, I test it in 80 different random xz positions, see where it can fall the furthest, and then 'drop' it into that position.
This is intended to create bubble towers like this one:
However, I have noticed that when I make the bubble radius very small and the base dimensions of the tower large, this happens:
If I turn the recursions down from 80, this effect is less apparent. For some reason, three.js seems to think that the spheres can fall further at the corners of the base square. The origin is exactly at the center of the base square - perhaps this is relevant.
When I console log all the fall-distances I'm receiving from the raycaster, they are indeed larger the further away you get from the center of the square... but only at the 11th or 12th decimal place.
This is not so much a problem I am trying to solve (I could just round fall distances to the nearest 10th decimal place before I pick the largest one), but something I am very curious about. Does anyone know why this is happening? Has anybody come across something similar to this before?
EDIT:
I edited my code to shift everything so that the origin is no longer at the center of the base square:
So am I correct in thinking... this phenomenon is something to do with distance from the origin, rather than anything relating to the surface onto which the balls are falling?
Indeed, the pattern you are seeing is exactly because the corners and edges of the bottom of your tower are furthest from the origin where you are dropping the balls. You are creating a right triangle (see image below) in which the vertical "leg" is the line from the origin from which you are dropping the balls down to the point directly below on mesh floor (at a right angle to the floor - thus the name, right triangle). The hypotenuse is always the longest leg of a right triangle, and the futher out your rays cast from the point just below the origin, the longer the hypotenuse will be, and the more your algorithm will favor that longer distance (no matter how fractional).
Increasing the size of the tower base would exaggerate this effect as the hypotenuse measurements can now grow even larger. Reducing the size of the balls would also favor the pattern you are seeing, as now each ball is not taking up as much space, and so the distant measurments to the corners won't fill in as quickly as they would with larger balls so that now more balls will congregate at the edges before filling in the rest of the space.
Moving your dropping origin to one side or another creates longer distances (hypotenuses) to the opposites sides and corners, so that the balls will fill in those distant locations first.
The reason you see less of an effect when you reduce the sample size from 80 to say, 20, is that there are simply fewer chances to detect these more distant locations to which the balls could fall (an odds game).
A right triangle:
A back-of-the-napkin sketch:
I have a cannon that fires a cannonball and smoke particle effect, i want the cannon ball to start at the end of the cannon, this i can do by adding the width of the cannon to its x position and adding the half the height to the cannon. This works fine when the cannon is unrotated but when i rotate the cannon its not in the correct position. This is what i use to try and rotate the vector.
Vector2 rotPos = cannon.position.tmp().add(cannon.bounds.width, cannon.bounds.height/2).rotate(cannon.angle);
How can i get a rotated vector that fires the cannon ball in the correct place. See image below.
UPDATE
I tried the below also, same result the ball is off to the left
Vector2 rotPos = world.CannonBody.getWorldVector( world.CannonBody.getPosition() );
The way that you've described the problem, you've solved it for only a single case. This really is just a math problem. Think about the direction you want to shoot, the barrel of the cannon, as the coordinates on a circle.
Since you know the angle, this is easy. Draw a circle with a dot in the center. Then draw a line from the center to the right edge. Then draw another line at a 45 degree angle up from the first line. Connect the two points on the edges with a straight line. You have a triangle now.
Lets call the 45 degree angle line 'r'. And we'll call the first line x, and the last line y.
You should have something that looks like this:
http://i.stack.imgur.com/MJNWZ.jpg
We know that sin(angleInRadians) = y/r. Doing a little algebra we can change this into r*sin(angleInRadians) = y
Boom, you have your y coordinate.
Almost the same thing: cos(angleInRadians) = x/r
So r*cos(angleInRadians) = x
There's your x coordinate.
The you can get the angle of a body directly from box2d, so that's easy. You just need to pick a value for 'r' that represents a correct radius for the circle that you're using to conceptualize the barrel of the cannon at a given angle. If the cannon rotates around the center of the circle, then r is the length of your cannon.
I had an issue which is similar to yours. Here's the question with an answer:
Android. How to move object in the direction it is facing (using Vector3 and Quaternion)
You need something like
translation.set(baseDirection).rot(modelInstance.transform).nor()
For fun I am making Pong in Python with Pygame. I have run into some trouble with reflections.
So the ball has an angle associated with it. Since positive y is down this angle is downward. If the ball hits the top or bottom walls I can simply negate the angle and it will reflect properly, but the trouble is with the left and right walls. I cannot figure out the trigonometry for how to change the angle in this case. I am currently trying combinations of the below snippet but with no luck.
self.angle = -(self.angle - math.pi/2)
I have attached the code. You can try it for yourself easily. Just remember to take out the "framerate" module which I have not included or used yet. I would appreciate any input. Thanks!
You'll want to look into Angle of Incidence.
Basically you'll want to find the angle theta between your incoming vector and the normal of the wall the ball is hitting. Where the incoming angle is (wall normal)-theta the resulting angle is (wall normal)+theta.
The angle can be found using the dot product between your incoming vector and the normal of the wall, then taking the inverse cosine (normalize your vectors first).
You should use:
math.pi - angle
Firstly - Z is Up in this problem.
Context: Top down 2D Game using 3D objects.
The player and all enemies are Spheres that can move in any direction on a 2D Plane (XY). They rotate as you would expect when they move. Their velocity is a 3D vector in world space and this is how I influence them. They aren't allowed to rotate on the spot.
I need to find a formula to determine the direction one of these spheres should move in order to get their Z-Axis (or any axis really) pointing a specified direction in world space.
Some examples may be in order:
X
|
Z--Y
This one is simple: The Spheres local axes matches the world so if I want the Spheres Z-Axis to point along 1,0,0 then I can move the sphere along 1,0,0.
The one that gives me trouble is this:
X
|
Y--Z
Now I know that to get the Z-Axis to point along 1,0,0 in world space I have to tell the sphere to move along 1,1,0 but I don't know/understand WHY that is the case.
I've been programming for ten years but I absolutely suck at vector maths so assume I'm an idiot when trying to explain :)
All right, I think I see what you mean.
Take a ball-- you must have one lying around. Mark a spot on it to indicate an axis of interest. Now pick a direction in which you want the axis to point. The trick is to rotate the ball in place to bring the axis to the right direction-- we'll get to the rolling in a minute.
The obvious way is to move "directly", and if you do this a few times you'll notice that the axis around which you are rotating the ball is perpendicular to the axis you're trying to move. It's as if the spot is on the equator and you're rotating around the North-South axis. Every time you pick a new direction, that direction and your marked axis determine the new equator. Also notice (this may be tricky) that you can draw a great circle (that's a circle that goes right around the sphere and divides it into equal halves) that goes between the mark and the destination, so that they're on opposite hemispheres, like mirror images. The poles are always on that circle.
Now suppose you're not free to choose the poles like that. You have a mark, you have a desired direction, so you have the great circle, and the north pole will be somewhere on the circle, but it could be anywhere. Imagine that someone else gets to choose it. The mark will still rotate to the destination, but they won't be on the equator any more, they'll be at some other latitude.
Now put the ball on the floor and roll it -- don't worry about the mark for now. Notice that it rotates around a horizontal axis, the poles, and touches the floor along a circle, the equator (which is now vertical). The poles must be somewhere on the "waist" of the sphere, halfway up from the floor (don't call it the equator). If you pick the poles on that circle, you choose the direction of rolling.
Now look at the marks, and draw the great circle that divides them. The poles must be on that circle. Look where that circle crosses the "waist"; that's where your poles must be.
Tell me if this makes sense, and we can put in the math.