Unity raycast going random directions - vector

Ray r = new Ray(this.transform.position, this.transform.eulerAngles);
RaycastHit hit;
if(Physics.Raycast(r, out hit, 3000, 256 /*layer 8*/ )){
That little bit of code won't give me a forward raycast, and I've searched for a number of solutions over multiple hours, to no avail.
So, the above won't give me a straight raycast out the front of the object and I don't know why. I figure it's probably an oversight.

The constructor for Ray takes an origin and a direction. transform.eulerAngles returns a vector of three angles around the x, y, and z axes. "Direction" might sound similar to angles, but it's not: the angles are rotation, not direction. The important distinction is that a direction vector "points" a certain way, but rotation describes how something is oriented. You could create a direction vector using the rotation information, but fortunately Unity can do this for you.
The easiest way to fix this is to use Unity's built-in way to get an object's forward direction vector (as seen in the Ray doc):
// Create a ray from the transform position along the transform's z-axis
Ray ray = new Ray(transform.position, transform.forward);
transform.forward gives you the forward direction vector of transform, meaning that the ray will be shot in the direction the object's facing.

Related

return local 2D coords of a 3D point on a 2D plane in space?

As the title suggests, I don't know where to start on this problem:
I have a a 2D plane defined by it's origin point in global coordinates(x,y,z) and its axis endpoints as well, but I don't know how to return the local coordinates of a point on the plane
I found this solution but it gave false results:
// given information
`Vector3 origin;
Vector3 planeXDir;
Vector3 planeYDir;
Vector3 pointOnPlane;
Vector3 v = pointOnPlane - origin;
Vector2 positionInPlane = new Vector2(Vector3.Dot(planeXDir, v), Vector3.Dot(planeYDir, v));`
I don't know where I went wrong, maybe a misconception of planeXDir and planeYDir? I'd be happy if someone could explain or give me an easier solution to implement.
That code is correct, so there're misconceptions or mistakes somewhere.
Check that planeXDir & planeDir are orthogonal, and unit (or if not unit that you really want a scaling difference between unit lengths in 3D vs the plane).

Vector3 Lerp Rotation

I want to make a Vector3 rotation in Unity 3D. However, the rotation axes do not make a correct rotation. This is because, I rotate two axes at the same time x and y. I have to separate two rotations. What can I do? Thank you for helping me.
Vector3 start = new Vector3(0,0,0)
Vector3 target = new Vector3(0,0,0)
transform.localEulerAngles = Vector3.Lerp(start,target,time)
Using Vector3 rotation through Vector3.Lerp is prone to gimbal lock problems which is what you might be experiencing.
Best way to lerp through rotation is to use the transform.Rotate(); function.
Alternatively you could use a Quaternion.Lerp(); and create the start and end Quaternions using Quaternion.Euler();. This will yield you the most reliable results since Quaterion rotation is more complex than Vector3 and cancels out any problems you would have in Vector3 rotations.
Edit: Please remember to post entire snippets of your code. Lerp stands for Linear Interpolation and it requires a t that goes from 0 to 1 where 0 is the starting point and 1 is the end point. Many people use this function wrong and come up with problem. Since i cant see your code however i assumed you did this correctly, but i don't know unless i'd see the actual code :)

Trigonometry: 3D rotation around center point

Yeah, yeah, I checked out the suggested questions/answers that were given to me but most involved quaternions, or had symbols in them that I don't even HAVE on my keyboard.
I failed at high school trig, and while I understand the basic concepts of sin and cos in 2D space, I'm at a loss when throwing in a third plane to deal with.
Basically, I have these things: centerpoint, distance, and angles for each of the three axes. Given that information, I want to calculate the point that is -distance- away from the center point, at the specified angles.
I'm not sure I'm explaining this correctly. My intent is to get what amounts to electrons orbiting around a nucleus, if anyone happens to know how to do that. I am working with Java, JRE 6, if there are any utility classes in there that can help.
I don't want just an answer, but also the how and why of the answer. If I'm going to learn something, i want to learn ABOUT it as well. I am not afraid to take a lesson in trigonometry, or how quaternions work, etc. I'm not looking for an entire course on the answer, but at least some basic understanding would be cool.
If you did this in 2D, you would have a point on a plane with certain x and y coordinates. The distance from the origin would be sqrt(x^2+y^2), and the angle atan(y/2).
If you were given angle phi and distance r you would compute x= r*cos(phi); y=r*sin(phi);
To do this in three dimensions you need two angles - angle in XY plane and angle relative to Z axis. Calling these phi and theta, you compute coordinates as
X = r*cos(phi)*sin(theta);
Y = r*sin(phi)*sin(theta);
Z = r*cos(theta);
When I have a chance I will make a sketch to show how that works.

Camera rotation with a quaternion

I am having a problem with the maths of camera rotation, well more like I lack knowledge on this subject and can't find anything about it on the internet (read, most likely don't know the correct search keywords)
Anyway, this is what I am attempting to do (pseudo code):
RotateCamera(angle,axis){
Quaternion rotation = cam.getRotation();
Quaternion rot = new Quaternion();
rot.fromAngleNormalAxis(angle, axis);
rotation.multLocal(rot);
cam.setRotation(rotation);
}
update(float value){ // just to show what input I use the RotateCamera method for the directions
RotateCamera(value,Vector3f(0,1,0)) // left
RotateCamera(-value,Vector3f(0,1,0)) // right
RotateCamera(value,Vector3f(1,0,0)) // up
RotateCamera(-value,Vector3f(1,0,0)) // down
}
Now this works quite well but sometimes the cam will roll instead of only yaw/pitch. What is the correct way of doing this?
With just the bit of code given, it's hard to say for sure. But it looks like you've hard coded the axes of rotation into your update method. The thing about rotations (whether represented by quaternions or matrices) is that their multiplication isn't "commutative" meaning doing the same two rotations but in opposite orders does not give the same result.
It looks like you're assuming the camera is facing in the (0,0,1) direction, let's call it the z axis, with the y axis (0,1,0) coming out of the top of your head. As long as this assumption holds, you're axis of rotation for looking up, down, left and right will be (1,0,0), (1,0,0), (0,1,0) and (0,1,0) as they seem to be in your code snippet. But say you've just rotated 90 degrees to the left. This sends the camera's view from the (0,0,1) direction to the (1,0,0) direction. Now say you make an "up" rotation, which was coded to be around the (1,0,0) axis. This is a rotation around the same axis in which you're looking, and the effect will be a roll.
Does this address the issue? If so, you should code your axes of rotation w.r.t. the current direction the camera is facing.

Quaternion rotation matrix unexpectedly has the opposite sense

I have some understanding problem concerning quaternions.
In order to have my world object rotate in the correct way, I need to invert their quaternion rotation while refreshing the object world matrix.
I create the object rotation with this code:
Rotation = Quaternion.RotationMatrix(
Matrix.LookAtRH(Position,
Position + new Vector3(_moveDirection.X, 0, _moveDirection.Y),
Vector3.Up)
);
and refresh the object World matrix like this:
Object.World = Matrix.RotationQuaternion(Rotation)
* Matrix.Translation(Position);
This is not working; it makes the object rotate in the opposite way compared to what it should!
The is the way that makes my object rotate correctly:
Object.World = Matrix.RotationQuaternion(Quaternion.invert(Rotation))
* Matrix.Translation(Position);
Why do I have to invert the object rotation?
This isn't a quaternion problem so much as it is a usage and/or documentation issue with the DirectX call you're using. The transformation the call gives is the one that happens when you move the camera. If you're keeping the camera fixed and moving the world, you're swapping what's moving and what's fixed. These coordinate transformations are inverses of each other, which is why taking the inverse works for you.
You don't need to take an explicit inverse, though. Just swap the order of the first two arguments.

Resources