in my javafx application, i have an imageView for which I have created an Rotate trasition, everything works fine , but the rotation direction is in counterclockwise of watch, i want to make it in opposite direction .
this is My code :
RotateTransition rt = new RotateTransition(Duration.millis(3000), myImageView);
rt.setByAngle(360);
rt.setCycleCount(1);
rt.setAutoReverse(false);
rt.play()
;
It seems to be undocumented, but setting a negative angle results in a counter-clockwise rotation:
rt.setByAngle(-360);
Related
I have a problem with my 3D Project. It is quite complicated to describe the purpose so I try to abstract it to the minimum.
I have an live videostream of the unity program which I bring up to fullscreen (1920 x 1200). One user clicks on the screen to send the coords to the unity app.
sending coords:
// relative coord
float x = mouse_x / 1920.0f;
float y = mouse_y / 1200.0f;
The receiver is the unity app, which trys to make an 3D coordinate of it and finds a wall or an obstacle to place a mark.
1. Attempt
// 1268 x 720 receiver viewport size
Ray ray = Camera.main.ScreenPointToRay(new Vector3(Position.x * 1268.0f, Position.y * 720.0f, 0));
2. Attempt
// * 1268 not necessary
Vector3 far = Camera.main.ViewportToWorldPoint(new Vector3(fix.Position.x, fix.Position.y, 1));
Vector3 near = Camera.main.ViewportToWorldPoint(new Vector3(fix.Position.x, fix.Position.y, 0));
Vector3 dir = far - near;
dir.Normalize();
Ray ray = new Ray(near, dir);
RaycastHit hitInfo;
if(Physics.Raycast(ray, out hitInfo))
{
// place mark
}
Both attempts results in the same way. If the coordinate is around the center then it is in the center on the receiver as well. But the more it goes to the edge it will be much farer from the position it should be. The picture shows what I think happens. The red circle is the current behaviour and the green is what I was expecting. I'd rather have a 90 degrees ray from the screen to the wall than right through the cam.
I really do not know what to do. Thank you very much for your help in advance.
You're right about your drawing, this is indeed what's happening.
Here is a test I did using Debug.DrawRay.
The blue ray is the output of this code.
Debug.DrawRay(Camera.main.transform.position, Camera.main.transform.forward * 100f, Color.red);
And here is the drawing of the red, like you did.
var viewportPointRay = Camera.main.ViewportPointToRay(viewportTouchPos);
Debug.DrawRay(viewportPointRay.origin, viewportPointRay.direction * 3f, Color.blue);
I was expecting a truly simple answer but I wasn't able to find one. I did found a trick to do what you want though.
var ray = new Ray(viewportPointRay.origin, Camera.main.transform.forward);
Debug.DrawRay(ray.origin, ray.direction, Color.green);
Result
I have a 3D game where I want an arrow to point in the direction base on the mouses angle of that object in a 2D view.
Now from the camera looking down at the board from a 90 degree x-angle standpoint it works fine. The below image is when I am in a 90 Degree x-angle Camera angle facing down on my game and I have the arrow face where my cursor is:
But now when we take a step back and have the camera at a 45 degree x-angle the direction the arrow is facing is a bit off. The below image is when I have the cursor face my mouse cursor when my camera is on a 45 degree x-angle :
Now lets look at the above image but when the Camera is shifted back to 90 Degrees x-angle:
My current code is:
// Get the vectors of the 2 points, the pivot point which is the ball start and the position of the mouse.
Vector2 objectPoint = Camera.main.WorldToScreenPoint(_arrowTransform.position);
Vector2 mousePoint = (Vector2)Input.mousePosition;
float angle = Mathf.Atan2( mousePoint.y - objectPoint.y, mousePoint.x - objectPoint.x ) * 180 / Mathf.PI;
_arrowTransform.rotation = Quaternion.AngleAxis(-angle, Vector2.up) * Quaternion.Euler(90f, 0f, 0f);
What would I have to add in my Mathf.Atan2() to compensate for the camera rotation on the x and/or y to make sure when the user wants to move the camera how they please it will make sure to provide an accurate direction?
EDIT : The solution was in MotoSV's answer with using Plane. This allowed me to get the exact point no matter what my camera angles were based on my mouse position. Code that worked for me is below :
void Update()
{
Plane groundPlane = new Plane(Vector3.up, new Vector3(_arrowTransform.position.x, _arrowTransform.position.y, _arrowTransform.position.z));
Ray ray = _mainCamera.ScreenPointToRay(Input.mousePosition);
float distance;
if (groundPlane.Raycast(ray, out distance))
{
Vector3 point = ray.GetPoint(distance);
_arrowTransform.LookAt(point);
}
}
Although this does not answer your question directly with regards to the Mathf.Atan2 method it is a alternative approach that may be useful.
This would be placed onto the game object that represents the arrow:
public class MouseController : MonoBehaviour
{
private Camera _camera;
private void Start()
{
_camera = GameObject.FindGameObjectWithTag("MainCamera").GetComponent<Camera>();
}
private void Update()
{
Plane groundPlane = new Plane(Vector3.up, this.transform.position);
Ray ray = _camera.ScreenPointToRay(Input.mousePosition);
float distance;
Vector3 axis = Vector3.zero;
if(groundPlane.Raycast(ray, out distance))
{
Vector3 point = ray.GetPoint(distance);
axis = (point - this.transform.position).normalized;
axis = new Vector3(axis.x, 0f, axis.z);
}
this.transform.rotation = Quaternion.LookRotation(axis);
}
}
The basic idea is to:
Create a Plane instance centred at the game object's position
Convert the mouse screen position into a Ray that heads into the world
relative to the camer'a current position and rotation
Then cast that ray onto the Plane created in step #1
If the ray intersects the plane, then you can use the GetPoint method to find out where on the plane the ray hit
Then create a direction vector from the centre of the plane to the intersect point and create a LookRotation based on the vector
You can find out more information about the Plane class on the Unity - Plane documentation page.
I am writing a volume render program that constantly adjusts some plane geometry so it always faces the camera. The plane geometry rotates whenever the camera rotates in order to appear as if it doesn't move--relative to everything else in the scene. (I use the camera's viewing direction as a normal vector to these plane geometries.)
Currently I am manually storing a custom rotation vector ('rotations') and applying its affects as follows in the render function:
gl2.glRotated(rotations.y, 1.0, 0.0, 0.0);
gl2.glRotated(rotations.x, 0.0, 1.0, 0.0);
Then later on I get the viewing direction by rotating the initial view direction (0,0,-1) around the x and y axes with the values from rotation. This is done in the following manner. The final viewing direction is stored in 'view':
public Vec3f getViewingAngle(){
//first rotate the viewing POINT
//then find the vector from there to the center
Vec3f view=new Vec3f(0,0,-1);
float newZ=0;
float ratio=(float) (Math.PI/180);
float vA=(float) (-1f*rotations.y*(ratio));
float hA=(float) (-1f*rotations.x)*ratio;
//rotate about the x axis first
float newY=(float) (view.y*Math.cos(vA)-view.z*Math.sin(vA));
newZ=(float) (view.y*Math.sin(vA)+view.z*Math.cos(vA));
view=new Vec3f(view.x,newY,newZ);
//rotate about Y axis
float newX=(float) (view.z*Math.sin(hA)+view.x*Math.cos(hA));
newZ=(float) (view.z*Math.cos(hA)-view.x*Math.sin(hA));
view=new Vec3f(newX,view.y,newZ);
view=new Vec3f(view.x*-1f,view.y*-1f,view.z*-1f);
//return the finalized normal viewing direction
view=Vec3f.normalized(view);
return view;
}
Now I am moving this program to a larger project wherein the camera rotation is handled by a 3rd party graphics library. I have no rotations vector. Is there some way I can get my view direction vector from:
GLfloat matrix[16];
glGetFloatv (GL_MODELVIEW_MATRIX, matrix);
I am looking at this for reference http://3dengine.org/Modelview_matrix but I still don't get how to come up with the view direction. Can someone explain to me if it is possible and how it works?
You'll want to look at this picture # http://db-in.com/images/local_vectors.jpg
The Direction-of-Flight ( DOF) is the 3rd row.
GLfloat matrix[16];
glGetFloatv( GL_MODELVIEW_MATRIX, matrix );
float DOF[3];
DOF[0] = matrix[ 2 ]; // x
DOF[1] = matrix[ 6 ]; // y
DOF[2] = matrix[ 10 ]; // z
Reference:
http://blog.db-in.com/cameras-on-opengl-es-2-x/
Instead of trying to follow the modelview matrix, to adjust your volume rasterizer's fragment impostor, you should just adjust the modelview matrix to your needs. OpenGL is not a scene graph, it's a drawing system and you can, and should change things however they suit you best.
Of course if you must embedd the volume rasterization into a larger scene, it may be neccessary to extract certain info from the modelview matrix. The upper left 3×3 submatrix contains the composite rotation of models and view. The 3rd column contains the view rotated Z vector.
I have a camera in OpenGL.I had no problem with it until adding FPS controller.The problem is that the basic FPS behavior is ok. The camera moves forward,backward,left and right+ rotates towards the direction supplied by mouse input.The problems begin when the camera moves to the sides or the back of the target position.In such a case camera local forward,backward,left,right directions aren't updated based on its current forward look but remain the same as if it was right in front of the target.Example:
If the target object position is at (0,0,0) and camera position is at (-50,0,0) (to the left of the target) and camera is looking at the target,then to move it back and forth I have to use the keys for left and right movement while backward/forward keys move the camera sideways.
Here is the code I use to calculate camera position, rotation and LookAt matrix:
void LookAtTarget(const vec3 &eye,const vec3 ¢er,const vec3 &up)
{
this->_eye = eye;
this->_center = center;
this->_up = up;
this->_direction =normalize((center - eye));
_viewMatrix=lookAt( eye, center , up);
_transform.SetModel(_viewMatrix );
UpdateViewFrustum();
}
void SetPosition(const vec3 &position){
this->_eye=position;
this->_center=position + _direction;
LookAtTarget(_eye,_center,_up);
}
void SetRotation(float rz , float ry ,float rx){
_rotationMatrix=mat4(1);
vec3 direction(0.0f, 0.0f, -1.0f);
vec3 up(0.0f, 1.0f, 0.0f);
_rotationMatrix=eulerAngleYXZ(ry,rx,rz);
vec4 rotatedDir= _rotationMatrix * vec4(direction,1) ;
this->_center = this->_eye + vec3(rotatedDir);
this->_up =vec3( _rotationMatrix * vec4(up,1));
LookAtTarget(_eye, _center, up);
}
Then in the render loop I set camera's transformations:
while(true)
{
display();
fps->print(GetElapsedTime());
if(glfwGetKey(GLFW_KEY_ESC) || !glfwGetWindowParam(GLFW_OPENED)){
break;
}
calculateCameraMovement();
moveCamera();
view->GetScene()->GetCurrentCam()->SetRotation(0,-camYRot,-camXRot);
view->GetScene()->GetCurrentCam()->SetPosition(camXPos,camYPos,camZPos);
}
lookAt() method comes from GLM math lib.
I am pretty sure I have to multiply some of the vectors (eye ,center etc) with rotation matrix but I am not sure which ones.I tried to multiply _viewMatrix by the _rotationMatrix but it creates a mess.The code for FPS camera position and rotation calculation is taken from here.But for the actual rendering I use programmable pipeline.
Update:
I solved the issue by adding a separate method which doesn't calculate camera matrix using lookAt but rather using the usual and basic approach:
void FpsMove(GLfloat x, GLfloat y , GLfloat z,float pitch,float yaw){
_viewMatrix =rotate(mat4(1.0f), pitch, vec3(1, 0, 0));
_viewMatrix=rotate(_viewMatrix, yaw, vec3(0, 1, 0));
_viewMatrix= translate(_viewMatrix, vec3(-x, -y, -z));
_transform.SetModel( _viewMatrix );
}
It solved the problem but I still want to know how to make it work with lookAt() methods I presented here.
You need to change the forward direction of the camera, which is presumably fixed to (0,0,-1). You can do this by rotating the directions about the y axis by camYRot (as computed in the lookat function) so that forwards is in the same direction that the camera is pointing (in the plane made by the z and x axes).
I'd like to implement application which allows user to select few QGraphicsItems and then rotate them as a group. I know that I could add all items into one QGraphicsItemGroup but I need to keep Z-value of each item. Is it possible?
I also have a second question.
I'm trying to rotate QGraphicsItem around some point (different from (0,0) - let's say (200,150)). After that operation I want to rotate this item once more time but now around (0,0). I'm using code below:
QPointF point(200,150); // point is (200,150) at first time and then it is changed to (0,0) - no matter how...
qreal x = temp.rx();
qreal y = temp.ry();
item->setTransform(item->transform()*(QTransform().translate(x,y).rotate(angle).translate(-x,-y)));
I noticed that after second rotation the item is not rotated around point (0,0) but around some other point (I don't know which). I also noticed that if I change order of operations it all works great.
What am I doing wrong?
Regarding your first problem, why should the z-values be a problem when putting them into a QGraphicsGroup?
On the other hand you could also iterate through the selected items and just apply the transformation.
I guess this snippet will solve your 2nd problem:
QGraphicsView view;
QGraphicsScene scene;
QPointF itemPosToRotate(-35,-35);
QPointF pivotPoint(25,25);
QGraphicsEllipseItem *pivotCircle = scene.addEllipse(-2.5,-2.5,5,5);
pivotCircle->setPos(pivotPoint);
QGraphicsRectItem *rect = scene.addRect(-5,-5,10,10);
rect->setPos(itemPosToRotate);
// draw some coordinate frame lines
scene.addLine(-100,0,100,0);
scene.addLine(0,100,0,-100);
// do half-cicle rotation
for(int j=0;j<=5;j++)
for(int i=1;i<=20;i++) {
rect = scene.addRect(-5,-5,10,10);
rect->setPos(itemPosToRotate);
QPointF itemCenter = rect->pos();
QPointF pivot = pivotCircle->pos() - itemCenter;
// your local rotation
rect->setRotation(45);
// your rotation around the pivot
rect->setTransform(QTransform().translate(pivot.x(), pivot.y()).rotate(180.0 * (qreal)i/20.0).translate(-pivot.x(),-pivot.y()),true);
}
view.setScene(&scene);
view.setTransform(view.transform().scale(2,2));
view.show();
EDIT:
In case you meant to rotate around the global coordinate frame origin change the rotations to:
rect->setTransform(QTransform().translate(-itemCenter.x(), -itemCenter.y()).rotate(360.0 * (qreal)j/5.0).translate(itemCenter.x(),itemCenter.y()) );
rect->setTransform(QTransform().translate(pivot.x(), pivot.y()).rotate(180.0 * (qreal)i/20.0).translate(-pivot.x(),-pivot.y()),true);