Creating plane with triangular strips - qt

I am trying to create a finely triangulated mesh. In the draw function :
// Draws triangular strip as only lines
glPolygonMode(GL_FRONT_AND_BACK,GL_LINE);
// Create Cloth Mesh
glBegin(GL_TRIANGLE_STRIP);
glNormal3f(0,0,1);
for(float i = 0.0 ; i < 2.0; i+=.05)
{
for(float j = -1.0 ; j < 1.0 ; j+=.05)
{
glVertex3f(j,1-i+.05,0);
glVertex3f(j,1-i,0);
}
}
glEnd();
I get the following output :
which I what I expected.
But if I rotate the scene, I see this on the back side:
(If its not clear, its feebly lit mesh, but the triangles are not the same as showing up on front.)
Q1. Shouldn't the back side be not visible at all ? I have these flag enabled :
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);
Q2. Even if the back side shows up, why does it shows these irregular, long triangles and not the same ones as on front ?
P.S. This is the resize function :
void MyGLWidget::resizeGL(int width, int height)
{
int side = qMin(width, height);
glViewport((width - side) / 2, (height - side) / 2, side, side);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(-2, +2, -2, +2, 1.0, 15.0);
glMatrixMode(GL_MODELVIEW);
}
P.P.S: Yes, I understand this is not modern OpenGL.

For me, this looks like the triangles, that are produced by the triangle-strip to "jump back" to the left side. This happens whenever you start a new row. Basically, they consist of the last-vertices of the previous row and the first vertices of the next row. Backface culling seems to work, but the large triangles are visible from the back because their winding order is exactly opposite to the one of the small triangles.
If you don't want to have them, you either have to reset the triangle-strip at the end of each row, or use another structure (GL_TRIANGLES).

Related

How to unproject a point on screen to object space coordinates in vulkan?

I need to be able to unproject a screen pixel into object space using Vulkan, but somewhere my math is going wrong.
Here is the shader as it stands today for reference:
void main()
{
//the depth of this pixel is between 0 and 1
vec4 obj_space = vec4( float(gl_FragCoord.x)/ubo.screen_width, float(gl_FragCoord.y)/ubo.screen_height, gl_FragCoord.z, 1.0f);
//this puts us in normalized device coordinates [-1,1 ] range
obj_space.xy = ( obj_space.xy * 2.0f ) -1.0f;
//this two lines will put is in object space coordinates
//mvp_inverse is derived from this in the c++ side:
//glm::inverse(app.three_d_camera->get_projection_matrix() * app.three_d_camera->view_matrix * model);
obj_space = ubo.mvp_inverse * obj_space;
obj_space.xyz /= obj_space.w;
//the resulting position here is wrong
out_color = obj_space;
}
when I output the position in color, the colors are off. I know I can simply pass in the object space position from the vertex shader to the fragment shader, but I'd like to understand why my math is not working, it will help me understand Vulkan and maybe learn a little math myself.
Thanks!
I'm not entirely sure what your problem is, but lets go over potential problems.
Remember, vulkan clip space is:
positive y = down,
positive x = right,
positive z = out,
centered at the middle of the screen.
Additionally, despite OpenGL's GLSL docs saying it is centered at the bottom left corner, in vulkan gl_FragCoord is centered at the top left corner.
in this step:
obj_space.xy = ( obj_space.xy * 2.0f ) -1.0f;
obj_space is now:
left x : -1.0
right x : 1.0
top y = -1.0
bottom y = 1.0
out z = 1.0
back z = 0
I'm almost entirely sure you don't mean your object space to have Y be negative at the top. The reasoning for y increasing starting from top to bottom is for images and textures, which on the CPU are ordered the same way, and now are ordered like that in vulkan.
Some other notes:
You claim your inverse is derivied from glm::inverse here:
glm::inverse(app.three_d_camera->get_projection_matrix() * app.three_d_camera->view_matrix * model);
But GLM uses OpenGL notation for matrix dimensions and handedness, and unless you force it to the correct coordinate system, it is going to assume right handed positive Y up, z negative out. You'll need to include the following #defines before it works correctly (or physically change your calculations to accommodate this).
#define GLM_FORCE_DEPTH_ZERO_TO_ONE
#define GLM_FORCE_LEFT_HANDED
Additionally you'll need to modify your matrices to account for the negative Y direction. Here is an example of how I've handled this in the past (modifying the perspective matrix directly):
ubo.model = glm::translate(glm::mat4(1.0f), glm::vec3(pos_x,pos_y,pos_z));
ubo.model *= glm::rotate(glm::mat4(1.0f), time * glm::radians(0.0f), glm::vec3(0.0f, 0.0f, 1.0f));
ubo.view = glm::lookAt(glm::vec3(0.0f, 0.0f, -10.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 1.0f, 0.0f));
ubo.proj = glm::perspective(glm::radians(45.0f), swapChainExtent.width / (float) swapChainExtent.height, 0.1f, 100.0f);
ubo.proj[1][1] *= -1; // makes the y axis projected to the same as vulkans

How do i get list of points which satisfy given QPainterPath?

I have a QGraphicsView in my Qt application on which user can draw curves. Curves consist of QGraphicsEllipseItem's and QGraphicsPathItem's, which connect the adjacent ellipses.
I want to get a list of QPoint's which satisfy the given curve. I tried creating local QPainterPath for this procedure which would represent the whole curve and iterating over all the points from it's rectangle to see which ones satisfy this curve. The code looks like:
QPainterPath curvePath = edges[index]->at(0)->path();
qreal left, right, bottom, top;
for(int i=1;i<edges[index]->size();i++)
{
curvePath.connectPath(edges[index]->at(i)->path());
}
QRectF curveRect = curvePath.boundingRect();
left = curveRect.left();
right = curveRect.right();
top = curveRect.top();
bottom = curveRect.bottom();
for(qreal i = left;i<right;i++)
for(qreal j = top;j<bottom;j++)
{
QPointF pointToCheck(i, j);
if(curvePath.contains(pointToCheck))
list.append(pointToCheck);
}
where edges is QList of QLists of QGraphicsPathItem's. It works fine in case of calculations (the point of applying this is to increase precision of calculation), but it really slows down my application since those calculations are made quite often.
Is there more efficient way to implement this?

Calculating if or not a 3D eyepoint is behind a 2D plane or upwards

The setup
Draw XY-coordinate axes on a piece of paper. Write a word on it along X-axis, so that the word's centerpoint is at origo (half on positive side of X/Y, the other half on negative side of X/Y).
Now, if you flip the paper upside down you'll notice that the word is mirrored in relation to both X- and Y-axis. If you look from behind the paper, it's mirrored in relation to Y-axis. If you look at it from behind and upside down, it's mirrored in relation to X-axis.
Ok, I have points in 2D-plane (vertices) that are created in similar way at the origo and I need to apply exactly the same rule for them. To make things interesting:
The 2D plane is actually 3D, each point (vertex) being (x, y, 0). Initially the vertices are positioned to the origo and their normal is Pn(0,0,1). => Correctly seen when looked at from point Pn towards origo.
The vertex-plane has it's own rotation matrix [Rp] and position P(x,y,z) in the 3D-world. The rotation is applied before positioning.
The 3D world is "right handed". The viewer would be looking towards origo from some distance along positive Z-axis but the world is also oriented by rotation matrix [Rw]. [Rw] * (0,0,1) would point directly to the viewer's eye.
From those I need to calculate when the vertex-plane should be mirrored and by which axis. The mirroring itself can be done before applying [Rp] and P by:
Vertices vertices = Get2DPlanePoints();
int MirrorX = 1; // -1 to mirror, 1 NOT to mirror
int MirrorY = 1; // -1 to mirror, 1 NOT to mirror
Matrix WorldRotation = GetWorldRotationMatrix();
MirrorX = GetMirrorXFactor(WorldRotation);
MirrorY = GetMirrorYFactor(WorldRotation);
foreach(Vertex v in vertices)
{
v.X = v.X * MirrorX * MirrorY;
v.Y = V.Y * MirrorY;
}
// Apply rotation...
// Add position...
The question
So I need GetMirrorXFactor() & ..YFactor() -functions that return -1 if the viewer's eyepoint is at greater "X/Y"-angle than +-90 degrees in relation to the vertex-plane's normal after the rotation and world orientation. I have already solved this, but I'm looking for more "elegant" mathematics. I know that rotation matrices somehow contain info about how much is rotated by which axis and I believe that can be utilized here.
My Solution for MirrorX:
// Matrix multiplications. Vectors are vertical matrices here.
Pnr = [Rp] * Pn // Rotated vertices's normal
Pur = [Rp] * (0,1,0) // Rotated vertices's "up-vector"
Wnr = [Rw] * (0,0,1) // Rotated eye-vector with world's orientation
// = vector pointing directly at the viewer's eye
// Use rotated up-vector as a normal some new plane and project viewer's
// eye on it. dot = dot product between vectors.
Wnrx = Wnr - (Wnr dot Pur) * Pur // "X-projected" eye.
// Calculate angle between eye's X-component and plane's rotated normal.
// ||V|| = V's norm.
angle = arccos( (Wnrx dot Pnr) / ( ||Wnrx|| * ||Pnr|| ) )
if (angle > PI / 2)
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
Solution for mirrorY can be done in similar way using viewer's up and vertex-plane's right -vectors.
Better solution?
if (([Rp]*(1,0,0)) dot ([Rw]*(1,0,0))) < 0
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
if (([Rp]*(0,1,0)) dot ([Rw]*(0,1,0))) < 0
MirrorY = -1; // DO mirror
else
MirrorY = 1; // DON'T mirror
Explaining in more detail is difficult without diagrams, but if you have trouble with this solution we can work through some cases.

How can I get view direction from the OpenGL ModelView Matrix?

I am writing a volume render program that constantly adjusts some plane geometry so it always faces the camera. The plane geometry rotates whenever the camera rotates in order to appear as if it doesn't move--relative to everything else in the scene. (I use the camera's viewing direction as a normal vector to these plane geometries.)
Currently I am manually storing a custom rotation vector ('rotations') and applying its affects as follows in the render function:
gl2.glRotated(rotations.y, 1.0, 0.0, 0.0);
gl2.glRotated(rotations.x, 0.0, 1.0, 0.0);
Then later on I get the viewing direction by rotating the initial view direction (0,0,-1) around the x and y axes with the values from rotation. This is done in the following manner. The final viewing direction is stored in 'view':
public Vec3f getViewingAngle(){
//first rotate the viewing POINT
//then find the vector from there to the center
Vec3f view=new Vec3f(0,0,-1);
float newZ=0;
float ratio=(float) (Math.PI/180);
float vA=(float) (-1f*rotations.y*(ratio));
float hA=(float) (-1f*rotations.x)*ratio;
//rotate about the x axis first
float newY=(float) (view.y*Math.cos(vA)-view.z*Math.sin(vA));
newZ=(float) (view.y*Math.sin(vA)+view.z*Math.cos(vA));
view=new Vec3f(view.x,newY,newZ);
//rotate about Y axis
float newX=(float) (view.z*Math.sin(hA)+view.x*Math.cos(hA));
newZ=(float) (view.z*Math.cos(hA)-view.x*Math.sin(hA));
view=new Vec3f(newX,view.y,newZ);
view=new Vec3f(view.x*-1f,view.y*-1f,view.z*-1f);
//return the finalized normal viewing direction
view=Vec3f.normalized(view);
return view;
}
Now I am moving this program to a larger project wherein the camera rotation is handled by a 3rd party graphics library. I have no rotations vector. Is there some way I can get my view direction vector from:
GLfloat matrix[16];
glGetFloatv (GL_MODELVIEW_MATRIX, matrix);
I am looking at this for reference http://3dengine.org/Modelview_matrix but I still don't get how to come up with the view direction. Can someone explain to me if it is possible and how it works?
You'll want to look at this picture # http://db-in.com/images/local_vectors.jpg
The Direction-of-Flight ( DOF) is the 3rd row.
GLfloat matrix[16];
glGetFloatv( GL_MODELVIEW_MATRIX, matrix );
float DOF[3];
DOF[0] = matrix[ 2 ]; // x
DOF[1] = matrix[ 6 ]; // y
DOF[2] = matrix[ 10 ]; // z
Reference:
http://blog.db-in.com/cameras-on-opengl-es-2-x/
Instead of trying to follow the modelview matrix, to adjust your volume rasterizer's fragment impostor, you should just adjust the modelview matrix to your needs. OpenGL is not a scene graph, it's a drawing system and you can, and should change things however they suit you best.
Of course if you must embedd the volume rasterization into a larger scene, it may be neccessary to extract certain info from the modelview matrix. The upper left 3×3 submatrix contains the composite rotation of models and view. The 3rd column contains the view rotated Z vector.

Flipping an angle horizontally

I want to know a way to flip an angle in a horizontal axis, without having to do many operations. Say I have an angle of 0 ("pointing right" in my code's coordinate system), the flipped angle should be 180 (pointing left). If 90 (pointing up), flipped it should still be 90. 89 is 91, and so on.
I can operate on the X/Y speeds implied by the angle but that would slow things down, and I feel it's not the proper way to go.
I don't know much math so I might be calling things by the wrong name...Can anyone help?
EDIT: Sorry I took long, I had to be out of the computer for long, OK...
http://img215.imageshack.us/img215/8095/screenshot031v.jpg
This screenshot might do.The above structure are two satellites and a beam linked to the white dot in the center. The two satellites should inherit the angle of the white dot (it's visible for debug purposes), so if it's aiming at an angle, they will follow. The satellite at the left is mirrored, so I calculated it with 180-angle as suggested, although it was my first try as well. As you can see it is not mirrored but flipped. And when the white dot rotates, it rotates backwards. The other does alright.
This is the angle recalculation for something linked to something else, pid would be the parent, and id the current. pin.ang is the angle offset copied when the object is linked to another, so it keeps position when rotated:
if(object[id].mirror)
object[id].angle = 180 - (object[id].pin.ang + object[pid].angle);
else
object[id].angle = object[id].pin.ang + object[pid].angle;
And this is the specific rotation part. OpenGL. the offx/y is for things rotated off-center, like the beam about to come out there, it renders everything else right.
glTranslatef(list[index[i]].x, list[index[i]].y, 0);
glRotatef(list[index[i]].angle, 0.0, 0.0, 1.0);
glTranslatef(list[index[i]].offx, -list[index[i]].offy, 0);
The rotation also seems to miss when the rotation speed (an integer added every redraw to the current angle, positive for rotating clockwise, like in this next one:
http://img216.imageshack.us/img216/7/screenshot032ulr.jpg
So it's definitely not 180-angle, despite how obvious it'd be. The mirroring is done by just reversing the texture coordinates so it doesn't affect angle. I am afraid it might be a quirk on the GL rotation thing.
The reflected amount (just looking at the maths) would be (180 - angle)
Angle | Reflection
------+-----------
0 | 180
90 | 90
89 | 91
91 | 89
360 | -180
270 | -90
Note the negatives if you fall below the "horizontal plane" - which you could leave as they are, or handle as a special case.
Isn't it simply
result = 180-(your angle)
As already explained, you find the opposite angle by subtracting your angle from 180 degrees. Eg:
180 - yourangle
Directly manipulating the X/Y speeds would not be very cumbersome. You simply reverse the direction of the X speed, by multiplying it by minus 1, example: speedx = (-1) * speedx. This would change the left-right direction, eg: something moving to the left would start moving to the right, and vice versa, and the vertical speed would be unaffected.
If you're using sine/cosine (sin/cos) to recalculate your X/Y speed components, then the *(-1) method would probably be more efficient. Ultimately it depends on the context of your program. If you're looking for a better solution, update your question with more details.
This solution is for -Y oriented angles (like a watch)! For +X orientation (like school math) you need to swap X and Y.
public static float FlipAngleX(float angle)
{
angle = NormalizeAngle(angle);
angle = TwoPi - angle;
return angle;
}
public static float FlipAngleY(float angle)
{
angle = NormalizeAngle(angle);
if (angle < Pi)
{
angle = Pi - angle;
}
else
{
angle = TwoPi - angle + Pi;
}
return angle;
}
/// <summary>
/// Keeps angle between 0 - Two Pi
/// </summary>
public static float NormalizeAngle(float angle)
{
if (angle < 0)
{
int backRevolutions = (int)(-angle / TwoPi);
return angle + TwoPi * (backRevolutions + 1);
}
else
{
return angle % TwoPi;
}
}
Aah, seems the problem came from negative numbers after all, I ensured them being positive and now the rotation does fine, I don't even need to recalculate angle...
Thanks to everyone, I ended up figuring out due to bits of every response.
to flip counter clockwise to clockwise (270 on right -> 90 on right)
angle - 360
--
to flip vertical (180 on top -> 0/360 on top)
Math.Normalize(angle - 180)
--
both:
float flipped_vertical = angle - 360
float flipped_vertical_and_horizontal = Math.Normalize(flipped_vertical- 180)
just 360-angle will flip your angle horizontaly but not verticaly

Resources