Vertex displacement on sphere break the mesh - math

I'm trying to make a simple noise effect on a sphere with shaders.
I tried to use ashima's perlin noise but the effect wasn't what I expected so I create my own shader based on Phong.
Here is what I get with this code in my vertex shader:
attribute int index;
uniform float time;
vec3 newPosition = position + normal * vec3(sin((time * 0.001) * float(index)) * 0.05);
gl_Position = projectionMatrix * modelViewMatrix * vec4(newPosition, 1.0);
where index is the index of the vertex and time the current elapsed time.
The noise effect is exactly what I expected but the sphere mesh is open...
How can I keep this effect and keep the sphere mesh closed?

Most likely your sphere contains duplicated vertices. Get rid of them and your shader will works well. Or get rid of your shader dependency on "index".

Related

2d physics ball bouncing and rolling on incline response

I asked this question before but it got closed because it said I was asking for recommendations which isn't allowed.
Really I'm just looking for a solution whether it's from a book,website or someone who knows how to do this.
I progressed somewhat since I asked the question the last time, but anyways:
So far I've got a program where you can spawn balls which can collide with other balls and bounce off the window boundaries just fine, you can also draw a linestrip (am using sfml as I'm just a beginner and it requires very little setting up before you can get going with coding), so you hold down the mouse and every however many pixels it'll set the line you're currently drawing and start a new one at the end, so I can either do straight edges or change the setting to make a new line every pixel and make very ugly curves with the mouse (need to figure out how to do splines).
It makes no difference for the collision as it finds the closest point on the line every frame.
So dropping the ball onto the incline is fine, it bounces off like it should, or even several times,
but getting it to roll is a different story, even if you start the ball on the incline with 0 velocity as soon as it starts colliding it gets sent up off the incline instead of moving along it, and starts doing tiny bounces.
The only solution I found, which is terrible, is to time each bounce (or height of bounce I guess) and when its low enough to stop it bouncing and only have gravity act on it.
I'd be happy with it except what happens is it will bounce lower and lower until it reaches the
threshold, at which point it visibly slows down before it starts accelerating again rolling down the incline, which looks really bad, like it comes to a halt and starts getting pulled along.
Ideally I'd like it to actually simulate the physical forces rather than having to resort to tricks like timing the bounces.
While I was writing this I put in some code to work out what the angle is that the ball hits the incline at and I think the problem lies there, it's fine if it hits it from free fall but when rolling the angle shows as about 67-72 degrees when it should be..0 i guess?
Any help appreciated.
Here's the code for the collision.
void Particle::collideRamp(sf::Vector2f cp,sf::Vector2f l1,sf::Vector2f l2) {
//distance between ball and closest point on line
float dx = x - cp.x;
float dy = y - cp.y;
float distance = hypot(dx, dy);
//find line normal
float lx = l2.x - l1.x;
float ly = l2.y - l1.y;
sf::Vector2f LineNormal = normalize(ly, -lx);
//make velocity and projection same length so can mirror velocity against normal
sf::Vector2f normalMag(velocity.x * LineNormal.x,velocity.y * LineNormal.y);
sf::Vector2f Projection = LineNormal * hypotf(normalMag.x, normalMag.y);
sf::Vector2f vel = normalize(velocity.x, velocity.y) * hypotf(normalMag.x, normalMag.y);
//working on making circles rotate but early days
float rsx = prevx - x;
float rsy = prevy - y;
float rSpeed = hypot(rsx, rsy) / circumference;
//work out gravity forces for the incline, sin#mg down incline etc.
sf::Vector2f gravPerpendicular = normalize(-ly, lx);
sf::Vector2f gravParallell = normalize(lx, ly);
float gravPerMag;
float gravParMag;
gravParMag = sin(M_halfPI + atan2f(gravParallell.x,gravParallell.y));
gravPerMag = cos(M_halfPI + atan2f(gravParallell.x,gravParallell.y));
//// Collision detected.
if (distance < (radius)) {
//work out angle the ball struck the incline
float iAngle = atan2f(lx, ly) + atan2f(velocity.x, velocity.y);
//cout << iAngle * 180 / M_PI << " aft " << endl;
//make sure its 0-90
if (iAngle > 1.5708)
iAngle = 1.5708 - (iAngle - 1.5708);
//move the ball back if it went past the line by the amount it passed it by
sf::Vector2f v = normalize(velocity.x + forceAcc.x, velocity.y + forceAcc.y);
float overlap = (radius - distance );
x -= v.x * overlap;
y -= v.y * overlap;
rotationSpeed = rSpeed;
//messing with changing the angle only if bounce is a certain height as this is what's causing the
//problem i think, however the ball slows down before it starts rolling making it look weird
if (collClock.getElapsedTime().asSeconds() > 0.01) {
sf::Vector2f newVel = ((velocity + Projection) + Projection);
forceAcc = newVel;
float e = elasticity * elasticity;
forceAcc *= e;
velocity = sf::Vector2f(0, 0);
}
//add gravity forces, since the force that goes along the line normal is cancelled out
//by an equal force from the line I guess it doesn't need to be added?
//accelerateI(sf::Vector2f(-sin(gravPerpendicular) * gravPerMag * gravity * mass, cos(gravPerpendicular) * gravPerMag * gravity * mass));
//accelerate(sf::Vector2f(sin(gravPerpendicular) * gravPerMag * gravity * mass, -cos(gravPerpendicular) * gravPerMag * gravity * mass));
//this one rolls it down the incline
accelerateI(sf::Vector2f(gravParallell.x * gravParMag * gravity * mass, gravParallell.y * gravParMag * gravity * mass ));
//trying wether subtracting gravity helps
//accelerateI(sf::Vector2f(0, -1 * gravity * mass));
//friction
//accelerateI(sf::Vector2f(gravParallell.x * forceAcc.x * friction * mass, gravParallell.y * forceAcc.y * friction * mass));
collClock.restart();
}
}
Thanks guys

Calculating if or not a 3D eyepoint is behind a 2D plane or upwards

The setup
Draw XY-coordinate axes on a piece of paper. Write a word on it along X-axis, so that the word's centerpoint is at origo (half on positive side of X/Y, the other half on negative side of X/Y).
Now, if you flip the paper upside down you'll notice that the word is mirrored in relation to both X- and Y-axis. If you look from behind the paper, it's mirrored in relation to Y-axis. If you look at it from behind and upside down, it's mirrored in relation to X-axis.
Ok, I have points in 2D-plane (vertices) that are created in similar way at the origo and I need to apply exactly the same rule for them. To make things interesting:
The 2D plane is actually 3D, each point (vertex) being (x, y, 0). Initially the vertices are positioned to the origo and their normal is Pn(0,0,1). => Correctly seen when looked at from point Pn towards origo.
The vertex-plane has it's own rotation matrix [Rp] and position P(x,y,z) in the 3D-world. The rotation is applied before positioning.
The 3D world is "right handed". The viewer would be looking towards origo from some distance along positive Z-axis but the world is also oriented by rotation matrix [Rw]. [Rw] * (0,0,1) would point directly to the viewer's eye.
From those I need to calculate when the vertex-plane should be mirrored and by which axis. The mirroring itself can be done before applying [Rp] and P by:
Vertices vertices = Get2DPlanePoints();
int MirrorX = 1; // -1 to mirror, 1 NOT to mirror
int MirrorY = 1; // -1 to mirror, 1 NOT to mirror
Matrix WorldRotation = GetWorldRotationMatrix();
MirrorX = GetMirrorXFactor(WorldRotation);
MirrorY = GetMirrorYFactor(WorldRotation);
foreach(Vertex v in vertices)
{
v.X = v.X * MirrorX * MirrorY;
v.Y = V.Y * MirrorY;
}
// Apply rotation...
// Add position...
The question
So I need GetMirrorXFactor() & ..YFactor() -functions that return -1 if the viewer's eyepoint is at greater "X/Y"-angle than +-90 degrees in relation to the vertex-plane's normal after the rotation and world orientation. I have already solved this, but I'm looking for more "elegant" mathematics. I know that rotation matrices somehow contain info about how much is rotated by which axis and I believe that can be utilized here.
My Solution for MirrorX:
// Matrix multiplications. Vectors are vertical matrices here.
Pnr = [Rp] * Pn // Rotated vertices's normal
Pur = [Rp] * (0,1,0) // Rotated vertices's "up-vector"
Wnr = [Rw] * (0,0,1) // Rotated eye-vector with world's orientation
// = vector pointing directly at the viewer's eye
// Use rotated up-vector as a normal some new plane and project viewer's
// eye on it. dot = dot product between vectors.
Wnrx = Wnr - (Wnr dot Pur) * Pur // "X-projected" eye.
// Calculate angle between eye's X-component and plane's rotated normal.
// ||V|| = V's norm.
angle = arccos( (Wnrx dot Pnr) / ( ||Wnrx|| * ||Pnr|| ) )
if (angle > PI / 2)
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
Solution for mirrorY can be done in similar way using viewer's up and vertex-plane's right -vectors.
Better solution?
if (([Rp]*(1,0,0)) dot ([Rw]*(1,0,0))) < 0
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
if (([Rp]*(0,1,0)) dot ([Rw]*(0,1,0))) < 0
MirrorY = -1; // DO mirror
else
MirrorY = 1; // DON'T mirror
Explaining in more detail is difficult without diagrams, but if you have trouble with this solution we can work through some cases.

Rotating a D3DXVECTOR3 around a specific point

This is probably a pretty simple thing but my knowledge of direct x is just not up to par with what I'm trying to achieve.
For the moment I am trying to create a vehicle that moves around on terrain. I am attempting to make the vehicle recognize the terrain by creating a square (4 D3DXVECTOR3 points) around the vehicle who's points each detect the height of the terrain and adjust the vehicle accordingly.
The vehicle is a simple object derived from Microsoft sample code. It has a world matrix, coordinates, rotations etc.
What I am trying to achieve is to make these points move along with the vehicle, turning when it does so they can detect the difference in height. This requires me to update the points each time the vehicle moves but I cannot for the life of me figure out how to get them to rotate properly.
So In summary I am looking for a simple way to rotate a vector about an origin (my vehicles coordinates).
These points are situated near the vehicle wheels so if it worked they would stay there regardless of the vehicles y -axis rotation.
Heres What Ive tryed:
D3DXVECTOR3 vec;
D3DXVec3TransformCoord(&vectorToHoldTransformation,&SquareTopLeftPoint,&matRotationY);
SquareTopLeftPoint = vec;
This resulted in the point spinning madly out of control and leaving the map.
xRot = VehicleCoordinateX + cos(RotationY) * (SquareTopleftX - VehicleCoordinateX) - sin(RotationY) * (SquareTopleftZ - VehicleCoordinateZ);
yRot = VehicleCoordinateZ + sin(RotationY) * (SquareTopleftX - VehicleCoodinateX) + cos(RotationY) * (SquareToplefteZ - VehicleCoordinateZ);
BoxPoint refers to the vector I am attempting to rotate.
Vehicle is of course the origin of rotation
RotationY is the amount it has rotated.
This is the code for 1 of 4 vectors in this square but I assume once I get 1 write the rest are just copy-paste.
No matter what I try the point either does not move or spirals out of control under leaving the map all-together.
Here is a snippet of my object class
class Something
{
public:
float x, y, z;
float speed;
float rx, ry, rz;
float sx, sy, sz;
float width;
float length;
float frameTime;
D3DXVECTOR3 initVecDir;
D3DXVECTOR3 currentVecDir;
D3DXMATRIX matAllRotations;
D3DXMATRIX matRotateX;
D3DXMATRIX matRotateY;
D3DXMATRIX matRotateZ;
D3DXMATRIX matTranslate;
D3DXMATRIX matWorld;
D3DXMATRIX matView;
D3DXMATRIX matProjection;
D3DXMATRIX matWorldViewProjection;
//these points represent a box that is used for collision with terrain.
D3DXVECTOR3 frontLeftBoxPoint;
D3DXVECTOR3 frontRightBoxPoint;
D3DXVECTOR3 backLeftBoxPoint;
D3DXVECTOR3 backRightBoxPoint;
}
I was thinking it might be possible to do this using D3DXVec3TransformCoord
D3DXMatrixTranslation(&matTranslate, origin.x,0,origin.z);
D3DXMatrixRotationY(&matRotateY, ry);
D3DXMatrixTranslation(&matTranslate2,width,0,-length);
matAllRotations = matTranslate * matRotateY * matTranslate2;
D3DXVECTOR3 newCoords;
D3DXVECTOR3 oldCoords = D3DXVECTOR3(x,y,z);
D3DXVec3TransformCoord(&newCoords, &oldCoords, &matAllRotations);
Turns out that what I need to do was
Translate by -origin.
rotate
Translate by origin.
What I was doing was
Move to origin
Rotate
Translate by length/width
Thought it was the same.
D3DXMATRIX matTranslate2;
D3DXMatrixTranslation(&matTranslate,-origin.x,0,-origin.z);
D3DXMatrixRotationY(&matRotateY,ry);
D3DXMatrixTranslation(&matTranslate2,origin.x,0,origin.z);
//D3DXMatrixRotationAxis(&matRotateAxis,&origin,ry);
D3DXMATRIX matAll = matTranslate * matRotateY * matTranslate2;
D3DXVECTOR4 newCoords;
D3DXVECTOR4 oldCoords = D3DXVECTOR4(x,y,z,1);
D3DXVec4Transform(&newCoords,&oldCoords,&matAll);
//D3DXVec4TransformCoord(&newCoords, &oldCoords, &matAll);
return newCoords;
Without knowing more about your code I can't say what it does exactly, however one 'easy' way to think about this problem if you know the angle of the heading of your vehicle in world coordinates is to represent your points in a manner such that the center of the vehicle is at the origin, use a simple rotation matrix to rotate it around the vehicle according to the heading, and then add your vehicle's center to the resulting coordinates.
x = vehicle_center_x + cos(heading) * corner_x - sin(heading) * corner_y
y = vehicle_center_y - sin(heading) * corner_x + cos(heading) * corner_y
Keep in mind that corner_x and corner_y are expressed in coordinates relative to the vehicle -- NOT relative to the world.

Why does my GLSL shader lighting move around the scene with the objects it's shining on?

I'm following a tutorial on OpenGL ES 2.0 and combining it with a tutorial on GLSL lighting that I found, using a handy Utah teapot from developer.apple.com.
After a lot of fiddling and experimentation I have the teapot drawn moderately correctly on the screen, spinning around all three axes with the 'toon shading' from the lighting tutorial working. There's a few glitches in the geometry due to me simply drawing the whole vertex list as triangle strips (if you look in the teapot.h file there are '-1' embedded where I'm supposed to start new triangle strips, but this is only test data and not relevant to my problem).
The bit I am really confused about is how to position a light in the scene. In my Objective-C code I have a float3 vector that contains {0,1,0} and pass that into the shader to then calculate the intensity of the light.
Why does the light appear to move in the scene too? What I mean is the light acts as though it's attached to the teapot by an invisible stick, always pointing at the same side of it no matter what direction the teapot is facing.
This is the vertex shader
attribute vec4 Position;
attribute vec4 SourceColor;
attribute vec3 Normal;
uniform mat4 Projection;
uniform mat4 Modelview;
varying vec3 normal;
void main(void) {
normal = Normal;
gl_Position = Projection * Modelview * Position;
}
'Position' is set by the Obj-C code and is the vertices for the object, 'Normal' is the list of normals both from a vertex array (VBO), 'Projection' and 'Modelview' are calculated like this:
(A CC3GLMatrix is from the Cocos3D library, mentioned in the GLES tutorial linked above)
CC3GLMatrix *projection = [CC3GLMatrix matrix];
float h = 4.0f * self.frame.size.height / self.frame.size.width;
[projection populateFromFrustumLeft:-2 andRight:2 andBottom:-h/2 andTop:h/2 andNear:1 andFar:100];
glUniformMatrix4fv(_projectionUniform, 1, 0, projection.glMatrix);
CC3GLMatrix *modelView = [CC3GLMatrix matrix];
[modelView populateFromTranslation:CC3VectorMake(0, 0, -7)];
[modelView scaleBy:CC3VectorMake(30, 30, 30)];
_currentRotation += displayLink.duration * 90;
[modelView rotateBy:CC3VectorMake(_currentRotation, _currentRotation, _currentRotation)];
glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix);
And I set the light in the scene by doing
float lightDir[] = {1,0,1};
glUniform3fv(_lightDirUniform, 1, lightDir);
The fragment shader looks like this
varying lowp vec4 DestinationColor; // 1
varying highp vec3 normal;
uniform highp vec3 LightDir;
void main(void) {
highp float intensity;
highp vec4 color;
intensity = dot(LightDir,normal);
if (intensity > 0.95)
color = vec4(1.0,0.5,0.5,1.0);
else if (intensity > 0.5)
color = vec4(0.6,0.3,0.3,1.0);
else if (intensity > 0.25)
color = vec4(0.4,0.2,0.2,1.0);
else
color = vec4(0.2,0.1,0.1,1.0);
gl_FragColor = color;
}
While trying to work this out I come across code that references the (non-existant in GLES) 'gl_LightSource' and 'gl_NormalMatrix' but don't know what to put into equivalents I have to pass into the shaders from my code. The references to 'eye space' 'camera space' 'world space' and so on are confusing, I know I should probably be converting things between them but don't understand why or how (and where - in code, or in the shader?)
Every frame do I need to modify the light source? The code I have for setting it looks too simplistic. I'm not really moving the teapot around, am I, instead I'm moving the entire scene - light and all around?
First of all some definitions:
world space: the space your whole world is defined in. By convention it is a static space that never moves.
view space/camera space/eye space: the space your camera is defined in. it is usually a position and rotation relative to world space
model space: the space your model is defined in. Like camera space, it is usually a position and rotation relative to world space
light space: same as model space
In simple examples (and i guess in your's) model space and world space are the same. In addition OpenGL by itself doesn't have a concept of world space, which doesn't mean you cannot use one. It comes in handy when you want to have more than one object moving around independently in your scene.
Now, what you are doing with your object before rendering is creating a matrix that transforms the vertices of a model into viewspace, hence 'modelViewMatrix'.
With light in this case it's a little different. Light calculation in your shader is done in modelspace, so you have to transform your lightposition every frame into modelspace.
This is done by calculating something like:
_lightDirUniform = inverseMatrix(model) * inverseMatrix(light) * lightPosition;
The lightposition is transformed from light into world and then into model space. If you don't have a world space, just leave out the model space transformation and you should be fine.

OpenGL ES 2.0 specifying normals for vertex shader

I am not able to get the right shading on all the faces of the cube I drew. I get a smooth transition from one of the face to another face which does not show the edge properly.
One a different face (where I get the desired edge), I get the shading such that it shows the two triangles which make up that face.
I believe the problem is with the normals I am specifying. I am attaching my vertex and normal matrix and my vertex and fragment shader code. vertex and normal matrix are same.
I guess the problem is with the normals but I have tried almost everything with them but the effect does not change.
//normal matrix and vertex matrix are same
static const float normals[]=
{
//v0,v1,v2,v3,
//v4,v5,v6,v7
0.5,0.5,0.5, -0.5,0.5,0.5, -0.5,-0.5,0.5, 0.5,-0.5,0.5,
0.5,0.5,-0.5, -0.5,0.5,-0.5, -0.5,-0.5,-0.5, 0.5,-0.5,-0.5
};
//vertex shader
attribute vec4 color;
attribute vec4 position;
attribute vec4 normal;
uniform mat4 u_mvpMatrix;
uniform vec4 lightDirection;
uniform vec4 lightDiffuseColor;
uniform float translate;
varying vec4 frontColor;
//varying vec4 colorVarying;
void main()
{
vec4 normalizedNormal = normalize(u_mvpMatrix* normal);
vec4 normalizedLightDirection = normalize(lightDirection);
float nDotL = max(dot(normalizedNormal, normalizedLightDirection), 0.0);
frontColor = color * nDotL * lightDiffuseColor;
gl_Position = u_mvpMatrix * position;
}
//fragment shader
varying lowp vec4 frontColor;
void main()
{
gl_FragColor = frontColor;
}
Please help? Thanks in advance !!
Your problem is not related to your normal matrix, it is related to your input normals.
From what I read, you provide 8 normals, one for each vertex, meaning that you provide pre-averaged normals ; there is no way your vertex shader can un-average them.
To have proper discontinuities in your lighting, you need discontinuous normals : for each quad, you have 4 vertices, each with a normal matching the quad normal. So you end up with 6x4 = 24 normals (so as many vertices).
You may want to take a look at this for more detailed explanation :
http://www.songho.ca/opengl/gl_vertexarray.html

Resources