i wonder if someone knows a source or a way to design a Gothic pointed arch with css or canvas.
Edit:
My attempt was to style the canvas quadraticCurveTo method to fit the gothic arc. But I failed badly and had not the guts to post it here :)
var canvas = document.getElementById('arch');
var context = canvas.getContext('2d');
context.beginPath();
context.moveTo(188, 150);
context.quadraticCurveTo(288, -100, 388, 150);
context.lineWidth = 20
context.strokeStyle = 'black';
context.stroke();
http://jsfiddle.net/zumgugger/ZaqJ5/
Just because this is quirky and interesting....
Some geometry for a gothic pointed arch can be found here :
Perhaps the most recognizable feature of gothic architecture is the pointed arch. The basic gothic arch is equilateral in construction and forms the basis of many variants.
The construction of the equilateral arch is thus:
From the drawing, the compass is set to the span, a-b. With x-y as the springing line, the compass is positioned at the junction of a-x/y and a curve from x/y-q is draw as shown. The procedure is repeated with the compass placed at the junction of b-x/y, with the point at which the curves join forming the rise p-q. Drawing straight lines from a-x/y to q and b-x/y to q it can be shown that the resulting triangle is equilateral in construction with all angles being 60°.
http://www.stonecarvingcourses.com/the-geometry-of-gothic-architecture
I've put together a small fiddle that does this. http://jsfiddle.net/7c7Vc/1/
If my understanding is correct that means (and since I am not a mathematician I'll describe this in laymans terms) that you need to draw two arcs with the compass centered on points x and y respectively, from the opposite point on the x to y line to the intersect point q, given the distance between points x and y as the radius width for your compass.
In the example I use the arc method to do this, here is a sample that will draw the right hand side arc of the arch...
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
Explanation
We center the compass on point x :
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
-------------
Set the radius of our circle to be the width of the arch (the distance between point x and point y)
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
---------
Start drawing from the direction of 3'oclock (which happens to be 0 radians)
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
--
Draw the arc until we hit point q, which in terms of the arc we are drawing is 30 degrees short of the direction of 12'oclock (using local variable _30degrees which holds the equivalent value in radians) and radians : 1.5*Math.PI for the direction 12'oclock.
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
------------------------
And we want to draw this arc counter-clockwise
ctx.arc(0, archHeight, archWidth, 0, 1.5*Math.PI + _30degrees, true);
----
The reverse method is used for the other arc making up the arch, take a look at the example for this
Notes on the code:
It uses some patterns to set up a factory that will create your arch based on either height or width, the returned arch knows how to draw itself on a canvas and has been given it's calculated height and width by the factory. If you prefer not to use this pattern you can extract the calculation bits out and simplify it.
Related
I need to be able to unproject a screen pixel into object space using Vulkan, but somewhere my math is going wrong.
Here is the shader as it stands today for reference:
void main()
{
//the depth of this pixel is between 0 and 1
vec4 obj_space = vec4( float(gl_FragCoord.x)/ubo.screen_width, float(gl_FragCoord.y)/ubo.screen_height, gl_FragCoord.z, 1.0f);
//this puts us in normalized device coordinates [-1,1 ] range
obj_space.xy = ( obj_space.xy * 2.0f ) -1.0f;
//this two lines will put is in object space coordinates
//mvp_inverse is derived from this in the c++ side:
//glm::inverse(app.three_d_camera->get_projection_matrix() * app.three_d_camera->view_matrix * model);
obj_space = ubo.mvp_inverse * obj_space;
obj_space.xyz /= obj_space.w;
//the resulting position here is wrong
out_color = obj_space;
}
when I output the position in color, the colors are off. I know I can simply pass in the object space position from the vertex shader to the fragment shader, but I'd like to understand why my math is not working, it will help me understand Vulkan and maybe learn a little math myself.
Thanks!
I'm not entirely sure what your problem is, but lets go over potential problems.
Remember, vulkan clip space is:
positive y = down,
positive x = right,
positive z = out,
centered at the middle of the screen.
Additionally, despite OpenGL's GLSL docs saying it is centered at the bottom left corner, in vulkan gl_FragCoord is centered at the top left corner.
in this step:
obj_space.xy = ( obj_space.xy * 2.0f ) -1.0f;
obj_space is now:
left x : -1.0
right x : 1.0
top y = -1.0
bottom y = 1.0
out z = 1.0
back z = 0
I'm almost entirely sure you don't mean your object space to have Y be negative at the top. The reasoning for y increasing starting from top to bottom is for images and textures, which on the CPU are ordered the same way, and now are ordered like that in vulkan.
Some other notes:
You claim your inverse is derivied from glm::inverse here:
glm::inverse(app.three_d_camera->get_projection_matrix() * app.three_d_camera->view_matrix * model);
But GLM uses OpenGL notation for matrix dimensions and handedness, and unless you force it to the correct coordinate system, it is going to assume right handed positive Y up, z negative out. You'll need to include the following #defines before it works correctly (or physically change your calculations to accommodate this).
#define GLM_FORCE_DEPTH_ZERO_TO_ONE
#define GLM_FORCE_LEFT_HANDED
Additionally you'll need to modify your matrices to account for the negative Y direction. Here is an example of how I've handled this in the past (modifying the perspective matrix directly):
ubo.model = glm::translate(glm::mat4(1.0f), glm::vec3(pos_x,pos_y,pos_z));
ubo.model *= glm::rotate(glm::mat4(1.0f), time * glm::radians(0.0f), glm::vec3(0.0f, 0.0f, 1.0f));
ubo.view = glm::lookAt(glm::vec3(0.0f, 0.0f, -10.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 1.0f, 0.0f));
ubo.proj = glm::perspective(glm::radians(45.0f), swapChainExtent.width / (float) swapChainExtent.height, 0.1f, 100.0f);
ubo.proj[1][1] *= -1; // makes the y axis projected to the same as vulkans
THREE.js Noob here.
I have a mesh that I want to rotate by selecting on one of its faces. Basically, I want to click on a face, and apply rotations to the mesh so that the face I clicked on faces the plane that the mesh is currently sitting on.
Here is a visualization of my problem:
I want to click on a face (the yellow triangle) and rotate the mesh so that the yellow triangle faces the plane that the mesh is currently sitting on. I do have normal vector of the face (i.e., myVector) and I want to apply rotations so that the normal vector would equal targetVector after.
I would like to find out how much I would have to rotate the mesh in EACH axis separately in order to achieve my goal.
Thank you in advance and please ask me if you require any more information!
You'll need to use a THREE.Quaternion, apply the vectors, and then read the resulting rotations through a THREE.Euler:
// Set starting and ending vectors
var myVector = new THREE.Vector3(0.1, 1.0, 0.1);
var targetVector = new THREE.Vector3(0, 0, -1);
// Normalize vectors to make sure they have a length of 1
myVector.normalize();
targetVector.normalize();
// Create a quaternion, and apply starting, then ending vectors
var quaternion = new THREE.Quaternion();
quaternion.setFromUnitVectors(myVector, targetVector);
// Quaternion now has rotation data within it.
// We'll need to get it out with a THREE.Euler()
var euler = new THREE.Euler();
euler.setFromQuaternion(quaternion);
console.log(euler.toArray());
// Resulting euler will have x, y, z rotations in radians:
//[
// 0: -1.6704649792860586,
// 1: 0.09917726107940236,
// 2: 0.10956980436233299,
// 3: "XYZ"
//]
Following up from my original post Three.JS Object following a spline path - rotation / tangent issues & constant speed issue, I am still having the issue that the object flips at certain points along the path.
View this happening on this fiddle: http://jsfiddle.net/jayfield1979/T2t59/7/
function moveBox() {
if (counter <= 1) {
box.position.x = spline.getPointAt(counter).x;
box.position.y = spline.getPointAt(counter).y;
tangent = spline.getTangentAt(counter).normalize();
axis.cross(up, tangent).normalize();
var radians = Math.acos(up.dot(tangent));
box.quaternion.setFromAxisAngle(axis, radians);
counter += 0.005
} else {
counter = 0;
}
}
The above code is what moves my objects along the defined spline path (an oval in this instance). It was mentioned by #WestLangley that: "Warning: cross product is not well-defined if the two vectors are parallel.".
As you can see, from the shape of the path, I am going to encounter a number of parallel vectors. Is there anything I can do to prevent this flipping from happening?
To answer the why question in the title. The reason its happening is that at some points on the curve the vector up (1,0,0) and the tangent are parallel. This means their cross product is zero and the construction of the quaternion fails.
You could follow WestLangley suggestion. You really want the up direction to be the normal to the plane the track is in.
Quaternion rotation is tricky to understand the setFromAxisAngle function rotates around the axis by a given angle.
If the track lies in the X-Y plane then we will want to rotate around the Z-axis. To find the angle use Math.atan2 to find the angle of the tangent
var angle = Math.atan2(tangent.y,tangent.x);
putting this together set
var ZZ = new THREE.Vector3( 0, 0, 1 );
and
tangent = spline.getTangentAt(counter).normalize();
var angle = Math.atan2(tangent.y,tangent.x);
box.quaternion.setFromAxisAngle(ZZ, angle);
If the track leaves the X-Y plane things will get trickier.
The setup
Draw XY-coordinate axes on a piece of paper. Write a word on it along X-axis, so that the word's centerpoint is at origo (half on positive side of X/Y, the other half on negative side of X/Y).
Now, if you flip the paper upside down you'll notice that the word is mirrored in relation to both X- and Y-axis. If you look from behind the paper, it's mirrored in relation to Y-axis. If you look at it from behind and upside down, it's mirrored in relation to X-axis.
Ok, I have points in 2D-plane (vertices) that are created in similar way at the origo and I need to apply exactly the same rule for them. To make things interesting:
The 2D plane is actually 3D, each point (vertex) being (x, y, 0). Initially the vertices are positioned to the origo and their normal is Pn(0,0,1). => Correctly seen when looked at from point Pn towards origo.
The vertex-plane has it's own rotation matrix [Rp] and position P(x,y,z) in the 3D-world. The rotation is applied before positioning.
The 3D world is "right handed". The viewer would be looking towards origo from some distance along positive Z-axis but the world is also oriented by rotation matrix [Rw]. [Rw] * (0,0,1) would point directly to the viewer's eye.
From those I need to calculate when the vertex-plane should be mirrored and by which axis. The mirroring itself can be done before applying [Rp] and P by:
Vertices vertices = Get2DPlanePoints();
int MirrorX = 1; // -1 to mirror, 1 NOT to mirror
int MirrorY = 1; // -1 to mirror, 1 NOT to mirror
Matrix WorldRotation = GetWorldRotationMatrix();
MirrorX = GetMirrorXFactor(WorldRotation);
MirrorY = GetMirrorYFactor(WorldRotation);
foreach(Vertex v in vertices)
{
v.X = v.X * MirrorX * MirrorY;
v.Y = V.Y * MirrorY;
}
// Apply rotation...
// Add position...
The question
So I need GetMirrorXFactor() & ..YFactor() -functions that return -1 if the viewer's eyepoint is at greater "X/Y"-angle than +-90 degrees in relation to the vertex-plane's normal after the rotation and world orientation. I have already solved this, but I'm looking for more "elegant" mathematics. I know that rotation matrices somehow contain info about how much is rotated by which axis and I believe that can be utilized here.
My Solution for MirrorX:
// Matrix multiplications. Vectors are vertical matrices here.
Pnr = [Rp] * Pn // Rotated vertices's normal
Pur = [Rp] * (0,1,0) // Rotated vertices's "up-vector"
Wnr = [Rw] * (0,0,1) // Rotated eye-vector with world's orientation
// = vector pointing directly at the viewer's eye
// Use rotated up-vector as a normal some new plane and project viewer's
// eye on it. dot = dot product between vectors.
Wnrx = Wnr - (Wnr dot Pur) * Pur // "X-projected" eye.
// Calculate angle between eye's X-component and plane's rotated normal.
// ||V|| = V's norm.
angle = arccos( (Wnrx dot Pnr) / ( ||Wnrx|| * ||Pnr|| ) )
if (angle > PI / 2)
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
Solution for mirrorY can be done in similar way using viewer's up and vertex-plane's right -vectors.
Better solution?
if (([Rp]*(1,0,0)) dot ([Rw]*(1,0,0))) < 0
MirrorX = -1; // DO mirror
else
MirrorX = 1; // DON'T mirror
if (([Rp]*(0,1,0)) dot ([Rw]*(0,1,0))) < 0
MirrorY = -1; // DO mirror
else
MirrorY = 1; // DON'T mirror
Explaining in more detail is difficult without diagrams, but if you have trouble with this solution we can work through some cases.
I want to know a way to flip an angle in a horizontal axis, without having to do many operations. Say I have an angle of 0 ("pointing right" in my code's coordinate system), the flipped angle should be 180 (pointing left). If 90 (pointing up), flipped it should still be 90. 89 is 91, and so on.
I can operate on the X/Y speeds implied by the angle but that would slow things down, and I feel it's not the proper way to go.
I don't know much math so I might be calling things by the wrong name...Can anyone help?
EDIT: Sorry I took long, I had to be out of the computer for long, OK...
http://img215.imageshack.us/img215/8095/screenshot031v.jpg
This screenshot might do.The above structure are two satellites and a beam linked to the white dot in the center. The two satellites should inherit the angle of the white dot (it's visible for debug purposes), so if it's aiming at an angle, they will follow. The satellite at the left is mirrored, so I calculated it with 180-angle as suggested, although it was my first try as well. As you can see it is not mirrored but flipped. And when the white dot rotates, it rotates backwards. The other does alright.
This is the angle recalculation for something linked to something else, pid would be the parent, and id the current. pin.ang is the angle offset copied when the object is linked to another, so it keeps position when rotated:
if(object[id].mirror)
object[id].angle = 180 - (object[id].pin.ang + object[pid].angle);
else
object[id].angle = object[id].pin.ang + object[pid].angle;
And this is the specific rotation part. OpenGL. the offx/y is for things rotated off-center, like the beam about to come out there, it renders everything else right.
glTranslatef(list[index[i]].x, list[index[i]].y, 0);
glRotatef(list[index[i]].angle, 0.0, 0.0, 1.0);
glTranslatef(list[index[i]].offx, -list[index[i]].offy, 0);
The rotation also seems to miss when the rotation speed (an integer added every redraw to the current angle, positive for rotating clockwise, like in this next one:
http://img216.imageshack.us/img216/7/screenshot032ulr.jpg
So it's definitely not 180-angle, despite how obvious it'd be. The mirroring is done by just reversing the texture coordinates so it doesn't affect angle. I am afraid it might be a quirk on the GL rotation thing.
The reflected amount (just looking at the maths) would be (180 - angle)
Angle | Reflection
------+-----------
0 | 180
90 | 90
89 | 91
91 | 89
360 | -180
270 | -90
Note the negatives if you fall below the "horizontal plane" - which you could leave as they are, or handle as a special case.
Isn't it simply
result = 180-(your angle)
As already explained, you find the opposite angle by subtracting your angle from 180 degrees. Eg:
180 - yourangle
Directly manipulating the X/Y speeds would not be very cumbersome. You simply reverse the direction of the X speed, by multiplying it by minus 1, example: speedx = (-1) * speedx. This would change the left-right direction, eg: something moving to the left would start moving to the right, and vice versa, and the vertical speed would be unaffected.
If you're using sine/cosine (sin/cos) to recalculate your X/Y speed components, then the *(-1) method would probably be more efficient. Ultimately it depends on the context of your program. If you're looking for a better solution, update your question with more details.
This solution is for -Y oriented angles (like a watch)! For +X orientation (like school math) you need to swap X and Y.
public static float FlipAngleX(float angle)
{
angle = NormalizeAngle(angle);
angle = TwoPi - angle;
return angle;
}
public static float FlipAngleY(float angle)
{
angle = NormalizeAngle(angle);
if (angle < Pi)
{
angle = Pi - angle;
}
else
{
angle = TwoPi - angle + Pi;
}
return angle;
}
/// <summary>
/// Keeps angle between 0 - Two Pi
/// </summary>
public static float NormalizeAngle(float angle)
{
if (angle < 0)
{
int backRevolutions = (int)(-angle / TwoPi);
return angle + TwoPi * (backRevolutions + 1);
}
else
{
return angle % TwoPi;
}
}
Aah, seems the problem came from negative numbers after all, I ensured them being positive and now the rotation does fine, I don't even need to recalculate angle...
Thanks to everyone, I ended up figuring out due to bits of every response.
to flip counter clockwise to clockwise (270 on right -> 90 on right)
angle - 360
--
to flip vertical (180 on top -> 0/360 on top)
Math.Normalize(angle - 180)
--
both:
float flipped_vertical = angle - 360
float flipped_vertical_and_horizontal = Math.Normalize(flipped_vertical- 180)
just 360-angle will flip your angle horizontaly but not verticaly