How to calculate the start point and the end point of a circle based on points - math

I want bevel the sides of a rectangle in order to do so i want to draw a circle from the start point to the end point as shown in the image
while drawing a complete circle this is how i do it
float angle = 2.0f * M_PI * i / iSegments;
// vertex data
float x, y, z ,tx,ty ,tz;
x = cos(angle) * 50.0;
y = sin(angle) * 50.0;
z = 0.0;
How do we calculate the vertices for the circle in the given case ?

The center of the circle is radius away from the corner point, both in x and in y directions. In your example the radius is 30. To draw an arc of 90° (π/2 radians), you can start with the first angle and divide π/2 into as many segments as you want. The first angle depends on the which corner of the rectangle is treated. In the example of the upper right corner, the starting angle would be 0°.
Here is some code to illustrate the concept:
// draw a circular bevel to a rectangle, given are
// p_x, p_y: the coordinates of the corner, e.g. 50, 50
// rad: the radius of the bevel, e.g. 30
// dir_x: the direction to indicate whether the center of
// the circle lies lef (dir_y=-1) or right (dir_x=+1) of the corner
// dir_y: the direction to indicate whether the center of
// the circle lies lower (dir_y=-1) or higher (dir_y=+1) than the corner
void draw_circular_bevel (float p_x, float p_y, float rad, int dir_x, int dir_y)
{
float c_x, c_y; // the center of the circle
float start_angle; // the angle where to start the arc
c_x = p_x + rad * dir_x;
c_y = p_y + rad * dir_y;
if (dir_x == 1 and dir_y == 1)
start_angle = 0.0;
else if (dir_x == 1 and dir_y == -1)
start_angle = - M_PI * 0.5f;
else if (dir_x == -1 and dir_y == 1)
start_angle = M_PI * 0.5f;
else if (dir_x == -1 and dir_y == -1)
start_angle = - 2.0f * M_PI;
for (int i=0; i <= iSegments; ++i) {
float x, y;
float angle = start_angle + 0.5f * M_PI * i / (float)iSegments;
x = c_x + cos(angle) * rad;
y = c_y + sin(angle) * rad;
// here call code to draw a point or a segment at position x,y
}
}

Related

My raycaster renders walls in a really weird way depending on the map size I guess

I've been writing a raycaster in C++ and to render stuff I use GDI/GDI+. I know that using WGDI to render graphics is not the best idea in the world and I should probably use OpenGL, SFML and etc. but this raycaster does not involve any super-high-level real-time graphics, so in this case WGDI does the job. Besides I probably will be showing this in my school and installing OpenGL there would be a huge pain.
Okay, so the actual problem I wanted to talk about is that whenever I change the map grid from 8x8 to e.g. 8x16, the way that some walls are rendered is pretty bizzarre:
If someone can explain why such issue occurrs I would be very happy to discover what's wrong with my code.
main.cpp
/*
* Pseudo-code of the void renderer():
* Horizontal gridline check:
* Set horizontal distance to a pretty high value, horizontal coordinates to camera coordinates
* Calculate negative inverse of tangent
* Set DOF variable to 0
* If ray angle is bigger than PI calculate ray Y-coordinate to be as close as possible to the gridline position and subtract 0.0001 for precision, calculate ray X-coordinate and offset coordinates for the ray moovement over the gridline
* If ray angle is smaller than PI do the same as if ray angle < PI but add whatever the size of the map is to ray Y-coordinate
* If ray angle is straight up or down set ray coordinates to camera coordinates and DOF to map size
* Loop only if DOF is smaller than map size:
* Calculate actual gridline coordinates
* If the grid cell at [X, Y] is a wall break out from the loop, save the current ray coordinates, calculate the distance between the camera and the wall
* Else update ray coordinates with the earlier calculated offsets
*
* Vertical gridline check:
* Set vertical distance to a pretty high value, vertical coordinates to camera coordinates
* Calculate inverse of tangent
* Set DOF variable to 0
* If ray angle is bigger than PI / 2 and smaller than 3 * PI / 2 calculate ray X-coordinate to be as close as possible to the gridline position and subtract 0.0001 for precision, calculate ray Y-coordinate and offset coordinates for the ray moovement over the gridline
* If ray angle is smaller than PI / 2 or bigger than 3 * PI / 2 do the same as if ray angle > PI / 2 && < 3 * PI / 2 but add whatever the size of the map is to ray X-coordinate
* If ray angle is straight left or right set ray coordinates to camera coordinates and DOF to map size
* Loop only if DOF is smaller than map size:
* Calculate actual gridline coordinates
* If the grid cell at [X, Y] is a wall break out from the loop, save the current ray coordinates, calculate the distance between the camera and the wall
* Else update ray coordinates with the earlier calculated offsets
*
* If the vertical distance is smaller than the horizontal one update ray coordinates to the horizontal ones and set final distance to the horizontal one
* Else update ray coordinates to the vertical ones and set final distance to the vertical one
* Fix fisheye effect
* Add one radian to the ray angle
* Calculate line height by multiplying constant integer 400 by the map size and dividing that by the final distance
* Calculate line offset (to make it more centered) by subtracting half of the line height from constant integer 400
* Draw 8-pixels wide column at [ray index * 8, camera Z-offset + line offset] and [ray index * 8, camera Z-offset + line offset + line height] (the color doesn't matter i think)
*/
#include "../../LIB/wsgl.hpp"
#include "res/maths.hpp"
#include <memory>
using namespace std;
const int window_x = 640, window_y = 640;
float camera_x = 256, camera_y = 256, camera_z = 75;
float camera_a = 0.001;
int camera_fov = 80;
int map_x;
int map_y;
int map_s;
shared_ptr<int[]> map_w;
void controls()
{
if(wsgl::is_key_down(wsgl::key::w))
{
int mx = (camera_x + 30 * cos(camera_a)) / map_s;
int my = (camera_y + 30 * sin(camera_a)) / map_s;
int mp = my * map_x + mx;
if(mp >= 0 && mp < map_s && !map_w[mp])
{camera_x += 15 * cos(camera_a); camera_y += 15 * sin(camera_a);}
}
if(wsgl::is_key_down(wsgl::key::s))
{
int mx = (camera_x - 30 * cos(camera_a)) / map_s;
int my = (camera_y - 30 * sin(camera_a)) / map_s;
int mp = my * map_x + mx;
if(mp >= 0 && mp < map_s && !map_w[mp])
{camera_x -= 5 * cos(camera_a); camera_y -= 5 * sin(camera_a);}
}
if(wsgl::is_key_down(wsgl::key::a_left))
{camera_a = reset_ang(camera_a - 5 * RAD);}
if(wsgl::is_key_down(wsgl::key::a_right))
{camera_a = reset_ang(camera_a + 5 * RAD);}
if(wsgl::is_key_down(wsgl::key::a_up))
{camera_z += 15;}
if(wsgl::is_key_down(wsgl::key::a_down))
{camera_z -= 15;}
}
void renderer()
{
int map_x_pos, map_y_pos, map_cell, dof;
float ray_x, ray_y, ray_a = reset_ang(camera_a - deg_to_rad(camera_fov / 2));
float x_offset, y_offset, tangent, distance_h, distance_v, h_x, h_y, v_x, v_y;
float final_distance, line_height, line_offset;
wsgl::clear_window();
for(int i = 0; i < camera_fov; i++)
{
distance_h = 1000000, h_x = camera_x, h_y = camera_y;
tangent = -1 / tan(ray_a);
dof = 0;
if(ray_a > PI)
{ray_y = (((int)camera_y / map_s) * map_s) - 0.0001; ray_x = (camera_y - ray_y) * tangent + camera_x; y_offset = -map_s; x_offset = -y_offset * tangent;}
if(ray_a < PI)
{ray_y = (((int)camera_y / map_s) * map_s) + map_s; ray_x = (camera_y - ray_y) * tangent + camera_x; y_offset = map_s; x_offset = -y_offset * tangent;}
if(ray_a == 0 || ray_a == PI)
{ray_x = camera_x; ray_y = camera_y; dof = map_s;}
for(dof; dof < map_s; dof++)
{
map_x_pos = (int)(ray_x) / map_s;
map_y_pos = (int)(ray_y) / map_s;
map_cell = map_y_pos * map_x + map_x_pos;
if(map_cell >= 0 && map_cell < map_s && map_w[map_cell])
{dof = map_s; h_x = ray_x; h_y = ray_y; distance_h = distance(camera_x, camera_y, h_x, h_y);}
else
{ray_x += x_offset; ray_y += y_offset;}
}
distance_v = 1000000, v_x = camera_x, v_y = camera_y;
tangent = -tan(ray_a);
dof = 0;
if(ray_a > PI2 && ray_a < PI3)
{ray_x = (((int)camera_x / map_s) * map_s) - 0.0001; ray_y = (camera_x - ray_x) * tangent + camera_y; x_offset = -map_s; y_offset = -x_offset * tangent;}
if(ray_a < PI2 || ray_a > PI3)
{ray_x = (((int)camera_x / map_s) * map_s) + map_s; ray_y = (camera_x - ray_x) * tangent + camera_y; x_offset = map_s; y_offset = -x_offset * tangent;}
if(ray_a == PI2 || ray_a == PI3)
{ray_x = camera_x; ray_y = camera_y; dof = map_s;}
for(dof; dof < map_s; dof++)
{
map_x_pos = (int)(ray_x) / map_s;
map_y_pos = (int)(ray_y) / map_s;
map_cell = map_y_pos * map_x + map_x_pos;
if(map_cell >= 0 && map_cell < map_s && map_w[map_cell])
{dof = map_s; v_x = ray_x; v_y = ray_y; distance_v = distance(camera_x, camera_y, v_x, v_y);}
else
{ray_x += x_offset; ray_y += y_offset;}
}
if(distance_v < distance_h)
{ray_x = v_x; ray_y = v_y; final_distance = distance_v;}
else
{ray_x = h_x; ray_y = h_y; final_distance = distance_h;}
final_distance *= cos(reset_ang(camera_a - ray_a));
ray_a = reset_ang(ray_a + RAD);
line_height = (map_s * 400) / final_distance;
line_offset = 200 - line_height / 2;
wsgl::draw_line({i * 8, camera_z + line_offset}, {i * 8, camera_z + line_offset + line_height}, {0, 255 / (final_distance / 250 + 1), 0}, 8);
if(i == camera_fov / 2)
{wsgl::draw_text({0, 0}, {255, 255, 255}, L"Final distance: " + to_wstring(final_distance) + L" Line height: " + to_wstring(line_height) + L" X: " + to_wstring(camera_x) + L" Y: " + to_wstring(camera_y));}
}
wsgl::render_frame();
}
void load_map(wsgl::wide_str wstr, int cell_size = 1)
{
shared_ptr<wsgl::bmp> map = shared_ptr<wsgl::bmp>(wsgl::bmp::FromFile(wstr.c_str(), true));
map_x = map->GetWidth();
map_y = map->GetHeight();
map_s = map_x * map_y;
map_w = shared_ptr<int[]>(new int[map_s]);
wsgl::color color;
for(int y = 0; y < map_y; y += cell_size)
{
for(int x = 0; x < map_x; x += cell_size)
{
map->GetPixel(x, y, &color);
if(color.GetR() == 255 && color.GetG() == 255 && color.GetB() == 255)
{*(map_w.get() + ((y / cell_size) * map_x + (x / cell_size))) = 0;}
else
{*(map_w.get() + ((y / cell_size) * map_x + (x / cell_size))) = 1;}
}
}
}
int main()
{
wsgl::session sess = wsgl::startup(L"raycaster", {window_x, window_y});
load_map(L"res/map.png");
while(true)
{controls(); renderer();}
}
maths.hpp
#include <cmath>
const float PI = 3.14159265359;
const float PI2 = PI / 2;
const float PI3 = 3 * PI2;
const float RAD = PI / 180;
float deg_to_rad(float deg)
{return deg * RAD;}
float distance(float ax, float ay, float bx, float by)
{
float dx = bx - ax;
float dy = by - ay;
return sqrt(dx * dx + dy * dy);
}
float reset_ang(float ang)
{
if(ang < 0)
{ang += 2 * PI;}
if(ang > 2 * PI)
{ang -= 2 * PI;}
return ang;
}
If someone asks whats wsgl.hpp thats just my wrapper library over some WGDI routines and etc.
I think the problem lies here:
map_x_pos = (int)(ray_x) / map_s;
map_y_pos = (int)(ray_y) / map_s;
map_cell = map_y_pos * map_x + map_x_pos;
You need to change the order of operations:
map_x_pos = (int)(ray_x / map_s);
map_y_pos = (int)(ray_y / map_s);
map_cell = map_y_pos * map_x + map_x_pos;
With your current implementation, you first truncate ray_x and ray_y, then divide by map_s (which should probably be a floating point value, but is an integer in your current implementation), then truncate again to integer values. Your current implementation needlessly sacrifices precision and will be unpredictable for small map_s values.
Additionally, map_s seems incorrect. You set map_s to represent the total area of your map, but in the above code, you use it like it was the side length of the map.
To be correct, you would need something like
#include <cmath>
map_x_pos = (int)(ray_x / sqrtf(map_s));
map_y_pos = (int)(ray_y / sqrtf(map_s));
map_cell = map_y_pos * map_x + map_x_pos;

Visual Bug with Quaternion LookAt Rotation

I have the following function:
Quaternion Quaternion::LookAt(Vector3f direction, Vector3f forward, Vector3f up) {
Quaternion rot1 = RotationBetweenVectors(forward, direction);
Vector3f right = Vector3f::CrossProduct(direction, up);
up = Vector3f::CrossProduct(right, direction);
Vector3f realUp(0, 1, 0);
Vector3f newUp = rot1 * realUp;
Quaternion rot2 = RotationBetweenVectors(newUp, up);
Quaternion res = rot2 * rot1;
return Quaternion(res.x, res.y, res.z, res.w);
}
And the other function:
Quaternion Quaternion::RotationBetweenVectors(Vector3f forward, Vector3f direction) {
forward = Vector3f::Normalize(forward);
direction = Vector3f::Normalize(direction);
float cosTheta = Vector3f::DotProduct(forward, direction);
Vector3f axis;
if (cosTheta < -1 + 0.001f) {
// special case when vectors in opposite directions:
// there is no "ideal" rotation axis
// So guess one; any will do as long as it's perpendicular to start
axis = Vector3f::CrossProduct(Vector3f(0.0f, 0.0f, 1.0f), forward);
if (axis.Length() * axis.Length() < 0.01)
axis = Vector3f::CrossProduct(Vector3f(1.0f, 0.0f, 0.0f), forward);
axis = Vector3f::Normalize(axis);
return Quaternion(axis.x, axis.y, axis.z, DegreesToRadians(180));
}
axis = Vector3f::CrossProduct(forward, direction);
float s = sqrt((1 + cosTheta) * 2);
float invs = 1 / s;
return Quaternion(
axis.x * invs,
axis.y * invs,
axis.z * invs,
s * 0.5f
);
}
These functions work very well, but a visual bug appears where the object is stretched out when the diretion and forward vectors point in opposite directions, as it is said in the comment in the second function.
I followed this: http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-17-quaternions/ tutorial (under the section "How do I find the rotation between 2 vectors?")
And it suggests the code I have implemented.
As said, a visual bug appears, so is there any better way of doing it?

How can I extract the angle from a function that determines if a circle is colliding with a line?

So I have the following code that works fine
I've taken it from https://codereview.stackexchange.com/questions/192477/circle-line-segment-collision
I am now trying to figure out where along this function I can determine the angle in which the circle is hitting the line. Instead of returning true I'd like it to return the angle, since I didn't write the function myself going through it to try and figure this out myself has been a bit of a struggle. Wondering if someone excellent at math can help me figure this out at first glance. Thank you!
// Function to check intercept of line seg and circle
// A,B end points of line segment
// C center of circle
// radius of circle
// returns true if touching or crossing else false
function doesLineInterceptCircle(A, B, C, radius) {
var dist;
const v1x = B.x - A.x;
const v1y = B.y - A.y;
const v2x = C.x - A.x;
const v2y = C.y - A.y;
// get the unit distance along the line of the closest point to
// circle center
const u = (v2x * v1x + v2y * v1y) / (v1y * v1y + v1x * v1x);
// if the point is on the line segment get the distance squared
// from that point to the circle center
if(u >= 0 && u <= 1){
dist = (A.x + v1x * u - C.x) ** 2 + (A.y + v1y * u - C.y) ** 2;
} else {
// if closest point not on the line segment
// use the unit distance to determine which end is closest
// and get dist square to circle
dist = u < 0 ?
(A.x - C.x) ** 2 + (A.y - C.y) ** 2 :
(B.x - C.x) ** 2 + (B.y - C.y) ** 2;
}
return dist < radius * radius;
}

Get the X,Y coordinate at the edge of a rhombus, based on the rhombus' width and angle

I'm developing a Processing sketch that, given a certain angle, draws a dot at the edge of a rhombus.
I know the width of the rhombus, and its position, but I'm not sure how to calculate the x-y coordinates of a dot resting at its edge.
Are there any elegant solutions for this problem? Any help in pseudocode would be welcomed.
Let's square side length is A, half-length is H = A/2. Angle Theta. Intersection point P.
All coordinates are relative to the square center.
Rotate square by -Pi/4, angle Alpha = Theta - Pi/4
if Alpha lies in range -Pi/4..Pi/4, then intersection point P' = (H, H*Tan(Alpha))
if Alpha lies in range Pi/4..3*Pi/4, then P' = (H*Cotangent(Alpha), H)
if Alpha lies in range 3*Pi/4..5*Pi/4, then P' = (-H, -H*Tan(Alpha))
if Alpha lies in range 5*Pi/4..7*Pi/4, then P' = (-H*Cotangent(Alpha), -H)
Then rotate point P' back by Pi/4:
S = Sqrt(2)/2
P.X = S * (P'.X - P'.Y)
P.Y = S * (P'.X + P'.Y)
Example (data like your sketch):
A = 200, Theta = 5*Pi/12
H = 200/2 = 100, Alpha =Theta-Pi/4 = Pi/6
P'.X = H = 100
P'.Y = H * Tan(Alpha) = 100 * Tan(Pi/6) ~= 57.7
S = 0.707
P.X = 0.707 * (100 - 57.7) = 30
P.Y = 0.707 * (100 + 57.7) = 111
Based on your image, you want to find the intersection of two equations, that of the line at angle θ, and that of the side of the square with which it intersects.
Assuming the size of your square is n, the equation of the square is y=±(n*(√2/2))±x (by Pythagoras' theorem). The equation for the side you intersect in your image is y=n*(√2/2)-x.
The equation of the radial line can be calculated using trigonometry to be y=tan(θ)*x, with θ expressed in radians.
You can then solve this as a simultaneous equation to determine the intersection. Please note that it will intersect with both sides of the square (both above and below), so if you only want the one you will have to choose the equation for the correct side of the square. Also guard against the case where θ is π/2, as tan(π/2) is undefined. You can easily work out that case, as x=0 and so it will always intersect at y=±(n*(√2/2)).
In your example, the intersection occurs when x*(1+tan(θ))=n*(√n/n), or x=(n*(√n/n))/(1+tan(θ)). You can calculate that, plug it back into y and that is your (x,y) intersection.
Imagine a circle with a larger radius that will intersect your rhombus at the points you want. One way to draw at that location is to use a nested coordinate system that you translate and rotate. All you need to know is the radius and the angle.
Here's a very basic example:
float angle = radians(-80.31);
float radius = 128;
float centerX,centerY;
void setup(){
size(320,320);
noFill();
rectMode(CENTER);
centerX = width * 0.5;
centerY = height * 0.5;
}
void draw(){
background(255);
noFill();
//small circle
strokeWeight(1);
stroke(95,105,120);
ellipse(centerX,centerY,210,210);
rhombus(centerX,centerY,210);
//large circle
strokeWeight(3);
stroke(95,105,120);
ellipse(centerX,centerY,radius * 2,radius * 2);
//line at angle
pushMatrix();
translate(centerX,centerY);
rotate(angle);
stroke(162,42,32);
line(0,0,radius,0);
popMatrix();
//debug
fill(0);
text("angle: " + degrees(angle),10,15);
}
void rhombus(float x,float y,float size){
pushMatrix();
translate(x,y);
rotate(radians(45));
rect(0,0,size,size);
popMatrix();
}
void mouseDragged(){
angle = atan2(centerY-mouseY,centerX-mouseX)+PI;
}
You can try a demo here(you can drag the mouse to change the angle):
var angle;
var radius = 128;
var centerX,centerY;
function setup(){
createCanvas(320,320);
noFill();
rectMode(CENTER);
angle = radians(-80.31);
centerX = width * 0.5;
centerY = height * 0.5;
}
function draw(){
background(255);
noFill();
//small circle
strokeWeight(1);
stroke(95,105,120);
ellipse(centerX,centerY,210,210);
rhombus(centerX,centerY,210);
//large circle
strokeWeight(3);
stroke(95,105,120);
ellipse(centerX,centerY,radius * 2,radius * 2);
//line at angle
push();
translate(centerX,centerY);
rotate(angle);
stroke(162,42,32);
line(0,0,radius,0);
pop();
//debug
fill(0);
noStroke();
text("angle: " + degrees(angle),10,15);
}
function rhombus(x,y,size){
push();
translate(x,y);
rotate(radians(45));
rect(0,0,size,size);
pop();
}
function mouseDragged(){
angle = atan2(centerY-mouseY,centerX-mouseX)+PI;
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/p5.js/0.5.0/p5.min.js"></script>
If you want to calculate the position, you can use the polar to cartesian coordinate conversion formula:
x = cos(angle) * radius
y = sin(angle) * radius
Here's an example using that. Note that drawing is done from the centre, therefore the centre coordinates are added to the above:
float angle = radians(-80.31);
float radius = 128;
float centerX,centerY;
void setup(){
size(320,320);
noFill();
rectMode(CENTER);
centerX = width * 0.5;
centerY = height * 0.5;
}
void draw(){
background(255);
noFill();
//small circle
strokeWeight(1);
stroke(95,105,120);
ellipse(centerX,centerY,210,210);
rhombus(centerX,centerY,210);
//large circle
strokeWeight(3);
stroke(95,105,120);
ellipse(centerX,centerY,radius * 2,radius * 2);
//line at angle
float x = centerX+(cos(angle) * radius);
float y = centerX+(sin(angle) * radius);
stroke(162,42,32);
line(centerX,centerY,x,y);
//debug
fill(0);
text("angle: " + degrees(angle),10,15);
}
void rhombus(float x,float y,float size){
pushMatrix();
translate(x,y);
rotate(radians(45));
rect(0,0,size,size);
popMatrix();
}
void mouseDragged(){
angle = atan2(centerY-mouseY,centerX-mouseX)+PI;
}
Another option would be using transformation matrices

correct glsl affine texture mapping

i'm trying to code correct 2D affine texture mapping in GLSL.
Explanation:
...NONE of this images is correct for my purposes. Right (labeled Correct) has perspective correction which i do not want. So this: Getting to know the Q texture coordinate solution (without further improvements) is not what I'm looking for.
I'd like to simply "stretch" texture inside quadrilateral, something like this:
but composed from two triangles. Any advice (GLSL) please?
This works well as long as you have a trapezoid, and its parallel edges are aligned with one of the local axes. I recommend playing around with my Unity package.
GLSL:
varying vec2 shiftedPosition, width_height;
#ifdef VERTEX
void main() {
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
shiftedPosition = gl_MultiTexCoord0.xy; // left and bottom edges zeroed.
width_height = gl_MultiTexCoord1.xy;
}
#endif
#ifdef FRAGMENT
uniform sampler2D _MainTex;
void main() {
gl_FragColor = texture2D(_MainTex, shiftedPosition / width_height);
}
#endif
C#:
// Zero out the left and bottom edges,
// leaving a right trapezoid with two sides on the axes and a vertex at the origin.
var shiftedPositions = new Vector2[] {
Vector2.zero,
new Vector2(0, vertices[1].y - vertices[0].y),
new Vector2(vertices[2].x - vertices[1].x, vertices[2].y - vertices[3].y),
new Vector2(vertices[3].x - vertices[0].x, 0)
};
mesh.uv = shiftedPositions;
var widths_heights = new Vector2[4];
widths_heights[0].x = widths_heights[3].x = shiftedPositions[3].x;
widths_heights[1].x = widths_heights[2].x = shiftedPositions[2].x;
widths_heights[0].y = widths_heights[1].y = shiftedPositions[1].y;
widths_heights[2].y = widths_heights[3].y = shiftedPositions[2].y;
mesh.uv2 = widths_heights;
I recently managed to come up with a generic solution to this problem for any type of quadrilateral. The calculations and GLSL maybe of help. There's a working demo in java (that runs on Android), but is compact and readable and should be easily portable to unity or iOS: http://www.bitlush.com/posts/arbitrary-quadrilaterals-in-opengl-es-2-0
In case anyone's still interested, here's a C# implementation that takes a quad defined by the clockwise screen verts (x0,y0) (x1,y1) ... (x3,y3), an arbitrary pixel at (x,y) and calculates the u and v of that pixel. It was originally written to CPU-render an arbitrary quad to a texture, but it's easy enough to split the algorithm across CPU, Vertex and Pixel shaders; I've commented accordingly in the code.
float Ax, Bx, Cx, Dx, Ay, By, Cy, Dy, A, B, C;
//These are all uniforms for a given quad. Calculate on CPU.
Ax = (x3 - x0) - (x2 - x1);
Bx = (x0 - x1);
Cx = (x2 - x1);
Dx = x1;
Ay = (y3 - y0) - (y2 - y1);
By = (y0 - y1);
Cy = (y2 - y1);
Dy = y1;
float ByCx_plus_AyDx_minus_BxCy_minus_AxDy = (By * Cx) + (Ay * Dx) - (Bx * Cy) - (Ax * Dy);
float ByDx_minus_BxDy = (By * Dx) - (Bx * Dy);
A = (Ay*Cx)-(Ax*Cy);
//These must be calculated per-vertex, and passed through as interpolated values to the pixel-shader
B = (Ax * y) + ByCx_plus_AyDx_minus_BxCy_minus_AxDy - (Ay * x);
C = (Bx * y) + ByDx_minus_BxDy - (By * x);
//These must be calculated per-pixel using the interpolated B, C and x from the vertex shader along with some of the other uniforms.
u = ((-B) - Mathf.Sqrt((B*B-(4.0f*A*C))))/(A*2.0f);
v = (x - (u * Cx) - Dx)/((u*Ax)+Bx);
Tessellation solves this problem. Subdividing quad vertex adds hints to interpolate pixels.
Check out this link.
https://www.youtube.com/watch?v=8TleepxIORU&feature=youtu.be
I had similar question ( https://gamedev.stackexchange.com/questions/174857/mapping-a-texture-to-a-2d-quadrilateral/174871 ) , and at gamedev they suggested using imaginary Z coord, which I calculate using the following C code, which appears to be working in general case (not just trapezoids):
//usual euclidean distance
float distance(int ax, int ay, int bx, int by) {
int x = ax-bx;
int y = ay-by;
return sqrtf((float)(x*x + y*y));
}
void gfx_quad(gfx_t *dst //destination texture, we are rendering into
,gfx_t *src //source texture
,int *quad // quadrilateral vertices
)
{
int *v = quad; //quad vertices
float z = 20.0;
float top = distance(v[0],v[1],v[2],v[3]); //top
float bot = distance(v[4],v[5],v[6],v[7]); //bottom
float lft = distance(v[0],v[1],v[4],v[5]); //left
float rgt = distance(v[2],v[3],v[6],v[7]); //right
// By default all vertices lie on the screen plane
float az = 1.0;
float bz = 1.0;
float cz = 1.0;
float dz = 1.0;
// Move Z from screen, if based on distance ratios.
if (top<bot) {
az *= top/bot;
bz *= top/bot;
} else {
cz *= bot/top;
dz *= bot/top;
}
if (lft<rgt) {
az *= lft/rgt;
cz *= lft/rgt;
} else {
bz *= rgt/lft;
dz *= rgt/lft;
}
// draw our quad as two textured triangles
gfx_textured(dst, src
, v[0],v[1],az, v[2],v[3],bz, v[4],v[5],cz
, 0.0,0.0, 1.0,0.0, 0.0,1.0);
gfx_textured(dst, src
, v[2],v[3],bz, v[4],v[5],cz, v[6],v[7],dz
, 1.0,0.0, 0.0,1.0, 1.0,1.0);
}
I'm doing it in software to scale and rotate 2d sprites, and for OpenGL 3d app you will need to do it in pixel/fragment shader, unless you will be able to map these imaginary az,bz,cz,dz into your actual 3d space and use the usual pipeline. DMGregory gave exact code for OpenGL shaders: https://gamedev.stackexchange.com/questions/148082/how-can-i-fix-zig-zagging-uv-mapping-artifacts-on-a-generated-mesh-that-tapers
I came up with this issue as I was trying to implement a homography warping in OpenGL. Some of the solutions that I found relied on a notion of depth, but this was not feasible in my case since I am working on 2D coordinates.
I based my solution on this article, and it seems to work for all cases that I could try. I am leaving it here in case it is useful for someone else as I could not find something similar. The solution makes the following assumptions:
The vertex coordinates are the 4 points of a quad in Lower Right, Upper Right, Upper Left, Lower Left order.
The coordinates are given in OpenGL's reference system (range [-1, 1], with origin at bottom left corner).
std::vector<cv::Point2f> points;
// Convert points to homogeneous coordinates to simplify the problem.
Eigen::Vector3f p0(points[0].x, points[0].y, 1);
Eigen::Vector3f p1(points[1].x, points[1].y, 1);
Eigen::Vector3f p2(points[2].x, points[2].y, 1);
Eigen::Vector3f p3(points[3].x, points[3].y, 1);
// Compute the intersection point between the lines described by opposite vertices using cross products. Normalization is only required at the end.
// See https://leimao.github.io/blog/2D-Line-Mathematics-Homogeneous-Coordinates/ for a quick summary of this approach.
auto line1 = p2.cross(p0);
auto line2 = p3.cross(p1);
auto intersection = line1.cross(line2);
intersection = intersection / intersection(2);
// Compute distance to each point.
for (const auto &pt : points) {
auto distance = std::sqrt(std::pow(pt.x - intersection(0), 2) +
std::pow(pt.y - intersection(1), 2));
distances.push_back(distance);
}
// Assumes same order as above.
std::vector<cv::Point2f> texture_coords_unnormalized = {
{1.0f, 1.0f},
{1.0f, 0.0f},
{0.0f, 0.0f},
{0.0f, 1.0f}
};
std::vector<float> texture_coords;
for (int i = 0; i < texture_coords_unnormalized.size(); ++i) {
float u_i = texture_coords_unnormalized[i].x;
float v_i = texture_coords_unnormalized[i].y;
float d_i = distances.at(i);
float d_i_2 = distances.at((i + 2) % 4);
float scale = (d_i + d_i_2) / d_i_2;
texture_coords.push_back(u_i*scale);
texture_coords.push_back(v_i*scale);
texture_coords.push_back(scale);
}
Pass the texture coordinates to your shader (use vec3). Then:
gl_FragColor = vec4(texture2D(textureSampler, textureCoords.xy/textureCoords.z).rgb, 1.0);
thanks for answers, but after experimenting i found a solution.
two triangles on the left has uv (strq) according this and two triangles on the right are modifed version of this perspective correction.
Numbers and shader:
tri1 = [Vec2(-0.5, -1), Vec2(0.5, -1), Vec2(1, 1)]
tri2 = [Vec2(-0.5, -1), Vec2(1, 1), Vec2(-1, 1)]
d1 = length of top edge = 2
d2 = length of bottom edge = 1
tri1_uv = [Vec4(0, 0, 0, d2 / d1), Vec4(d2 / d1, 0, 0, d2 / d1), Vec4(1, 1, 0, 1)]
tri2_uv = [Vec4(0, 0, 0, d2 / d1), Vec4(1, 1, 0, 1), Vec4(0, 1, 0, 1)]
only right triangles are rendered using this glsl shader (on left is fixed pipeline):
void main()
{
gl_FragColor = texture2D(colormap, vec2(gl_TexCoord[0].x / glTexCoord[0].w, gl_TexCoord[0].y);
}
so.. only U is perspective and V is linear.

Resources