How can you get the index (counting from top-left) of an item on a grid given the (x,y) location?
NUM_COLS*x + y;
Where NUM_COLS is the number of columns in the grid.
Related
I have 2D array in form [[x1,y1],[x2,y2],....,[x1000,Y1000]]. I need to dived the XY plane to 100 (10*10) bins and then count the number of the elements in each bins. How can I do that? Also, I need to remove all empty bins too.
Thank you for your help.
I'm trying to iterate through a matrix called XY with 50 rows and 100 columns (divided into 50 pairs of X and Y values descending alongside each other) with a for loop:
for (i in 1:50){
slope=atan(((XY[i+1,2]-XY[i,2])/(XY[i+1,1]-XY[i,1]))*100)
}
So as you can see on top XY[i+1,2]-XY[i,2]), I'm trying to take y the i-th value and subtract it from the next one and iterate through the entire list for each consecutive pair of descending values and then divide that by that by the corresponding x increments to get the slope and convert that into an angle using atan(()*100).
Unfortunately it keeps telling me that XY[i+1,2] is "out of bounds" and I'm pretty sure I have equal brackets on each side of the equation.
I have a point coordinate based on two double values x0,y0 both in this format: xx.x (point as decimal separator)
In a database I have a list of lines that are defined by the coordinates x1,y1 as startpoint and x2,y2 as endpoint. Among other columns, (such as line thickness and so on) in the database there are these columns:
id | x1 | y1 | x2 | y2
What I would need is a query that returns whatever line has either the starpoint(x1,y1) or the endpoint(x2,y2) nearest to my basepoint (x0,y0). So the line that starts or ends nearest to my current position.
Thanks
SQLite has no square root function, but for comparing distances, we can just as well use the square of the distance:
SELECT *
FROM MyTable
ORDER BY min((x1-x0)*(x1-x0) + (y1-y0)*(y1-y0),
(x2-x0)*(x2-x0) + (y2-y0)*(y2-y0))
LIMIT 1
I need to find what column and row the mouse location is in. To simplify this question, lets only find the column. I will write in pseudocode.
I have a map (a grid of rows and columns, made up by square cells) with a pixel width. I have a cell size which makes up each columns pixel width.
eg map.width / cell size = map.NumberOfColumns.
From this we can get what column the mouse is on.
Eg if ( mouse.X > cellSize ) {col is definitely > 1} (i have not used zero indexing in this example).
So if anyone here loves maths, i would very much appreciate some help. Thanks.
Assuming square cells, 1-based row/col indexing, and truncating integer division:
col = mouse.X / cellSize + 1;
row = mouse.Y / cellSize + 1;
I'm defining a flat wall as a center pos (cx,cy,cz), a normal (nx,ny,nz), a vector pointing to the up-direction of the wall (ux,uy,uz) it's width and length (w,l). How do I find the position of it's 4 vertexes?
I'll assume that by length, you mean height. First, make sure that your up and normal vectors are normalized. You can multiply the up vector by the length, and add and subtract the result from the center to get the temporary results A and B, respectively.
Then, cross product the up vector with the normal vector to get the right vector (or left, depending on what order you do the cross product). Then multiply the right vector by the width, and add and subtract this from the center to get two more temporary results, C and D, respectively.
Finally, the four corners of the quad can be obtained by adding each of C and D to each of A and B.