Generating a 3D prism from any 2D polygon - math

I am creating a 2D sprite game in Unity, which is a 3D game development environment.
I have constrained all translation of objects to the XY-plane and rotation to the Z-axis.
My problem is that the meshes that are used to detect collisions between objects must still be in 3D. I have the need to detect collisions between the player object (a capsule collider) and a sprite (that has its collision volume defined by a polygonal prism).
I am currently writing the level editor and I have the need to let the user define the collision area for any given tile. In the image below the user clicks the points P1, P2, P3, P4 in that order.
Obviously the points join up to form a quadrilateral. This is the collision area I want, however I must then convert that to a 3D mesh. Basically I need to generate an extrusion of the polygon, then assign the vertex winding and triangles etc. The vertex positions is not a problem to figure out as it is merely a translation of the polygon down the z-axis.
I am having trouble creating an algorithm for assigning the winding order of the vertices, especially since the mesh must consist only of triangles.
Obviously the structure I have illustrated is not important, the polygon may be any 2d shape and will always need to form a prism.
Does anyone know any methods for this?
Thank you all very much for your time.

A simple algorithm that comes to mind is something like this:
extrudedNormal = faceNormal.multiplyScale(sizeOfExtrusion);//multiply the face normal by the extrusion amt. = move along normal
for each(vertex in face){
vPrime = vertex.clone();//copy the position of each vertex to a new object to be modified later
vPrime.addSelf(extrudedNormal);//add translation in the direction of the normal, with the amt. used in the
}
So the idea is basic:
clone the face normal and move it in
the same direction by the amt. you
want to extrude by
clone the face vertices and move them
using the moved(extruded) normal
position
For a more complete, feature rich example, refer to the Procedural Modeling Unity samples. They include a nice Mesh extrusion sample too (see ExtrudedMeshTrail.js which uses MeshExtrusion.cs).
Goodluck!

To create the extruded walls:
For each vertex a (with coordinates ax, ay) in your polygon:
- call the next vertex 'b' (with coordinates bx, by)
- create the extruded rectangle corresponding to the line from 'a' to 'b':
- The rectangle has vertices (ax,ay,z0), (ax,ay,z1), (bx,by,z0), (bx,by,z1)
- This rectangle can be created from two triangles:
- (ax,ay,z0), (ax,ay,z1), (bx,by,z0) and (ax,ay,z1), (bx,by,z0), (bx,by,z1)
If you want to create a triangle strip instead, it's even simpler. For each vertex a, just add (ax,ay,z0) and (ax,ay,z1). Whichever vertex you processed first will also need to be processed again after looping over all other vertices.
To create the end-caps:
This step is probably unnecessary for collision purposes. But, one simple technique is here: http://www.siggraph.org/education/materials/HyperGraph/scanline/outprims/polygon1.htm
Each resulting triangle should be added at depth z0 and z1.

Related

How can I apply different normal map textures for different faces of a minecraft-like cubic terrain blocks in Unity?

I'm making a procedurally generated minecraft-like voxel terrain in Unity. Mesh generation and albedo channel texturing is flawless; however I need to apply different normal map textures for different cube faces regarding whether they're neighboring to another cube or not. Materials accepts only single normal map file and doesn't provide a sprite-sheet-editor kind of functionality for normal maps. So I have no idea about how to use selected slices out of normal map file as if they were albedo textures. I couldn't find any related resources about the problem. Any help will be appreciated. Thanks...
First of all, I'm not an expert in this area, though I am going to try to help you based on my limited and incomplete understanding of parts of Unity.
If there are a finite number of "normal face maps" that can exist, I suggest something like you indicated ("sprite sheet") and create a single texture (also sometimes called a texture atlas) that contains all these normal maps.
The next step, which I'm not sure whether the Standard material shader will be to handle for your situation is to generate UV/texture coordinates for the normal map and pass those along with your vertex xyz positions to the shader. The UV coordinates need to be for each vertex of each face; they are specified as a 2-D (U, V) offset into your atlas of normal maps and are floating point values with a range of [0.0, 1.0], that map to the full X and Y coordinates of the actual normal texture. For instance, if you had an atlas with a grid of textures in 4 rows and 4 columns, a face that should use the top-left texture would have UV coords of [(0,0), (0.25,0), (0.25,0.25), (0, 0.25)].
The difficulty here may depend if you are you using UV coordinates for doing other texture mapping (e.g. in the Albedo or wherever else). If this is the case, I think the Unity Standard Shader permits two sets of texture coordinates, and if you need more, you might have to roll your own shader or find a Shader asset elsewhere that allows for more UV sets. This is where my understanding of gets shaky, as I'm not exactly sure how the shader uses these two UV coordinate sets, and whether there is some existing convention for how these UV coordinate are used, as the standard shader supports secondary/detail maps, which may mean you have to share the UV0 set with all non-detail maps, so albedo, normal, height, occlusion, etc.

Find the distance between two points on any 3d surface

I am making a game in Unity3d and I need a pathfinding algorithm that can guide enemy's towards the player on a 3d surface. The problem is that the 3d surface can take any shape, so it can be a 3d sphere, cube, torus and many more shapes.
I tried using A* but for that formula I need the distance between the two points, and since the object is curved I cannot get that so easily. I found that you can use the Haversine formula if its a sphere, but that won't work on a torus or a random 3d shape.
I want this kind of result except with every kind of object:
https://www.youtube.com/watch?v=hvunNq7yVcU
Is there a way/algorithm that I can use to get that result. I know there is something called nav mesh but I need to program it myself. Also I cannot find how nav mesh approaches this dilema. I am going to use the triangles of my object as nodes.
So my question boils down to:
Does anyone know a algorithm for pathfinding that works on any 3d surface?
Thanks in advance.
I think your problem is that you are not using a graph, I would suggest that you look into a tutorial on how to create a graph, for the language you are using if you can, (this may also help here they are using edges to connect their node which is needed if you have more then one weight). If you do make a graph you will need a node class. Each node must contain pointers to any nodes that it is connected to and an ID of some kind. In your case that is probably all you need but it is also possible to assign a weight to each move if you also have an edge class (connectors between nodes) which would be used to connect the nodes. If you do have an edge class your nodes will have pointers to edges instead of other nodes and each edge will have a weight and a pointer to 1 or 2 nodes (depending on if it is a directed path or not). You can also make a graph class to contain all of your nodes and edges.
Summary:
make a node class and determine if you need the edge class (if everything has a weight of 1 you can get away with out it). Use the node class to create a graph to represent your map with each tile being a node with pointers to connected tiles. Use A* or dijkstra's algorithm to search your graph to find the shortest path.
note: most examples you will find will be for 2d graphs, yours is no different except that there are no bounds on yours, you just need to connect the nodes to their adjacent tiles.

Vector projection in game development

Where would you use a vector projection in game development. I know that it projects one vector to another, but I don't know where would I use that.
Regards
Here are a few examples:
Vector projection is common in computer graphics, which many games depend on.
In 3D games, during the rendering process the renderer has access to the 3D coordinates of every vertex of every mesh in the game world. These vertices need to be mapped onto a 2D rectangle that's the same shape as your screen. A projection matrix coincidentally called the Projection Matrix does this.
Sometimes projection matrices are used make objects cast shadows onto the surfaces of other objects.
Or suppose you're making a homing missile with a 60-degree field of view. You could say that the missile sees the world through a circular screen, and it loses track of its target if its target goes off the screen. You could use a projection matrix to map the 3D position of the target onto the homing missile's screen, and then decide whether the missile can see the target.

Mapping points in 2d space to a sphere

I have a bunch of points in a rectangular x/y space which I would like to project onto a sphere. As in, I am trying to write this function:
function point_on_sphere(2dx:Number, 2dy:Number) : Vector3D
{
//magic
return new Vector3D(3dx, 3dy, 3dz);
}
I have been trying to first plot the points on to a cylinder and then map those points to a sphere as directed by this wikipedia page. However, those formulas assume a constant z=0, which doesn't really do what I want.
I'm using actionscript 3 / flex, but any pseudo code or pushes in the right direction would be greatly appreciated.
Just to clarify: I'm not trying to apply a texture to a sphere object, but rather to place objects along an imaginary sphere.
There is no one right answer. You can choose different approaches based on how you want to place the objects along the sphere.
Is it OK for the objects to get nearer and nearer to each other as you get closer to the sphere's "poles"? Why wouldn't the normal texture-mapping projection actually work for you?

How can I create a parallel polyline without self intersections?

The simple algorithm to create a parallel polyline to an existing polyline is simple: you can calculate the normal of each vertex (as the average of the segment's normals) and displace the vertices using the normal with whatever amount you want.
However, there's a graphical problem when I try to use this algorithm on a curved polyline, this is, a succession of points which form an arc. When I create the parallel to the arc polyline, everything is fine until I increase the distance enough that projected vertices through their normals create a polyline where advancing from one vertex to another one actually moves in the reverse direction creating a self-intersection.
How can I remove such vertices from the parallel polyline efficiently? I've though of comparing the direction of the segments: if the generated segments are not parallel, it means I've reached a point were the parallel polyline intersects itself. However, this doesn't work very well for small segments (a curved polyline will generate even smaller segments) or polylines which originally have degenerate vertices (one vertex equal to the next one).
A parallel polyline is known in the graphic circles as offset polyline. Looks like a method to generate offset polylines without degenerate geometry artifacts are to use Straight Skeleton algorithms.
I've also found an interesting paper on the subject called An offset algorithm for polyline curves.

Resources