I am making a 2D interface in Qt5 and QML (Qt Quick 2.0) and I need to draw a guage (e.g. fuel or the like) that has different colors along it. For instance, it may be green for most of the arc, then become yellow, and finally red.
I have accomplished this by implementing a custom class derived from QQuickItem which creates a QSGGeometryNode and specifies geometry with vertex colors. The problem is that the edges are rough (aliased). FSAA seems overkill, so I want to texture my arc with a gradient that fades out at the edges to get a smooth look.
In order to get a custom material, I started with the simplematerial example:
http://qt-project.org/doc/qt-5.0/qtquick/scenegraph-simplematerial-simplematerial-cpp.html
This has texture coordinates, but no actual texturing (brilliant!). None of the "helper" material classes seem to support both vertex colors and texturing. QSGMaterial doc says nothing about texturing. How can I texture my vertex colored geometry?
Related
I'm trying go apply normal map to QDiffuseSpecularMaterial or QMetalRoughMaterial. I Use QTextureImage to load the textures. When I try to apply normal material just becomes black, however, I don't have any issues with other maps (baseColor, ambientOcclusion, metalness and roughness)
What I've tried without success:
Changed direction of vertex normals, but vertex normals are correct
Swapped RGB channels in normal map texture - all the combinations. Also tried with grayscale texture
Mapped values in texture loaded in QPaintedTextureImage from (0, 255) to (0, 1) range
Thought that normal maps maybe don't work with QPointLight, so I've also added QDirectionalLight to the scene
I have a model that uses a texture atlas .png as the texture for a material for a low poly model with about 4 colors. I am using Blender 2.81 and exporting as glTF.
Blender shaders
When I import it in A-Frame, it comes in as untextured and white.
A-Frame
I can add a BSDF shader and the texture applies properly, but now due to the BSDF it has these lightings and all of the faces are different colors now.
BSDF shader model
In Unity and Godot you can set things to be "unlit" to keep the flat colors.
I tried the solution here:
AFrame: how to use flat shading on a mesh
But it results in colors not quite the same as the texture atlas.
With "unlit"
The colors in my color palette/atlas
Also, there's a 3js example that baked in lighting and textures. This soft shading is what I'm trying to replicate.
https://threejs.org/examples/#webgl_animation_keyframes
Is it possible is A-Frame to get the true unlit color from the texture?
Found an ok solution: you can set the scene renderer's colorManagement to true along with the unlit script from the linked SO link.
Flat chi
https://aframe.io/docs/1.0.0/components/renderer.html
I'm currently rendering a 3D model (Wavefront .obj format) in my Qt program. Right now, I'm rendering the model using Scene3D in QML, and I'm able to get it to display in the viewing area. What I would like to do is have a user click on the model and generate a 2D cross section of the slice that I would like to plot on a different window. I'm quite new to 3D rendering, and a lot of Qt documentation isn't very descriptive. I've been reading Qt documentation, experimenting, and searching online with no luck. How can I create 2D slices of a 3D object Model in Qt 3D, preferably in QML? What Qt libraries or classes can I use to achieve this?
Unfortunately, the fact that models are stored as a set of surfaces makes this hard. QT probably doesn't have a built in method for this.
Consider, for example, that a model made of faces might be missing a face. What now? can you interpolate across that gap consistently from different angles? What about the fact that a cross-section probably won't contain any vertices?
But, of course, it can be solved. First, just don't allow un-closed surfaces (meshes with holes). Second, for finding the vertices of your cross-section, perform an intersection between every edge in your model and the plane you're using, and if there's an intersection, there's a point there. Third, to find the edges, look at the list of vertices, and any two that are from an edge on the same polygon in the mesh should be connected by an edge in the cross section. To find which direction the edge should go, project the normal of the polygon onto the plane your using. For filling, I don't really know what to do. I guess that's whatever you want it to be.
I am currently working of JavaFX 3D application and come across getNormals() method in TriangleMesh class.
As in TriangleMesh class is used to create user defined Java FX 3D obejct and in that getPoints() is used to add PointsgetFaces() is used to add Faces getTexCoords() is used to manage Texture of 3D Object, but I am not sure what is use of getNormals() method in TriangleMesh class.
In TriangleMesh class, we can set vertex format to VertexFormat.POINT_TEXCOORD and VertexFormat.POINT_NORMAL_TEXCOORD.But if we set vertexFormat as "VertexFormat.POINT_NORMAL_TEXCOORD", then we need to add indices of normals into the Faces like below : [
p0, n0, t0, p1, n1, t1, p3, n3, t3, // First triangle of a textured rectangle
p1, n1, t1, p2, n2, t2, p3, n3, t3 // Second triangle of a textured rectangle
]
as described in https://docs.oracle.com/javase/8/javafx/api/javafx/scene/shape/TriangleMesh.html
I didn't find any difference in 3D shape, if I used vertexFormat as POINT_TEXCOORD or POINT_NORMAL_TEXCOORD.
So what is the use of getNormals() method in TriangleMesh JavaFX?
Thanks in Advance..
Use of normals in computer graphics:
The normal is often used in computer graphics to determine a surface's orientation toward a light source for flat shading, or the orientation of each of the corners (vertices) to mimic a curved surface with Phong shading.
The normals effect the shading applied to a face.
The standard shading mechanism for JavaFX 8 is Phong Shading and a Phong Reflection Model. By default, Phong Shading assumes a smoothly varying (linearly interpolated) surface normal vector. This allows you to have a sphere rendered by shading with limited vertex geometry supplied. By default the normal vectors will be calculated as being perpendicular to the faces.
What JavaFX allows is for you to supply your own normals rather than rely on the default calculated ones. The Phong shading algorithm implementation in JavaFX will then interpolate between the normals that you supply rather than the normals it calculates. Changing the direction of surface normals will change the shading model by altering how the model represents light bouncing off of it, essentially the light will bounce in a different direction with a modified normal.
This example from Wikipedia shows a phong shaded sphere on the right. Both spheres actually have the same geometry. The distribution of the normals which contribute towards the phong shading equation is the default, smoothly interpolated one based upon a standard normal calculation for each face (so no user normals supplied). The equation used for calculating the shading is described in the PhongMaterial javadoc, and you can see there the normal contribution to the shading algorithm, in terms of the both the calculations of diffuse color and the specular highlights.
Standard 3D models, such as obj files can optionally allow for providing normals:
vn i j k
Polygonal and free-form geometry statement.
Specifies a normal vector with components i, j, and k.
Vertex normals affect the smooth-shading and rendering of geometry.
For polygons, vertex normals are used in place of the actual facet
normals. For surfaces, vertex normals are interpolated over the
entire surface and replace the actual analytic surface normal.
When vertex normals are present, they supersede smoothing groups.
i j k are the i, j, and k coordinates for the vertex normal. They
are floating point numbers
So, why would you want it?
The easiest way to explain might be to look at something known as smoothing groups (please click on the link, I won't embed here due to copyright). As can be seen by the linked image, when the smoothing group is applied to a collection of faces it is possible to get a sharp delineation (e.g. a crease or a corner) between the grouped faces. Specifying normals allows you to accomplish a similar thing to a smoothing group, just with more control because you can specify individual normals for each vertex rather than an overall group of related faces. Note JavaFX allows you to specify smoothing groups via getFaceSmoothingGroups() for instances where you don't want to go to the trouble of defining full normal geometry via getNormals().
Another, similar idea is a normal map (or bump map). Such a map stores normal information in an image rather than as vector information, such as the getNormals() method, so it is a slightly different thing. But you can see a similar interaction with the reflection model algorithm:
Background Reading - How to understand Phong Materials (and other things)
I know that if I rotate an object, which extends javafx.scene.shape.Shape, I can transform it into 3D space, even though it was primarily designed to be in 2D (at least as far as I know).
Let's say I have a 3D scene (perspective camera and depth buffer are used), where various MeshViews occur. Some are used for areas, others for lines. In both cases those shapes must be triangulated in order to draw them with a TriangleMesh, which is often nontrivial.
Now when I change the drawing of these lines to use the Polyline class, the performance collapse is horrible and there a strange artefacts. I thought I could benefit from the fact, that a polyline has less vertices and the developer doesn't have to triangulate programmatically.
Is it discouraged to use shapes extending javafx.scene.shape.Shape within 3D space? How're they drawn internally?
If the question is "should I use 2D Shape objects in 3D space?" in JavaFX then the answer is No because you will get all terrible performance that you are seeing. However it sounds like you are trying to make up for JavaFX's lack of a 3D PolyLine object using 2D objects and rotating them in 3D space. If that is true, instead use the free open-source F(X)yz library:
http://birdasaur.github.io/FXyz/
For example the PolyLine3D class which allows you to simply specify a list of Point3Ds and it will connect them for you:
/src/org/fxyz/shapes/composites/PolyLine3D.java
and you can see example code on how to use it in the test directory:
/src/org/fxyz/tests/PolyLine3DTest.java