Vertex displacement through shader in Babylon.js not updating mesh's shadow - babylonjs

I'm morphing an object's vertices using a vertex shader in Babylon.js. The morphed object looks great, but I can't figure out a way to cause the object's shadow to update as well.
I know in Three.js there is a customDepthMaterial for a mesh where you can pass in the same custom vertex shader and correctly update the object's shadow, but is there something similar in Babylon.js? I have to use Babylon for this project.

Related

How to create map marker to show multi users' facing direction by here sdk?

I want to create a mobile application that can show where is your friends and in which direction they are facing. I want to use positionIndicator at first but I can't create one more positionIndicator on the map view. Then I turned to MapMarker, But I found I can't rotate it and scale it. And I tried MapLocalModel, but I don't think it's a good idea to use a 3d module to render a 2d object. Then I think I should create a new MapObject class. But the constructor of MapObject is package protect. I can't call it or override it. So, what's the correct way to implement it?
MapLocalModel in general is the right approach for a marker which could be rotated. Agree that for 2D object MapLocalModel is not the best approach however the alternative would be rotating the image used for MapMarker itself which might also have some performance hit.

How to display points with QT3D?

Qt3D makes it very easy to display some mesh primitives:
m_torus = new Qt3DExtras::QTorusMesh();
but I would just like to display a collection of points. I haven't seen anything like
m_points = new Qt3DExtras::QPoints();
Is there a way to do this without writing lower level OpenGL?
Don't know if this is what you're looking for but check out Qt3DRender::QGeometryRenderer. I use it in a project to display map lines in a 3D scene.
There is a method to define how the vertex buffer data shall be rendered (where I use Qt3DRender::QGeometryRenderer::LineStrip instead of Qt3DRender::QGeometryRenderer::Points):
Qt3DRender::QGeometryRenderer::setPrimitiveType(Qt3DRender::QGeometryRenderer::Points);
AFAIK, There are no simple primitives like lines or points available in Qt3D 2.0, because there is just no one-size-fits-it-all solution. If you are lucky, someone will step up and add something to extras, else you have to write your solution yourself.
Qt Interest Mailing List Nov 2016 - Lines in Qt3D
There is however, a pcl point cloud renderer project on github!

Qt OpenGL data synchronization / Model/View implementation

I am trying to develop an application with Qt 5.5 and OpenGL. The basic work of the application will be to load simple objects, modify their positions in a scene and save them together with other attributes (material, name, parents/child relations etc...).
The only thing I am struggling about for a week now is that I really don't know how I should take on the problem of synchronizing data. Let's say I have some kind of SceneGraph class which takes care of all SceneObjects. Those SceneGraphs should be rendered in a SceneView-Widget which can be used to modify it's Objects via transformations. Now how would I tell every SceneView that an Object changed it's position?
I thought of the Model/View architecture for a moment but I am not really sure how this implementation should look like.
What would be the best way to handle Objects like that in different Windows/Widgets but still have one single piece of data?
SceneObject:
Holds the mesh-information (verticies, uvs, etc..)
Has a name (QString)
Has a material
Has a transform storing position, rotation and scaling information
(Important: these datatypes should be synchronized in all views)
SceneGraph:
Contains different SceneObjects and is passed to SceneViews
SceneView:
The QWidget responsible for drawing the Scene correctly in any QWindow.
Has it's own camera to move around.
Handles UserInput and allows transformation of SceneObjects.
You could use the signal and slot to observe position updates of SceneObjects and process them in SceneView.

Instantiate a texture in the Unreal Editor

I'm trying to create a vehicle which throws a ball (or, say, a spherical projectile) when clicking. I already have this vehicle doing the right thing, but I'd like it to throw a yellow-colored ball. I created a yellow texture but I don't know how to apply it specifically to the projectile.
I have to run the map on Unreal Tournament 3, so I may not be able to use the Unreal Development Kit.
Do you have some clues or an idea on how to do that?
Thanks
You'll have to plug your Texture into a Material and assign that Material to your projectile mesh. You can do that in the editor, or you can override the mesh's materials in code inside the the mesh component by adding entries to the Materials array, e.g.:
Begin Object Class=StaticMeshComponent (or SkeletalMeshComponent) Name=ProjMeshComp
StaticMesh=<your mesh>
Materials(0)=<the material you created>
End Object
Is the projectile you are shooting a custom projectile?
If it is, look in your projectile class for a particle system component or a static mesh component reference similar to the answer Phillip posted. You will see something like:
ParticleSystem'SomePackage.SomeGroup.AssetName'
//or
StaticMesh'SomePackage.SomeGroup.AssetName'
The GroupName might not be present.
Then open up your editor and in your content browser find the package (in this case SomePackage). Right click it and be sure to Fully Load it. Now you should see your ParticleSystem or Static Mesh. If its a particle system, you need to edit the mesh module of the particle in cascade to use your material; otherwise you just reassign the static mesh material as usual.
If its not a custom projectile, you need to figure out which projectile class you are using and then do the above; a good starting place is the UTProjectile hierarchy of code.

How do I get the QDrag hotspot value in the dropEvent function?

I'm somewhat new to Qt, and I'm using Qt 4.8 to implement a graphical editor of sorts. Right now I've implemented dragging of rectangles around my widget using drag&drop. In my mousePressEvent function I generate a QDrag with appropriate MIME data (similar to the puzzle sample), and I just added a 'setHotSpot' call.
The dragging works just fine, but in my dropEvent function, I can't figure out a way to get back to the hot-spot setting in the original QDrag object - I don't appear to have access to it.
I've solved it for the moment by stuffing the hot-spot point into my MIME data (it's custom data anyway), but that seems wrong to me - it seems to me that there'd be some way within the Qt framework for me to get that hot-spot data in my dropEvent function.
please check the following example in Qt.
http://doc.qt.io/qt-4.8/qt-draganddrop-fridgemagnets-example.html
this example shows how to use drag an drop events in Qt.
In that example we see that adding the hot-spot's point to the MIME data does in fact appear to be the recommended way to get the hot-spot point from where the drag is initiated do the dropEvent.
I don't understand what you are trying to achieve...
The "hotspot" point is just an offset point relative to the pixmap representing the data being dragged, and thus is constant during the whole drag.
If you are looking for the initial drag point, you should indeed encode it into the mime data.

Resources