new to blender and java fx
I have a project for college in which I need to create a stupid 3d fruit and show it in java presentation with some transformations
I managed to create an apple in blender and exported form blender and into a javafx scene.
but no matter how the .obj is exported from blender, I can't get the object to show with proper lighting in my javafx application
this is how it shows in blender
and this is how it comes up with a pointlight shining right over in my javafx application
I checked this other related posting
https://blender.stackexchange.com/questions/26088/obj-export-is-too-dark
but I couldn't make sense of what I need to change in blender.
by the way I'm using blender 2.8
any help is appreciated.
Update:
I added 2 more lights as per the 3 point lighting suggestion but I can't get the little green thingy at the top of the stupid apple to show green, it still comes up completely dark, what am I missing?
In the 3D world, the only light is what you put there.
One spotlight in your scene will light one side of your model, the other sides will be dark. Think of holding a torch under your face in the dark, you can't see the hand holding the torch or your shoulders, only your face that is right in front of the torch gets lit.
In blender you have environment lighting which is emitting light from all directions.
It is common in 3D to use threee point lighting to clearly illuminate a model.
I am using Qt 5.5.0 and Qt3D 2.0.
I want to display video as texture.
I can display and rotate only one image.
How can I add a video instead of an image?
I'm mixing two libraries that use OpenGL: Qt and OpenSceneGraph. I'm targeting OpenGL ES 2, so everything is done with shaders and ES 2 compatible calls.
I'm specifically using OSG with QtDeclarative by trying to paint OSG onto a QDeclarativeItem. I do this the way suggested in Qt documentation: wrap all OpenGL calls between beginNativePainting()/endNativePainting().
This works fine until I use textures in my OpenSceneGraph scene. When I do this, my QML window gets "messed up" for lack of a better word. To keep it as simple as possible, my OSG scene consists of a plane with a texture applied to it. I recreated the scene using basic OpenGL calls and the problem no longer occurs. Here's the problem summarized as a bunch of pictures:
The QtDeclarative engine uses OpenGL to paint stuff. I set up a simple QML page:
I create a simple scene using OpenGL directly. It's a plane with a texture painted onto it.
Now I try to set up the same scene in OSG... identical shaders, etc.
You can see something odd is going on with the last screenshot. Don't worry about the black background where the original OpenGL scene was transparent, that's just OSG using a black clear color. The problem is that the other items set up with QML (the rectangles) get messed up.
Edit: To clarify what happens: The rectangles I draw with QML are all stretched out to the right edge of the screen. I also noticed if I draw rectangles after the OpenSceneGraph item in QML, they don't show up (I didn't notice this before). I draw the purpley black rectangle after the OSG item in the following screenshots... note that it disappears. There might be more weird stuff happening, but this is all I've observed playing with rectangles.
Before
After
I'm fairly new to OpenGL so I don't know what kind of call/state setting would cause something like this to happen. I think that OpenSceneGraph makes some OpenGL state change that's messing up Qt's paint engine. I also know that this only occurs when OSG uses textures... if I don't apply textures in my OSG scene, this doesn't happen. This is where I'm stuck.
Also, I tried to use BuGLe to get an OpenGL call trace with and without textures enabled in OSG to see if I could figure out the problematic state change(s). I found a few differences, and even some global state that OSG changed (such as glPixelStorei()) between the two, but resetting the changes I found made no difference. It would help a lot if I knew what to look for. If anyone's feeling insane, I also have the stack traces:
OSG with texturing: http://pastie.org/4223182 (osg texture stuff is lines 637~650)
OSG without texturing: http://pastie.org/4223197
Edit 2:
Here's a diff that might be helpful. You'll need to scroll way down before the relevant lines are apparent.
http://www.mergely.com/nUEePufa/
Edit 3:
Woah! Okay, that diff helped me out quite a bit. OSG enables VertexAttribArray 3 but doesn't disable it. Calling glDisableVertexAttribArray(3) after OSG renders its frame seems to partially solve the problem; there's no more stretching of the QML rectangles. However, rectangles drawn after the OSG item still don't show up.
After obsessing over the trace logs, I think I've found two OpenGL things that need to be reset before passing control back to Qt to cause the issues above to go away. I mentioned one in an edit... I'll summarize both in this answer.
Rectangle/QML Item distortion
QPainter uses Vertex Attributes 3, 4, and 5 directly for something that looks like its related to the geometry of those rectangles. This can be seen in the trace:
[INFO] trace.call: glVertexAttrib3fv(3, 0x2d94a14 -> { 0.00195312, 0, 0 })
[INFO] trace.call: glVertexAttrib3fv(4, 0x2d94a20 -> { 0, -0.00333333, 0 })
[INFO] trace.call: glVertexAttrib3fv(5, 0x2d94a2c -> { 0.2, 0.4, 1 })
Disabling the corresponding vertex attribute arrays fixes the stretchy rectangles issue:
glDisableVertexAttribArray(3);
glDisableVertexAttribArray(4);
glDisableVertexAttribArray(5);
Items drawn after the OSG Item don't render
In retrospect, this was one was easy and didn't have anything to do with texturing. I hadn't noticed this before trying to add textures to my scene though, so mixing the two issues was my fault. I also screwed up with the traces and diff I posted; I never updated them to account for the ordering problem after I discovered it (sorry!)
Anyways, QPainter expects depth testing to be turned off. Qt will turn depth testing off when you call beginNativePainting(), and also when it starts to paint its items... but you're expected to turn it back off whenever handing control back:
QPainter paints stuff (DEPTH_TEST = off)
OSG draws stuff (DEPTH_TEST = on)
QPainter paints more stuff [expects DEPTH_TEST = off]
The right trace logs showed that I wasn't doing this... So the fix is
glDisable(GL_DEPTH_TEST)
Maybe you just need to reenable GL_TEXTURE_2D? I notice in your example with textures that OSG enables, and subsequently disables GL_TEXTURE_2D. Thus the difference between your two cases (with texture vs without), is that the one that uses textures finishes with texturing disabled, while the one without texturing leaves GL_TEXTURE_2D in it's initial state.
If Qt needs/expects texturing enabled to draw quads it could cause nothing to show up.
Trying to achieve something like this
http://attasi.com/labs/ipad/ which uses CSS transforms.
but using the Canvas object for greater compatibility.
Does anyone know if this is possible?
Take a look at three.js, it can render 3D using mainly WebGL, but it supports a subset of stuff with <canvas>, SVG and even the plain DOM - check this demo that uses the DOM renderer.
Here are demos using canvas. Some guy made this which shows it can do textured, animated game-like 3D with canvas.
Also, you may want to check out a particle engine and a tweening engine to go with that, plus stats like FPS.
I want to create a Qt application, supported by directFB, and I need to paint gradients in the application. However I understand that gradients are not supported graphics operations with directFB.
My question is, how can I create a gradient in this case?
The gradient painting is supported by the raster engine, and it will surely work. Just try it.