How to deallocate texture from memory in A-Frame? - aframe

I’d like to know how to deallocate memory from assets that I no longer need in a particular scene. Currently, our assets textures are staying in memory after we unload the asset in a scene.
<a-scene>
<a-box src="texture.jpg"></a-box>
</a-scene>

EDIT: The easiest way to dispose a texture is to grab an entity's material and call material.map.dispose(). For example, this.el.getObject3D('mesh').material.map.dispose(), until A-Frame can automatically handle it.
You can get the texture objects from document.querySelector('a-scene').systems.material.textureCache and run .dispose() on a texture.
Alternatively, I believe you can grab the texture from an entity via document.querySelector('a-entity').components.material.material.map.dispose().
There is an issue filed to do this automatically: https://github.com/aframevr/aframe/issues/2166

Related

How can I support multiple types of VR controllers at the same time in A-Frame?

I'm using A-Frame and I'm trying to figure out how to easily support multiple types of controllers at once (Oculus Touch, HTC Vive controllers, and Windows Mixed Reality controllers), preferably with controller models rendered in the scene and with lasers that would allow the user to click on things.
How do I do this?
I figured out how to do this, so here's my solution.
In your HTML, you can have these to create the controllers (this should be inside an a-scene element):
<a-entity laser-controls="hand: left" raycaster="showLine: true; objects: .clickable;"></a-entity>
<a-entity laser-controls="hand: right" raycaster="showLine: true; objects: .clickable;"></a-entity>
These should also render with the actual controller models in the scene, and each have a laser pointer.
This is what it looks like with the Oculus Touch controllers (ignore the other stuff in the view):
As new types of headsets come out and are supported by A-Frame (e.g. the Valve Index controllers aren't supported yet), the laser-controls component should automatically be updated to support them.
See the docs for a bit more information on how to use controllers in your A-Frame scene.
I still haven't figured out exactly how to make it possible to click on buttons or objects in the environment using the laser, I'll need to figure that out next.

Howto render QtQuick item into texture

I have a custom QQuickFramebufferObject node which renders a texture on some geometry. Now I want to render a Qml item to a texture (e.g. QOpenGLTexture) and use this texture in my custom node. I know that there is the grabToImage (http://doc.qt.io/qt-5/qquickitem.html#grabToImage) method but this method is slow because it first renders everything into a QImage and data has to be transferred from/to the GPU.
In detail, I am looking for something like ShaderEffectSource element but with direct access to the texture id which can be used as a texture in the QQuickFramebufferObject. Is there already something implemented in Qt for this scenario?

Cocos3D - Take various screenshots in the background

Using Cocos3D, is it possible to take screenshot of the 3D model in the background without the user knowing it?
For pre-processing purpose and other usage, I want to take screenshots of the 3D model at various angles. Following the Render-To-Texture capability, I noticed when my scene is not visible, the drawSceneContentWithVisitor: method only execute once rather than at every rendering cycle. For obvious reason, the CC3GLFramebuffer* won't get updated with new data, hence, I'm only able to take the initial screenshot.
Thanks.
In Cocos3D, you can render your 3D scene to an off-screen surface. See the CC3DemoMashUp addTelevision and drawSceneContentWithVisitor: methods for an example of how to do this.
What is important is that the 3D drawing environment has been established when you perform your drawing. The safest place to do this is inside your drawSceneContentWithVisitor: method. But if you want to render somewhere else, you need to invoke the CC3Scene open3DWithVisitor: and CC3Scene close3DWithVisitor: methods before and after rendering. See the implementations of the CC3Scene processInitializeScene and open methods for examples of how to do that.
To render your scene from multiple viewpoints, you need to add multiple cameras to your scene, and set the camera property of your drawing visitor appropriately to select a camera before rendering. See how this is done in the CC3DemoMashUpScene addTelevision and drawToTVScreen methods. The drawToTVScreen method also shows how to handle clearing the color and depth buffers of your surface.

Is there a way to access the image on the QWidget's backing store?

I'm doing some compositing inside the paintEvent() in a custom widget. Some of the compositing is done when some areas are already painted, and I need access to the current contents painted so far.
So, I'm looking for a way to access the image contents of the current backing store during a paintEvent. I've looked at QBackingStore, but there's nothing there that directly gives me access to the backing store bitmap. Is there some API, perhaps private, that could be used to provide that?
If not, I'll have to resort to painting on an explicit pixmap and rendering that pixmap onto the widget.
It is possible, but it is not portable. The QBackingStore is just a wrapper class around a QImage buffer on most platforms, but I suppose this is not guaranteed. I've researched this issue when writing the QuickWidget. A cast is needed:
QImage * image = dynamic_cast<QImage*>(backingStore()->paintDevice());
if (image != 0) // it's an image, do something with it
Be careful though not to cause the QImage to detach. Things such as resizing are off limits.
Check the QuickWidget out at:
https://code.google.com/p/quickwidget/

Qt custom widget update big overhead

We are trying to use Qt 4.8.5 for some Linux-based embedded devices in our company. I use Qt embedded without X server. I need to plot measured data and update them very often (20-30fps, but only a small portion of the widget). The system is ARM based, 400Mhz, have no GPU and no FPU. I subclassed QWidget and overridden the paintEvent(). I have WA_OpaquePaintEvent and WA_StaticContents set. For testing, my paint event is empty, and I call the update() function of the widget form a timer set to 50ms. My problem is that the empty update is eating up 30% of the CPU. The amount varies with the area of the update, so I think QT may redraw something in the background. I have read many posts but I cannot find the solution for my problem. If I comment out the update call, the CPU usage drops to ~1% (even if I generate a sine in the timer for testing the widget, which should be much more complex than an empty function call). My widget is rectangular, is not transparent and I want to handle the full drawing procedure from the paint event.
Is it possible to reduce this overhead, and handle the whole painting process by my own?
The "empty update" is not empty - it repaints the whole window :)
Have you read the below?
To rapidly update custom widgets with simple background colors, such as real-time plotting or graphing widgets, it is better to define a suitable background color (using setBackgroundRole() with the QPalette::Window role), set the autoFillBackground property, and only implement the necessary drawing functionality in the widget's paintEvent().
You should also be using QWidget::scroll(), since internally it does scroll the backing store of the window, and that's much more efficient than repainting the entire thing if only a tiny slice is added to it.

Resources