Is there any way to achieve hot-plugging of USB mouse in DirectFB 1.2.9 or Qt Embedded 4.7.3?
Currently my application stack is thus..
-----------------
GUI
-----------------
Qt Embedded 4.7.3
-----------------
DirectFB 1.2.9
-----------------
/dev/input/eventX
-----------------
DirectFB opens the Linux input device node. Qt uses a QSocketNotifier to wait on the DirectFB event buffer and sets up a slot to read the mouse data. But on hot-plugging, DirectFB does not open the device node and no mouse events are generated.
As far as I understand so far, hot-plugging is not supported by DirectFb..
I tried disabling DirectFB's handling of the Linux input device (removing the dev node from linux-input-devices= option in directfbrc), and set QWS_MOUSE_PROTO="linuxinput:.." but this did not work for some reason. Seems no mouse events were generated. Even if I manage to get it to work, I don't think QT provides any support for hot-plugging either.
So is my only alternative to sub-class QMouseDriverPlugin and QWSMouseHandle classes?. For this, I am yet to figure out how to make QT use the sub-classes I implement. i.e, Once I implement these classes how do I link them into the QT input device handling frame-work, so that I can set something like QWS_MOUSE_PROTO="mylinuxinput:.."?
As far as I can remember, I encountered no issue with mouse or keyboard hotplugging in Qt Embedded 4.7.2 (without DirectFB). If you want to subclass yourself, modify the plugin starting from the linuxinput plugin. You'll find that in Qt sources: this is the directory where the plugins are placed, but some classes are included in other directories.
Also, are you getting data in your linux device after pluggin in? Did you try to cat the device?
Related
In Qt Creator it is possible to run the program clicking the green play button (using PySide6), but not edit the qml file in Design mode, where it gives the error "Line 0: The Design Mode requires a valid Qt kit" inside the design view. I created a project and selected Qt for Python (qml).
But I have installed several gb of the newest Qt version during installation? I have also selected the correct python env in the options.
When I create a non-qml project, but add a .qml file later a popup error says:
"The QML emulation layer (QML Puppet) cannot be built because the Qt version is too old or it cannot run natively on your computer. The fallback emulation layer, which does not support all features, will be used"
The Qt version is 6.2.2 and the OS is Windows 11. It is possible to use design mode in Qt Design Studio.
I meet the same problem and my environment is similar, choose Edit → Preferences and enter the Kits. Then select a Qt version and a kit you like (the type of qmake shall be the same, for instance, both of them is msvc), and remember to apply the changes.
After that, go back to the form editor and it works well. What confused me is that I must repeat the steps above each time I want to use Qt, the config seems to not be saved correctly. (The Qt version is 5.1.2 and the OS is Windows 11)
PS: After selecting the kit, you need to restart Qt.
I am programming an app in Qt 5.9.4 commercial license. My app runs on Android and iOS.
Question:
Is there a way in Qt to detect usual mobile device events like:
- Device display switched off
- Device went to background after user presses home button on iOS/Android
- back button pressed on Android etc
There are a bunch of these events that each of the platform Android and iOS trigger.
If there is class/module in Qt that is relaying these events back then I wouldn't have to do the extra work of writing native(java and objective-c) classes to get these events inside my QtApp.
Most of these events are handeled by Qt itself and you can get them through classical means. Qt tries to hide android specific stuff as good as possible and delivers those though events etc. as you would get them on a desktop application.
The back button press triggers a QCloseEvent that is sent to the primary window. You can install an event filter on the object from C++ it to intercept it. For qml it's the Window::closing signal.
Since Qt does not support background execution of activities, the background press is propably reported as close event as well - or it quits the application directly.
For the DIsplay switch of I don't now for sure, but maybe the QGuiApplication::applicationStateChanged signal does report it as Qt::ApplicationSuspended or another of those states - simply try it out! (this might be the case for other events as well)
Short Hint for Android: If you however want something Qt does not handle, you can always just create a custom Java activity that extends the QtActivity and use it in the manifest. From there on you can use the JNI to interact with Java from C++ and vice versa. If you need to do so, have a look at the Qt Android Extras - They make using the JNI much easier and provide a bunch of nice wrapper classes and utility methods in the QtAndroid namespace that can come in very handy.
I use Yocto (Krogoth) to build my imx6 images and toolchains, however it's a bit heavy and slow for working on kernel drivers. As such my dev cycle is to build the kernel on its own, just using the output of a "do_patch" run in yocto as the source tree base and sourcing the toolchain environment.
This is normally not a problem, as mostly I'm focussed at that end of the s/w stack. However, I now need to be able to run a Qt application (running under eglfs) on top of my continually updated kernel, for a bug hunt. To do this, I need the imx6 graphics driver working, so I get the galcore source from git://github.com/Freescale/kernel-module-imx-gpu-viv.git export my kernel build directory, make it and deploy it. That module loads perfectly. However running the working application that has already been built with Yocto causes a crash, somewhere in libQt5EglDeviceIntegration.so.5. All the libs etc. are part of the original working image, the same place I took my kernel source from.
What do I need to do to make this work? Is there some part of Qt tied to the graphics driver that's going to force me to rebuild the entire library? What's the relationship between galcore.ko and Qt? Is there now a weird dependency between my application and the linux kernel?!
EDIT: PEBCAK. I'm an idiot. I didn't check out from the right SHA1 (that listed in the recipe) for the galcore driver. Still, the answer below is instructive, so I'd like to keep this question.
What do I need to do to make this work?
No idea. Maybe your self-built galcore.ko is incompatible with the binary blob OpenGL libraries from Freescale somehow? Does the original galcore.ko work correctly? How does the backtrace look?
Is there some part of Qt tied to the graphics driver that's going to force me to rebuild the entire library?
No need to rebuild Qt. While Qt is linked against the OpenGL library, the OpenGL ABI/API is stable and therefore a Qt rebuild isn't needed. Besides that, you aren't changing the OpenGL libraries.
What's the relationship between galcore.ko and Qt?
Qt uses OpenGL for rendering when using QtQuick. The OpenGL library (libGL.so and a few variants like libGLes2.so) is provided by Freescale as a binary blob. The OpenGL library makes syscalls that end up in the galcore.ko kernel module.
libQt5EglDeviceIntegration.so.5 is the part in Qt that does the first OpenGL calls to initialize OpenGL.
Is there now a weird dependency between my application and the linux kernel?!
Well, yes, indirectly via Qt -> libGL.so -> kernel [galcore.ko]
I'm trying to get QtWebEngine running on a VM and am having difficulties. According to the answer to this question:
Eventually I realised that OpenGL 3.3 wouldn't work easily on virtual machines .. yet. I had to boot from ubuntu usb and work from there by installing latest mesa 3d package.
Is there a way to get QtWebEngine to work without OpenGL? I'm not directly using any OpenGL calls, nor do I need any 3d capabilities. I just want to embed a QWebEngineView to display dynamic HTML pages. I'm guessing this should be possible since Chrome works on the same VM without an issue.
I don't think there is a way to use the Qt WebEngine without OpenGL. It is not very explicitly said in the documentation, but here's what I understood from what I found.
About Chromium
As it is said here, QtWebEngine integrates chromium's fast moving web capabilities into Qt. Plus, it is Chromium that allows the manipulation of OpenGL via the Qt Quick scene graph (source) :
Chromium is tightly integrated to the Qt Quick scene graph, which is
based on OpenGL ES 2.0 or OpenGL 2.0 for its rendering. This provides
you with one-pass compositing of web content and all the Qt Quick UI.
The integration to Chromium is transparent to developers, who just
work with Qt and JavaScript.
It is also said that both the render process and the GUI process should share an OpenGL context :
Because the render process is separated from the GUI process, they
should ideally share an OpenGL context to enable one process to access
the resources uploaded by the other, such as images or textures.
About the Qt WebEngine itself
We just talked about the Qt's GUI : in fact, the Qt WebEngine is not dependent of this GUI (page rendering and JavaScript execution are separated from the GUI process into the Qt WebEngine process), but remember that if you want your application to work, you will need to share an OpenGL context between both processes. In particular, this is achieved by default with a QSurfaceFormat, which has a OpenGLContextProfile accessible by the function QSurfaceFormat::profile(). Now, we look back at the Qt WebEngine platform notes which states :
If a new default QSurfaceFormat with a modified OpenGL profile has to
be set, it should be set before the application instance is declared,
to make sure that all created OpenGL contexts use the same OpenGL
profile.
On OS X, if the default QSurfaceFormat is set after the application
instance, the application will exit with qFatal(), and print a message
that the default QSurfaceFormat should be set before the application
instance.
If we look at the source code of Qt, calls to OpenGL are made in several important files, like qtwebengine\src\core\web_engine_context.cpp or qtwebengine\src\webengine\api\qtwebengineglobal.cpp. Moreover, I also found calls to OpenGL in functions from the sources in qtwebengine\src\3rdparty\chromium\, so I suspect that Chromium needs to call OpenGL functions sometimes.
In short
The Qt WebEngine is using Chromium (which doesn't necessarily use OpenGL) and also Qt GUI, which uses an OpenGL context which has to be shared by the Web Engine. Thus, my conclusion is that you can't use the Qt WebEngine without OpenGL.
I had the same problem on my VM environment trying to start an application that uses QtWebEngine and it crashed.
I will add this answer as a reference - although Sergey Khasanov mentioned it already in the comment above
Use Software Qt Quick2DRenderer - see https://doc.qt.io/QtQuick2DRenderer/
To do that, simply set the environment variable:
export QMLSCENE_DEVICE=softwarecontext
then restart your application. It might still complain about
libEGL warning: GLX/DRI2 is not supported
libEGL warning: DRI2: failed to authenticate
but (in my case) it finally worked!
I am trying to extend a legacy win32 application functionalities. The legacy application has a Multiple Document Interface(MDI) as it's main window and is purely written in win32 API. Is it possible to show a QWidget in win32 MDI area as a child?
Are you using MFC?
What's important to understand is that running Qt always requires you have a running Qt event loop. So what you need is to properly process your MFC/win32 events an the Qt events.
There is the Qt solution QtWinMigrage for that supports Qt 4 and Qt >= 5.4 (Qt 5.0-5.3 are broken). Examples also show your use case.
This is certainly a good starting point if your application is based on CWinApp.
Further details can be found by searching the internet and reading about the QAbstractEventDispatcher. Hope this helps!