QToolButton Remain Pressed after finger has been removed from Touchscreen - qt

I have a small Qt Desktop application (in C++) which I need to make it work in a touch screen device running Window 10. The device has a touchscreen and application works perfectly with keyboard and mouse.
I am not expert in developing Qt Application and that's why unable to resolve this could be silly isssue.
However, when I try to use touchscreen, the last touched QToolbutton remain pressed even if I have moved my finger away from the touchscreen and when I touch somewhere else, then that QToolButton is released.
I expect the Qtoolbutton to behave just like when it is pressed using a mouse. Once I move my finger off the touchscreen, it should be released.
I tried to resolve this issue with the following:
btn->setAttribute(Qt::WA_AcceptTouchEvents);
And
qApplication.setAttribute(Qt::AA_SynthesizeMouseForUnhandledTabletEvents);
But it didn't help. I think I am missing a very small issue and after that Qt will handle all the touch related events on it's own and will show the right behavior.
I am cross-compiling on my Ubuntu machine using MXE. Qt version is 5.12.

I think I am missing a very small issue and after that Qt will handle all the touch related events on it's own and will show the right behavior.
QWidget subclasses, including QToolButton, do not handle touch events on its own. It is not even needed that Qt emulate mouse events. You are seeing the effects in your application of emulated mouse events synthesized by Windows. This makes possible for ancient Windows programs designed to be used with mouse will work somewhat in modern touch enabled Windows tablets. But the mouse emulation is hardly perfect.
If the behavior is unacceptable for the QToolButton class, you have the option of subclass it and handle touchevents as you see fit, overriding mouse events as well, to avoid the synthesized events to be handled as well.

I had the same problem. I solved it by changing the signal to which the slot is connected.Previously I was connected my slot to QPushButton::pressed. To fix it, change the connect to use the signal QPushButton::clicked:
connect(my_button, &QPushButton::clicked, this, &MyClass::onMyButtonClicked);

Related

Showing a QMessageBox breaks QLineEdit highlighting

I have a C++ Qt 4 application written for Raspberry Pi. I'm experiencing an odd side-effect of showing a QMessageBox and I don't know enough about Qt to debug it.
The pi has a touchscreen, so I launch the application with unclutter to hide the mouse cursor. (Though this doesn't affect my issue... I have tried without unclutter just in case.)
I have sub-classed QLineEdit to override focusInEvent() and focusOutEvent() to select-all when a LineEdit gets focus, and deselect-all when it loses focus.
Before showing any QMessageBoxes, everything works perfectly - tapping on a QLineEdit selects all the text; tapping the next one de-selects the previous QLineEdit, and selects the new QLineEdit.
After showing a QMessageBox, my overridden events stop working, and QLineEdits no longer auto-select and de-select.
If I add:
msgBox.setWindowFlags(msgBox.windowFlags() | Qt::Popup);
before I exec() the QMessageBox, then text highlighting continues working normally, but the cursor is displayed and flickers while the QMessageBox is on the screen.
It seems like there is a side-effect of showing the QMessageBox that affects the calling window and my sub-classed QLineEdit boxes... but not if the QMessageBox has the Popup flag set!
I've tried storing and manually re-loading the flags on the main window, and that does nothing, so it doesn't appear to be flags on the main window.
One more oddity: everything works fine if I run the application remotely over XMing and SSH... it's only when it's run locally on the Pi in plain-old X11 that it freaks out.
Any thoughts on how to debug this? Thank you!
I was able to work around this by doing two things.
First, I updated the flags of the QMessageBox before opening it:
msgBox.setWindowFlags(Qt::Popup);
msgBox.exec();
and second, I hid the cursor in Qt code, rather than relying on Unclutter; this fixed the flickering cursor issue:
QApplication::setOverrideCursor(Qt::BlankCursor);

How to hide mouse cursor in Qt Application?

qApp->setOverrideCursor() method works successfully, if I want to hide mouse cursor, except one condition. If I add a dialog that is modal, and while it is shown, if the cursor is out of dialog's borders, it is shown again. Have you got any idea about the problem?
It does not matter how the solution for hiding mouse cursor is; whether by Qt or at the operating system level. My operating system is Windows 7.
You cannot hide the mouse cursor when it leaves your window (or dialog-window), because it is then handled by the window-manager of your OS. A workaround would be to constrain the mouse to your window/dialog, so it cannot leave. You will either need to look through the MSDN to find the specific windows functions to do it, or do it like in kshegunov's code example on the Qt-forums: https://forum.qt.io/topic/61832/restrict-mouse-cursor-movement/12

Custom Touch behavior in Windows 7 with Qt/QML application

I am developing a touch application for Windows 7 with Qt/QML. The end-user-device has Windows 7's native touch behavior, i.e.: When touching the screen, a point appears on the last-touched-point, and when ending the physical touch, Windows puts that point on the now-touched point and runs in the on-clicked-Event.
Compared to the behavior one knows from standard Windows mouse-usage, this leads to a different behavior as soon as it comes to e.g. clicking some button: A mouse user will expect that the button changes color to the pressed-down-color when mouse button goes down, while the color changes to the default color again when the mouse button goes up.
In my application, I want to have a customized way of touch feedback: What is currently being touched should be marked using changed colors of buttons, imitating a "mouse goes down" when the actual physical touch begins and imitating a "mouse goes up" when the actual physical touch ends.
My application will run fullscreen, so an actual possibility would be to change the system's behavior on application start and change it back to default on applications end.
Such a behavior would effectively be the same as the standard behavior on e.g. all Android devices I know.
I searched through all the MouseArea and MultiPointTouchArea elements, trying to find a way to just make the click-reaction behavior different to the standard behavior. However I did not even find a way to capture the begin of the actual touch ... All the things which I want to happen at the begin of the touch actually happen when the touching ends.
Edit:
It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called.
Possibly related:
Adobe AIR: touch screen doesn't trigger mouse down event correctly - but I did not find a way to access the functions like Multitouch.inputMode (which are mentioned in the first reply) from a native Qt application.
How can I achieve the described behavior for my application?
The solution for this issue is to disable "Press and Hold" for the application. This is what can be done in a system-wide setting using ...
Control Panel -> Pen and Touch -> Touch -> Press and Hold -> Settings -> uncheck 'Enable press and hold for right-clicking'
The only solution I found to to this in native code can be found here:
http://msdn.microsoft.com/en-us/library/ms812373.aspx
I checked that this is at least still working for Windows 7. To get it working for QML, I searched for the QWindow* in QQmlApplicationEngine::rootObjects() and used its winId as a HWND. With that HWND, I called the TogglePressAndHold function from the link before app.exec().

Qt working Windows 8 style frameless custom window

I recently installed Github for Windows on my Windows 7 machine and loved the custom frame it had, it fit really well with the overall application theme and had it's own titlebar buttons which were really well layed out, very fluent, and seemed very natural to work with.
I did a bit of digging and found 2 flags that would clear out the border completely and after a bit of customization I got my app to also have a nicely customized look which was intuitive yet different from all the apps with the old Windows border.
The thing was it wasn't fluent and naturally responsive like the other windows, it was glitchy as heck, I easily got the window to move around with the mouse but it often glitched and was able to be moved on areas it shouldn't like clicking and dragging on a disabled button.
The maximize button which was linked to showMaximize method just enlarged the whole window to take up the entire desktop, you could still move it (wasnt really trully maximized).
The window responded to none of the system signals like clicking the taskbar to minimize it and such.
After a lot of fixing around I just finally gave up which was ashame cause I really liekd how it looked and it was very intuitive, much like github for Windows is very intuitive.
Is there any way I can accomplish this, I'm really not ready to give up yet.
I know that when making a raw Windows API application you have to link it to the XP built-in style because it inherits the Windows 95 style by default, maybe theres a Windows 8 style that Qt'S not connected to, I do't know didn'T go that far in research yet.
Minimize window by clicking on task bar
It seems that Qt::FramelessWindowHint's implementation is limited. When this flag is set, Windows thinks that this window cannot be minimized or maximized. I've tried this solution implemented in pure winapi. Minimizing and restoring frameless window by clicking on taskbar works fine. Apparently Qt sets some bad flags that block this functionality. May be there is a good reason for that, I don't know.
We can use winapi and Qt together but it is troublesome. Firstly, winapi code should be executed after you set window flags and show the window using Qt. Otherwise Qt will overwrite window flags.
Another problem is when we remove border using winapi, window geometry suddently changes, and Qt doesn't know about that. Rendering and event mapping (including mouse click positions) become invalid. I didn't find any documented way to update mapping. I've found that we can tell Qt that screen orientation has changed, and it forces it to recalculate window geometry. But this looks like a dirty hack. Also the QWidget::windowHandle function is missing in Qt 4 and "is subject to change" in Qt 5. So this method is not reliable. But anyway, it works now. Here is complete code (tested in Windows 8) that should be placed in the top window class constructor:
#include "windows.h"
#include <QWindow>
//...
show();
HWND hwnd = reinterpret_cast<HWND>(effectiveWinId());
LONG lStyle = GetWindowLong(hwnd, GWL_STYLE);
lStyle &= ~(WS_CAPTION | WS_THICKFRAME | WS_MINIMIZE | WS_MAXIMIZE | WS_SYSMENU);
SetWindowLong(hwnd, GWL_STYLE, lStyle);
setWindowFlags(windowFlags() | Qt::FramelessWindowHint);
windowHandle()->reportContentOrientationChange(Qt::PrimaryOrientation);
The true way to solve this problem is to modify the Window Qt platform plugin (see QWindowsWindow class in Qt sources). May be there is a way to inherit from the default implementation, modify it and use in your app. Also you can ask Qt developers is this behavior reasonable or is it a bug. I think that this issue can be fixed with a patch.
If you still intend to use this code and other OSs should be also supported, don't forget to wrap windows-specific implementation in #ifdef Q_OS_WIN.
Enable window dragging only when title bar is clicked and window is not maximized
Other problems can be fixed more easily. When you process mouse events to implement window dragging, check window state and event position and disable moving when it is unwanted.
void MainWindow::mousePressEvent(QMouseEvent *e) {
if (!isMaximized() &&
e->button() == Qt::LeftButton &&
ui->title->geometry().contains(e->pos())) {
window_drag_start_pos = e->pos();
}
}
void MainWindow::mouseReleaseEvent(QMouseEvent *e) {
window_drag_start_pos = QPoint(0, 0);
}
void MainWindow::mouseMoveEvent(QMouseEvent *e) {
if (!window_drag_start_pos.isNull()) {
move(pos() + e->pos() - window_drag_start_pos);
}
}
void MainWindow::on_minimize_clicked() {
showMinimized();
}
void MainWindow::on_maximize_clicked() {
if (isMaximized()) {
showNormal();
} else {
showMaximized();
}
}
Here ui->title is a label used for displaying fake title bar, and QPoint window_drag_start_pos is a class variable.
If you use Qt::FramelessWindowHint, you lose all Windows frame related features, such as docking, shortcuts, maximizing, etc. You can implement some of them yourself, with great effort in some cases, but other things will still just not work, including handling multiple monitors and all the capabilities of the Windows Key. If you need your app to behave like a regular Windows app, Qt::FramelessWindowHint is pretty much a dead end.
The only real solution is to use the DWM API that Windows provides for this purpose, Custom Window Frame Using DWM. They provide examples of how to do anything you might want in the frame area, while preserving all the standard windows-manager behaviors.
Applications such as Chrome do use DWM for this (see the AeroGlassFrame class in the Chromium source). Granted, Chrome isn't using Qt, so it doesn't have to do the work of sneaking around QMainWindow to handle the Windows-specific messages, and their code has plenty of comments that indicate how messy it is, like:
// Hack necessary to stop black background flicker, we cut out
// resizeborder here to save us from having to do too much
// addition and subtraction in Layout() ...
Similarly, the Github-for-Windows app you mention is built on a platform called Electron, which in turn uses, you guessed it, DWM.
I haven't found any working example of a DWM frame around a Qt application, but hopefully this provides a starting point.

Qt background task touch event

I have a Symbian Qt "background" app that displays a button type thing using CWindowDrawer derived from CActive that is visible on top of everything. I want it to be visible when the a textinput is active or the virtual keyboard is visible, preferably when a textinput is active because some phones have a hardware keyboard. When it is visible I want it to receive all touch positions as to determine if it is being pressed without it taking focus and closing the keyboard?
Basically I need a way to determine when a textinput is activated and I need a way to listen for all touch positions from a non qwidget that is also not in focus.
Any help would be appreciated.
PS. This is for Symbian Belle and Symbian C++ is also supported within a Qt application if necessary.

Resources