Qt cannot provide real time "Window is being dragged" events - qt

Here's a simple task that looks innocent. Small Widget window. Whenever it's dragged around the desktop (by catching its titlebar with the mouse and moving it around) it should print out its X and Y position IN REAL TIME. during the move.
Sounds simple, but it seems it's impossible to do this in Qt.
No matter what I do, I cannot get real time events for when I catch the caption and move the window around. I have an event filter, but during drag only sporadic few events come in -- the moveEvents etc are only after I release the mouse button. I need real time events for every pixel of movement (or at least for every time the window is actually moves on the screen, as close a possible to real time)
Is there a way to do that without drilling holes into the OS?
I'm using Qt on Mac.

Here is what came to my mind first. This solution does not provide the real time events handling, but I think it is very close to it. If you move your window slowly enough, you can catch movements in a pixel precision. Ok, now the implementation (I believe the code explains how it works):
class Widget : public QWidget
{
Q_OBJECT
public:
Widget(QWidget *parent = 0) : QWidget(parent)
{
m_timer.setInterval(1);
connect(&m_timer, SIGNAL(timeout()), this, SLOT(onTimer()));
m_timer.start();
}
private slots:
void onTimer()
{
if (m_lastPos != pos()) {
// Window moved.
m_lastPos = pos();
qDebug() << "Moved to:" << m_lastPos;
}
}
private:
QPoint m_lastPos;
QTimer m_timer;
};

Related

QWebEngineView how to disable pinch-to-zoom

I'm running a Qt 5.9.4 app with QWebView on a touch linux device displaying HTML webpages.
When users pinch their fingers on the screen, they zoom into the webpage. I want to disable that behaviour, because all the webpages should be 100% fullscreen.
I could temporarily disable it in the webpages (since I own their code) by adding user-scalable=no to the viewport, but that wont work in the long run because not every webpage will have that.
I tried to use eventFilter on different elements of my app, but couldn't get any Gesture or even Mouse event to catch.
This is how I create my webview (in a QDialog):
void Dialog::createWebView() {
view = new QWebEngineView(this);
profile = new QWebEngineProfile();
page = new QWeEnginePage(profile);
view->setPage(page);
view->setUrl(QUrl(path));
}
This is my eventFilter class:
EvFilter::EvFilter(QObject *parent) : QObject(parent)
{
}
bool EvFilter::eventFilter(QObject *obj, QEvent *event)
{
qDebug() << event->type() << "\n";
return QObject::eventFilter(obj, event);
}
I have tried doing
EvFilter* evf = new EvFilter();
view->installEventFilter(evf);
And also on every other element (profile, page, dialog) but couldn't seem to get any events corresponding to mouse or gesture.
What am I doing wrong? How could I prevent that behaviour in the entire webview?
After adding the event listener to the QApplication object, I can detect TouchBegin, TouchUpdate, TouchCancel and TouchEnd events, but nothing else (no Gesture event). I dont know how could I detect if the touchevent is the zoom gesture.
I think you could use this on the Dialog (didn't test it). I had some other widget showing a QML QtWebView and what I did to prevent the widget's children to get Pinch Gestures is:
this->grabGesture(Qt::PinchGesture, Qt::DontStartGestureOnChildren);
This prevents the underlying widget from receiving the PinchGesture, so that you can handle it yourself.

QDateTimeEdit focus to seconds

I have a QDateTimeEdit widget in my dialog showing minutes and seconds in the form of mm:ss
When the widget get focus by clicking the up/down arrows the minutes are increased/decreased by default. Is there a way to set default focus to the seconds?
In the QDateTimeEdit, there is a method named setCurrentSection(). All you have to do is to set the current selection to the seconds, like this:
dateTimeEdit->setCurrentSection(QDateTimeEdit::SecondSection);
In case anyone is interested.
A couple of the comments state the solution (dateTimeEdit->setCurrentSection(QDateTimeEdit::SecondSection);) "does not work".
If you use this code as-is immediately after creating a QDateTimeEdit it fails to present the user with the desired section selected. It appears the Qt internals reset the selection (to the first section) at some point after initially showing the widget. I did not bother to delve into the source code to discover precisely where. But the following two samples show how to "workaround" this.
If you do not want to subclass QDateTimeEdit I had to do this:
QTimer::singleShot(1000, [timeEdit]() { timeEdit->setSelectedSection(QDateTimeEdit::SecondSection); });
It works, and you can see the selection move from the first section when it is initially shown to the seconds when the timer fires. Note that the time delay size seems to be critical here. I tried with 0 and even with 100 and it did not work. For me, I needed 1000 (1 second) before it worked! My 1 second was on a standalone program which had to start up. You will need to play with that figure depending on your machine and where you call it/show the widget.
If you want "better performance" and are prepared to subclass so as to override showEvent(), I found (I am using QTimeEdit, but same for QDateTimeEdit):
class MyTimeEdit : public QTimeEdit
{
public:
explicit MyTimeEdit(QWidget *parent = nullptr) : QTimeEdit(parent)
{
}
explicit MyTimeEdit(const QTime &time, QWidget *parent = nullptr) : QTimeEdit(time, parent)
{
}
void showEvent(QShowEvent *event) override
{
QTimeEdit::showEvent(event);
// next line commented out because still does not work
// this->setSelectedSection(QDateTimeEdit::SecondSection);
// next line still requires `100` delay on my machine, but better than `1000`
QTimer::singleShot(100, [this]() { this->setSelectedSection(QDateTimeEdit::SecondSection); });
}
};
Still needed to use a time delay, but getting that down to 100 makes it so the user does not see the selection start and then move.

Ignore keyboard presses on Qt inside OSG

I’m newbie with Qt, and experiencing one problem I cannot deal with for, like, a month. The situation is like this:
I’ve OpenSceneGraph project (which is OpenGL) and trying to make Qt interface inside the 3d scene. I think its not necessary how I deal with that, but if someone wants to know more here is thread with more info on OSG forum (though I didnt get solution there). The problem is, when any key on keyboard is clicked, Qt controls jump around the screen and dont react on any (mouse or keyboard) events anymore. The entire program continues to work, though.
To summarize, my question is like: is there a way to make Qt widgets ignore all keypresses?
I’ve searched a lot, but couldnt find any working solution.
Thanks in advance!
Read a bit about events in Qt. There is a section about event filtering (but please don't jump straight to it :P).
SHORT ANSWER :
void Qwidget::setEnabled ( bool );
The drawback is that it disable also mouse events, change the widget style and that's a bummer.
LONG ANSWER : FILTER EVENTS
One possibility is to filter all events on the Qt application. I suppose the function which launch your Qt code looks like this (if different post here):
int main(int argc, char* argv[]){
QApplication app(argc, argv);
QWidget toplevelwidget1;
toplevelwidget1.show()
//stufff
return app.exec();
}
//doesnt have to exactly like this.
you can to set an event filter on app variable. It is the more elegant solution but it is too complicated because it filters native events and will require some work...
What you can do instead is filter only your top level widgets or windows (the one without parents). You define an event filter (which is a QObject) like :
class KeyboardFilter: public QObject
{
Q_OBJECT
...
protected:
bool eventFilter(QObject *obj, QEvent *event);
};
bool KeyboardFilter::eventFilter(QObject *obj, QEvent *event)
{
//for all events from keyboard, do nothing
if (event->type() == QEvent::KeyPress ||
event->type() == QEvent::KeyRelease ||
event->type() == QEvent::ShortcutOverride ||
) {
return true;
} else {
// for other, do as usual (standard event processing)
return QObject::eventFilter(obj, event);
}
}
Then you set the filter on the desired widgets using:
myDesiredWidgetorObject->installEventFilter(new KeyboardFilter(parent));
And that's it!

Design pattern for mouse interaction

I need some opinions on what is the "ideal" design pattern for a general mouse
interaction.
Here the simplified problem. I have a small 3d program (QT and openGL) and
I use the mouse for interaction. Every interaction is normally not only a
single function call, it is mostly performed by up to 3 function calls (initiate, perform, finalize).
For example, camera rotation: here the initial function call will deliver the current first mouse position,
whereas the performing function calls will update the camera etc.
However, for only a couple of interactions, hardcoding these (inside MousePressEvent, MouseReleaseEvent MouseMoveEvent or MouseWheelEvent etc)
is not a big deal, but if I think about a more advanced program (e.g 20 or more interactions) then a proper design is needed.
Therefore, how would you design such a interactions inside QT.
I hope I made my problem clear enough, otherwise don't bother complain :-)
Thanks
I suggest using polymorphism and the factory method pattern. Here's an example:
In my Qt program I have QGraphicsScenes and QGraphicsItems with mousePressEvent, mouseMoveEvent, and mouseReleaseEvent, which look like this:
void CustomItem::mousePressEvent(QGraphicsSceneMouseEvent *event)
{
// call factory method, which returns a subclass depending on where click occurred
dragHandler = DragHandler::createDragHandler(event /* and other relevant stuff */);
}
void CustomItem::mouseMoveEvent(QGraphicsSceneMouseEvent *event)
{
dragHandler->onMouseMove(event);
}
void CustomItem::mouseReleaseEvent(QGraphicsSceneMouseEvent *event)
{
dragHandler->onMouseRelease(event);
delete dragHandler;
}
The idea in this particular case is that depending on where I click on CustomItem, mouse pressing, moving, and releasing will have different functionality. For example, if I click on the edge of the item, dragging will resize it, but if I click in the middle of the item, dragging will move it. DragHandler::onMouseMove and DragHandler::onMouseRelease are virtual functions that are reimplemented by subclasses to provide the specific functionality I want depending on where the mouse press occurred. There's no need for DragHandler::onMousePress because that's basically the constructor.
This is of course a rather specific example, and probably not exactly what you want, but it gives you an idea of how you can use polymorphism to clean up your mouse handling.
Qt makes this beautifully simple.
Instead of all the switch mouse_mode: stuff you used to write, simply have each mouse event handler function emit a signal ie. mouseDown/mouseUp/mousePosition and use signals/slots to route those to the appropriate model functions.
Then you can accommodate different uses of the mouse (selecting, rotating, editing etc) by connect/disconnect different SLOTS to the signal sent in the Mouse...Event()
I find Apple's UIGestureRecognizer design quite nice and extendable.
The idea is to decouple the recognition of the gesture (or interaction) and the action that will be triggered.
You need to implement a basic or abstract GestureRecognizer class that is able to recognize a certain interaction or gesture based on events MousePressEvent, MouseReleaseEvent MouseMoveEvent or MouseWheelEvent etc. GestureRecongnizers have a target to report changes periodically.
For example your very basic class would be like: (sorry my poor semi c++ pseudo-code ... recently I don't use it that much)
class Recognizer {
int state; // ex: 0:possible, 1:began, 2:changed, 3:ended/recognized 4:cancelled
protected:
void setTarget(void &theTarget); // or even better a touple, target/method. In this case target is assumed to have a method gestureHandle(Recognizer *r);
virtual void mouserPress() = 0;
virtual void mouserRelease() = 0;
virtual void mouserMove() = 0;
virtual void mouserWheel() = 0;
...
}
And if you want to detect a swipe with the mouse
class SwipeRecognizer : Recognizer {
int direction; // ex: 0:left2right 1:bottom2top 2:...
private:
void mouserPress() {
state = 0; // possible. You don't know yet is the mouse is going to swipe, simple click, long press, etc.
// save some values so you can calculate the direction of the swipe later
target.gestureHandle(this);
};
void mouserMove() {
if (state == 0) {
state = 1; // it was possible now you know the swipe began!
direction = ... // calculate the swipe direction here
} else if (state == 1 || state == 2) {// state is began or changed
state = 2; // changed ... which means is still mouse dragging
// probably you want to make more checks here like you are still swiping in the same direction you started, maybe velocity thresholds, if any of your conditions are not met you should cancel the gesture recognizer by setting its state to 4
}
target.gestureHandler(this);
};
void mouserRelease() {
if (state == 2) { // is swipping
state = 3; // swipe ended
} else {
state = 4; // it was not swiping so simple cancel the tracking
}
target.gestureHandler(this);
};
void mouserWheel() {
// if this method is called then this is definitely not a swipe right?
state = 4; // cancelled
target.gestureHandler(this);
}
Just make sure above methods are called when the events are happening and they should call the target when needed.
This is how the target will look to me:
class Target {
...
void gestureHandler(Recognizer *r) {
if (r->state == 2) {
// Is swipping: move the opengl camera using some parameter your recognizer class brings
} else if (r->state == 3) {
// ended: stop moving the opengl camera
} else if (r->state == 4) {
// Cancelled, maybe restore camera to original position?
}
}
Implementation of UIGestureRecognizer is quite nice and will allow to register several targets /method for the same recognizer and several recognizers to the same view.
UIGestureRecognizers have a delegate object that is used to get information about other gesture recognizers, for example, if two gestures can be detected at the same time, or should one must fail as soon as the other is detected, etc.
Some gesture recognizer will require more overrides than others but the big PRO of this is that their output is the same: a handler method that informs about the current state (and other info).
I think is worth taking a look at it
Hope it helps :)

Getting MouseMoveEvents in Qt

In my program, I'd like to have mouseMoveEvent(QMouseEvent* event) called whenever the mouse moves (even when it's over another window).
Right now, in my mainwindow.cpp file, I have:
void MainWindow::mouseMoveEvent(QMouseEvent* event) {
qDebug() << QString::number(event->pos().x());
qDebug() << QString::number(event->pos().y());
}
But this seems to only be called when I click and drag the mouse while over the window of the program itself. I've tried calling
setMouseTracking(true);
in MainWindow's constructor, but this doesn't seem to do anything differently (mouseMoveEvent still is only called when I hold a mouse button down, regardless of where it is). What's the easiest way to track the mouse position globally?
You can use an event filter on the application.
Define and implement bool MainWindow::eventFilter(QObject*, QEvent*). For example
bool MainWindow::eventFilter(QObject *obj, QEvent *event)
{
if (event->type() == QEvent::MouseMove)
{
QMouseEvent *mouseEvent = static_cast<QMouseEvent*>(event);
statusBar()->showMessage(QString("Mouse move (%1,%2)").arg(mouseEvent->pos().x()).arg(mouseEvent->pos().y()));
}
return false;
}
Install the event filter when the MainWindows is constructed (or somewhere else). For example
MainWindow::MainWindow(...)
{
...
qApp->installEventFilter(this);
...
}
I had the same problem, further exacerbated by the fact that I was trying to call this->update() to repaint the window on a mouse move and nothing would happen.
You can avoid having to create the event filter by calling setMouseTracking(true) as #Kyberias noted. However, this must be done on the viewport, not your main window itself. (Same goes for update).
So in your constructor you can add a line this->viewport()->setMouseTracking(true) and then override mouseMoveEvent rather than creating this filter and installing it.

Resources