Is it possible in Qt to catch the event of an extra monitor being connected or disconnected? - qt

When I use my laptop at home there is an extra monitor connected to it. However sometimes I disconnect the second display (D2) temporarily (for instance because I need the ThunderBolt connection on my Mac). Usually the OS takes care of moving all the visible windows that were visible on D2 to the first display (D1). However if the resolution of D1 is much higher than D0 windows are shrunk just enough the fill the whole screen of D0. This is ugly and inconvenient for the user.
So my question: is it possible to write an event handler or event filter for the occurrence of the event of connecting or disconnecting a second monitor? There might be other uses than the one I described above. Perhaps it is impossible because the OS does not tell any application that the windows were moved.

QDesktopWidget provides the signal screenCountChanged(int). One thing to note is that if screen mirroring is enabled, the screen count will be 1, but if the second screen is detached the screenCountChanged(..) signal is not emitted but the main screen may resize, so you may want to also checkout QDesktopWidget::resized(int).

Related

How to detect screen's DPI change in Qt?

Since I have a window (derived from QWidget) which may switch between two monitors and dynamically update according to screen DPI, it seems that I can only listen to screenChanged(QScreen*) signal. It works fine when I drag the window between monitors.
But it doesn't work when I open app in another monitor and switches menu bar between two monitors (through System Preference -> Displays -> Arrangement in mac). It seems that the signal is not emitted in that situation.
Which signal should I listen to? Is there a better way?

Qt: What governs mouse event emission rate?

I have a callback that does some work when the mouse is moved. It feels weird not to govern it to a maximum rate. What governs how often mouse callbacks occur when the user moves the mouse?
The mouse device driver. If you change mouse settings on the configuration panel of your system you will see a behavior difference. The window system send those events to the main process, which are handled by the QApplication and then propagated to the right widget.
Unless no event filter has been set, the events delivery at a widget is as seamless as in a native app. After all the Qt event system match what different OS use for their window event system.
If something feels weird double check your callback implementation. It is very unlikely that the issue is somewhere else.
I think it depends on the mouse’s polling rate. Mouse polling rate is how often it reports its position (measured in Hz). For example a mouse with 125 Hz polling reports its position 125 times in a second (Every 8 milliseconds).
A higher polling rate can lead to a more callback in your case when you move the mouse. But it will also use more CPU resources.

How can I remove the Chromecast volume button?

I developed https://play.google.com/store/apps/details?id=com.kunert.einsteinstictactoe. When the user hits the Chromecast button the first time he is able to connect to a Chromecast device. If he hits it the second time after being connected he is able to adjust the volume or to disconnect.
As my application currently doesn't support sound I want to hide the volume adjustment.
Is this possible?
It is possible. You need to define your own MediaRouteDialogFactory and in there, return your own MediaRouteControllerDialogFragment implementation. In your implementation of that fragment, in onCreateControllerDialog, you need to set setVolumeControlEnabled(false). See the package com.google.sample.castcompanionlibrary.cast.dialog.video in CCL which has all of these for its implementation.

Getting a mouse drop event in Qt 5 if mouse doesn't move before mouse release

Something seems to have changed in Qt 5: you can't get a drop or move event if you don't move at least one pixel from the start point where you were when QDrag::exec() was called. Try putting a breakpoint in the dropEvent of the Draggable Icons Sample, then click a boat and release it without moving the mouse. That generates an "ignore" without any drop signal.
(This is on Kubuntu 13.10 with Qt 5.1.)
When teaching how to start a drag operation, the documentation suggests you might use manhattanDistance() to determine if the mouse has moved enough to really qualify as "the user intending to start a drag". But you don't have to use that; you can start up a QDrag on the click itself.
Anyone know of a workaround to have that same kind of choice on the drop side, or is that choice gone completely? :-/
Why I care: I've long had frustrations trying to get a tight control on mouse behavior in GUI apps—Qt included. There seems to be no trustworthy state transition diagram you can draw of the invariants. It's a house of cards you can disprove very easily with simple tests like:
virtual void enterEvent(QEvent * event) {
Q_ASSERT(!_entered);
_entered = true;
}
virtual void leaveEvent(QEvent * event) {
Q_ASSERT(_entered);
_entered = false;
}
This breaks all kinds of ways, and how it breaks depends on the platform. (For the moment I'll talk about Kubuntu 13.10 with Qt 5.1.) If you press the mouse button and drag out of the widget, you'll receive a leaveEvent when you cross the boundary...and then another leaveEvent when the button is released. If you leave the window and activate another app in a window on screen and then click inside the widget to reactivate the Qt app, you'll get two consecutive enterEvents.
Repeat this pattern for every mouse event, and try and get a solid hold on the invariants...good luck! Nailing these down into a bulletproof app that "knows" it's state and doesn't fall apart (especially in the face of wild clicking and alt-Tabbing) is a bit of a lost cause.
This isn't good if your program does allocations and has heavy processing, and doesn't want to do a lot of sweeping under the rug (e.g. "Oh, I was doing some processing in response to being entered... but I just got entered again without a leave. Hm, I guess that happens! Throw the current calculations away and start again...")
In the past what I've done is to handle all my mouse operations (even simple clicking) with drag & drop. Getting the OS drag & drop facility involved in the operation tended to produce a more robust experience. I can only presume this is because the testers actually had to consider things like task switching with alt-Tab, etc. and not cause multiple drop operations or just forget that an operation had been started.
But the "baked in at a level deeper than the framework" aspect actually makes this one-pixel-move requirement impossible to change. I tried to hack around it by setting a timer event, then faking a QMouseEvent to bump the cursor to a new position once the drag was in effect. However, I surmise that the drag and drop is hooked in at the platform level, and doesn't consult the ordinary Qt event queue: src/plugins/platforms/xcb/qxcbdrag.cpp
The issue has--as of 1-May-2014--been acknowledged as a bug by the Qt team:
https://bugreports.qt-project.org/browse/QTBUG-34331
It seems that me bountying it here finally brought it to their attention, though it did not generate any SO answers I could accept to finalize the issue. So I'm writing and accepting my own. Good work, me. (?) Sorry for not having a better answer. :-/
There is another unfortunate side effect of the Qt5 change, pointed out by a "Dmitry Mordvinov":
Same problem here. Additionally app events are not handled till the first mouse event after drag started and this is really nasty bug. For example all app animations are suspended during that moment or application hangs up when you try to drag with touch monitor.
#dvvrd had to work around it, but felt the workaround was too ugly to share. So it seems that if you're affected by the problem, the right thing to do is go weigh in...and add your voice to the issue tracker to perhaps raise the priority of a solution.
(Or even better: patch it and submit the patch. 'tis open source, after all...)

KeyPress events for X windows

I'm trying to write a little app to capture keystrokes for a window under X and then display them onto the screen using OSD or something. The idea is to use it for screencasts and stuff like that.
I tried some surgery on xev and got it to work fine but then noticed something funny. If I use xev on the window which itself creates, the KEyPress and KeyRelease events are registered and I can see them. However, if I use the -id switch for xev to monitor another window and try to log keystrokes there, they KeyPress and KeyRelease events are not always displayed. I seem to get PropertyNotify events when some things happen but not KeyPress and KeyRelease which are what I'm interested in.
Some windows behave as expected (e.g. gnome-terminal). Some others don't (e.g. emacs-gtk).
How do I get the keystrokes for these windows?
Key events go to the window that has focus, which is not always the window that appears to have focus. When I try to use xev on my firefox window, key events go to one of its unviewable children (relative upper-left at (-1,-1), size (1,1).
You can use XGetInputFocus() to find out which window has focus.

Resources