I want to hide system cursor for 10s for some reason ,but I found
cursor.setShape(Qt.BlankCursor)
can only hide mouse cursor that is associated with QWidgets ,not in system wide ,i.e. when mouse cursor is hovering on QWidgets, it is invisible ,otherwise it is visible ,so is there any way to hide system cursor in system wide?
The win32 system call ShowCursor works per-window only. You can access this from either ctypes or pywin32's win32api. But apparently the cursor drawing is controlled by display driver and can only be affected by specific windows. You can't force another window to hide its cursor. Two options:
use ShowCursor(False) on your window, and for the display background, create a root window application that you spawn from your GUI app, it hides cursor; your app would cause it to exit after 10 seconds, but again if user moves mouse over other app windows they will see cursor.
make your application a root window application; then while in view, ShowCursor(False) will make cursor disappear everywhere on screen except system toolbar (which is a good thing).
I don't think it is a good idea anyways; what if your app crashes while the mouse is hidden? Then user can't use their desktop easily. Definitely good reason that this is not allowed.
Best approach is to think of a different solution to whatever problem led you to try cursor hiding.
Related
qApp->setOverrideCursor() method works successfully, if I want to hide mouse cursor, except one condition. If I add a dialog that is modal, and while it is shown, if the cursor is out of dialog's borders, it is shown again. Have you got any idea about the problem?
It does not matter how the solution for hiding mouse cursor is; whether by Qt or at the operating system level. My operating system is Windows 7.
You cannot hide the mouse cursor when it leaves your window (or dialog-window), because it is then handled by the window-manager of your OS. A workaround would be to constrain the mouse to your window/dialog, so it cannot leave. You will either need to look through the MSDN to find the specific windows functions to do it, or do it like in kshegunov's code example on the Qt-forums: https://forum.qt.io/topic/61832/restrict-mouse-cursor-movement/12
Background
I'm building an application which runs in the background, and where the mouse cursor is moved programmatically into a dialog when the dialog "pops up". I have done this using QCursor.setPos
Problem
The problem I'm having is that if the mouse button is pressed down when the user is interacting with something outside the application this might lead to unwanted things happening. For example if the user is changing the volume and the mouse is moved the volume might go to max or min
Question
Is there any way (in Qt) to do a mouseup programmatically?
If I do this before changing the position of the cursor it seems to me that there is less risk of problems (though there might be other problems resulting from this approach)
I am developing a touch application for Windows 7 with Qt/QML. The end-user-device has Windows 7's native touch behavior, i.e.: When touching the screen, a point appears on the last-touched-point, and when ending the physical touch, Windows puts that point on the now-touched point and runs in the on-clicked-Event.
Compared to the behavior one knows from standard Windows mouse-usage, this leads to a different behavior as soon as it comes to e.g. clicking some button: A mouse user will expect that the button changes color to the pressed-down-color when mouse button goes down, while the color changes to the default color again when the mouse button goes up.
In my application, I want to have a customized way of touch feedback: What is currently being touched should be marked using changed colors of buttons, imitating a "mouse goes down" when the actual physical touch begins and imitating a "mouse goes up" when the actual physical touch ends.
My application will run fullscreen, so an actual possibility would be to change the system's behavior on application start and change it back to default on applications end.
Such a behavior would effectively be the same as the standard behavior on e.g. all Android devices I know.
I searched through all the MouseArea and MultiPointTouchArea elements, trying to find a way to just make the click-reaction behavior different to the standard behavior. However I did not even find a way to capture the begin of the actual touch ... All the things which I want to happen at the begin of the touch actually happen when the touching ends.
Edit:
It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called.
Possibly related:
Adobe AIR: touch screen doesn't trigger mouse down event correctly - but I did not find a way to access the functions like Multitouch.inputMode (which are mentioned in the first reply) from a native Qt application.
How can I achieve the described behavior for my application?
The solution for this issue is to disable "Press and Hold" for the application. This is what can be done in a system-wide setting using ...
Control Panel -> Pen and Touch -> Touch -> Press and Hold -> Settings -> uncheck 'Enable press and hold for right-clicking'
The only solution I found to to this in native code can be found here:
http://msdn.microsoft.com/en-us/library/ms812373.aspx
I checked that this is at least still working for Windows 7. To get it working for QML, I searched for the QWindow* in QQmlApplicationEngine::rootObjects() and used its winId as a HWND. With that HWND, I called the TogglePressAndHold function from the link before app.exec().
I have a main window (on MS Windows) and I want to have sub windows or subpanels with free screen movement. I can use dialog and Qt::splashscreen flag, but when I am on these subwindows I lose the focus caption for the main window. Is there any trick to do what I want? (Something like a multi-focus...)
Maybe it is impossible?
I'm not sure what you mean by losing the focus.
When I create an application with multiple windows, this is what I do: in the sub-window widget, I set the parent to the main window, and set the Qt::Tool flag. It has multiple effects: the window manager sees it as one window, and when you focus any window, all the windows raise.
Do you want QMdiArea? Or a focus proxy?
In Qt, all top-level windows are independent, none is the "main". If you want to nominate one as a main window and have it steal focus from the others, then you will have to implement that manually.
Sounds like you just want to have widgets that you can move around freely on a parent widget/window, without invoking the "window focus changed" event between native Windows windows (...).
I'm not sure if there is a ready-made solution for that, but adding some grab/move/resize events to a widget's edges shouldn't be that hard, or?
I'd simply catch mousedown/up events on certain areas (these should probably be widgets of their own with a link to the parent movable widget), and have them resize/move the window when the mouse moves.
I need to keep a NativeWindow I am creating on top of the main window of the application.
Currently I am using alwaysInFront = true, which is not limited to the windows in the application. I can successfully synchronize the minimize/restore/move/resize actions, so the top window behaves appropriately in those cases. Even though using this option has the drawback that if I alt-tab to other application the window goes on top of the other application.
Because of the above I am trying to get it to work without using the alwaysInFront. I have tried using orderInFrontOf and orderToFront, which gets it in place but when I click an area in the main window the top one becomes hidden i.e. air makes it the top one.
I have tried capturing activate/deactivate events but it only happens on the first click, so on the second click the top window becomes hidden again. I also tried making the top window active when the main one becomes active, but that causes the main one to loose focus and I can't click on anything.
Ps. I am doing this to improve the behavior of a HTMLOverlay I am using - see Flex Air HTMLLoader blank pop up window when flash content is loaded
Listening for Event.DEACTIVATE and calling event.preventDefault() should work. Not sure if that is what you have tried, but I have an app where that does the trick.
I ended up turning on/off the alwaysInFront option based on whether the main window or the top window were active i.e. if none where active I turned it off. This was additionally to what I mentioned in the question.
That way when the user switches to another application, the window doesn't go on top of the other apps. I still would prefer a solution where I don't have to use the alwaysInFront option, or even better an alternate solution to the flex loading flash in external sites issue I linked to above.
Ps. I will try to check with the owner of the HTMLOverlay to submit a patch (its an improvement, although its tied to an app that doesn't open extra windows when opening the overlay).
Update: I have committed the changes to the HTMLOverlay.
I'm trying to do something very similar. In an AIR application, I have one large full screen window which is essentially the "desktop". I always want this window to stay behind all other windows in my app. There are, however, some items on the "desktop" window that need to be clickable.
There appears to be no clean way to force a window to maintain its position in the window ordering.
What I've settled on so far, which isn't perfect, is to make all other windows in my app use the alwaysOnTop property but bind this to a global var (ugh) that I maintain to track the overall application level active/inactive state. This way, when I switch to another app, my windows don't float above the all other app windows - they correctly move behind as expected.
Then, I have a regular (alwaysOnTop=false) window that is fully transparent as an "overlay" to the desktop window on which I can place various interactive controls. This window is OK to come forward since it's transparent and my other windows are alwaysOnTop.
Finally, and crucially, I install three event listeners on the "desktop" window as follows:
protected function onApplicationComplete(event:Event):void
{
this.addEventListener(MouseEvent.MOUSE_DOWN, onClickHandler, true,1000,true);
this.addEventListener(MouseEvent.CLICK, onClickHandler, true,1000,true);
this.nativeWindow.addEventListener(Event.ACTIVATE, onActivateWindow,false,-1);
}
protected function onActivateWindow(event:Event):void
{
trace("sent via activate to back");
orderInBackOf(bigTransparentWindow);
}
protected function onClickHandler(event:MouseEvent):void
{
trace("sent via click to back");
orderInBackOf(bigTransparentWindow);
}
I'm not entirely happy with all this since there is still some occasionally noticeable flicker of objects in the overlay window - it appears that the "Desktop" window gets ordered in front of it, an update of some sort happens, and then it gets forced behind again.
Any better solutions welcome!