My ddd window doesn't take any keyboard input when my mouse cursor is moved out of the window. Although ddd window is still active, but it doesn't take any inputs. I will have to move my mouse over its window to make it work again. Is there a way to configure it in a way that it takes the input no matter where the mouse pointer is, just like any other X window, xterm for example.
I had the same problem; DDD behaved like focus-follows-mouse, when all other windows (and the setting in gnome) was click-to-focus. This was in DDD 3.3.11.
For me, it (mostly) works now, after I changed "Keyboard focus" to "Click to type" in the Preferences, Startup tab.
If ddd implements some sort of follow-focus itself instead of relying on the window manager, I am afraid that it's not going to be easy to prevent it. The good news is that ddd doesn't do anything very smart, it's just a front-end for real debuggers (say, gdb). So if you don't find any way to fix it, you can easily switch to any other front-end (say, Emacs).
Your are under unix, right? That depends on your window manager. The current behavior is called "focus follow mouse."
OTOH, the commenter is right to point out that if DDD is the only program showing this problem, it might be something else. One idea is to turn of some tool auto raise magic, as indicated in the manual.
Related
My AutoIt script simulates mouse clicks. First a right click in one place, then a left click in one of many other points. I achieved that with MouseClick() and it works fine.
But now I want the script to work in "background" so I used ControlClick(). But there's no control ID. This is what I tried:
$square = Floor(Random(0,$length)) ;this one gets length of array with coordinates
;MouseClick("right", 1634,195 ,1,1) first version-works fine
ControlClick("Medivia","", "", "right",1,1634,195)
;MouseClick("left", $cordX[$square], $cordY[$square]) first version-works fine
ControlClick("Medivia","", "", "left",1 ,$cordX[$square] ,$cordY[$square])
The script clicks, but only in the place where I leave the mouse pointer. It does not move the mouse pointer by itself. Could anybody help me?
Answer Limitation: To use any of the Control* APIs from AutoIT, you're going to need to be interacting with a real Windows control.
If you just want to do "random" clicks, you probably don't need a real Windows control and should not be relying on ControlClick.
If you're trying to click on the "background" of Windows, you probably want to just minimize all open windows, which you can accomplish with WinMinimizeAll.
GUI clicks with Qt and other frameworks without real Windows Controls
Some frameworks like Qt will not give you a real Windows control for many of the default GUI buttons and the like, so when using AutoIT's Windows Info tool, as well as many of the UI spy tools out there, that info will be missing.
What you may need and I currently need is to resort are workarounds. For your case, it would help if I could see a screenshot of the sequence you're trying to automate; I could give better advice after seeing that.
For my case, I needed to click on a Quit button that had no controls and the developer told me he didn't have a way (or know a way) to add accessible names to the pop-up I was trying to hook into, even though I could hook to the main app's hWND. Luckily, that quit-box had a special color for the Quit button, which allowed me to use AutoIT's PixelSearch to locate it.
When you don't have cool helpers like that, it's usually best to determine the location of the main window, and whatever pixel-offset you need to find what you're looking for.
Something seems to have changed in Qt 5: you can't get a drop or move event if you don't move at least one pixel from the start point where you were when QDrag::exec() was called. Try putting a breakpoint in the dropEvent of the Draggable Icons Sample, then click a boat and release it without moving the mouse. That generates an "ignore" without any drop signal.
(This is on Kubuntu 13.10 with Qt 5.1.)
When teaching how to start a drag operation, the documentation suggests you might use manhattanDistance() to determine if the mouse has moved enough to really qualify as "the user intending to start a drag". But you don't have to use that; you can start up a QDrag on the click itself.
Anyone know of a workaround to have that same kind of choice on the drop side, or is that choice gone completely? :-/
Why I care: I've long had frustrations trying to get a tight control on mouse behavior in GUI apps—Qt included. There seems to be no trustworthy state transition diagram you can draw of the invariants. It's a house of cards you can disprove very easily with simple tests like:
virtual void enterEvent(QEvent * event) {
Q_ASSERT(!_entered);
_entered = true;
}
virtual void leaveEvent(QEvent * event) {
Q_ASSERT(_entered);
_entered = false;
}
This breaks all kinds of ways, and how it breaks depends on the platform. (For the moment I'll talk about Kubuntu 13.10 with Qt 5.1.) If you press the mouse button and drag out of the widget, you'll receive a leaveEvent when you cross the boundary...and then another leaveEvent when the button is released. If you leave the window and activate another app in a window on screen and then click inside the widget to reactivate the Qt app, you'll get two consecutive enterEvents.
Repeat this pattern for every mouse event, and try and get a solid hold on the invariants...good luck! Nailing these down into a bulletproof app that "knows" it's state and doesn't fall apart (especially in the face of wild clicking and alt-Tabbing) is a bit of a lost cause.
This isn't good if your program does allocations and has heavy processing, and doesn't want to do a lot of sweeping under the rug (e.g. "Oh, I was doing some processing in response to being entered... but I just got entered again without a leave. Hm, I guess that happens! Throw the current calculations away and start again...")
In the past what I've done is to handle all my mouse operations (even simple clicking) with drag & drop. Getting the OS drag & drop facility involved in the operation tended to produce a more robust experience. I can only presume this is because the testers actually had to consider things like task switching with alt-Tab, etc. and not cause multiple drop operations or just forget that an operation had been started.
But the "baked in at a level deeper than the framework" aspect actually makes this one-pixel-move requirement impossible to change. I tried to hack around it by setting a timer event, then faking a QMouseEvent to bump the cursor to a new position once the drag was in effect. However, I surmise that the drag and drop is hooked in at the platform level, and doesn't consult the ordinary Qt event queue: src/plugins/platforms/xcb/qxcbdrag.cpp
The issue has--as of 1-May-2014--been acknowledged as a bug by the Qt team:
https://bugreports.qt-project.org/browse/QTBUG-34331
It seems that me bountying it here finally brought it to their attention, though it did not generate any SO answers I could accept to finalize the issue. So I'm writing and accepting my own. Good work, me. (?) Sorry for not having a better answer. :-/
There is another unfortunate side effect of the Qt5 change, pointed out by a "Dmitry Mordvinov":
Same problem here. Additionally app events are not handled till the first mouse event after drag started and this is really nasty bug. For example all app animations are suspended during that moment or application hangs up when you try to drag with touch monitor.
#dvvrd had to work around it, but felt the workaround was too ugly to share. So it seems that if you're affected by the problem, the right thing to do is go weigh in...and add your voice to the issue tracker to perhaps raise the priority of a solution.
(Or even better: patch it and submit the patch. 'tis open source, after all...)
From what I see, QApplication::mouseButtons() may return no buttons even when a button is held down. This happens when you have clicked a side of a window for re-sizing. It's coherent with the docs because mouseButtons() reflects the state from the flow of QEvent::mouseButtonPress, etc. However, I need just to know if the button is held down. Does any one know if it's possible through the Qt API?
I think it's not possible. Mouse events outside an application's window are not passed to its event handlers. Dragging mouse borders is one of such events, it's processed by the window system. Another example is clicking on other windows. Usually an application doesn't know what the user does with other windows. You need to install system-wide event listener or use native API features(e.g. GetAsyncKeyState on Windows) to determine that. This behavior is unusual and possibly dangerous. In most cases it's not useful, and it seems that Qt doesn't have this ability.
Not why no one has been complaining about this but I'm have a lot of problems with the Blackberry Playbook Virtual Keyboard on the Simulator.
I have an richedit component in the middle of the screen and as soon as the virtual keyboard appears to enter text, it completely hides the text input. I'd like to move the text input up when the keyboard appears/disappears. Is there any way to do this? I don't want to muck around with the focus_in and focus_out events on the richedit. I've tried, and it's not very reliable.
Thank you in advance!
We expect the next release of the SDK (long overdue at this point but, I think, imminent) to provide much more complete support for the virtual keyboard. Until that occurs, I think it's a waste of time to attempt to do anything special with it.
I also think there's a chance it will automagically move your whole stage up when it would cover up a text input, so maybe you won't have to do anything about it anyway.
Edit: Actually I published code in January describing an undocumented way to support this, using some rudimentary PPS support. It also shows how you can programmatically control the keyboard opening and closing. I don't recommend it yet for real code...
I have a problem with Qt modeless dialog in Solaris 8/10 machine using CDE (Common Desktop Environment).
The dialog serve as drawing panel/popup that required user to choose the tools from main application before proceeding to draw on it. The problem is whenever user click on main application toolbar, then the dialog will goes behind the main application.
Notice that this is the behavior pertaining to CDE only, Open Windows enviroment or Solaris Java environment don't cause this issue.
My question is how can I make it always on top of its parent (main application)?
I've tried to pass in WX11BypassWM flag to the dialog, to by pass window manager, but then the border and frame is gone which cause the dialog to not drag/move-able.
Update 1:
With regard to Andy's answer:
I've tried Qt::WStyle_StaysOnTop, but it doesn't work.
I tried also to combine:
WX11BypassWM | WStyle_StaysOnTop | WStyle_Title
and other combinationa of WStyle_DialogBorder, WType_TopLevel etc, it only stays on top if WX11BypassWM is passed in.
But whenever WX11BypassWM is passed in, the dialog shown up will not have border nor the tittle bar.
Which means it's a unmove-able tittle-less dialog.
Update 2:
Since I can't find a solution for this issue, for the time being, I resolve it by resizing and reposition the main application and the modeless dialog to be side by side.
This at least will let user navigate in both interface.
Anyone if has a better suggestion then let me know.
I'm not sure I understood your question, but, wouldn't it be possible to use :
Found in QtAssistant :
enum Qt::WindowType
flags Qt::WindowFlags
Qt::WindowStaysOnTopHint :
"Informs the window system that the window should stay on top of all other windows."
I hope it helps a bit !