QGraphicsScene: How to prevent keyboard events from reaching the main window while dragging items, for example? - qt

I'm implementing a basic shape drawing tool using a custom subclass of Qt's QGraphicsScene and several QGraphicsItem. Now there are several situations where I don't want any "global" actions to be executed:
For example, while dragging items around, the user should not be allowed to create a new file or to undo the last action (by pressing Ctrl-Z for example) since this would lead to several problems that would have to be handled separately (if the user is currently drawing an edge between two nodes, what should happen if he presses Ctrl-Z with the last recorded action being the creation of the first node?)
I noticed that several commercial applications like Microsoft Word and Adobe Photoshop just seem to ignore any usual keyboard shortcuts while being in such an "intermediate" state. Furthermore, when dragging items out of the viewport, these tools display a "forbidden" cursor and do not allow any mouse press events to reach the outer window (like a right click on the toolbar, for example).
How should I implement this in my case, when using QGraphicsScene? I already tried to add the following override:
void MyGraphicsScene::keyPressEvent(QKeyEvent* keyEvent)
{
keyEvent->accept();
}
But any pressed keys were still delivered to the main window. In addition to that, I'm not sure if filtering just keyboard events is safe enough, since there might be other input events that could trigger forbidden actions.
Is there any generic approach to this problem that I could use in my software?

Related

JavaFX - Keeping Focus on a "Default" Component

I'm looking for the most concise way to deal with focus in an application which renders a map in a canvas component. You can pan the map location using arrow keys or ASWD keys. So far, I've been giving the canvas focus at startup and handling key pressed events via canvas.setOnKeyPressed().
This works fine, but I've always known that a problem was on the horizon when other components enter the picture. Once you interact with another component, it gains focus, and you're unable to scroll around the canvas map. I can prevent this from happening with some components like Hyperlinks or Buttons (I don't need tab-navigation) with something like this for those components:
sidePanel.getChildren().forEach(node -> node.setFocusTraversable(false));
But, when we get to things like TextArea or TextField, those do need to hold focus while they're being edited. And I'll need some way to return focus back (or at least unfocus those components) without being an annoyance to the user. (I don't want to have to click the canvas for it to regain focus.)
The options I see for returning focus back to the canvas after the user is done with those fields seem to be:
Add a key handler (ex. ESC or ENTER keypress) on EACH of these components which returns focus back to the canvas.
Maybe not so concise, and a bit of a pain... also feels a bit fragile; if I miss something and the canvas loses focus, it would fail - I need a 100% reliable solution.
Extend each of these components and add similar code to return focus back to Canvas
Even nastier and requires using custom components in Scene Builder, which
is not ideal.
Add a global event handler on the Scene and transmit events to the controller which owns the canvas
I believe an event filter would accomplish this - but on the other hand if the user is simply using arrow keys to move around a TextArea, I wouldn't want the Canvas map to move!
To solve the above problem, possibly the global event handler could ignore ASWD and arrow keypresses if the focus is on certain types of components? Is this worth trying, or am I neglecting a problem this would create?
Are there any other simple options out there that I've missed - and what would you suggest as the best option here? I'd like an automatic solution that doesn't require remembering to add some workaround code every time a UI component is added.

PyQt invisible buttons

I am working on a Touchscreen application.
For this I Need to Change the current window if user clicks on the Screen (Position doesn't matter).
For this I Need a to make my button (which is currently the same size as the current window) invisble, so user can see the Labels etc.
Any idea how to make Buttons invisible in PyQt4?
I recommend you not use a button to do this. Instead, either put an event filter on the QApplication instance, so any widgets in your window get events only if you determine they should; OR put a transparent panel widget over the touch area, with a mouse click event handler for that panel. Either method supports arbitrary complexity of widgets inside your touch area (labels and tables to display information etc). Main disadvantage with event filter approach is that all application events (from all threads) will be filtered. This could affect performance (you'd have to test, may not be any noticeable differenc), but it is simpler to implement than the transparent panel.

Mouse button status

From what I see, QApplication::mouseButtons() may return no buttons even when a button is held down. This happens when you have clicked a side of a window for re-sizing. It's coherent with the docs because mouseButtons() reflects the state from the flow of QEvent::mouseButtonPress, etc. However, I need just to know if the button is held down. Does any one know if it's possible through the Qt API?
I think it's not possible. Mouse events outside an application's window are not passed to its event handlers. Dragging mouse borders is one of such events, it's processed by the window system. Another example is clicking on other windows. Usually an application doesn't know what the user does with other windows. You need to install system-wide event listener or use native API features(e.g. GetAsyncKeyState on Windows) to determine that. This behavior is unusual and possibly dangerous. In most cases it's not useful, and it seems that Qt doesn't have this ability.

QListWidget working as a menu - need to be able to activate menu options using number keys rather than just getting focus

This application is for accessibility so will be used by blind/visually impaired users amongst others. So it has a listWidget on the screen disguised as a menu using style sheets. The functionality behind each item on the list needs to be accessed in the following ways: – mouse click – up and down arrows to select then hit return key – number key (voice says “press 1 for email” etc) – hands free voice activation – Braille input
the last two I have not got to yet because I am failing to get the first 3 working (I can get all of them working but not all at the same time, fixing one breaks another). The listWidget already processes number keys (if you press 4 the 4th row gets selected – I need pressing 4 to run the menu item functionality without a second user input) but I am just missing something in how it works.
Is this something I just need to do with different object?
For mouse click, I think it's straightforward to implement menu functionalities in respective clickevents of widgets in listwidget.
Use keyPressEvent(QKeyEvent *); to process up, down and return keys in combination with listwidget->hasFocus() if needed.
Use the same keyPressEvent(QKeyEvent *); to process number keys.

How to make two side by side Qt Windows sticky and act like a single window?

I am trying to implement a scenario where two Qt windows will be placed side by side and they will be kind of sticky to each other. By dragging one of them, the other also gets dragged. Even when doing an alt-tab they should behave like a single window.
Any help or pointer will be extremely helpful.
-Soumya
What you describe sounds like it's a good fit for a "docking" scenario. You're probably most familiar with docking from toolbars; where you can either float a toolbar on its own or stick it to any edge of an app's window. But Qt has a more generalized mechanism:
http://doc.qt.io/qt-5/qtwidgets-mainwindows-dockwidgets-example.html
http://doc.qt.io/qt-5/qdockwidget.html
It won't be a case where multiple top level windows are moved around in sync with their own title bars and such. The top-level windows will be merged into a single containing window when they need to get "sticky". But IMO this is more elegant for almost any situation, and provides the properties you seem to be seeking.
Install a event filter on the tracked window with QObject::installEventFilter() and filter on QEvent::Move
You can then change the position of tracking window whenever your filter is called with that event type.
I found a way to keep two windows anchored: when the user moves a window, the other follows, keeping its relative position to the moved one.
It is a bit of a hack, because it assumes that the event QEvent::NonClientAreaMouseButtonPress is sent when the user left clicks on the title bar, holding it pressed while he moves the window, and releasing it at the end, so that QEvent::NonClientAreaMouseButtonRelease is sent.
The idea is to use the QWidget::moveEvent event handler of each window to update the geometry of the other, using QWidget::setGeometry.
But the documentation states that:
Calling setGeometry() inside resizeEvent() or moveEvent() can lead to infinite recursion.
So I needed to prevent the moveEvent handler of the windows which was not moved directly by the user, to update the geometry of the other.
I achieved this with result via QObject::installEventFilter, intercepting the summentioned events.
When the user clicks on the title bar of WindowOne to start a move operation, WindowOne::eventFilter catches its QEvent::NonClientAreaMouseButtonPress and sets the public attribute WindowTwo::skipevent_two to true.
While the user is moving WindowOne, WindowTwo::moveEvent is called upon the setGeometry operation, performed on WindowTwo from WindowOne::moveEvent.
WindowTwo::moveEvent checks WindowTwo::skipevent_two, and if it is true, returns without performing a setGeometry operation on WindowOne which would cause infinite recursion.
As soon as the user releases the left mouse button, ending the window move operation, WindowOne::eventFilter catches QEvent::NonClientAreaMouseButtonRelease and sets back the public attribute WindowTwo::skipevent_two to false.
The same actions are performed if the user clicks the titlebar of WindowTwo, this time causing WindowOne::skipevent_one attribute to be set to true and preventing WindowOne::moveEvent to perform any setGeometry operation on WindowTwo.
I believe this solution is far from being clean and usable. Some problems:
I am not sure when and why QEvent::NonClientAreaMouseButtonRelease and QEvent::NonClientAreaMouseButtonRelease are dispatched, apart from the case considered above.
When/if one window is resized without user interaction or without the proper mouse clicks from the user, probably everything will go the infinite recursion way.
There is no guarantee that those mouse events will be dispatched the same way in the future.
Free space for more...
Proof of concept:
https://github.com/Shub77/DockedWindows

Resources