My team is developing an UI for an apparatus with touch screen and we would like it to emit a sound (from a buzzer) each time the user correctly presses a button (so using the release event). Notice that I don't want to play the sound after each click on the interface, but only when the click is over a button.
We use many types of button, sometimes QPushButton and most of the times customized buttons derived from QAbstractButton. In most cases these buttons get an objectName.
So I supposed in order to do that, I would have to catch the MouseButtonRelease event and since I'm already working with a subclass of QApplication to handle excetions, I decided to do this in the notify function.
I tried, then, some methods to recognized when the MouseButtonRelease was related to a button but none of them were successfull. The best one, verifying the receiver's objectName was still not good enought not only because not all buttons had an objectName (which, of course, can be handled), but specially because not always the event was caught for buttons with names set. In other words, sometimes I would click in a button and it recognizes the event and sometimes I would click in the same button and the event is not recognized.
I did some research and another method I found was to set an event filter in the MainWindow, but not all widgets have the MainWindow as their parent which means I would have to Ctrl+c / Ctrl+V the same code time after time when I obviously want something more localized (i.e. in only one spot).
So why it happens that the notify not always handles the events? And how could I do this? Any suggestions are appreciated specially one that is less heavier then handling the events globally.
As info, the other two ways I tried to catch the events with similar or even worst results inside notify were with receiver->inherits("...") and qobject_cast< QAbstractButton* >(receiver).
Related
Suppose I've a Qt app based on QWidget, in which I'd like to obtain a "feedback" everytime I press a QPushButton, I rotate a QDial, I press a mouse button. My question is: how can I set a slot that informs me everytime something happened inside my app built for a touchscreen, in which I need to know when user does something on it.
MousePressEvent partially solves my problem; so when user touches the screen, MousePressEvent warns me about it. Problem is related to widgets as QPushButtons or QDials.
At the moment, I've already arranged my widgets with a lot of subwidgets; number of pushbutton is about 300. How can obtain triggering signal when one of them is pressed without re-edit every single button?
You can use an "event filter" that is installed on QApplication instance. This filter will receive all the events in your application, afterwards you can check event type and object type to pick the events you are interested in. See my answer to another question here for details.
I am writing a program in Qt that looks like this:
The main window is my class Window : QWidget, it has a QGridLayout containing four other widgets (Input_Menu : QWidget and Output_Menu : QWidget, and then two Canvas : QWidget)
I would like to trigger certain events when the user strikes a key. The problem is, the Window sometimes loses focus (it goes, say to Input_Menu, or maybe a button in Input_Menu...)
I have tried the following solutions, but they seem unsatisfactory (and dirty):
Give Window the focus whenever it loses it.
Tell each widget who could have the focus to trigger Window's KeyPressEvent function (or a clone of it) whenever it receives a keyboard event.
Ideally, I would like that if a widget receives an event (say a keyboard event) and doesn't know what to do with it, it should automatically call its parent's event handler. I would have hoped this to be a default feature of Qt but it doesn't look like it. On the other hand I am really confused about the whole focus thing, I don't really get what's going on. Can someone explain this to me: I have included a std::cout << "key pressed" << std::endl; in my Window::KeyPressEvent function. When I first run my program, it seems the focus is on the top QComboBox in Input_Menu: if I hit the Up/Down keys, I navigate in that box and no "key pressed" is showed in my console. If I hit most letters, nothing happens. But if I hit Left/Right keys, I do get a "key pressed" in my console!?
Thanks a lot in advance for your insights.
You can install an event filter on QApplication to filter the relevant QEvent::KeyPress events globally. From the Qt documentation:
It is also possible to filter all events for the entire application,
by installing an event filter on the QApplication or QCoreApplication
object. Such global event filters are called before the
object-specific filters. This is very powerful, but it also slows down
event delivery of every single event in the entire application; the
other techniques discussed should generally be used instead.
Besides the performance considerations, remember to check if your window currently has the focus before you filter the key event, or you might break popup dialogs or input into other windows.
Actually, I found that for keys that are modifiers (such as Shift, Control), Qt supports finding out whether they are pressed.
Eg : if(QApplication::keyboardModifiers() == Qt::ShiftModifier) ...
This is good enough.
I need help to understand to use QEvents in QT, this is driving me crazy.
I am writting an application using custom events, but as in QApplication::postEvent function, it's necesary to specify the target object.
As I understand, it's possible to post events to Qt's event loop with
QApplication::postEvent(obj_target, QEvent myevent);
This means that I'm trying to catch "myevent" event in obj_target an do some stuff.
But I need to post events without specify a target object, as QMouseEvent or QKeyEvent do
I mean, when clicking in a QMainWindow with a lot of buttons, how is that I can click
any button and that button is pressed?
What is the target object when the click event is posted?
It's possible to register objects to "listen" for a specific event?
I'm really confused, it's possible to post an event without specifying a target object?
Thank you very much in advance
There is no trivial way to post events "globally", as Dan has said. All of the event dispatching of native events is done by private Qt implementation code.
The important distinction is:
There are native messages/events, delivered by the operating system, usually received by a window-specific event loop.
There are QEvents.
Internally, Qt keeps track of the top-level Widgets (windows, really), so when it receives an event from the OS, it knows which window it should go to - it can match it using the platform window id, for example.
QEvent delivery makes no sense without a receiving object, since sending an event to an object really only means that QObject::event(QEvent*) method is called on that object. It's impossible to call this method without having an object instance!
If you want to synthesize a global key press or mouse click event, then you have to figure out what object the event goes to. Namely:
Identify what top-level window (widget) the event should go to. You can enumerate top level widgets via qApp->topLevelWidgets().
Identify the child widget the event should go to. If it's a keyboard event, then sending the event to currently focused widget via qApp->focusWidget() is sufficient. You need to enumerate the child widgets to find the deepest one in the tree that overlaps the mouse coordinates.
Send the correct QEvent subclass to the widget you've just identified. Events delivered to top-level widgets will be routed to the correct child widget.
When sending mouse events, you also need to synthesize relevant enter and leave events, or you risk leaving the widgets in an invalid state. The application.cpp source file should give you some ideas there.
This doesn't give you access to native graphical items, such as menus on OS X.
Please tell us exactly what you're trying to do. Why do you want to post a broadcast event? Who receives it? Since your own QObject-derived classes will receive those broadcasts, I presume, it's easy enough to use signal-slot mechanism. You'd simply connect(...) those receiver classes to some global broadcaster QObject's signal(s).
For this purpose, I have a specific singleton class which I call GuiSignalHub. It regroups all the application-wide signals.
Objects that want to trigger an application-level action (such as opening context help) just connect their signal to the GuiSignalHub signal. Receivers just connect the GuiSignalHub to their slot.
I´am trying to create my own custom list component in a Flex mobile Project which fires an Event when the user touches a listitem and holds the finger down for a given time.
Some kind of "longTouch"-Event like its implemented on native android listitems to edit the entry for example.
I tried to listen for the MOUSE_DOWN Event to start a timer and dispatch an event when the timer finished. But this approach failed because i cant get the listitem that was pressed by the user because the List component updates the "selectedItem"-property only after the user lifts his finger from the list.
thanks in advance
Andre Uschmann
There is no longTouch (or longPress) event exposed through the Flash Player Native APIs.
One option is to roll your own using TOUCH_BEGIN, TOUCH_END, and a timer.
Basically:
When user starts the touch, start the timer.
When the touch_End event fires; check the timer to see how long it has been running using currentCount. If it is long enough to be considered a "long touch", then dispatch your custom longPress event. If not; then stop the timer and ignore.
This could all happen inside the renderer; so you'd know exactly what item was pressed.
I expect this would be more solid than using mouse events, which seem to be inconsistent on touch based devices
In more than one of my Qt applications I've noticed that, whenever the menu bar is clicked, the last signal to have been sent from a widget within the GUI is re-sent before the menu action is invoked. Most of the time this doesn't matter; but on some occasions it matters very much.
In a few cases where the widget's signal is connected to one of its own slots, it's straightforward to begin the slot with a
if (hasFocus())
{
// ...
}
...block so that such spurious signals, not generated by the user actually clicking on the widget, can be ignored.
However, I've recently identified that this behaviour is responsible for several related bugs where the spurious signals are passed on through several layers of the program before being acted upon, so simply checking whether a particular widget has focus is not trivial to implement.
My question, therefore, is:
why on earth does clicking on a menu item cause a signal to be emitted from a widget elsewhere on the screen? I can't find this behaviour documented anywhere?
how do I stop it?
Many thanks,
Stephen.
As you deduced yourself, the QLineEdit::editingFinished signal can be emitted twice if the user press enter and the QLineEdit loses the focus later. An old bug was created but behaviour has not been changed : https://bugreports.qt.io/browse/QTBUG-40.
Your solution is fine : overload QLineEdit and check if the value has changed before triggering the signal. You can use the flag QLineEdit::isModified for the check : it has a default value of false and is changed to true whenever the user changes the line edit's contents. It has to be manually reset to false when you emit the signal.
You can also do that check in the receiver object, just checking if the value has a meaning (is different from the last recorded value for example) and process it or not.
Therefore the polluting signals would not be a problem. It is in my opinion the cleanest solution, then you have a robust model. Because the user could also hit enter a few times with the same text value and you have to handle this anyway.
You can also use QLineEdit::returnPressed but this requires the user to always press enter to validate a value. Not always intuitive but forces the user to explicitly validate the inputs.