QML Shortcut Hold event - qt

I'm working on a Qt app, and developing most of the UI in QML. It will be mostly keyboard driven, and has quite a few pages/popups. So to avoid having a potential focus scope catastrophe, I've written most of the keybindings as Shortcuts.
Some of the shortcuts are to be activated when a key is held. These keys are physical buttons on a piece of external hardware. For instance, if Physical Key 1 is pressed, do action A. If Physical Key 1 is held for 2 seconds, do action B. How would I go about doing this?

Related

Squish not recording keypress events for mapped keys in Qt Application

I am testing the Squish software for automated testing and trying to record test cases. I have an application that has mapped several of the keys to perform certain tasks within the application like PgDn moves focus clockwise or [ moves a small screen element up. When I click the record button in squish, it only records keypress events for keys that are not mapped - severely limiting my test case creation capability. The only thing I found was two comments from 3+ years ago here with no response or resolution. When I watch the sponsored tutorials, every keypress is captured in their resulting script. Has anyone experienced this and found a workaround?

Simplest code to handle any key press - 8051

I'm trying to figure out the best approach to my problem.
I have an AT89S52 microcontroller with 3 buttons connected to 3 separate GPIO pins so that the microcontroller can perform different functions based on those buttons.
I'm trying to write code that waits for the user to press any of the three keys. By key press, I mean where the system detects any one key is fully pressed down and then fully released.
The code I presented below might work if I added a schmitt trigger in my hardware but I don't want to redo my circuit board (again).
Without adding interrupts, is there a way I can modify the code shown with just a few lines to reliably detect a user key press?
I ask because keys experience a phenomenon called "bouncing" where as soon as someone presses a key, it actually jitters at high speed and the microcontroller will see it as key being pressed and released multiple times. I don't want that to happen if the user legitimately pressed the key only once.
;KEY1I, KEY2I and KEY3I = GPIO pins connected to the keys
;Pin value is low when key is held down
w4key:
;begin key scan
jnb KEY1I,w4keyend
jnb KEY2I,w4keyend
jnb KEY3I,w4keyend
;here, nothing is pressed so scan again
sjmp w4key
w4keyend:
;key pressed. Hope for release
jnb KEY1I,w4key
jnb KEY2I,w4key
jnb KEY3I,w4key
;here, key is released so return.
ret
mainline:
;do something
acall w4key
;do another thing
...
You can use a timer (AT89S52 has several timers, you can use one of them if there are not otherwise used in your project) and a synchronous state machine. The state machine has 4 states for every key and definite transitions. I found this link that explains the concept quite thorough. Although the provided example code in this link is in C, you can easily "translate" it to your assembly code. If you need help with this, just leave a comment.
https://www.eeweb.com/profile/tommyg/articles/debouncing-push-buttons-using-a-state-machine-approach

Python Library for Stimulating Mouse Position and Keyboard presses

Please Suggest Python Library to controll Mouse Position While Playing Games Like CS GO . I have Used PynPut ,win32api..,Both not working while playing game
The way that this works in the Win32 API is that the input commands are handled with a chain of hooks. Simply speaking the input device sends input to the OS and the OS sends input to the running applications, an application can attach itself to this hook system and choose to suppress the handled input command or pass it along the chain. In the case of some modern games, they take full control of the input chain by not passing the input commands they handle along the chain.
https://msdn.microsoft.com/en-us/library/windows/desktop/ms644960%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396

What key is Keys.Context1 tied to in Qt/QML?

I'm currently building a proof of concept application, on windows, testing out what I can do with QML but the end result is going to run on an embedded Linux system (which I'll need to learn too). I've been working with key handling (Enter, Up, down, Left and Right etc.) and noticed that there are 4 keys marked as Context1 to 4.
In the QtQuick docs there is reference to Keys.Context1..4 with associated onPressed events but not about how they are used.
context1Pressed(KeyEvent event)
This signal is emitted when the Context1 key has been pressed. The event parameter provides information about the event.
The corresponding handler is onContext1Pressed.
How do I find out what physical keys these are bound to, or how can I specify which keys they bind to?
Qt/QML is intended for use on the desktop (Windows, Linux, Mac) and mobile platforms. I strongly suspect the contextX key events and handlers are for mobile platform use. I did some googling and you can see the intent on this Qtopia site. So you won't use these keys on Windows.
The key codes for the Keys enumeration is in CoreLib/Global/QNamespace.h; the Key_Context1..4 keys are bound to the key codes 0x01100000..3. As Tod has mentioned the keys are designed for the mobile platform, similar mobile keys are defined close to the context keys, for example: Key_Call, Key_Camera etc.
The Qt Embedded framework apparently reads the keys directly from the tty device (on Linux) and so doesn't use the OS key mappings, there is a patch that allows you to specify bindings like { Qt::Key_A, Qt::Key_Context1, Qt::Key_Unknown, Qt::Key_Unknown } from this page: http://llg.cubic.org/patches/qtegerman.html I'm not sure yet if that helps.
I hoped that I'd be able to use the OS inbuilt key mapping features, but if it is reading directly from the input device I'm not sure this is possible.
I'll update this if I learn any more.
Update
The platform we're running on has an API that deals with the keyboard device so Qt doesn't access it directly; I add a listener for each button and then send the key press event to Qt using:
QKeyEvent* key = QKeyEvent(QEvent::KeyPress, Qt::Key_Context1, Qt::NoModifier);
qApp->SendEvent(view, key).
where view is the QML DeclarativeView (or equivalent).

What are the differences between event and signal in Qt

It is hard for me to understand the difference between signals and events in Qt, could someone explain?
An event is a message encapsulated in a class (QEvent) which is processed in an event loop and dispatched to a recipient that can either accept the message or pass it along to others to process. They are usually created in response to external system events like mouse clicks.
Signals and Slots are a convenient way for QObjects to communicate with one another and are more similar to callback functions. In most circumstances, when a "signal" is emitted, any slot function connected to it is called directly. The exception is when signals and slots cross thread boundaries. In this case, the signal will essentially be converted into an event.
Events are something that happened to or within an object. In general, you would treat them within the object's own class code.
Signals are emitted by an object. The object is basically notifying other objects that something happened. Other objects might do something as a result or not, but this is not the emitter's job to deal with it.
My impression of the difference is as follows:
Say you have a server device, running an infinite loop, listening to some external client Events and reacting to them by executing some code.
(It can be a CPU, listening to interrupts from devices, or Client-side Javascript browser code, litsening for user clicks or Server-side website code, listening for users requesting web-pages or data).
Or it can be your Qt application, running its main loop.
I'll be explaining with the assumption that you're running Qt on Linux with an X-server used for drawing.
I can distinguish 2 main differences, although the second one is somewhat disputable:
Events represent your hardware and are a small finite set. Signals represent your Widgets-layer logic and can be arbitrarily complex and numerous.
Events are low-level messages, coming to you from the client. The set of Events is a strictly limited set (~20 different Event types), determined by hardware (e.g. mouse click/doubleclick/press/release, mouse move, keyboard key pressed/released/held etc.), and specified in the protocol of interaction (e.g. X protocol) between application and user.
E.g. at the time X protocol was created there were no multitouch gestures, there were only mouse and keyboard so X protocol won't understand your gestures and send them to application, it will just interpret them as mouse clicks. Thus, extensions to X protocol are introduced over time.
X events know nothing about widgets, widgets exist only in Qt. X events know only about X windows, which are very basic rectangles that your widgets consist of. Your Qt events are just a thin wrapper around X events/Windows events/Mac events, providing a compatibility layer between different Operating Systems native events for convenience of Widget-level logic layer authors.
Widget-level logic deals with Signals, cause they include the Widget-level meaning of your actions. Moreover, one Signal can be fired due to different events, e.g. either mouse click on "Save" menu button or a keyboard shortcut such as Ctrl-S.
Abstractly speaking (this is not exactly about Qt!), Events are asynchronous in their nature, while Signals (or hooks in other terms) are synchronous.
Say, you have a function foo(), that can fire Signal OR emit Event.
If it fires signal, Signal is executed in the same thread of code as the function, which caused it, right after the function.
On the other hand, if it emits Event, Event is sent to the main loop and it depends on the main loop, when it delivers that event to the receiving side and what happens next.
Thus 2 consecutive events may even get delivered in reversed order, while 2 consecutively fired signals remain consecutive.
Though, terminology is not strict. "Singals" in Unix as a means of Interprocess Communication should be better called Events, cause they are asynchronous: you call a signal in one process and never know, when the event loop is going to switch to the receiving process and execute the signal handler.
P.S. Please forgive me, if some of my examples are not absolutely correct in terms of letter. They are still good in terms of spirit.
An event is passed directly to an event handler method of a class. They are available for you to overload in your subclasses and choose how to handle the event differently. Events also pass up the chain from child to parent until someone handles it or it falls off the end.
Signals on the other hand are openly emitted and any other entity can opt to connect and listen to them. They pass through the event loops and are processed in a queue (they can also be handled directly if they are in the same thread).

Resources