Is there any way to obtain the keypress info for the green-tick (green check marked key) present in the input panel/ virtual keyboard?
Our application is for symbian touch devices. In my application I wanted the url to start loading as soon as the QLineEdit has been edited i.e., as soon as the green check marked key in VKB has been pressed.
I tried using the CloseSoftwareInputPanel event to get notified when the editing is completed in the QLineEdit i.e., when the green check marked key is pressed in the VKB, but that event occurs only after the QLineEdit loses focus (when I tap anywhere on the screen). Is there any other way to get it done?
Thanks
The virtual keyboard is implemented as a front-end processor (FEP) and it's quite invisible to applications.
Note that there are Symbian touch devices that also have a physical keyboard. Not for those devices alone, you should have a Go button or similar and observe returnPressed signal.
Related
I am developing a touch application for Windows 7 with Qt/QML. The end-user-device has Windows 7's native touch behavior, i.e.: When touching the screen, a point appears on the last-touched-point, and when ending the physical touch, Windows puts that point on the now-touched point and runs in the on-clicked-Event.
Compared to the behavior one knows from standard Windows mouse-usage, this leads to a different behavior as soon as it comes to e.g. clicking some button: A mouse user will expect that the button changes color to the pressed-down-color when mouse button goes down, while the color changes to the default color again when the mouse button goes up.
In my application, I want to have a customized way of touch feedback: What is currently being touched should be marked using changed colors of buttons, imitating a "mouse goes down" when the actual physical touch begins and imitating a "mouse goes up" when the actual physical touch ends.
My application will run fullscreen, so an actual possibility would be to change the system's behavior on application start and change it back to default on applications end.
Such a behavior would effectively be the same as the standard behavior on e.g. all Android devices I know.
I searched through all the MouseArea and MultiPointTouchArea elements, trying to find a way to just make the click-reaction behavior different to the standard behavior. However I did not even find a way to capture the begin of the actual touch ... All the things which I want to happen at the begin of the touch actually happen when the touching ends.
Edit:
It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called.
Possibly related:
Adobe AIR: touch screen doesn't trigger mouse down event correctly - but I did not find a way to access the functions like Multitouch.inputMode (which are mentioned in the first reply) from a native Qt application.
How can I achieve the described behavior for my application?
The solution for this issue is to disable "Press and Hold" for the application. This is what can be done in a system-wide setting using ...
Control Panel -> Pen and Touch -> Touch -> Press and Hold -> Settings -> uncheck 'Enable press and hold for right-clicking'
The only solution I found to to this in native code can be found here:
http://msdn.microsoft.com/en-us/library/ms812373.aspx
I checked that this is at least still working for Windows 7. To get it working for QML, I searched for the QWindow* in QQmlApplicationEngine::rootObjects() and used its winId as a HWND. With that HWND, I called the TogglePressAndHold function from the link before app.exec().
From what I see, QApplication::mouseButtons() may return no buttons even when a button is held down. This happens when you have clicked a side of a window for re-sizing. It's coherent with the docs because mouseButtons() reflects the state from the flow of QEvent::mouseButtonPress, etc. However, I need just to know if the button is held down. Does any one know if it's possible through the Qt API?
I think it's not possible. Mouse events outside an application's window are not passed to its event handlers. Dragging mouse borders is one of such events, it's processed by the window system. Another example is clicking on other windows. Usually an application doesn't know what the user does with other windows. You need to install system-wide event listener or use native API features(e.g. GetAsyncKeyState on Windows) to determine that. This behavior is unusual and possibly dangerous. In most cases it's not useful, and it seems that Qt doesn't have this ability.
I have a Symbian Qt "background" app that displays a button type thing using CWindowDrawer derived from CActive that is visible on top of everything. I want it to be visible when the a textinput is active or the virtual keyboard is visible, preferably when a textinput is active because some phones have a hardware keyboard. When it is visible I want it to receive all touch positions as to determine if it is being pressed without it taking focus and closing the keyboard?
Basically I need a way to determine when a textinput is activated and I need a way to listen for all touch positions from a non qwidget that is also not in focus.
Any help would be appreciated.
PS. This is for Symbian Belle and Symbian C++ is also supported within a Qt application if necessary.
How would I detect the pressing of one of the arrow keys in qt? Also, would the application still detect them if it is minimized?
How would I detect the pressing of one of the arrow keys in qt?
By handling the key press event in the top most widget in the hierarchy. See the list of key codes, the Qt::Left - Qt::Down range is what you're interested in.
Also, would the application still detect them if it is minimized?
No. It would detect them only if it had the keyboard's focus, which is not the case when minimized. You can't set up global hotkeys in a cross-platform fashion in Qt.
I'm creating a virtual keyboard for a touchscreen Flex app and i'm trying to simulate a key press by dispatching a KeyboardEvent. I've written a handler function to listen for the event and act accordingly. So far so good... but it's starting to get complicated as i have to manage the focused textInputs (easy), the cursor position in those fields (not to so easy), etc.
Now, if only there was a way to actually dispatch a KeyboardEvent that Flex would actually interpret as a genuine key press all those issues would be gone... Is that possible?
The TextInput does not use KeyboardEvent/TextEvent for text input, it uses internal Flash TextField objects that interact with the Flash Player / Keyboard.
The KeyboardEvent are used to enable notification of the Keyboard Event that occured.
To simulate a keyboard, you will need to create a class that upon recieving a KeyboardEvent will modify the text property of a TextInput and the cursor position accordingly.
Alex Harui has written a similar post about this FlexCoders Post