Playn and multitouch? - playn

I've made this Button class, to catch the Pointer event :
public class Button implements Pointer.Listener {
public void initLayer(Image defaultImage) {
layer = parent.createImageLayer(this.defaultImage);
layer.addListener(this);
}
...
If I touch one of the instanciated buttons, I get the onPointerStart & onPointerEnd events. But, if one of my button is already touched and I start to touch another, I don't get the onPointerStart event of the second button.
Is there a way to get these multi touch events with playn ?

The Pointer service is meant to abstract over either a simple touch interaction or a mouse interaction. Thus it does not support any sort of multi-touch interactions. You will not receive notifications about any touches other than the first via the Pointer service.
If you want to handle multiple touches, you have to use the Touch service, and there is currently no way to register Touch listeners directly on layers. So you'll have to register a global listener and do your own hit testing, and map touch movements to the layer that was first hit by that touch, etc.

Related

how can i implement drag and drop in urho 3d view?

I have added 3d view objects using urhosharp for my xamarin uwp/ios/android project . The only event that work is touch event, but i also want to use drag and drop so that the objects can move to different locations within the 3D view. Any suggestions?
https://us.v-cdn.net/5019960/uploads/editor/ni/u16pg79v2m62.png
Haven't used urhosharp yet , but here are some suggestions about using of drag and drop, not sure if it helps here for you.
urhosharp: Basic Actions
From document of urhosharp , there is some bascic actions explains,but no drag and drop in it.Maybe you can do it by combining actions and drag methods on each platform. But this requires you to try.
UWP: reference link here
Here's an overview of what you need to do to enable drag and drop in your app:
Enable dragging on an element by setting its CanDrag property to
true.
Build the data package. The system handles images and text
automatically, but for other content, you'll need to handle the
DragStarted and DragCompleted events and use them to construct your
own data package.
Enable dropping by setting the AllowDrop property to true on all the
elements that can receive dropped content.
Handle the DragOver event to let the system know what type of drag
operations the element can receive.
Process the Drop event to receive the dropped content.
Code example:
<Grid AllowDrop="True" DragOver="Grid_DragOver" Drop="Grid_Drop"
Background="LightBlue" Margin="10,10,10,353">
<TextBlock>Drop anywhere in the blue area</TextBlock>
</Grid>
private void Grid_DragOver(object sender, DragEventArgs e)
{
e.AcceptedOperation = DataPackageOperation.Copy;
}
IOS: reference link here
With drag and drop in iOS, users can drag items from one onscreen location to another using continuous gestures. A drag-and-drop activity can take place in a single app, or it can start in one app and end in another.
Use drag items to convey data representation promises between a source app and a destination app.
Adopt drag interaction APIs to provide items for dragging.
Adopt drop interaction APIs to selectively consume dragged content.
Demonstrates how to enable drag and drop for a UIImageView instance.
Code of example:
func customEnableDragging(on view: UIView, dragInteractionDelegate: UIDragInteractionDelegate) {
let dragInteraction = UIDragInteraction(delegate: dragInteractionDelegate)
view.addInteraction(dragInteraction)
}
func dragInteraction(_ interaction: UIDragInteraction, itemsForBeginning session: UIDragSession) -> [UIDragItem] {
// Cast to NSString is required for NSItemProviderWriting support.
let stringItemProvider = NSItemProvider(object: "Hello World" as NSString)
return [
UIDragItem(itemProvider: stringItemProvider)
]
}
Here is a sample for Xamarin IOS.
Or you can using UIPanGestureRecognizer in IOS to move view.Here is Walkthrough: Using Touch in Xamarin.iOS.All you need to do is let view.center follow panGesture to change.
Android: reference link here
With the Android drag/drop framework, you can allow your users to move data from one View to another using a graphical drag and drop gesture. The framework includes a drag event class, drag listeners, and helper methods and classes.
There are basically four steps or states in the drag and drop process:
Started:In response to the user's gesture to begin a drag, your application calls startDrag() to tell the system to start a drag.
Continuing:The user continues the drag.
Dropped:The user releases the drag shadow within the bounding box of a View that can accept the data.
Ended:After the user releases the drag shadow, and after the system sends out (if necessary) a drag event with action type ACTION_DROP, the system sends out a drag event with action type ACTION_DRAG_ENDED to indicate that the drag operation is over.
Table. DragEvent action types:
Or in Android can use onTouchEvent to move view, need to calculate position of the view.Walkthrough - Using Touch in Android
The main thing is to handle pressing and moving two messages, overloading onTouchEvent. Mathematical knowledge (translation): Record the coordinate point when ACTION_DOWN, and calculate the translation amount according to the current position and the position when pressed at ACTION_MOVE. Refresh the control, causing the control to redraw, and move the coordinates of the upper left corner of the drawing when redrawing.
Here also has a discussion about Drag & Drop in Xamarin forms.It may be helpful.

How to catch all and only button release events in Qt?

My team is developing an UI for an apparatus with touch screen and we would like it to emit a sound (from a buzzer) each time the user correctly presses a button (so using the release event). Notice that I don't want to play the sound after each click on the interface, but only when the click is over a button.
We use many types of button, sometimes QPushButton and most of the times customized buttons derived from QAbstractButton. In most cases these buttons get an objectName.
So I supposed in order to do that, I would have to catch the MouseButtonRelease event and since I'm already working with a subclass of QApplication to handle excetions, I decided to do this in the notify function.
I tried, then, some methods to recognized when the MouseButtonRelease was related to a button but none of them were successfull. The best one, verifying the receiver's objectName was still not good enought not only because not all buttons had an objectName (which, of course, can be handled), but specially because not always the event was caught for buttons with names set. In other words, sometimes I would click in a button and it recognizes the event and sometimes I would click in the same button and the event is not recognized.
I did some research and another method I found was to set an event filter in the MainWindow, but not all widgets have the MainWindow as their parent which means I would have to Ctrl+c / Ctrl+V the same code time after time when I obviously want something more localized (i.e. in only one spot).
So why it happens that the notify not always handles the events? And how could I do this? Any suggestions are appreciated specially one that is less heavier then handling the events globally.
As info, the other two ways I tried to catch the events with similar or even worst results inside notify were with receiver->inherits("...") and qobject_cast< QAbstractButton* >(receiver).

Post events without specifying target object in Qt

I need help to understand to use QEvents in QT, this is driving me crazy.
I am writting an application using custom events, but as in QApplication::postEvent function, it's necesary to specify the target object.
As I understand, it's possible to post events to Qt's event loop with
QApplication::postEvent(obj_target, QEvent myevent);
This means that I'm trying to catch "myevent" event in obj_target an do some stuff.
But I need to post events without specify a target object, as QMouseEvent or QKeyEvent do
I mean, when clicking in a QMainWindow with a lot of buttons, how is that I can click
any button and that button is pressed?
What is the target object when the click event is posted?
It's possible to register objects to "listen" for a specific event?
I'm really confused, it's possible to post an event without specifying a target object?
Thank you very much in advance
There is no trivial way to post events "globally", as Dan has said. All of the event dispatching of native events is done by private Qt implementation code.
The important distinction is:
There are native messages/events, delivered by the operating system, usually received by a window-specific event loop.
There are QEvents.
Internally, Qt keeps track of the top-level Widgets (windows, really), so when it receives an event from the OS, it knows which window it should go to - it can match it using the platform window id, for example.
QEvent delivery makes no sense without a receiving object, since sending an event to an object really only means that QObject::event(QEvent*) method is called on that object. It's impossible to call this method without having an object instance!
If you want to synthesize a global key press or mouse click event, then you have to figure out what object the event goes to. Namely:
Identify what top-level window (widget) the event should go to. You can enumerate top level widgets via qApp->topLevelWidgets().
Identify the child widget the event should go to. If it's a keyboard event, then sending the event to currently focused widget via qApp->focusWidget() is sufficient. You need to enumerate the child widgets to find the deepest one in the tree that overlaps the mouse coordinates.
Send the correct QEvent subclass to the widget you've just identified. Events delivered to top-level widgets will be routed to the correct child widget.
When sending mouse events, you also need to synthesize relevant enter and leave events, or you risk leaving the widgets in an invalid state. The application.cpp source file should give you some ideas there.
This doesn't give you access to native graphical items, such as menus on OS X.
Please tell us exactly what you're trying to do. Why do you want to post a broadcast event? Who receives it? Since your own QObject-derived classes will receive those broadcasts, I presume, it's easy enough to use signal-slot mechanism. You'd simply connect(...) those receiver classes to some global broadcaster QObject's signal(s).
For this purpose, I have a specific singleton class which I call GuiSignalHub. It regroups all the application-wide signals.
Objects that want to trigger an application-level action (such as opening context help) just connect their signal to the GuiSignalHub signal. Receivers just connect the GuiSignalHub to their slot.

Recognize a holding/long touch on a list component in Flex Mobile

I´am trying to create my own custom list component in a Flex mobile Project which fires an Event when the user touches a listitem and holds the finger down for a given time.
Some kind of "longTouch"-Event like its implemented on native android listitems to edit the entry for example.
I tried to listen for the MOUSE_DOWN Event to start a timer and dispatch an event when the timer finished. But this approach failed because i cant get the listitem that was pressed by the user because the List component updates the "selectedItem"-property only after the user lifts his finger from the list.
thanks in advance
Andre Uschmann
There is no longTouch (or longPress) event exposed through the Flash Player Native APIs.
One option is to roll your own using TOUCH_BEGIN, TOUCH_END, and a timer.
Basically:
When user starts the touch, start the timer.
When the touch_End event fires; check the timer to see how long it has been running using currentCount. If it is long enough to be considered a "long touch", then dispatch your custom longPress event. If not; then stop the timer and ignore.
This could all happen inside the renderer; so you'd know exactly what item was pressed.
I expect this would be more solid than using mouse events, which seem to be inconsistent on touch based devices

Qt - top level widget with keyboard and mouse event transparency?

I want an app's main window to ignore mouse and keyboard events, passing them to applications underneath it in the window manager Z-order.
I see how to make child widgets ignore keyboard or mouse events, but how about the main window?
I'm trying to make a desktop widget that always sits just over the background and is totally invisible to keyboard and mouse events. (Pass through)
Qt::X11BypassWindowManagerHint gets me keyboard pass through (although sadly X11 specific, but fine for now), so how about mouse events?
Is there a OS-agnostic way to be transparent to keyboard events?
EDIT:
The key word here is transparency.
I don't want to EAT mouse and keyboard events, I want the window manager to know I don't want them at all. Those events should be directed to whatever application is under me in the zorder.
For example, I want to be able to click on desktop icons that are covered by my widget and interact with them as if the widget was not there.
I found the following solution (tested on Linux, also works on Windows according to #TheSHEEEP):
setWindowFlags(windowFlags() | Qt::WindowTransparentForInput);
It has been added in more recent qt release (i did not find when)
see http://doc.qt.io/qt-5/qt.html
On Windows you can set WS_EX_TRANSPARENT
To do this in Qt use the following code:
Include the header,
#if _WIN32
#include <windows.h>
#endif
and put the following code into the constructor.
#if _WIN32
HWND hwnd = (HWND) winId();
LONG styles = GetWindowLong(hwnd, GWL_EXSTYLE);
SetWindowLong(hwnd, GWL_EXSTYLE, styles | WS_EX_TRANSPARENT);
#endif
Maybe what you want is
widget->setAttribute(Qt::WA_TransparentForMouseEvents)
? That's what QRubberBand uses to let it's parent handle the mouse events. As for keyboard events, a QWidget doesn't get any keyboard events unless it has set itself a focusPolicy().
setFocusPolicy( Qt::NoFocus );
should therefore take care of the keyboard events.
Use Qt's event filters: they will allow your application to eat whichever events you specify (i.e. keyboard and mouse events) but still process other events such as paint events.
bool FilterObject::eventFilter(QObject* object, QEvent* event)
{
QKeyEvent* pKeyEvent = qobject_cast<QKeyEvent*>(event);
QMouseEvent* pMouseEvent = qobject_cast<QMouseEvent*>(event);
if (pKeyEvent || pMouseEvent)
{
// eat all keyboard and mouse events
return true;
}
return FilterObjectParent::eventFilter(object, event);
}
Maybe I'm missing something here, but have you tried subclassing the QMainWindow class and overriding the QWidget::event() method to always return false? If you need to handle some events, you could add that intelligence here as well.
This technique should allow you to inspect the events coming in to the application and ignore them if desired without having to eat them using an event filter.
If this doesn't work you could attempt to redirect the events to the desktop by calling QCoreApplication::notify() and passing the event to the desktop widget obtained by calling QApplication::desktop(). I have no idea if this would work, but it seemed like it might be worth giving a try.
I think that overriding is supposed to work:
bool YourMainWindow::event( QEvent *event )
{
event ->accept();
return true;
}
that's some of what the QWidget class documentation says about event() member function:
This function returns true if the
event was recognized, otherwise it
returns false. If the recognized event
was accepted (see QEvent::accepted),
any further processing such as event
propagation to the parent widget
stops.

Resources