Getting a custom UIView to respond to a two finger trackpad scroll gesture (or mouse scroll wheel) - gesture

I have an iPad app (in C#) with a custom UIView that allows input via touch and Apple Pencil stylus touches. I am trying to integrate support for trackpad/mouse indirect (cursor, or "pointer" as Apple calls it).
I got hover working using HoverGestureRecognizer. I got right-click and control-click working using normal touch Began/Moved/Ended/Cancelled events and checking for .type == .indirectPointer and then checking if the control key modifier in event.ModifierFlags is set, or if event.ButtonMask == secondary.
I have spent a lot of time searching through the documentation on the Apple Developer website, starting here and branching out:
UIApplicationSupportsIndirectInputEvents
Somehow I cannot find the API that the system calls in my code when a two-finger trackpad scroll (or mouse scrollwheel scroll) occurs. (On another view that is a scrollview, I can get the scrollview's scroll event when I do a two-finger scroll, since this is built-in to iPadOS 13.4+ for scroll views, but my custom view is not a scroll view, it just has some scrollable areas inside of it.)
Things I tried:
UISwipeGestureRecognizer. Nothing was called for two-finger trackpad scroll gesture.
UIPanGestureRecognizer. Nothing.
Subclassing UIScrollView and adding a UIScrollViewDelegate, just to see if it would work... Nothing.
Subclassing GestureRecognizer and adding that, then overriding ShouldReceive(UIEvent evt) but that was never called.
What does iPadOS 13.4+ convert the trackpad two-finger scroll gesture into? Can I get this as some sort of event? The documentation linked above is pretty disappointingly barebones, but it mentions UIEvent.EventType.scroll but not how or when or where the system will call any of my methods with an event of that type. Pretty infuriating. They should just spell this out more clearly.
Answers in Swift or C# are welcomed.

OK, strangely I thought I tried PanGestureRecognizer, but I must have set it up wrong. The example code project by Apple, Integrating Pointer Interactions into Your iPad App had the answer (C# code):
panRecognizer = new UIPanGestureRecognizer(() => {
Console.WriteLine("panned -- " + panRecognizer.VelocityInView(this));
});
panRecognizer.AllowedScrollTypesMask = UIScrollTypeMask.Continuous;
AddGestureRecognizer(panRecognizer);
Glad I figured this out!

Related

Gluon Mobile 5.0.0 - New FAB behaviour causes trouble

In our app we use multiple floating action buttons. With Gluon 4.4.4 we added them as layers, which had the following behaviour:
Each view had it's own FAB.
When changing the view, the FAB was hiding.
When changing back to the view, the FAB was showing.
Now, with Gluon 5.0.0 (FAB's as objects) the following happens:
The buttons get stacked over each other, when they are not hidden.
When they are hidden and we change back to a view, the FAB is no longer showing.
This led to a lot of trouble and unnecessary code lines. How can we improve this or how is it intended to be used?
We could imagine to use one FAB for the whole app and exchange it's content for each view - however, this ends up in a bigger mess as well, since it would have to be declared public etc.
Any help is appreciated :-)
You are looking for the new FloatingActionButton#showOn(View) method.
This method makes sure to automatically show and hide the FAB depending on the View's showing property, removing most of the boiler code required in earlier versions to achieve the same functionality.
From the Javadocs:
Makes sure that the FAB is automatically shown when the supplied view is shown. The FAB also automatically hides when the view is hidden. This allows the developer to not worry about calling show() and hide() methods explicitly.

Google VR Reticle Click on UI Button

So I am having this issue with using Google VR reticle where I cannot click a button. I have an image attached showing the heirarchy and the PlayButton is what I am trying to click. The Canvas has a Graphic Raycaster, the button has an Event Trigger that calls the method to navigate to the next scene. The UpScrollPanel, and DownScrollPanel work just fine. The EventSystem has the Gaze Input Module, as well as Event System, and Touch Input Module.
Any ideas on how to get this working? I have watched a few videos from NurFACEGAMES and while they helped a little, I haven't gotten the click to work yet.
Oh, and I am using Unity 5.3.4f
Sometimes things can get in the way of the button, make sure that no other UI elements overlap it, for example text borders (which are actually larger than they appear). You can also fix this by moving the button up the hierarchy among its siblings, I believe the first child is top.
Also try moving the button up the hierarchy if possible, sometimes UI having certain parents makes them not work
The canvas object should have a graphic raycaster
I found the issue to be unrelated to anything I thought it was. The menu I was using is a prefab I also use in another view that isn't VR. The scrollrect was loading that prefab, instead of the modified one I was using in the VR menu, and therefore the triggers I had added to the button were no being used when the app loaded.

find key presses from layout controls (not entry field) in xamarin.forms

I need to find what key the user pressed on a keyboard using xamarin.forms, ideally inside an AbsoluteLayout. I've come up with a few ways myself, but can't get any of them working.
As there isn't an event for this on the AbsoluteLayout control, i tried a little cheat, which was putting an entry field (textbox) on the screen, but hiding it above, so it can't be seen and using the result from that, but it loses focus when someone presses the screen or a button on the screen. so i tried adding an event to each button which refocuses the textbox once i've handled the press and this seemed ok at first, however, if they press anywhere else on the screen, it also loses focus.
I also tried adding a TapGestureRecognizer to the screen, and focusing the button when they press anywhere on the screen, however, there were 2 issues with this, the first being that it appears to only fire when something inside it is touched, and secondly, when i call the focus method for the second time (if they click the screen twice) it un-focuses the entry field, even if i check "isfocused" first (think this is a bug).
I'm only concerned about windows 8 and android apps so far. iOS may come later, but for now im just trying to get it working for these OS's. so maybe i could code it in the windows and android projects (inside my shared project solution), however, i have absolutely no idea where to even begin doing that. I mean, if this is the best way, how can i pass my AbsoluteLayout to the windows project and get it to know what it is, and convert it into a control which i can then add the event to.
Any help or advice, or ideas would be much appreciated. I cant find anything in NuGet which will help me with this. Any ideas?
Many thanks
James

Disable UIPageViewController when ModalViewController opened

I am totaly new to this site, but I already like it :-)
I found it by searching for a question about the UIPageViewController.
I have a normal UIPageViewController App, in which I open a ModalViewController for setting up some settings...
Now the Problem: :-)
If I click on the done Button on the right side of the ModalView, to dismiss it, the PageViewController turnes the page, because he thinks that he is meant by that click ;-)
Can I disable the PageViewController GestureRecognizer as long as I have a ModalView opened?
Is there a method to disable and later his recognizer?
thank you for your help in advance...
cu Matze
It seems odd that your UIPageViewController would steal touches from a modal view presented over it. Unless, perhaps, you are embedding the modal view within the content of the UIPageViewController?
To answer your question -- you can easily disable the page view controller's gesture recognizers by enumerating its gestureRecognizers property (an NSArray):
for (UIGestureRecognizer *gr in [self.pageViewController gestureRecognizers]) {
[gr setEnabled:NO];
}
Re-enable them later with setEnabled:YES.
UPDATE:
In iOS 6 UIPageViewControllerTransitionStyleScroll has been added. UIPageViewControllers that use this transition style return no gesture recognisers in the array returned by gestureRecognizers. Presumably page view controllers with this transition style use an underlying, private UIScrollView instance (it behaves just like a UIScrollView with paging enabled), although I haven't checked this.

Gesture Events and StageWebView

Greetings! I have a Flex 4.5 Mobile project rolling, and I've hit a pretty crazy snag. I'm using a StageWebView object to render web pages, embedded within the rest of my spark layouts. I'm trying to add a gesture event to the component that contains the StageWebView, but since the StageWebView object doesn't belong to the Flex stack (it inherits from EventDispatcher, not UIComponent) all of my events seem to be getting eaten. Any mouse based event (click, gesture, etc) doesn't seem to register, and I'm not sure how to get around it. The gesture events work if I use the area where the browser is not rendered. How can I get the gesture event from the outer SkinnableContainer?
StageWebView Reference:
http://help.adobe.com/en_US/FlashPlatform/beta/reference/actionscript/3/flash/media/StageWebView.html
UIComponent Wrapped StageWebView:
http://soenkerohde.com/2010/11/air-mobile-stagewebview-uicomponent/
Thanks!
I guess you might have to wire up the gesture events yourself, just doing a quick digging in UIComponent.as, it has these:
[Event(name="touchInteractionStarting", type="mx.events.TouchInteractionEvent")]
[Event(name="touchInteractionStart", type="mx.events.TouchInteractionEvent")]
[Event(name="touchInteractionEnd", type="mx.events.TouchInteractionEvent")]
it's not a bug, from what i understand any mouse interaction over a stagewebview means an interaction with the html currently loaded in itself. you should capture events there and trigger it back to the swf.
surely there are some jquery plugins or something that have gestures to help achieving that.
it's a bit of a bummer that you cant overlay stuff over them though.

Resources