One finger gesture recognizer with normal uibuttons - xcode4

I'm using some parts of code of this one finger gesture recogniser:
http://blog.mellenthin.de/archives/2012/02/13/an-one-finger-rotation-gesture-recognizer/
and I add three unbuttons with touch in mode. When I implemented this in xcode and run my app on simulator the buttons worked fine only when I swipe on them. They don't recognise one touch click on them. What should I do?
Thanks,

What you need is to implement one delegation method as below
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer {
return YES;
}
For more explanation just use this example.
Happy Coding :)

Related

Getting a custom UIView to respond to a two finger trackpad scroll gesture (or mouse scroll wheel)

I have an iPad app (in C#) with a custom UIView that allows input via touch and Apple Pencil stylus touches. I am trying to integrate support for trackpad/mouse indirect (cursor, or "pointer" as Apple calls it).
I got hover working using HoverGestureRecognizer. I got right-click and control-click working using normal touch Began/Moved/Ended/Cancelled events and checking for .type == .indirectPointer and then checking if the control key modifier in event.ModifierFlags is set, or if event.ButtonMask == secondary.
I have spent a lot of time searching through the documentation on the Apple Developer website, starting here and branching out:
UIApplicationSupportsIndirectInputEvents
Somehow I cannot find the API that the system calls in my code when a two-finger trackpad scroll (or mouse scrollwheel scroll) occurs. (On another view that is a scrollview, I can get the scrollview's scroll event when I do a two-finger scroll, since this is built-in to iPadOS 13.4+ for scroll views, but my custom view is not a scroll view, it just has some scrollable areas inside of it.)
Things I tried:
UISwipeGestureRecognizer. Nothing was called for two-finger trackpad scroll gesture.
UIPanGestureRecognizer. Nothing.
Subclassing UIScrollView and adding a UIScrollViewDelegate, just to see if it would work... Nothing.
Subclassing GestureRecognizer and adding that, then overriding ShouldReceive(UIEvent evt) but that was never called.
What does iPadOS 13.4+ convert the trackpad two-finger scroll gesture into? Can I get this as some sort of event? The documentation linked above is pretty disappointingly barebones, but it mentions UIEvent.EventType.scroll but not how or when or where the system will call any of my methods with an event of that type. Pretty infuriating. They should just spell this out more clearly.
Answers in Swift or C# are welcomed.
OK, strangely I thought I tried PanGestureRecognizer, but I must have set it up wrong. The example code project by Apple, Integrating Pointer Interactions into Your iPad App had the answer (C# code):
panRecognizer = new UIPanGestureRecognizer(() => {
Console.WriteLine("panned -- " + panRecognizer.VelocityInView(this));
});
panRecognizer.AllowedScrollTypesMask = UIScrollTypeMask.Continuous;
AddGestureRecognizer(panRecognizer);
Glad I figured this out!

Responsive, single button input in A-Frame WebVR

For my A-Frame WebVR game, I need to access a single "controller" button, regardless of platform. For a phone using a magic window or Google Cardboard, any screen tap would count. For Gear VR or Daydream, any button on the controller would count. For a PC VR rig, any button on either controller would count.
Don McCurdy's universal-controls (https://github.com/donmccurdy/aframe-extras/tree/master/src/controls) would seem to be relevant, yet it's not clear how I could use it to do what I want.
I could also go access the GamePad API directly, and separately detect screen taps.
What's the best way to proceed?
Perhaps the input mapping system, from Fernando Serrano could help:
https://blog.mozvr.com/input-mapping/
It turns out, if you want to treat all buttons alike, it's easier to listen for the buttonchanged event on the controls entity.
As Noam kindly pointed out, aframe-input-mapping-component is great for general mapping of buttons to actions.
[edit] I've created aframe-button-controls to handle this.

How to make a button in Unity?

How can I make a button for mobile devices? I have a pause screen and a script that accepts touch but it activates when I click on any part of the screen, not just the button.
How can I fix it to activate only on the button? My script is attached to cube with invisible material so basically I want to activate script only when player presses the area around cube.
I wasn't sure how to paste code here so I used pastebin:
http://pastebin.com/ERC39TuU
See GUI.Button reference.
function OnGUI() {
if (GUI.Button(Rect(10,10,100,50), "Click"))
Debug.Log("Clicked button!");
}
With newer versions of unity they implemented a much better GUI designer. I would look into using Canvas. Adding images to your buttons and functions is much easier the nusing OnGUI in my opinion.
https://unity3d.com/learn/tutorials/modules/beginner/ui/ui-canvas

Swipe to go back in navigation controller on iOS 7

I've been looking through the iOS 7 / UIKIT framework, and although it looks quite different aesthetically it's really the same SDK underneath from what I can see.
My question, is there any extra code that needs to be included to get the draggable behaviour between pushed tableviews/views?
When you push a view onto a UINavigationController you can now drag back to the previous controller from the side rather than pressing the back button.
This behavior can be seen in mail.
How is this achieved, do I need to add any code to add it to my app?
This has nothing to do with UITableView or UITableViewController, but with UINavigationController. And yes, you get this behavior for free as long as the back button is visible.

How do you register an event listener to the appear/disappear events of the virtual keyboard on Blackberry Playbook using Flex?

Not why no one has been complaining about this but I'm have a lot of problems with the Blackberry Playbook Virtual Keyboard on the Simulator.
I have an richedit component in the middle of the screen and as soon as the virtual keyboard appears to enter text, it completely hides the text input. I'd like to move the text input up when the keyboard appears/disappears. Is there any way to do this? I don't want to muck around with the focus_in and focus_out events on the richedit. I've tried, and it's not very reliable.
Thank you in advance!
We expect the next release of the SDK (long overdue at this point but, I think, imminent) to provide much more complete support for the virtual keyboard. Until that occurs, I think it's a waste of time to attempt to do anything special with it.
I also think there's a chance it will automagically move your whole stage up when it would cover up a text input, so maybe you won't have to do anything about it anyway.
Edit: Actually I published code in January describing an undocumented way to support this, using some rudimentary PPS support. It also shows how you can programmatically control the keyboard opening and closing. I don't recommend it yet for real code...

Resources