Responsive, single button input in A-Frame WebVR - aframe

For my A-Frame WebVR game, I need to access a single "controller" button, regardless of platform. For a phone using a magic window or Google Cardboard, any screen tap would count. For Gear VR or Daydream, any button on the controller would count. For a PC VR rig, any button on either controller would count.
Don McCurdy's universal-controls (https://github.com/donmccurdy/aframe-extras/tree/master/src/controls) would seem to be relevant, yet it's not clear how I could use it to do what I want.
I could also go access the GamePad API directly, and separately detect screen taps.
What's the best way to proceed?

Perhaps the input mapping system, from Fernando Serrano could help:
https://blog.mozvr.com/input-mapping/

It turns out, if you want to treat all buttons alike, it's easier to listen for the buttonchanged event on the controls entity.
As Noam kindly pointed out, aframe-input-mapping-component is great for general mapping of buttons to actions.
[edit] I've created aframe-button-controls to handle this.

Related

Should I use Button or TextBlock?

I have two options. I need 48 of a certain type control; it needs to respond to clicks and taps (for touch devices).
I could use Buttons, using the TextButtonStyle, and the Click event. Or I could use TextBlock, with the Tapped event.
I reckon buttons may be more "expensive" to create. OTOH, although I believe "Tapped" is also called when the user clicks the component, this makes me a little nervous due to its nomenclature, I guess.
Another difference is that a button takes up only the width necessary, whereas a TextBlock takes everything; and I want the underlying Grid to be tappable, so the TextBlock is kind of a problem that way. Is there a property that will make it more modest like the button?
There is design guidance for Windows Store apps on when and how to use buttons at http://msdn.microsoft.com/en-US/library/windows/apps/hh465470. Based on your description and this guidance, it sounds like buttons are the way to go. Responding to click events is what they were made for, and TextBlocks add the extra issues that you describe.

QMainWindow that ignore clicks, passes them on to background windows

I'd like to create a semi-transparent information window that doesn't get in the way of the user's other activities. Any clicks on the window should just pass through as if the window wasn't there.
How would you recommend implementing such behavior? Is there an easy way to do it or do I have to follow a clumsy workaround? I'm thinking of hiding the window, re-executing the click, then making the window visible again. But this would still screw up drag'n'drop gestures.
Take a look at an enum value of Qt::WidgetAttribute: Qt::WA_TransparentForMouseEvents:
When enabled, this attribute disables the delivery of mouse events to
the widget and its children. Mouse events are delivered to other
widgets as if the widget and its children were not present in the
widget hierarchy; mouse clicks and other events effectively "pass
through" them. This attribute is disabled by default.
I did a little more research into "mouse event transparency" (didn't know the exact terminology) and I found this.
I don't think there is a general and easy approach to your problem. You will probably have to dig into the native API. Once events reach an application they are not forwarded to other applications on their own.
What do you guys think? Am I doomed to work with the native APIs of each OS?

Detecting FR/FF button event in MPMoviePlayerController UI?

Basically long pressing of FR(fast rewind)/FF(fast forward) causes directional scrubbing. But iPod, YouTube app detects short tapping of these buttons and uses it for navigating to previous/next tracks.
How can I archive this feature? Is this possible? Or should I go view-hierarchy hack?
I have solved this with view hierarchy hack. This is not recommended and should be avoided as much as possible. But I note here for further reference. To mark there is no accessible way currently. This hack is applied to only specific version (4.3) of iOS SDK.
Iterate all view hierarchy of -[MPMoviePlayerController view].
Find subclass of UIButton. And add target-action handler to all of them. (you can check for subclass of MPTransportButton)
In handler, you can filter by tags. Only navigation buttons are tagged. Each tag means 1 = play/pause, 2 = previous, 4 = next button.
Take care about this is just an hack. Will not be guaranteed to work or pass on AppStore.
If you have experience of rejection by this method, please comment me. It'll be very appreciated.

How do you register an event listener to the appear/disappear events of the virtual keyboard on Blackberry Playbook using Flex?

Not why no one has been complaining about this but I'm have a lot of problems with the Blackberry Playbook Virtual Keyboard on the Simulator.
I have an richedit component in the middle of the screen and as soon as the virtual keyboard appears to enter text, it completely hides the text input. I'd like to move the text input up when the keyboard appears/disappears. Is there any way to do this? I don't want to muck around with the focus_in and focus_out events on the richedit. I've tried, and it's not very reliable.
Thank you in advance!
We expect the next release of the SDK (long overdue at this point but, I think, imminent) to provide much more complete support for the virtual keyboard. Until that occurs, I think it's a waste of time to attempt to do anything special with it.
I also think there's a chance it will automagically move your whole stage up when it would cover up a text input, so maybe you won't have to do anything about it anyway.
Edit: Actually I published code in January describing an undocumented way to support this, using some rudimentary PPS support. It also shows how you can programmatically control the keyboard opening and closing. I don't recommend it yet for real code...

iOS Advanced Gestures: Getting Swipe Direction Vector

Looking through the documentation, it seems that the new advanced gestures API doesn't determine the direction of a swipe beyond the basic { left, right, up, down }.
I need the start point of the swipe and the direction.
Is there anyway to retrieve this other than coding my own advanced gesture library from scratch out the basic gestures?
And if this is my only option, could anyone point me to some open source code that does this?
Got it! Documentation is here, under 'Creating Custom Gesture Recognizers' at the bottom.
Basically the six gestures Apple provides all derive from UIGestureRecognizer, and you can make your own gesture recogniser in the same way.
then, inside your view's init, you hook up your recogniser. and just the act of hooking it up automatically reroutes incoming touch events.
Actually, the default behaviour is to make your recogniser an Observer of these events. Which means your view gets them as it used to, and in addition if your recogniser spots a gesture it will trigger your myCustomEventHandler method inside your view (you passed its selector when you hooked up your recogniser).
But sometimes you want to prevent the original touch events from reaching the view, and you can fiddle around in your recogniser to do that. so it's a bit misleading to think of it as an ' observer '.
There is one other scenario, where one gesture needs to eat another. Like you can't just send back a single click if your view is also primed to receive double clicks. You have to wait for the double-click recogniser to report failure. and if it is successful, you need to fail the single click -- obviously you don't want to send both back!

Resources