I want to replicate the controller functions in the Altspace VR Oculus Go app.
How do I make it so that the Oculus Go controller trigger is a click, at the same time pressing up on the trackpad triggers an arc teleport, touching left or right on the touch pad triggers a right turn or left turn, touching down on the track pad triggers a backwards movement, and tapping the back button triggers a menu with an option to escape webvr mode in the Oculus Browser.
Please try the app for additional clarity about what I would like to accomplish.
Related
Im using aframe for a different purpose than VR. Im using it to show a 3d model where the user can rotate the model, zoom in, out, and inspect various parts of the device on click.
Im using aframe-orbit-controls-component-2 component to make the camera rotate around the device model.
How do I detect mouse clicks on specific parts of the device(I already have these parts with ids, I just need to detect mouse clicks on them) without needing the camera to be focused on said part?
You can use the mouse by setting the cursors attribute rayOrigin: mouse:
<a-scene cursor="rayOrigin: mouse">
....
Check it out here - the console will log the elements which are clicked
I find the graphical feedback when pressing buttons (WKInterfaceButton) in watchOS2 is very weak. It is hard to see and even Apple seems to thinks this is the case as they e.g. in the unlock screen change the background to white on active buttons. The default behaviour is to dim the whole button.
How can I make a button press cleared in watchOS 2? I can e.g. change the button text color on activity but how do I easily change it back when it is no longer active?
There is currently no way to detect touch-down, or other events, on WKInterfaceButtons like you can on UIButtons on iOS. The only touch event you can detect is touch-up-inside, which calls the IBAction method.
Therefore what you wish to accomplish cannot be accomplished. Something you might consider is animating the button appearance once the action has been triggered. For example in my app upon button tap I animate the button's background color, then animate it back to the original color. That provides more visual confirmation to the user so they are certain the button was tapped.
I'd encourage you to file an enhancement request at bugreport.apple.com if you'd like to have more control with touch events.
I'm trying to implement a feature such that users can click on the notification content area (not app icon nor action buttons) to bring the main watchkit app open. The current official way to launch the main watchkit app is to tap the app icon or action buttons in a notification scene. But I feel like the app icon is actually quite small (and hard) to tap and sometimes users are not even aware of this feature.
What I'm trying to do now is to put my notification content into a table row in the notification controller and make this row clickable. This will give user a pretty big area to tap with. Then in the click event handler I'm trying to open the main watchkit app. But I haven't figure out a way to do that (none of the general navigation methods such as presentControllerWithName works from notification controller).
Any suggestions?
In the apple watch's passcode setting screen the passcode on top updates as soon as a user clicks on the buttons and not after lifting the finger. How can one achieve that in Xcode 6.3? Because the only event that triggers the ibaction is the touch up event
I don't think that behavior is possible with the current version of WatchKit. Apple is likely using their own internal method to accomplish that (as they do with many/most of the default Watch apps). Hopefully we'll get more functionality in the next major update.
Looking through the documentation, it seems that the new advanced gestures API doesn't determine the direction of a swipe beyond the basic { left, right, up, down }.
I need the start point of the swipe and the direction.
Is there anyway to retrieve this other than coding my own advanced gesture library from scratch out the basic gestures?
And if this is my only option, could anyone point me to some open source code that does this?
Got it! Documentation is here, under 'Creating Custom Gesture Recognizers' at the bottom.
Basically the six gestures Apple provides all derive from UIGestureRecognizer, and you can make your own gesture recogniser in the same way.
then, inside your view's init, you hook up your recogniser. and just the act of hooking it up automatically reroutes incoming touch events.
Actually, the default behaviour is to make your recogniser an Observer of these events. Which means your view gets them as it used to, and in addition if your recogniser spots a gesture it will trigger your myCustomEventHandler method inside your view (you passed its selector when you hooked up your recogniser).
But sometimes you want to prevent the original touch events from reaching the view, and you can fiddle around in your recogniser to do that. so it's a bit misleading to think of it as an ' observer '.
There is one other scenario, where one gesture needs to eat another. Like you can't just send back a single click if your view is also primed to receive double clicks. You have to wait for the double-click recogniser to report failure. and if it is successful, you need to fail the single click -- obviously you don't want to send both back!