A-Frame master branch now contains a daydream-controls component, which I can get to work in my A-Frame project when I enter VR using my phone and daydream view HMD. However, I would like to emulate a daydream controller while developing, particularly to inspect event data, so I can develop the event handlers.
What would be the best strategy to generate event data for daydream-controls component during development?
Motion Capture to automate controller development https://aframe.io/docs/0.5.0/introduction/visual-inspector-and-dev-tools.html#motion-capture
Related
I'm using A-Frame and I'm trying to figure out how to easily support multiple types of controllers at once (Oculus Touch, HTC Vive controllers, and Windows Mixed Reality controllers), preferably with controller models rendered in the scene and with lasers that would allow the user to click on things.
How do I do this?
I figured out how to do this, so here's my solution.
In your HTML, you can have these to create the controllers (this should be inside an a-scene element):
<a-entity laser-controls="hand: left" raycaster="showLine: true; objects: .clickable;"></a-entity>
<a-entity laser-controls="hand: right" raycaster="showLine: true; objects: .clickable;"></a-entity>
These should also render with the actual controller models in the scene, and each have a laser pointer.
This is what it looks like with the Oculus Touch controllers (ignore the other stuff in the view):
As new types of headsets come out and are supported by A-Frame (e.g. the Valve Index controllers aren't supported yet), the laser-controls component should automatically be updated to support them.
See the docs for a bit more information on how to use controllers in your A-Frame scene.
I still haven't figured out exactly how to make it possible to click on buttons or objects in the environment using the laser, I'll need to figure that out next.
Is it possible to implement a draggable view with Xamarin View? Is there basically an event that I can use that gets triggered when a finger is pressed on the screen, then moved and then released without having to use native Android/iOS code? I don't mean a swipe event, I know this exists. I am looking for a event so I can let the user be able to drag a rectangle across the screen for example.
Looked for it on the internet, but can only seem to find just normal Touch, Swipe and Tap events. (Although I found it is possible using native Android/IOS code.
I have a requirement to build multiple scenes in web VR A-frame. Each scene will have a button which when clicked, will load a new scene with different background.
What is the recommended way to build scenes in A-frame?
Create just one scene and swap the entities at runtime. Each entity can correspond to a button, background, etc.
Create multiple pages(eg. index.html), each having its own scene and load each page on click event.
Approach (1) seems to be the preferred one as the second approach would mean that 'Back' button in browser will be enabled on loading new pages, which is undesirable and affects the user experience in VR.
Can anyone confirm that approach (1) is preferred?
Definitely a better experience if you can keep everything in a "single-page app" (1) and swap out entities. Especially if you are doing something as simple as swapping a background. Only a couple browsers have implemented proper in-VR link traversal, and the link traversal UX at the moment is non-existent.
https://aframe.io/docs/0.6.0/guides/building-a-360-image-gallery.html
I'm learning about view controllers and skscenes for a spritekit game and have a button in the view controller for an in app purchase. I put it here and not programmatically in the skscene because I can drag and create outlets and actions that made the IAP setup easier. However I need to disable this button during the skscene controlling the game to it isn't accidentally tapped. I can post code if need be but it's basically just a UIbutton outlet that I want to be able to edit another scene file, it should be enabled in the game over scene but disabled in the game scene.
In general its not a good idea to use UIKit elements in SpriteKit games.
You should create all your buttons using SKSpriteNodes directly in the scene rather than using UIButtons in your game view controller. There is plenty tutorials to google on how to create buttons in SpriteKit.
Its a common thing people do
How to fix the share button on SpriteKit Swift when Players plays the game again Share Button shows up on the game scene?
If you still want to use UIButton than I believe this is what you are looking for
someUIButton.isUserInteractionEnabled = false
Just call it at the correct spot
Hope this helps
I have a web-application whose UI is implemented in GXT (ext GWT).
Now I want to switch to Flex but as the application is so large that I cannot afford to start migrating the whole application at once.
So I have decided to migrate slowly. So what I want is to bring up a Flex panel on the click of a GXT's button.
Basically the idea is how to make Flex components listen to the events generated by GXT's component.
Flex app is ultimately an SWF and GWT gives you JavaScript; you can use ExternalInterface to invoke an SWF's methods from JavaScript code and vice-versa. Checkout the addCallback method - the linked page has some sample code in it.