There are samples to show how to aceess front camera of android devices using android sdk.Is there any way to do the same using flex hero sdk
Yes, although I'm not sure which version of the AIR SDK that was added to. Some method somewhere returns an array, and the first item is the back camera and the second item is the front camera.
If I were to guess, I'd say that you use the Camera class.. This blog post should give you more information.
Related
I'm using AFRAME in an app for the Pico Goblin device.
Unsurprisingly, I'm getting the "No DPDB device match." error and the camera moves at the wrong speed and appears upside down.
I realise this is because this device is unlikely to be in the official webvr-polyfill DPDB.json
Is it possible to add devices to this file and use within aframe?
Thanks
This is the repo where the dpdb file lives: https://github.com/immersive-web/webvr-polyfill-dpdb
You can indeed submit PRs there. A-Frame will pick it up in the next version.
In the meantime you can have a custom build of A-Frame that points to your fork of the webvr-polyfill that fetches your own DPDB.json
I'm working on a Daydream app using the Google VR SDK/NDK. To submit the app to Google Play, I need a 360-degree stereo photosphere. I've seen directions for creating this with Unity, but is there any way to create this without Unity?
I've taken a screenshot of the app in stereo mode, but I don't think that will satisfy the requirement.
Google doesn't provide any tools to capture in-app photospheres in non-Unity apps at this time. Some devs produce photospheres in modeling apps like Maya and Blender.
You could always cheat and make your "Daydream 360 degree stereoscopic image" in Photoshop. Just use the same image twice, once on top and once on the bottom.
I think others have already done this, because I have noticed a few wrong looking previews in the store. Where if I close one eye, parts of the image disappear.
If you change your mind and make one with Unity, this plugin worked nicely for me: https://www.assetstore.unity3d.com/en/#!/content/38755
My HTC Vive is set up in a different room to my developer workstation. When developing for A-Frame, I understand I can: use my desktop monitor instead of a headset; use mouse dragging instead of motion controls; use WASD instead of room-scale tracking. However, what is the preferred way to simulate the tracked controllers?
For example, how can I move the cubes in this demo from my desktop: https://aframe.io/examples/showcase/tracked-controllers
This is not yet released, but we're working on tools to be able to record camera and controllers, output to a file, and then you can load it up any device and replay the camera and controller pose and events without needing a headset. Project is led by Diego:
https://github.com/dmarcos/aframe-motion-capture
http://swimminglessonsformodernlife.com/aframe-motion-capture/examples/
This will become the common way to develop for VR without having to enter and re-enter VR to test each code change, and to do it on the go
I am building an iPhone game with a watch extension. On the watch I would like the user to click on an image and I would like to know where on the image the user clicked. Is this possible with watchkit?
Update: In watchOS 3.0 this is no longer an issue. See answer for details.
With watchOS 3.0, you can now use WKTapGestureRecognizer combined with locationInObject() to get a touch location.
(still no UITouchEvent unfortunately).
This is not possible with the current version of WatchKit. The closest you could come to this is to detect that the user tapped an image/button. I suppose you could break the original image into smaller images/buttons and lay them out as a larger image, but I'm not sure how the performance would be.
I am creating a Playbook app in Adobe Flex/AIR.
I have a situation where there is too much content to show all on one page so I would like it where the content overflows vertically, the user can swipe to scroll down, like they were viewing a website on an Ipad.
I am extending the View class to make my screens, I could place every screen in a one element List or surround it all in a Scroller but surely there must be an inbuilt way to do this on the Playbook?
With the limited Blackberry documentation, I'm struggling to find the solution, any ideas?
Thanks
I'm developing my own Blackberry Playbook app myself. Here's some advice:
Take a look at the List View options available. You might be interested in using a TileList
Review listening for Swipe Events to update the position of your List (be it TileList or SectionList).
Check out the Blackberry Tablet documentation. It's not that limited!