Swap Gestures in HERE sdk - here-api

I'm trying to remove the default pan behaviour, which is easily done with:
mapView.getGestures().disableDefaultAction(GestureType.PAN);
and use the TWO_FINGER_PAN gesture instead, but I can't quite find a solution for this other than coding the entire animation by myself. Is there an easier way? Maybe some source code I couldn't find?
To summarize I want the TWO_FINGER_PAN gesture to do the exact same thing the PAN gesture would do.

When you disable a gesture, you have to handle the gesture on your side. There is no source code exposed for this as part of the HERE SDK. But you may want to look for native Android gesture handling examples.
The Gestures example app that is shipped with the HERE SDK provides as least a starting point for custom gesture animations with the GestureAnimator class, but for a pan gesture you would usually not need any animation, so a finger movement relates to a change of the target coordinates of the MapView.

Please confirm the the Here SDK version that you looking for? To explain more about the reason this function is not available because, the pan gesture is designed for single finger including kinetic panning. Two finger pan behaves different. So the best user experience is still with normal panning.
Also two finger gestures are also used for tilting and rotating map. To achieve panning with two fingers can be confusing for users.
Detail information about default behaviour in 4.x in explained in the below link:
https://developer.here.com/documentation/android-sdk-navigate/4.7.6.0/dev_guide/topics/gestures.html

Related

glutPassiveMotionFunc in Qt

I'm trying to make an 'fps' camera in OpenGL using Qt. I'm able to move the mouse and rotate the camera accordingly, the only problem I'm having is that I have to click and drag in order for it to happen (the mouseMoveEvent is only called when the mouse is being pressed). Now I've been searching around all day, and there is a lot of conflicting and outdated information about OpenGL on the internet. My question is really quite simple, does the QOpenGLWidget have some functionality similar to glutPassiveMotionFunc, or do I have to install the glut library to get this functionality? Other suggestions to get this functionality (other, better documented libraries for example) are also welcome. In case I have to install the glut library, it would also be amazing if someone would have documented a proper way of doing this, because I seem to find a million different ways, all equally hard to understand.
By default, Qt doesn't handle simple mouse move events. It has to be turned on via the mouseTracking property:
yourOpenGLWidget->setMouseTracking(true);

replace WASD keys navigation with VR tracked controllers a-frame

I've developed an a-frame scene in a different location to where I will be able to use a headset (either oculus or HTC).
Is the tracked controller functionality built into aframe 0.7.0?
Is there code I need to add to detect these controllers and replace the desktop WASD navigation with the tracked controllers? I don't need any hands to be visible I just need to achieve the up/down/left/right movement in space.
Thanks
Don McCurdy's aframe-extras includes a component called universal-controls that I highly recommend. Specifically there is a gamepad-controls component that may do exactly what you're looking for right out of the box.
If not, universal-controls supports extending the main component with "custom" controllers. The ability to do so is lightly documented on the repository page, but it's pretty straightforward. I'm working on one for the GearVR controller that responds to pressing the GearVR trackpad to achieve movement. I still need to work on getting backward movement, but you can find my work so far at Github.
Once you've developed your own custom controller, (or decided to use mine, or whatever), you attach it to your scene's camera, like this:
<a-entity
id='scene-camera'
camera="userHeight: 1.6"
position='24 1.6 14'
universal-controls='movementControls: universal-gear-vr, keyboard;'
universal-gearvr-controls>
Things to note from above: Rather than the default setting, (which will attempt to load all movement controls schemes that are available), I'm telling the universal-controls component to use my custom component, by giving it's name in the movementControls parameter. Notice that I leave off 'controls' from the name though. That's because universal-controls adds that back later. With that said, I also attach my custom component to the camera, which must be done so that universal-controls can find and use it.
A quick note though, on enabling backward movement, if that's something you're interested in. I've already done it by hacking around with the original WASD movement script. You can take a look at what I did if you'd like to see that.

pointerHover is not being called?

In the eclipse simulator environment, I don't seem to be receiving any
pointerHover calls on my components. Is there something I need to do
to arm them?
[edit/response]
No one will find it acceptable to deliver desktop applications that don't support mouse movement input.
Likewise, there are many new input devices on the horizon which will require specialized support, but will all expect mouse simulation to be the baseline. For example gesture capture by Kinect, leap motion or VR headsets will need to feed standard events to applications that have not been specifically rewritten to use them.
We don't support those events since there is no pointer hover on the device. We might support it in a future case for desktop builds but even then the simulator will not send those events.
We might support this for the new Apple 3d touch API although I think this might be too simplistic.
This API was originally introduced for the blackberry 5 device which had a "click screen" that allowed hovering and clicking.
You should use pointerDrag events which are more representative of all devices.

Leap Motion SwipeGesture

I am looking into the leap motion controller. I want to integrate the leap motion controller into my app for the sole reason of gestures -- up, down, left, right.
Will SwipeGesture be the correct method to use? Do people have trouble using it? (i.e., does it work only half the time?)
Swipe gestures are pretty robust. You can set parameters for minimum speed and length before a swipe gesture will be recognized to fine tune things for your app (except in JavaScript, where this isn't supported).
The Swipe gesture is probably the easiest way, but you could also do your own recognition using the hand and finger positions also provided by the API.

Microsoft Kinect SDK Zoom/Scroll getsures

We are writing software to enable mouse control via kinect on Windows. Basic mouse functions are working well.
Now we want to add things like zoom and pan.
Now, these functions are not applicable to every application, and it might be that this is only possible on an application by application basis, or perhaps there is a way.
IN basic terms we would like to be able to use our kinect software to do pan and zoom in google maps/earth, and any app where there is a vertical/horizontal scrollbar.
We also like to implement rotate using two handed gesture as the rotation axis. Again this does not apply to all software.
Any ideas?

Resources