Detecting Mouse Events with JavaFX - javafx

I am developing an on screen music keyboard using JavaFX under Kotlin. The graphics and layout are in place using Buttons with appropriate images. The buttons are added to a Group which is in a StacKPane.
I can detect simple mouse press and release events to trigger appropriate MIDI on/off messages. To this point everything is working fine except that the keyboard is monophonic (IE only able to play a single note at a time).
I wish to extend the keyboard to play multiple notes. The sequence of gesture are as follows:
Mouse press on an initial key triggers the first note. If the mouse enters an adjacent key with the button down, the original note should continue while a new note is triggered. Whenever the button is released or the mouse is no longer above any key, all notes should stop.
I have implemented something similar a decade ago using Swing but the JavaFX events do not operate the same way.
I can detect the initial mouse pressed event, however as long as the button is down, MouseEntered events are not detected when moving to another key. I have also tried MouseMoved, MouseDraggedEnterd and DragDetected with no luck. It is as if the initial MousePressed event is blocking all other events until the button is released. Any suggestions? Thanks
Code Snippet
fun setKeyListeners(b: Button, keynumber: Int) {
b.setOnMousePressed { _ ->
node as VirtualKeyboard
val velocity = 64
node.triggerOn(keynumber, velocity)
status("key: ${node.lastKeynumber} vel: ${node.lastVelocity}")
}
b.setOnMouseReleased { _ ->
node as VirtualKeyboard
node.triggerOff(keynumber)
}
// b.setOnMouseMoved { _ -> println("Moved $keynumber") }
// b.setOnMouseDragged { _ -> println("Dragged $keynumber") }
// b.setOnMouseEntered { _ -> println("Entered $keynumber") }
}

Related

Javafx long touch hid touch square effects

i am creating a javafx application in which i am implementing touch pressed and release events. while long touch press a square box comes up as a effect for long touch event which doesn't go when i release the touch event.
in the below image you can see square box is effect of touch event, when i randomly touch screen then it goes.
below is my touch listener code
javafx.event.EventHandler<TouchEvent> buttonStartZoomInPressed = new javafx.event.EventHandler<TouchEvent>(){
public void handle(TouchEvent e)
{
startZoomIn();
}
};
buttonStartZoomIn.setOnTouchPressed(buttonStartZoomInPressed);

how can i implement drag and drop in urho 3d view?

I have added 3d view objects using urhosharp for my xamarin uwp/ios/android project . The only event that work is touch event, but i also want to use drag and drop so that the objects can move to different locations within the 3D view. Any suggestions?
https://us.v-cdn.net/5019960/uploads/editor/ni/u16pg79v2m62.png
Haven't used urhosharp yet , but here are some suggestions about using of drag and drop, not sure if it helps here for you.
urhosharp: Basic Actions
From document of urhosharp , there is some bascic actions explains,but no drag and drop in it.Maybe you can do it by combining actions and drag methods on each platform. But this requires you to try.
UWP: reference link here
Here's an overview of what you need to do to enable drag and drop in your app:
Enable dragging on an element by setting its CanDrag property to
true.
Build the data package. The system handles images and text
automatically, but for other content, you'll need to handle the
DragStarted and DragCompleted events and use them to construct your
own data package.
Enable dropping by setting the AllowDrop property to true on all the
elements that can receive dropped content.
Handle the DragOver event to let the system know what type of drag
operations the element can receive.
Process the Drop event to receive the dropped content.
Code example:
<Grid AllowDrop="True" DragOver="Grid_DragOver" Drop="Grid_Drop"
Background="LightBlue" Margin="10,10,10,353">
<TextBlock>Drop anywhere in the blue area</TextBlock>
</Grid>
private void Grid_DragOver(object sender, DragEventArgs e)
{
e.AcceptedOperation = DataPackageOperation.Copy;
}
IOS: reference link here
With drag and drop in iOS, users can drag items from one onscreen location to another using continuous gestures. A drag-and-drop activity can take place in a single app, or it can start in one app and end in another.
Use drag items to convey data representation promises between a source app and a destination app.
Adopt drag interaction APIs to provide items for dragging.
Adopt drop interaction APIs to selectively consume dragged content.
Demonstrates how to enable drag and drop for a UIImageView instance.
Code of example:
func customEnableDragging(on view: UIView, dragInteractionDelegate: UIDragInteractionDelegate) {
let dragInteraction = UIDragInteraction(delegate: dragInteractionDelegate)
view.addInteraction(dragInteraction)
}
func dragInteraction(_ interaction: UIDragInteraction, itemsForBeginning session: UIDragSession) -> [UIDragItem] {
// Cast to NSString is required for NSItemProviderWriting support.
let stringItemProvider = NSItemProvider(object: "Hello World" as NSString)
return [
UIDragItem(itemProvider: stringItemProvider)
]
}
Here is a sample for Xamarin IOS.
Or you can using UIPanGestureRecognizer in IOS to move view.Here is Walkthrough: Using Touch in Xamarin.iOS.All you need to do is let view.center follow panGesture to change.
Android: reference link here
With the Android drag/drop framework, you can allow your users to move data from one View to another using a graphical drag and drop gesture. The framework includes a drag event class, drag listeners, and helper methods and classes.
There are basically four steps or states in the drag and drop process:
Started:In response to the user's gesture to begin a drag, your application calls startDrag() to tell the system to start a drag.
Continuing:The user continues the drag.
Dropped:The user releases the drag shadow within the bounding box of a View that can accept the data.
Ended:After the user releases the drag shadow, and after the system sends out (if necessary) a drag event with action type ACTION_DROP, the system sends out a drag event with action type ACTION_DRAG_ENDED to indicate that the drag operation is over.
Table. DragEvent action types:
Or in Android can use onTouchEvent to move view, need to calculate position of the view.Walkthrough - Using Touch in Android
The main thing is to handle pressing and moving two messages, overloading onTouchEvent. Mathematical knowledge (translation): Record the coordinate point when ACTION_DOWN, and calculate the translation amount according to the current position and the position when pressed at ACTION_MOVE. Refresh the control, causing the control to redraw, and move the coordinates of the upper left corner of the drawing when redrawing.
Here also has a discussion about Drag & Drop in Xamarin forms.It may be helpful.

Mouse Listeners in Gluon

I've been trying to use listeners on gluon CharmListView for a while. It didn't work in my project so i decided to try it on the FIFTY STATES app. I added the code below:
charmListView.onMouseClickedProperty().set((MouseEvent event) ->{
Logger.getGlobal().log(Level.INFO, "Pick: {0}", new Object[]{event.getPickResult()});
});
When I launch the application, NO click fires aMOUSE_CLICKED event. When I scroll down slightly such that the a list header cell is fully docked like this,
the CharmListView fires the event only on a click on the top header cell.
INFO: Pick: PickResult [node = VBox#49f31558[styleClass=text-box], point = Point3D [x = 133.0, y = 13.0, z = 0.0], distance = 1067.366530964699
No other click anywhere else on the list fires an event.
I've tried adding the same listener to the normal ListView and a MouseEvent is always fired after a click on any area of the ListView. So now I'm stuck because I cannot set a listener to get a selected item.
The CharmListView control is mainly intended for mobile applications, where you use scroll and swipe gestures. But these gestures trigger mouse clicked or pressed events.
If the list cells contain some event handler to process the latter, the only way scroll works is by consuming them, otherwise whenever you scroll the list, the cell event handler will be processed as well, when you start scrolling.
That's the reason why setOnMouseClicked() doesn't trigger any event if you click on the listView.
For accessing the list view selection model, please refer to this question.

How to completly disable focus system

I'd like to disable focus system in a JavaFX application. That means no focus traversal, no control could receive focus, so it would never have corresponding graphic decoration and would never receive keyboard events. All keyboard events should be caught and processed by the root node (or stage).
What should I extend/rewrite to disable focus system?
To process all keyboard events in the root node, do
root.addEventFilter(KeyEvent.ANY, e -> {
// handle keyboard event...
e.consume();
});

Qt - top level widget with keyboard and mouse event transparency?

I want an app's main window to ignore mouse and keyboard events, passing them to applications underneath it in the window manager Z-order.
I see how to make child widgets ignore keyboard or mouse events, but how about the main window?
I'm trying to make a desktop widget that always sits just over the background and is totally invisible to keyboard and mouse events. (Pass through)
Qt::X11BypassWindowManagerHint gets me keyboard pass through (although sadly X11 specific, but fine for now), so how about mouse events?
Is there a OS-agnostic way to be transparent to keyboard events?
EDIT:
The key word here is transparency.
I don't want to EAT mouse and keyboard events, I want the window manager to know I don't want them at all. Those events should be directed to whatever application is under me in the zorder.
For example, I want to be able to click on desktop icons that are covered by my widget and interact with them as if the widget was not there.
I found the following solution (tested on Linux, also works on Windows according to #TheSHEEEP):
setWindowFlags(windowFlags() | Qt::WindowTransparentForInput);
It has been added in more recent qt release (i did not find when)
see http://doc.qt.io/qt-5/qt.html
On Windows you can set WS_EX_TRANSPARENT
To do this in Qt use the following code:
Include the header,
#if _WIN32
#include <windows.h>
#endif
and put the following code into the constructor.
#if _WIN32
HWND hwnd = (HWND) winId();
LONG styles = GetWindowLong(hwnd, GWL_EXSTYLE);
SetWindowLong(hwnd, GWL_EXSTYLE, styles | WS_EX_TRANSPARENT);
#endif
Maybe what you want is
widget->setAttribute(Qt::WA_TransparentForMouseEvents)
? That's what QRubberBand uses to let it's parent handle the mouse events. As for keyboard events, a QWidget doesn't get any keyboard events unless it has set itself a focusPolicy().
setFocusPolicy( Qt::NoFocus );
should therefore take care of the keyboard events.
Use Qt's event filters: they will allow your application to eat whichever events you specify (i.e. keyboard and mouse events) but still process other events such as paint events.
bool FilterObject::eventFilter(QObject* object, QEvent* event)
{
QKeyEvent* pKeyEvent = qobject_cast<QKeyEvent*>(event);
QMouseEvent* pMouseEvent = qobject_cast<QMouseEvent*>(event);
if (pKeyEvent || pMouseEvent)
{
// eat all keyboard and mouse events
return true;
}
return FilterObjectParent::eventFilter(object, event);
}
Maybe I'm missing something here, but have you tried subclassing the QMainWindow class and overriding the QWidget::event() method to always return false? If you need to handle some events, you could add that intelligence here as well.
This technique should allow you to inspect the events coming in to the application and ignore them if desired without having to eat them using an event filter.
If this doesn't work you could attempt to redirect the events to the desktop by calling QCoreApplication::notify() and passing the event to the desktop widget obtained by calling QApplication::desktop(). I have no idea if this would work, but it seemed like it might be worth giving a try.
I think that overriding is supposed to work:
bool YourMainWindow::event( QEvent *event )
{
event ->accept();
return true;
}
that's some of what the QWidget class documentation says about event() member function:
This function returns true if the
event was recognized, otherwise it
returns false. If the recognized event
was accepted (see QEvent::accepted),
any further processing such as event
propagation to the parent widget
stops.

Resources