Is it possible to run JavaFx without X-server? - javafx

My plan is to run a JavaFx application on a Raspberry Pi. While the usual way is to start the GUI which is then redirected to the HDMI port, which presents the whole of desktop on the TV.
However, I do not want to see the desktop or anything else, other than the content rendered by my application on the TV. Is it possible to send the rendered output directly to HDMI port?

Yes you need the arm port of java8, where fx renders directly into the framebuffer.

Related

Select Wayland output for QT application

I am running QT 5.11.3 on a TI embedded platform (AM5728). My display manager is wayland 1.16 (weston). The QPA for my application is Wayland-Egl.
The embedded system has both an LCD panel & HDMI output. Both are recognized and treated as independent outputs in wayland.
My question is: How can I programatically choose which wayland output my QT application starts on?
I found that if I use a mouse, the application starts on the output which the mouse is currently active on, but I need to have the application start on the specific desired output (without mouse).
Are there any back-end QPA related calls or settings which can be configured to allow this?
Thanks!

Problem emulating phone camera with a normal USB webcam

I'm developing a simple Xamarin Form using VS2019 and Oreo as simulated android.
I would like to use my USB Logitech Webcam 200 in order to simulate phone camera.
This basically to avoid the deploymnet of the app in the phone with USB and save time.
I've configured the ADM adding hw.camera property and enabling it. Than i've tried to set hw.back.camera to webcam0 and hw.front.camera to none, but when i open camera in the emulated device it gives an error and camera app crash, also my app doesn't work.
I've tried to check the log and mainly i've found: E/CamDev#3.2-impl(1393): open: cannot open camera 0!
My webcam is good and drivers are installed on my win10 (i use it regularly with Skype and other apps).
I also tried with VS2017 in my PC and also tried in another PC with VS2019.
Any idea?
Thank you.

Why Qt can not capture touch event after re-plug USB touch device?

Have a good day.
I have a problem about touch device's hot-plug.
I set the environment variable as below.
export QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS=/dev/input/ts_uinput:rotate=0
The "/dev/input/ts_uinput" is created by ts library's application "ts_uinput".
The touch function can work normally before I re-plug the USB touch device.
If I re-plug the USB touch device, the touch function doesn't work.
The "/dev/input/ts_uinput" still is created after I re-plug the USB touch device.
I also monitor the data in "/dev/input/ts_uinput" and it also has data report.
Why the Qt does not get the touch event after re-plug the USB touch device?
I would boldly guess that this is because Qt (the Qt evdev platform plugin) opens /dev/input/ts_uinput when the app start. When you replug the touch device, the file is recreated but the file handle held by Qt has become invalid. Making it work again would require Qt to close and reopen the handle.
You could try getting more info by enabling debug logs: http://doc.qt.io/qt-5/embedded-linux.html#debugging-input-devices
However, I do not know if this is a bug or a missing feature, you might want to contact the Qt interest mailing list or report a bug.

In Qt embedded 4.8.4, USB keyboard doesn't work properly

When I start embedded Qt GUI application from a serial port or a terminal via SSH, then type input from wireless USB keyboard, I almost cannot see input on console screen. I have set
export QWS_KEYBOAD=TTY:/dev/tty1
or
export QWS_KEYBOAD=linuxinput:/dev/input/event2
for
/dev/input/by-path/platform-ehci-omap.0-usb-0:2.2:1.0-event-kbd -> ../event2, it doesn't work too.
But If I start my application from console screen which is /dev/tty1, it works fine.
My issue is my application starts automatically after reboot, it doesn't respond to keyboard input. Any ideas?

Using x11vnc as VNC server, but a really simple Qt-embedded app get hanged after key pressing

I am trying to use TightVNC viewer to connect to my VNC server running by x11vnc. Since there is no X server on my embedded device, I start the x11vnc by the following arguments:
./x11vnc -rawfb console -pipeinput
UINPUT:touch,tslib_cal=/etc/pointercal,direct_abs=/dev/input/event1
I build a very simple Qt-embedded app which only has a push button and a line edit. From the TightVNC viewer I can use mouse to click the push button and it show a message box by my design. However, the app got hanged when I press a key on the keyboard.
The VNC connection is working well because when I restart the app, I can still control it from viewer.
Because the app is just a very simple one, I tend to think this is a bug in Qt? If this is the case, is there any way to avoid or work around this? Or is there any way to control the Qt-embedded app by mouse and keyboard via VNC (also a password protection is necessary)?
The Qt-embedded version is 4.8.3.
Use QtVNC with Qt 4.8.4 and the qws platform - it works well.

Resources