I am trying to set the location of Flash Player launch location to my secondary monitor connected using a VGA cable and I want to launch it always in FullScreen mode. After couple of searches, I found one link which relates to this topic:
http://www.flashdevelop.org/community/viewtopic.php?f=7&t=6658
But it doesn't seem to work for me. How do I set the x,y co-ordinates of flash player when it is launching as a new process ?
Adobe air has a Screen class to handle this. To help you out, there is an open source project on Google code that wraps this and handles some of the basics:
http://code.google.com/p/airscreenmanageras3/
Related
I'm developing a simple Xamarin Form using VS2019 and Oreo as simulated android.
I would like to use my USB Logitech Webcam 200 in order to simulate phone camera.
This basically to avoid the deploymnet of the app in the phone with USB and save time.
I've configured the ADM adding hw.camera property and enabling it. Than i've tried to set hw.back.camera to webcam0 and hw.front.camera to none, but when i open camera in the emulated device it gives an error and camera app crash, also my app doesn't work.
I've tried to check the log and mainly i've found: E/CamDev#3.2-impl(1393): open: cannot open camera 0!
My webcam is good and drivers are installed on my win10 (i use it regularly with Skype and other apps).
I also tried with VS2017 in my PC and also tried in another PC with VS2019.
Any idea?
Thank you.
I'm creating an A-Frame VR code program in Glitch.com
I'm told that I can use my smartphone iPhone like a VR Device.
How do I do that from the Show mode in Glitch.com?
Go to the show mode on your computer or whichever device you are working on. Click on the share menu directly below it and go to
live app
Copy that link (which will be a name of your choosing)
and open that on your iPhone. You will need google cardboard to access it.
Have a good day.
I have a problem about touch device's hot-plug.
I set the environment variable as below.
export QT_QPA_EVDEV_TOUCHSCREEN_PARAMETERS=/dev/input/ts_uinput:rotate=0
The "/dev/input/ts_uinput" is created by ts library's application "ts_uinput".
The touch function can work normally before I re-plug the USB touch device.
If I re-plug the USB touch device, the touch function doesn't work.
The "/dev/input/ts_uinput" still is created after I re-plug the USB touch device.
I also monitor the data in "/dev/input/ts_uinput" and it also has data report.
Why the Qt does not get the touch event after re-plug the USB touch device?
I would boldly guess that this is because Qt (the Qt evdev platform plugin) opens /dev/input/ts_uinput when the app start. When you replug the touch device, the file is recreated but the file handle held by Qt has become invalid. Making it work again would require Qt to close and reopen the handle.
You could try getting more info by enabling debug logs: http://doc.qt.io/qt-5/embedded-linux.html#debugging-input-devices
However, I do not know if this is a bug or a missing feature, you might want to contact the Qt interest mailing list or report a bug.
So I was working on a positioning app which needs GPS to be turned on. So I wanted a way to automatically turn on GPS in the background, if even possible with high accuracy or atleast have a popup window come up so the user can turn on the GPS right away. But sadly I couldn't find a single way to do so in QT, if there is a solution it's always written in Java. Can I somehow do it in QT too or import Java code in it?
Would I also be able to keep my GPS updates running in the background? Because as soon as I press the home button the updates stop to come up in the console with qDebug... or is it just the qDebug function that can't run when the app is not open?
Will I be able to use GPS in my Qt app?
First, check if the target platform has GPS and/or capable of resolving geo-coordinates. Next check the manual out: Qt Location. Mind that they refer to this functionality as 'location' so it maybe partially available even without actual GPS unit on device if there is another provider type (I guess partially and not for all platforms). I could only find this list of platforms supported:
Qt Location Classes for accessing GPS and other location services and
for mapping and navigation. Split off from the Qt 4 Mobility module of
Qt Location. Supported on Android, BlackBerry, iOS, Linux (using
GeoClue), Windows and Sailfish OS.
As for starting the GPS (location services provider) there is such entry for QML for sure called start() of PositionSource. It also implies one can find the same functionality in C++ as well.
Would I also be able to keep my GPS updates running in the background? Because as soon as I press the home button the updates stop to come up in the console with qDebug... or is it just the qDebug function that can't run when the app is not open?
Home button: it implies Android? Unclear what you ask but the Android app lifecycle is a bit different matter than that. The GPS will be working independently of your app but will the app respond to messages is more determined by Android.
I've searched Google for about 1,5h now and i can't find a good answer.
Is it possible to display the Soft Keyboard when debugging using ADL?
If i use the existing properties I still don't see it.
I want to check if my layout looks well when the Soft Keyboard appears, but for some reason I don't see it coming in ADL (Adobe Debug Launcher).
I don't have a tablet to test it on for now so that isn't a solution.
Am i doing something wrong whereby I don't see the soft keyboard or doesn't it exist in ADL?
ADL is just the Flex debugger program. When I use ADL to launch my app on an Android device; there are no issues getting the soft keyboard to show up. When I use ADL to launch an app in an emulator, I have never seen the soft keyboard come up. This is not supported with the emulator included with Flex / Flash Builder.
In my experience the availability of the soft keyboard depends on the context you're running the app in (Debugger vs Device) and has nothing to do w/ ADL.
Does that answer your question?