How to add close and minimize button to processing program - arduino

I made a program with processing and arduino to control RGB led using processing
then I made application .exe from my processing code
the problem is the application work without close and minimize button .
how can i add it for my application ?

be sure that you Uncheck the Presentation Mode

You need to make sure the "Presentation Mode" checkbox is unchecked:
If this is checked, your sketch is displayed in full screen. If it's unchecked, your sketch is displayed in a window.

Related

I can run my Qt program (C++) in the system tray, but how to remove it from the task bar?

I could successfully compiled and run a program which has its icon in a system tray.
There is a good example explaining it here:
https://doc.qt.io/qt-5/qtwidgets-desktop-systray-example.html
Now is my problem: When I run my program, its icon is still visible also in the task bar.
How to get rid of this icon from the task bar?
Why it is important to me? My program will be an alarm-clock, so it should be visible on the Desktop, but it should not be occupying place on the task bar. This is why I decided to place it in the system tray. So, how to get rid of it from the task bar?
In the example you need to just uncheck the corresponding checkbox, which set the visible property to true or false:
Line in the example where the checkbox connects to the property:
connect(showIconCheckBox, &QAbstractButton::toggled,
trayIcon, &QSystemTrayIcon::setVisible);
Then you can close the window. The application will be still running because of the line
QApplication::setQuitOnLastWindowClosed(false);
Here is an answer
The information in the Task Bar about a particular running program is independent on the system tray information. So to make the program info (and its icon) 'not present' in the task bar - we have to work in a different place: We need to change the attribute of a main window of the program. In this case it was enough to add - the following line in a constructor of the program window:
setWindowFlags(Qt::Tool);
The tool dialog is a kind of window which information is not "placed" in a task Bar.
So now it works as I planed.

Swift 5 - textField.endEditing(true) crashing the app

I'm using Number Pad keyboard and I programmed, when there is no text, when you press "Delete button" to trigger the function .endEditing , but the app crashes with this error:
Push segues can only be used when the source controller is managed by an instance of UINavigationController.
.resignFirstResponder() also crash with this error
I've tried adding a delay timer, but also crash.
changed to .endEditing(false) also crash.
Any suggestions without adding UINavigationController ? I can't afford to use UINavigationController, since my app must use the whole screen and I need all the space on it.
I've added UINavigationController and I hide it... If there are more options, feel free to share it :)

How to show exteranal window in qml-based application

I have an application that gui is made up with QML. The task is to start an external program (LibreOffice) "inside" my application. It means that when you press the button on the app's face, external program must be shown in the same window as the main program is. And also it can be closed by app's button that is drown under the external window.
The only thing that I could do for the moment is to start lowriter with QProcess using this article. But it is still shown in separate window and I don't know how to make a button that will close lowriter.
If somebody have any thoughts about how to do this, it would be great if you share it.. Thanks!

Custom Touch behavior in Windows 7 with Qt/QML application

I am developing a touch application for Windows 7 with Qt/QML. The end-user-device has Windows 7's native touch behavior, i.e.: When touching the screen, a point appears on the last-touched-point, and when ending the physical touch, Windows puts that point on the now-touched point and runs in the on-clicked-Event.
Compared to the behavior one knows from standard Windows mouse-usage, this leads to a different behavior as soon as it comes to e.g. clicking some button: A mouse user will expect that the button changes color to the pressed-down-color when mouse button goes down, while the color changes to the default color again when the mouse button goes up.
In my application, I want to have a customized way of touch feedback: What is currently being touched should be marked using changed colors of buttons, imitating a "mouse goes down" when the actual physical touch begins and imitating a "mouse goes up" when the actual physical touch ends.
My application will run fullscreen, so an actual possibility would be to change the system's behavior on application start and change it back to default on applications end.
Such a behavior would effectively be the same as the standard behavior on e.g. all Android devices I know.
I searched through all the MouseArea and MultiPointTouchArea elements, trying to find a way to just make the click-reaction behavior different to the standard behavior. However I did not even find a way to capture the begin of the actual touch ... All the things which I want to happen at the begin of the touch actually happen when the touching ends.
Edit:
It does not matter if I use a QML button or a mousearea plus the MouseArea.pressed property: Nothing will be "pressed" before the finger leaves the touch and the onClicked() even is called.
Possibly related:
Adobe AIR: touch screen doesn't trigger mouse down event correctly - but I did not find a way to access the functions like Multitouch.inputMode (which are mentioned in the first reply) from a native Qt application.
How can I achieve the described behavior for my application?
The solution for this issue is to disable "Press and Hold" for the application. This is what can be done in a system-wide setting using ...
Control Panel -> Pen and Touch -> Touch -> Press and Hold -> Settings -> uncheck 'Enable press and hold for right-clicking'
The only solution I found to to this in native code can be found here:
http://msdn.microsoft.com/en-us/library/ms812373.aspx
I checked that this is at least still working for Windows 7. To get it working for QML, I searched for the QWindow* in QQmlApplicationEngine::rootObjects() and used its winId as a HWND. With that HWND, I called the TogglePressAndHold function from the link before app.exec().

Bring a window to the front in Maemo

I've got a Maemo (Qt) app that does some integration with the built-in media player via D-Bus. All the control functionality I need is complete, but I've got a requirement to show my application window (which gets backgrounded when playback starts) instead of the media player when the playback window is closed (it's a stacked window).
It should go like this: user clicks item in my Qt application, which launches the media file in the native media player. User watches media file, exits by clicking the arrow on the playback window. I'd like to somehow catch this event and bring my application to the front instead of showing the media player's main window.
Is it even possible on Maemo? I'm thinking that some low-level X coding might be required.
Answer was painfully obvious, I can catch a state_changed signal from D-Bus- state=0 when the window is closed.
You can also use the raise() method of Qt windows.

Resources