In our app we use multiple floating action buttons. With Gluon 4.4.4 we added them as layers, which had the following behaviour:
Each view had it's own FAB.
When changing the view, the FAB was hiding.
When changing back to the view, the FAB was showing.
Now, with Gluon 5.0.0 (FAB's as objects) the following happens:
The buttons get stacked over each other, when they are not hidden.
When they are hidden and we change back to a view, the FAB is no longer showing.
This led to a lot of trouble and unnecessary code lines. How can we improve this or how is it intended to be used?
We could imagine to use one FAB for the whole app and exchange it's content for each view - however, this ends up in a bigger mess as well, since it would have to be declared public etc.
Any help is appreciated :-)
You are looking for the new FloatingActionButton#showOn(View) method.
This method makes sure to automatically show and hide the FAB depending on the View's showing property, removing most of the boiler code required in earlier versions to achieve the same functionality.
From the Javadocs:
Makes sure that the FAB is automatically shown when the supplied view is shown. The FAB also automatically hides when the view is hidden. This allows the developer to not worry about calling show() and hide() methods explicitly.
Related
I have an iPad app (in C#) with a custom UIView that allows input via touch and Apple Pencil stylus touches. I am trying to integrate support for trackpad/mouse indirect (cursor, or "pointer" as Apple calls it).
I got hover working using HoverGestureRecognizer. I got right-click and control-click working using normal touch Began/Moved/Ended/Cancelled events and checking for .type == .indirectPointer and then checking if the control key modifier in event.ModifierFlags is set, or if event.ButtonMask == secondary.
I have spent a lot of time searching through the documentation on the Apple Developer website, starting here and branching out:
UIApplicationSupportsIndirectInputEvents
Somehow I cannot find the API that the system calls in my code when a two-finger trackpad scroll (or mouse scrollwheel scroll) occurs. (On another view that is a scrollview, I can get the scrollview's scroll event when I do a two-finger scroll, since this is built-in to iPadOS 13.4+ for scroll views, but my custom view is not a scroll view, it just has some scrollable areas inside of it.)
Things I tried:
UISwipeGestureRecognizer. Nothing was called for two-finger trackpad scroll gesture.
UIPanGestureRecognizer. Nothing.
Subclassing UIScrollView and adding a UIScrollViewDelegate, just to see if it would work... Nothing.
Subclassing GestureRecognizer and adding that, then overriding ShouldReceive(UIEvent evt) but that was never called.
What does iPadOS 13.4+ convert the trackpad two-finger scroll gesture into? Can I get this as some sort of event? The documentation linked above is pretty disappointingly barebones, but it mentions UIEvent.EventType.scroll but not how or when or where the system will call any of my methods with an event of that type. Pretty infuriating. They should just spell this out more clearly.
Answers in Swift or C# are welcomed.
OK, strangely I thought I tried PanGestureRecognizer, but I must have set it up wrong. The example code project by Apple, Integrating Pointer Interactions into Your iPad App had the answer (C# code):
panRecognizer = new UIPanGestureRecognizer(() => {
Console.WriteLine("panned -- " + panRecognizer.VelocityInView(this));
});
panRecognizer.AllowedScrollTypesMask = UIScrollTypeMask.Continuous;
AddGestureRecognizer(panRecognizer);
Glad I figured this out!
I am using "javafx.scene.control.ComboBox" on Java 8 and I noticed that whenever the combobox does not have room below and instead pops up, the bordering styling of the elements switches as if it still pops down.
How can I access the styling for that to fix it?
Managed to fix this by actually extending the ComboBoxListViewSkin. In there, I've stuck a method that updates the styling, and does that by calling super.getPopup(), gets the AnchorY of that and compares it with the combo-box Y. After determining if the popup is below or above the combo, I set the correct styling on super.getListView...
Also, that method I've added, has to be called from the "ON_SHOWN" event of the combo-box.
I've tried several other variants but the damn thing just yields unstable behavior.
So I am having this issue with using Google VR reticle where I cannot click a button. I have an image attached showing the heirarchy and the PlayButton is what I am trying to click. The Canvas has a Graphic Raycaster, the button has an Event Trigger that calls the method to navigate to the next scene. The UpScrollPanel, and DownScrollPanel work just fine. The EventSystem has the Gaze Input Module, as well as Event System, and Touch Input Module.
Any ideas on how to get this working? I have watched a few videos from NurFACEGAMES and while they helped a little, I haven't gotten the click to work yet.
Oh, and I am using Unity 5.3.4f
Sometimes things can get in the way of the button, make sure that no other UI elements overlap it, for example text borders (which are actually larger than they appear). You can also fix this by moving the button up the hierarchy among its siblings, I believe the first child is top.
Also try moving the button up the hierarchy if possible, sometimes UI having certain parents makes them not work
The canvas object should have a graphic raycaster
I found the issue to be unrelated to anything I thought it was. The menu I was using is a prefab I also use in another view that isn't VR. The scrollrect was loading that prefab, instead of the modified one I was using in the VR menu, and therefore the triggers I had added to the button were no being used when the app loaded.
I've been looking through the iOS 7 / UIKIT framework, and although it looks quite different aesthetically it's really the same SDK underneath from what I can see.
My question, is there any extra code that needs to be included to get the draggable behaviour between pushed tableviews/views?
When you push a view onto a UINavigationController you can now drag back to the previous controller from the side rather than pressing the back button.
This behavior can be seen in mail.
How is this achieved, do I need to add any code to add it to my app?
This has nothing to do with UITableView or UITableViewController, but with UINavigationController. And yes, you get this behavior for free as long as the back button is visible.
Short Version:
"How do you get a simple UITable drill down UINavigationController-styled non-full-screen modal dialog on the iPad?"
Long version:
I have a very specific set of requirements that I can't seem to get working...
I have a functioning iPad program that that needs to pop up a non-full-screen modal view. This modal needs to have a navigation controller and a simple drill-down table that displays a detailed view that I can have edit some values related to the selected item in the table.
Of course I am looking to have the regular "Back" and "Delete" buttons in the Navigation Bar.
I can handle the detailed view, what I am having issues with:
Non-full-screen popup (mine is
always full screen no matter what I
try).
The Navigation controller will not display the Table View I tell it to and the navigation bar does not even have the title I assigned to it in IB.
I can't seem to get any of this working. If anyone has a step by step example of how to do this, that would be great,
~Eric
P.S. I am not afraid of doing this 100% programmatically, but all the examples I have been trying to follow (and failing at extending to my problem) use IB.
As for the fullscreen issue, you need to set modalPresentationStyle to UIModalPresentationFormSheet or UIModalPresentationPageSheet on the controller you want to present modally. I'm not sure of a way to do this through IB.
As for your navigation controller/table view, I think more information is needed to provide an answer.