two different custom transitions in a navigationController - uinavigationcontroller

I'm trying to make two different custom transition (one fade and one slide from top) within a navigation controller.
The transition works very well the first time but when pop the transitions acting crazy.
I guess that navigationController.delegate is the key but I can't figure it out by myself.
Any help will be greatly appreciated
thanks a lot
******************************** IOS14 QUESTION UPDATE *****************************************
the solution of #Vlad for setting the delegate is working great thanks.
But recently in IOS14 a stack menu appearing when a long press gesture is detected on navigation back button (which allows user to navigate through the navigationcontroller's viewcontrollers stack).
And so the navigationcontroller delegate is setting to the wrong controller when popping two or more controllers.
I am once again asking for your support

Your starting state is in VC A.
After viewDidLoad, the navigationController?.delegate is set to VC A which uses anim1.
When you push to B, you're setting navigationController?.delegate to B, which uses anim2.
When you pop from C to B, anim2 is used as navigationController?.delegate is VC B.
When you pop from B to A, anim2 is used because navigationController?.delegate is still VC B.
When you pop B, navigationController?.delegate is set to nil because the instance of VC B is destroyed.
That is why when you try and push B again, the default animation is used.
Two important pieces of information is that:
viewDidLoad is only called one time when the view has finished loading and not when it appears again after a pop.
navigationController?.delegate can only point to one delegate.

Related

Mouse Events for Unfocused JavaFX Stage

Always On Top, Undecorated Navigation with Decorated Stage Below
I'd like to position Stage 1 always on top to act as a navigation bar while I open stages below, but also be able to interact with Stage 2 while Stage 1 has focus. Is this possible?
A related post suggested using setPickOnBounds(), but applying this to both stages doesn't seem right. At the moment it's taking two or more clicks (with one click to switch focus) In order to interact with Stage 2.
This also seemed similar, but hoping JNI code isn't required. I don't have the option to pull in another library at the moment.

How to receive hover events on QGraphicsItems during drop?

I use two different QGraphicView's and do drag'n'drop between them. The dragdrop does work very well so far. In one of my QGraphicView's I have items that receive hover events so that they are lighted when the mouse moves on them. The problem is that during drops and also during moves on items, the hover events won't get called. Is it possible to overcome this behaviour somehow? The hover events mark the places the drop can occur in my view and the items then have to be dropped at the right places accordingly (they can only be inserted at specific places and the user should get some feedback).
I hope I could describe my problem...I posted no code for now, because I do not know if this is even possible.
Thanks!
I'm not too familiar with the graphics view framework yet, but you'll probably need to subclass QGraphicsView (if you haven't already) and override QWidget::DragEnterEvent. Depending on how you coded your objects, they might also have a DragEnterEvent that you can use.
In either case you'd accept the QDragEnterEvent and have it trigger the hover event. Hope that points you in the right direction.

Combining a navigationcontroller and a root view controller with pages

I have a problem that has been solved on iOS6, but still appears if I use the iOS 5.1 simulator.
I have a default page based application. I added a "Main menu" view controller, that has three buttons that activate the root view controller containing the pages via a push segue, all defined in the storyboard.
I then added a navigation controller and made it the initial view controller.
If I pas the menu via a button and flip a few pages I can click the back button and it goes back to the menu. Good.
But I don't want the nav bar, so I hide it, and on the root view controller that contains the page view controllers I add a button which performs this action:
[self.navigationController popToRootViewControllerAnimated:YES];
This button works perfect on iOS6, I can flip a few pages, press the button and I'm back in the menu.
With the iOS 5.1 simulator however (and on my 5.1 iPad), a page flip occurs! It's a page filp until I'm on the last page and then I go back to the menu.
I searched for over two hours now but could not find a solution, I hope someone can help me with this?
Note: setting animated to NO does not solve the problem.
I did find the answer to my question, this is my first iOS project which explains why I did not find it any sooner myself.
In the default page based application there are two lines in the viewDidLoad method:
// Add the page view controller's gesture recognizers to the book view controller's view so that the gestures are started more easily
self.view.gestureRecognizers = self.pageViewController.gestureRecognizers;
First, it's not really adding but assigning that happens here, second, when I put the second line in comment, everything works as expected.
So I'm glad it solves my problem, however it raises some questions:
The gestures are started as easy as they were, so why was this line needed in the first place, what does it supposedly fix?
Why does it work in iOS6? It should have had the same problem, no?
Is it correct that it is an assignment and not an addition?
Any answers to these three are still appreciated.
Alex

How to make two side by side Qt Windows sticky and act like a single window?

I am trying to implement a scenario where two Qt windows will be placed side by side and they will be kind of sticky to each other. By dragging one of them, the other also gets dragged. Even when doing an alt-tab they should behave like a single window.
Any help or pointer will be extremely helpful.
-Soumya
What you describe sounds like it's a good fit for a "docking" scenario. You're probably most familiar with docking from toolbars; where you can either float a toolbar on its own or stick it to any edge of an app's window. But Qt has a more generalized mechanism:
http://doc.qt.io/qt-5/qtwidgets-mainwindows-dockwidgets-example.html
http://doc.qt.io/qt-5/qdockwidget.html
It won't be a case where multiple top level windows are moved around in sync with their own title bars and such. The top-level windows will be merged into a single containing window when they need to get "sticky". But IMO this is more elegant for almost any situation, and provides the properties you seem to be seeking.
Install a event filter on the tracked window with QObject::installEventFilter() and filter on QEvent::Move
You can then change the position of tracking window whenever your filter is called with that event type.
I found a way to keep two windows anchored: when the user moves a window, the other follows, keeping its relative position to the moved one.
It is a bit of a hack, because it assumes that the event QEvent::NonClientAreaMouseButtonPress is sent when the user left clicks on the title bar, holding it pressed while he moves the window, and releasing it at the end, so that QEvent::NonClientAreaMouseButtonRelease is sent.
The idea is to use the QWidget::moveEvent event handler of each window to update the geometry of the other, using QWidget::setGeometry.
But the documentation states that:
Calling setGeometry() inside resizeEvent() or moveEvent() can lead to infinite recursion.
So I needed to prevent the moveEvent handler of the windows which was not moved directly by the user, to update the geometry of the other.
I achieved this with result via QObject::installEventFilter, intercepting the summentioned events.
When the user clicks on the title bar of WindowOne to start a move operation, WindowOne::eventFilter catches its QEvent::NonClientAreaMouseButtonPress and sets the public attribute WindowTwo::skipevent_two to true.
While the user is moving WindowOne, WindowTwo::moveEvent is called upon the setGeometry operation, performed on WindowTwo from WindowOne::moveEvent.
WindowTwo::moveEvent checks WindowTwo::skipevent_two, and if it is true, returns without performing a setGeometry operation on WindowOne which would cause infinite recursion.
As soon as the user releases the left mouse button, ending the window move operation, WindowOne::eventFilter catches QEvent::NonClientAreaMouseButtonRelease and sets back the public attribute WindowTwo::skipevent_two to false.
The same actions are performed if the user clicks the titlebar of WindowTwo, this time causing WindowOne::skipevent_one attribute to be set to true and preventing WindowOne::moveEvent to perform any setGeometry operation on WindowTwo.
I believe this solution is far from being clean and usable. Some problems:
I am not sure when and why QEvent::NonClientAreaMouseButtonRelease and QEvent::NonClientAreaMouseButtonRelease are dispatched, apart from the case considered above.
When/if one window is resized without user interaction or without the proper mouse clicks from the user, probably everything will go the infinite recursion way.
There is no guarantee that those mouse events will be dispatched the same way in the future.
Free space for more...
Proof of concept:
https://github.com/Shub77/DockedWindows

Activity Indicator display in Table View whilst row data is being fetched

I am navigating from tableview1.row to a tableview2 which has a LOT of rows being fetched. Given the load time is around 3 seconds, I want the navigation to slide into tableview2 as soon as the tableview1.row is selected, and then display a UIActivityIndicatorView above tableview2 whilst the data is fetched and then rendered in its underlying table view. Note, tableview2 is actually a subview of the parent UIView (as opposed to the parent being a UITableView).
I've seen this post: Activity indicator should be displayed when navigating from UITableView1 to UITableView2
... which gives instructions to add the activity indicator start and stopAnimating calls around the data fetch into viewDidLoad of tableview2.
Thing is, I'm not sure how the above solution could work as viewDidLoad runs and completes before tableview2 visibly slides into view.
Separately, I also tried adding an activity indicator over tableview2 in IB and added the IBOutlet indicator's start/stop animating code into viewDidAppear. What happens is the data fetch runs and I can see the indicator spinning but at the end of the fetch, the table view is empty. Seems like viewDidAppear is too late to add data to the table view as cellForRowAtIndexPath etc has already fired.
Can anyone please suggest any pointers? I could very well be missing something obvious here (its nearly 5am where I am and think my brain is mush). Should I re-trigger cellForRowAtIndexPath etc from viewDidAppear? Is the issue that my table view is a subview and not the parent view?
Thanks
I took another look at my own question and the answer is actually pretty straight forward. Any very intensive processing should generally not be run on the main thread. I was attempting to do my data intensive fetch AND UI display of the activity indicator on the same thread and so these were being processed in sequence, one after the other.
Solution is to enter my new view controller and a) kick off the activity indicator animation on the main thread and then b) run the data intensive fetch on another thread (using performSelectorInBackground). Any UI related updates cannot be applied from the background thread. These should be handed back to the main thread.
Plenty of SO posts on this already (example here), my bad for not picking up on these from the start :)

Resources