Dynamic Programming - top-down vs bottom-up - recursion

What I have learnt is that dynamic programming (DP) is of two kinds: top-down and bottom-up.
In top-down, you use recursion along with memoization. In bottom-up, you just fill an array (a table).
Also, both these methods use same time complexity. Personally, I find top-down approach more easier and natural to follow. Is it true that a given question of DP can be solved using either of the approaches? Or will I ever face a problem which can only be solved by one of the two methods?

Well I believe theoretically you should be able to solve a DP problem with either approach. However, there are instances when bottom up approach can become too expensive. Consider a knapsack problem with the knapsack_size = 200,000 and the num_items = 2000. To fill in a two dimensional DP table with just ints is not going to be possible. You'll exhaust the main memory of an ordinary computer. Moreover, you do not require to fill in all the entries in a table to achieve the desired final computation. A recursive top-down approach is far superior in a case like this. Hope it helps.

bottom-up DP requires you to see how the recursion was precisely building the complete solution i.e what kind of subproblems were being created how they were filled by the base cases hence it is somewhat difficult to write bottom-up dynamic programming while in top-down have to write a backtrack solution(which is still hard to think of) and then see the states of backtrack solution.states means those arguments to the recursive function that were varying during subsequent recursive calls.
Now coming to the time complexities:
Bottom-up DP is faster than top-down since it doesn't involve any function calls. It completely depends on the table entries while in top-down DP it requires function calls and thereby causing an implicit stack formation.
PS: to see difference between time complexities of top-down and bottom-up you need to solve ASSIGNMENTS problem on SPOJ.

When solving a dynamic problem you may consider two things...
In bottom-up you have to fill an entire level before going to the next upper level. But in case of top-down an entire subtree can be skipped if not needed. In this way top-down can save a lot of extra calculations. So for optimal time,
if all sub-problems need not to be solved
top-down
else
bottom-up
There is an advantage in bottom-up in terms of space requirement. Because it saves a lot of stack space as compared to recursive top-down approach. So for optimal space,
bottom-up
However, if you feel recursion is not too deep but is very wide and can lead to a lot of unnecessary calculation by tabulation, you can then go for top-down approach with memoization.

Related

iterative or recursive to implement a binary search tree?

I'm learning Data Structures & Algorithms now.
My lecture notes have an implementation of a binary search tree which is implemented using a recursive method. That is an elegant way, but my question is in real life code, should I implement a binary search tree recursively, will it generate a lot of calling stack if the tree has large height/depth number.
I understand that recursion is a key concept to understand lots of data structure concepts, but would you choose to use recursion in real life code?
A tree is recursive by nature. Each node of a tree represents a subtree, and each child of each note represents a subtree of that subtree, so recursion is the best bet, especially in practice where other people people might have to edit and maintain your code.
Now, IF depth becomes a problem for your call stack, than I'm afraid that there are deeper problems with your data structure (either it's monstrously huge, or it's very unbalanced)
"I understand that recursive is a key concept to understand lots of
data structure, but will you choose to use recursive in real life
code?"
After first learning about recursion I felt the same way. However, having been working in the Software industry for over a year now, I can say that I have used the concept of recursion to solve several problems. There are often times that recursion is cleaner, easier to understand/read, and just downright better. And to emphasize a point in the previous answer, a tree is a recursive data structure. IMO, there is no other way to traverse a BST :)
Many times, the compiler can optimize your code, to avoid creating a new stack frame for each recursive call (look up tail recursion, for example). Of course, it all depends on the algorithm and on your data structure. If the tree is reasonably balanced, I don't think a recursive algorithm should cause any problems.
its true that recursion is intutive and elegent and it produces code that is clear and concise. its also correct that some methods such as quick sort, DFS etc. are really hard to implement iterativelly.
but in practice recursive implementations are almost always going to be slow when compared to iterative counterparts because of all the function calls (To really understand the performance hit I suggest you learn how much book keeping stuff assembler has to do for a single function call).
the optimizations that we talk about are not applicable to every recursive method in general and manny compilers and interpreters dont even support them.
so in summary if you are writing something which is performance critical such as a data strucute then stay away from recursion (or use it if you are sure that your compiler/interpreter got you covered)
PS: CLRS (introduction to algorithms, page 290, last line) suggests that iterative search procedure for a BST is faster compared to recursive one.

Map/Reduce: any theoretical foundation beyond "howto"?

For a while I was thinking that you just need a map to a monoid, and then reduce would do reduction according to monoid's multiplication.
First, this is not exactly how monoids work, and second, this is not exactly how map/reduce works in practice.
Namely, take the ubiquitous "count" example. If there's nothing to count, any map/reduce engine will return an empty dataset, not a neutral element. Bummer.
Besides, in a monoid, an operation is defined for two elements. We can easily extend it to finite sequences, or, due to associativity, to finite ordered sets. But there's no way to extend it to arbitrary "collections" unless we actually have a σ-algebra.
So, what's the theory? I tried to figure it out, but I could not; and I tried to go Google it but found nothing.
I think the right way to think about map-reduce is not as a computational paradigm in its own right, but rather as a control flow construct similar to a while loop. You can view while as a program constructor with two arguments, a predicate function and an arbitrary program. Similarly, the map-reduce construct has two arguments named map and reduce, each functions. So analogously to while, the useful questions to ask are about proving correctness of constructed programs relative to given preconditions and postconditions. And as usual, those questions involve (a) termination and run-time performance and (b) maintenance of invariants.

What are the advantages and disadvantages of recursion?

With respect to using recursion over non-recursive methods in sorting algorithms or, for that matter, any algorithm what are its pros and cons?
For the most part recursion is slower, and takes up more of the stack as well. The main advantage of recursion is that for problems like tree traversal it make the algorithm a little easier or more "elegant".
Check out some of the comparisons:
link
Recursion means a function calls repeatedly
It uses system stack to accomplish its task. As stack uses LIFO approach
and when a function is called the controlled is moved to where function is defined which has it is stored in memory with some address, this address is stored in stack
Secondly, it reduces a time complexity of a program.
Though bit off-topic,a bit related. Must read. : Recursion vs Iteration
All algorithms can be defined recursively. That makes it much, much easier to visualize and prove.
Some algorithms (e.g., the Ackermann Function) cannot (easily) be specified iteratively.
A recursive implementation will use more memory than a loop if tail call optimization can't be performed. While iteration may use less memory than a recursive function that can't be optimized, it has some limitations in its expressive power.
Any algorithm implemented using recursion can also be implemented using iteration.
Why not to use recursion
It is usually slower due to the overhead of maintaining the stack.
It usually uses more memory for the stack.
Why to use recursion
Recursion adds clarity and (sometimes) reduces the time needed to write and debug code (but doesn't necessarily reduce space requirements or speed of execution).
Reduces time complexity.
Performs better in solving problems based on tree structures.
For example, the Tower of Hanoi problem is more easily solved using recursion as opposed to iteration.
I personally prefer using Iterative over recursive function. Especially if you function has complex/heavy logic and number of iterations are large. This because with every recursive call call stack increases. It could potentially crash the stack if you operations are too large and also slow up process.
To start:
Pros:
It is the unique way of implementing a variable number of nested loops (and the only elegant way of implementing a big constant number of nested loops).
Cons:
Recursive methods will often throw a StackOverflowException when processing big sets. Recursive loops don't have this problem though.
To start :
Recursion:
A function that calls itself is called as recursive function and this technique is called as recursion.
Pros:
1. Reduce unnecessary calling of functions.
2. Through Recursion one can solve problems in easy way while its iterative solution is very big and complex.
3. Extremely useful when applying the same solution.
Cons:
1. Recursive solution is always logical and it is very difficult to trace.
2. In recursive we must have an if statement somewhere to force the function to return without the recursive call being executed, otherwise the function will never return.
3. Recursion uses more processor time.
Expressiveness
Most problems are naturally expressed by recursion such as Fibonacci, Merge sorting and quick sorting. In this respect, the code is written for humans, not machines.
Immutability
Iterative solutions often rely on varying temporary variables which makes the code hard to read. This can be avoided with recursion.
Performance
Recursion is not stack friendly. Stack can overflow when the recursion is not well designed or tail optimization is not supported.
Some situation would arise where you would have to abandon recursion in a problem where recursion appears to be to your advantage, this is because for problems where your recursion would have to occur thousand of times this would result in a stackoverflow error even though your code did not get stuck in an infinite recursion. Most programming languages limits you to a number of stack calls, so if your recursion goes beyond this limit, then you might consider not using recursion.
We should use recursion in following scenarios:
when we don't know the finite number of iteration for example our fuction exit condition is based on dynamic programming (memoization)
when we need to perform operations on reverse order of the elements. Meaning we want to process last element first and then n-1, n-2 and so on till first element
Recursion will save multiple traversals. And it will be useful, if we can divide the stack allocation like:
int N = 10;
int output = process(N) + process(N/2);
public void process(int n) {
if (n==N/2 + 1 || n==1) {
return 1;
}
return process(n-1) + process(n-2);
}
In this case only half stacks will be allocated at any given time.
Recursion gets a bad rep, I'm always surprised by the number of developers that wont even touch recursion because someone told them it was evil incarnate.
I've learned through trial and error that when done properly recursion can be one of the fastest ways to iterate over something, it is not a steadfast rule and each language/ compiler/ engine has it's own quirks so mileage will vary.
In javascript I can reliably speed up almost any iterative process by introducing recursion with the added benefit of reducing side effects and making the code more clear concise and reusable. Also pro tip its possible to get around the stack overflow issue (and no you dont disable the warning).
My personal Pros & Cons:
Pros:
- Reduces side effects.
- Makes code more concise and easier to reason about.
- Reduces system resource usage and performs better than the traditional for loop.
Cons:
- Can lead to stack overflow.
- More complicated to setup than a traditional for loop.
Mileage will vary depending on language/ complier/ engine.

What is a practical difference between a loop and recursion

I am currently working in PHP, so this example will be in PHP, but the question applies to multiple languages.
I am working on this project with a fiend of mine, and as always we were held up by a big problem. Now we both went home, couldn't solve the problem. That night we both found the solution, only I used a loop to tackle the problem, and he used recursion.
Now I wanted to tell him the difference between the loop and recursion, but I couldn't come up with a solution where you need recursion over a normal loop.
I am going to make a simplified version of both, I hope someone can explain how one is different from the other.
Please forgive me for any coding errors
The loop:
printnumbers(1,10);
public function printnumbers($start,$stop)
{
for($i=$start;$i<=$stop;$i++)
{
echo $i;
}
}
Now the code above just simply prints out the numbers.
Now let's do this with recursion:
printnumbers(1,10);
public function printnumbers($start,$stop)
{
$i = $start;
if($i <= $stop)
{
echo $i;
printnumbers($start+1,$stop);
}
}
This method above will do the exact same thing as the loop, but then only with recursion.
Can anyone explain to me what there is different about using one of these methods.
Loops and recursions are in many ways equivalent. There are no programs the need one or the other, in principle you can always translate from loops to recursion or vice versa.
Recursions is more powerful in the sense that to translating recursion to a loop might need a stack that you have to manipulate yourself. (Try traversing a binary tree using a loop and you will feel the pain.)
On the other hand, many languages (and implementations), e.g., Java, don't implement tail recursion properly. Tail recursion is when the last thing you do in a function is to call yourself (like in your example). This kind of recursion does not have to consume any stack, but in many languages they do, which means you can't always use recursion.
Often, a problem is easier expressed using recursion. This is especially true when you talk about tree-like data structures (e.g. directories, decision trees...).
These data structures are finite in nature, so most of the time processing them is clearer with recursion.
When stack-depth is often limited, and every function call requires a piece of stack, and when talking about a possibly infinite data structure you will have to abandon recursion and translate it into iteration.
Especially functional languages are good at handling 'infinite' recursion. Imperative languages are focused on iteration-like loops.
In general, a recursive function will consume more stack space (since it's really a large set of function calls), while an iterative solution won't. This also means that an iterative solution, in general, will be faster because.
I am not sure if this applies to an interpreted language like PHP though, it is possible that the interpreter can handle this better.
A loop will be faster because there's always overhead in executing an extra function call.
A problem with learning about recursion is a lot of the examples given (say, factorials) are bad examples of using recursion.
Where possible, stick with a loop unless you need to do something different. A good example of using recursion is looping over each node in a Tree with multiple levels of child nodes.
Recursion is a bit slower (because function calls are slower than setting a variable), and uses more space on most languages' call stacks. If you tried to printnumbers(1, 1000000000), the recursive version would likely throw a PHP fatal error or even a 500 error.
There are some cases where recursion makes sense, like doing something to every part of a tree (getting all files in a directory and its subdirectories, or maybe messing with an XML document), but it has its price -- in speed, stack footprint, and the time spent to make sure it doesn't get stuck calling itself over and over til it crashes. If a loop makes more sense, it's definitely the way to go.
Well, I don't know about PHP but most languages generate a function call (at the machine level) for every recursion. So they have the potential to use a lot of stack space, unless the compiler produces tail-call optimizations (if your code allows it).
Loops are more 'efficient' in that sense because they don't grow the stack. Recursion has the advantage of being able to express some tasks more naturally though.
In this specific case, from a conceptual (rather than implementative) point of view, the two solutions are totally equivalent.
Compared to loops, a function call has its own overhead like allocating stack etc. And in most cases, loops are more understandable than their recursive counterparts.
Also, you will end up using more memory and can even run out of stack space if the difference between start and stop is high and there are too many instances of this code running simultaneously (which can happen as you get more traffic).
You don't really need recursion for a flat structure like that. The first code I ever used recursion in involved managing physical containers. Each container might contain stuff (a list of them, each with weights) and/or more containers, which have a weight. I needed the total weight of a container and all it held. (I was using it to predict the weight of large backpacks full of camping equipment without packing and weighing them.) This was easy to do with recursion and would have been a lot harder with loops. But many kinds of problems that naturally suit themselves to one approach can also be tackled with the other.
Stack overflow.
And no, I don't mean a website or something. I MEAN a "stack overflow".

Recursion Vs Loops

I am trying to do work with examples on Trees as given here: http://cslibrary.stanford.edu/110/BinaryTrees.html
These examples all solve problems via recursion, I wonder if we can provide a iterative solution for each one of them, meaning, can we always be sure that a problem which can be solved by recursion will also have a iterative solution, in general. If not, what example can we give to show a problem which can be solved only by recursion/Iteration?
--
The only difference between iteration and recursion on a computer is whether you use the built-in stack or a user-defined stack. So they are equivalent.
In my experience, most recursive solution can indeed be solved iteratively.
It is also a good technique to have, as recursive solutions may have too large an overhead in memory and CPU consumptions.
Since recursion uses an implicit stack on which it stores information about each call, you can always implement that stack yourself and avoid the recursive calls. So yes, every recursive solution can be transformed into an iterative one.
Read this question for a proof.
Recursion and iteration are two tools that, at a very fundamental level, do the same thing: execute a repeated operation over a defined set of values. They are interchangeable in that there is no problem that cannot, in some way, be solved by only one of them. That does not mean, however, that one cannot be more suited than the other.
Recursion has the advantage where it will continue without a known end. A perfect example of this is a tuned and threaded Quick Sort.
You can't spawn additional loops, but you can spawn new threads via recursion.
As an "old guy," I fall back to my memory of learning that recursive descent parsers are easier to write, but that stack-based, iterative parsers perform better. Here's an article that seems to support that idea with metrics:
http://www.texttoolkit.com/index.php?option=com_content&view=article&catid=35%3Atechnology&id=60%3Abeyond-recursive-descent&Itemid=55
One thing to note is the author's mention of overrunning the call stack with recursive descent. An iterative, stack-based implementation can be much more efficient of resources.

Resources