Recursion and Big O - recursion

I've been working through a recent Computer Science homework involving recursion and big-O notation. I believe I understand this pretty well (certainly not perfectly, though!) But there is one question in particular that is giving me the most problems. The odd thing is that by looking it, it looks to be the most simple one on the homework.
Provide the best rate of growth using the big-Oh notation for the solution to the following recurrence?
T(1) = 2
T(n) = 2T(n - 1) + 1 for n>1
And the choices are:
O(n log n)
O(n^2)
O(2^n)
O(n^n)
I understand that big O works as an upper bound, to describe the most amount of calculations, or the highest running time, that program or process will take. I feel like this particular recursion should be O(n), since, at most, the recursion only occurs once for each value of n. Since n isn't available, it's either better than that, O(nlogn), or worse, being the other three options.
So, my question is: Why isn't this O(n)?

There's a couple of different ways to solve recurrences: substitution, recurrence tree and master theorem. Master theorem won't work in the case, because it doesn't fit the master theorem form.
You could use the other two methods, but the easiest way for this problem is to solve it iteratively.
T(n) = 2T(n-1) + 1
T(n) = 4T(n-2) + 2 + 1
T(n) = 8T(n-3) + 4 + 2 + 1
T(n) = ...
See the pattern?
T(n) = 2n-1⋅T(1) + 2n-2 + 2n-3 + ... + 1
T(n) = 2n-1⋅2 + 2n-2 + 2n-3 + ... + 1
T(n) = 2n + 2n-2 + 2n-3 + ... + 1
Therefore, the tightest bound is Θ(2n).

I think you have misunderstood the question a bit. It does not ask you how long it would take to solve the recurrence. It is asking what the big-O (the asymptotic bound) of the solution itself is.
What you have to do is to come up with a closed form solution, i. e. the non-recursive formula for T(n), and then determine what the big-O of that expression is.

The question is asking for the big-Oh notation for the solution to the recurrence, not the cost of calculation the recurrence.
Put another way: the recurrence produces:
1 -> 2
2 -> 5
3 -> 11
4 -> 23
5 -> 47
What big-Oh notation best describes the sequence 2, 5, 11, 23, 47, ...
The correct way to solve that is to solve the recurrence equations.

I think this will be exponential. Each increment to n makes the value to be twice as large.
T(2) = 2 * T(1) = 4
T(3) = 2 * T(2) = 2 * 4
...
T(x) would be the running time of the following program (for example):
def fn(x):
if (x == 1):
return # a constant time
# do the calculation for n - 1 twice
fn(x - 1)
fn(x - 1)

I think this will be exponential. Each increment to n brings twice as much calculation.
No, it doesn't. Quite on the contrary:
Consider that for n iterations, we get running time R. Then for n + 1 iterations we'll get exactly R + 1.
Thus, the growth rate is constant and the overall runtime is indeed O(n).
However, I think Dima's assumption about the question is right although his solution is overly complicated:
What you have to do is to come up with a closed form solution, i. e. the non-recursive formula for T(n), and then determine what the big-O of that expression is.
It's sufficient to examine the relative size of T(n) and T(n + 1) iterations and determine the relative growth rate. The amount obviously doubles which directly gives the asymptotic growth.

First off, all four answers are worse than O(n)... O(n*log n) is more complex than plain old O(n). What's bigger: 8 or 8 * 3, 16 or 16 * 4, etc...
On to the actual question. The general solution can obviously be solved in constant time if you're not doing recursion
( T(n) = 2^(n - 1) + 2^(n) - 1 ), so that's not what they're asking.
And as you can see, if we write the recursive code:
int T( int N )
{
if (N == 1) return 2;
return( 2*T(N-1) + 1);
}
It's obviously O(n).
So, it appears to be a badly worded question, and they are probably asking you the growth of the function itself, not the complexity of the code. That's 2^n. Now go do the rest of your homework... and study up on O(n * log n)

Computing a closed form solution to the recursion is easy.
By inspection, you guess that the solution is
T(n) = 3*2^(n-1) - 1
Then you prove by induction that this is indeed a solution. Base case:
T(1) = 3*2^0 - 1 = 3 - 1 = 2. OK.
Induction:
Suppose T(n) = 3*2^(n-1) - 1. Then
T(n+1) = 2*T(n) + 1 = 3*2^n - 2 + 1 = 3*2^((n+1)-1) - 1. OK.
where the first equality stems from the recurrence definition,
and the second from the inductive hypothesis. QED.
3*2^(n-1) - 1 is clearly Theta(2^n), hence the right answer is the third.
To the folks that answered O(n): I couldn't agree more with Dima. The problem does not ask the tightest upper bound to the computational complexity of an algorithm to compute T(n) (which would be now O(1), since its closed form has been provided). The problem asks for the tightest upper bound on T(n) itself, and that is the exponential one.

Related

Quadratic Equation in Big-Oh Notation

I am reviewing for a midterm test regarding Big-Oh runtime. One of the questions I have difficulty with is given the following recurrence and the question is asking for the Big-Oh notation.
T(n) = 2T(n/2) + (2n^2 + 3n + 5)
So by using the Master Theorem, where k > log_b (a), in this question, I am thinking k is 2 from (2n^2), a is 2 from 2T and b is 2 from (n/2). Therefore, the runtime of Master Theorem is when k > log_b (a), that is 2 > log_2 (2) = 1, then T(n) = O(n^2).
Is my thinking correct? I have never seen a quadratic runtime inside T(n) but I am fairly certain it is O(n^2) in this question.
Thank you for your input!
Yes, O(n^2) is correct. Actually there is a similar example in the Wikipedia article about the master theorem. This function could be anything and basically you just compare the depth and width of the recursion tree with the cost of this additional function and check what is dominating the complexity.

getting closed form of recursion equation and compare which is faster

Get closed form of these equations if possible. Then, determine which would be faster than the other.
f(n) = 0.25f(n/3)+ f(n/10) + logn, f(1) = 1
g(n) = n + log(n-1)^2 + 1
In these equations, do I have to expand these recursions and try to discover patterns within? I really don't know how to calculate closed form intuitively
Short answer: g(n)>f(n)
Long answer: g is not even recursive, so you can see immediately that g(n)=O(n).
You can approximate f(n) <f(n/2)+logn
which, by the master theorem, is Θ(logn)

Tree method with 8T(n/2) +n^2

I'm trying to solve this problem, but I think I haven't understood how to do it correctly. The first thing I do in this type of exercises is taking the bigger value in the row (in this case is n^2) and divide it multiple times, so I can find what kind of relation there is between the values. After found the relation, I try to mathematically found its value and then as the final step, I multiply the result for the root. In this case the result should be n^3. How is possible?
Unfortunately #vahidreza's solutions seems false to me because it contradicts the Master theorem. In terms of the Master theorem a = 8, b = 2, c = 2. So log_b(a) = 3 so log_b(a) > c and thus this is the case of a recursion dominated by the subproblems so the answer should be T(n) = Ө(n^3) rather than O(m^(2+1/3)) which #vahidreza has.
The main issue is probably in this statement:
Also you know that the tree has log_8 m levels. Because at each level, you divide the number by 8.
Let's try to solve it properly:
On the zeroth level you have n^2 (I prefer to start counting from 0 as it simplifies notation a bit)
on the first level you have 8 nodes of (n/2)^2 or a total of 8*(n/2)^2
on the second level you have 8 * 8 nodes of (n/(2^2))^2 or a total of 8^2*(n/(2^2))^2
on the i-th level you have 8^i nodes of (n/(2^i))^2 or a total of 8^i*(n/(2^i))^2 = n^2 * 8^i/2^(2*i) = n^2 * 2^i
At each level your value n is divided by two so at level i the value is n/2^i and so you'll have log_2(n) levels. So what you need to calculate is sum for i from 0 to log_2(n) of n^2 * 2^i. That's a geometric progression with a ratio of 2 so it's sum is
Σ (n^2 * 2^i) = n^2 * Σ(2^i) = n^2 * (2^(log_2(n)+1) - 1)/2
Since we are talking about Ө/O we can ignore constants and so we need to estimate
n^2 * 2^log_2(n)
Obviously 2^log_2(n) is just n so the answer is
T(n) = Ө(n^3)
exactly as predicted by the Master theorem.

Troublesome recurrence equation: T(n) = 2*T(ceil((sqrt(n)))+1

I have recently encountered a recurrence problem:
T(n) = 2*T(ceil((sqrt(n)))+1
T(1)=1;
I am unable to see this function terminate at all when I draw my recurrence tree. The general node form in the tree (n1/2i) becomes 1 only when 1/2i becomes 0. This means i should tend to infinity.
You're right that if sqrt is the ceiling of the square root, then you'll never reach 1 by repeatedly applying square roots. I'm going to assume that you meant to use the floor, which means that you will indeed eventually hit 1 as the recurrence unwinds.
In this case, your recurrence is more properly
T(1) = 1
T(n) = 2T(⌊√n⌋) + 1
A standard technique for solving recurrences involve square roots is to make a substitution. Let's define a new value k such that n = 2k. Notice that √n = (2k)1/2 = 2k/2. In other words, taking the square root of n is equivalent to halving the value of k. Because of this, we can convert the above recurrence, which involves square roots, into a new recurrence that will more closely match the form used by the Master Theorem and other recurrence-solving techniques. Specifically, let's define S(k) = T(2k). Then we get the recurrence
S(0) = 1
S(k) = 2S(⌊k / 2⌋) + 1
It's a lot easier to see how to solve this recurrence. Either by recognizing this recurrence from elsewhere or by using the Master Theorem, we get that S(k) = Θ(k). Now, we wanted to solve for T(n), so we can use the fact that S(k) = T(2k) = T(n). Since S(k) = Θ(k), we now see that T(n) = Θ(k). Since we chose k such that 2k = n, this means that k = lg n. Therefore, T(n) = Θ(log n), so the recurrence works out to Θ(log n).
Hope this helps!

Proving worst case running time of QuickSort

I am trying to perform asymptotic analysis on the following recursive function for an efficient way to power a number. I am having trouble determining the recurrence equation due to having different equations for when the power is odd and when the power is even. I am unsure how to handle this situation. I understand that the running time is theta(logn) so any advice on how to proceed to this result would be appreciated.
Recursive-Power(x, n):
if n == 1
return x
if n is even
y = Recursive-Power(x, n/2)
return y*y
else
y = Recursive-Power(x, (n-1)/2)
return y*y*x
In any case, the following condition holds:
T(n) = T(floor(n/2)) + Θ(1)
where floor(n) is the biggest integer not greater than n.
Since floor doesn't have influence on results, the equation is informally written as:
T(n) = T(n/2) + Θ(1)
You have guessed the asymptotic bound correctly. The result could be proved using Substitution method or Master theorem. It is left as an exercise for you.

Resources