Troublesome recurrence equation: T(n) = 2*T(ceil((sqrt(n)))+1 - math

I have recently encountered a recurrence problem:
T(n) = 2*T(ceil((sqrt(n)))+1
T(1)=1;
I am unable to see this function terminate at all when I draw my recurrence tree. The general node form in the tree (n1/2i) becomes 1 only when 1/2i becomes 0. This means i should tend to infinity.

You're right that if sqrt is the ceiling of the square root, then you'll never reach 1 by repeatedly applying square roots. I'm going to assume that you meant to use the floor, which means that you will indeed eventually hit 1 as the recurrence unwinds.
In this case, your recurrence is more properly
T(1) = 1
T(n) = 2T(⌊√n⌋) + 1
A standard technique for solving recurrences involve square roots is to make a substitution. Let's define a new value k such that n = 2k. Notice that √n = (2k)1/2 = 2k/2. In other words, taking the square root of n is equivalent to halving the value of k. Because of this, we can convert the above recurrence, which involves square roots, into a new recurrence that will more closely match the form used by the Master Theorem and other recurrence-solving techniques. Specifically, let's define S(k) = T(2k). Then we get the recurrence
S(0) = 1
S(k) = 2S(⌊k / 2⌋) + 1
It's a lot easier to see how to solve this recurrence. Either by recognizing this recurrence from elsewhere or by using the Master Theorem, we get that S(k) = Θ(k). Now, we wanted to solve for T(n), so we can use the fact that S(k) = T(2k) = T(n). Since S(k) = Θ(k), we now see that T(n) = Θ(k). Since we chose k such that 2k = n, this means that k = lg n. Therefore, T(n) = Θ(log n), so the recurrence works out to Θ(log n).
Hope this helps!

Related

Tree method with 8T(n/2) +n^2

I'm trying to solve this problem, but I think I haven't understood how to do it correctly. The first thing I do in this type of exercises is taking the bigger value in the row (in this case is n^2) and divide it multiple times, so I can find what kind of relation there is between the values. After found the relation, I try to mathematically found its value and then as the final step, I multiply the result for the root. In this case the result should be n^3. How is possible?
Unfortunately #vahidreza's solutions seems false to me because it contradicts the Master theorem. In terms of the Master theorem a = 8, b = 2, c = 2. So log_b(a) = 3 so log_b(a) > c and thus this is the case of a recursion dominated by the subproblems so the answer should be T(n) = Ө(n^3) rather than O(m^(2+1/3)) which #vahidreza has.
The main issue is probably in this statement:
Also you know that the tree has log_8 m levels. Because at each level, you divide the number by 8.
Let's try to solve it properly:
On the zeroth level you have n^2 (I prefer to start counting from 0 as it simplifies notation a bit)
on the first level you have 8 nodes of (n/2)^2 or a total of 8*(n/2)^2
on the second level you have 8 * 8 nodes of (n/(2^2))^2 or a total of 8^2*(n/(2^2))^2
on the i-th level you have 8^i nodes of (n/(2^i))^2 or a total of 8^i*(n/(2^i))^2 = n^2 * 8^i/2^(2*i) = n^2 * 2^i
At each level your value n is divided by two so at level i the value is n/2^i and so you'll have log_2(n) levels. So what you need to calculate is sum for i from 0 to log_2(n) of n^2 * 2^i. That's a geometric progression with a ratio of 2 so it's sum is
Σ (n^2 * 2^i) = n^2 * Σ(2^i) = n^2 * (2^(log_2(n)+1) - 1)/2
Since we are talking about Ө/O we can ignore constants and so we need to estimate
n^2 * 2^log_2(n)
Obviously 2^log_2(n) is just n so the answer is
T(n) = Ө(n^3)
exactly as predicted by the Master theorem.

Solving the recurrence T(n) = 3T(n / 2) + n

I'm studying for an exam and I don't know how to get the height of the recursion tree generated by this relation:
T(n) = 3T(n/2) + n
I know the tree will look like this and that I have to add all the terms:
c*n
/ | \
/ | \
c*n/2 c*n/2 c*n/2
. . .
. . .
Thank you!!
When you have a recurrence relation of the form
T(n) = aT(n / b) + f(n)
then the height of the recursion tree depends only on the choice of b (assuming, of course, that a > 0). The reason for this is that each node in the tree represents what happens when you expand out the above recurrence, and the only place in the above recurrence that you can expand something is in the T(n / b) term. If you increase or decrease a, you will increase or decrease the branching factor of the tree (for example, 2T(n / b) means that there will be two nodes generated when you expand a node, and 3T(n / b) means that there will be three nodes generated when you expand a node), but the branching factor of the tree is independent of the number of levels. It just tells you how many levels there will be. Similarly, changing f(n) will only increase or decrease the total amount of work done at each node, which doesn't impact the shape of the recursion tree.
So how specifically does b impact the height of the tree? Well, in this recurrence relation, each time we expand out T, we end up dividing the size of the input by a factor of b. This means that at the top level of the tree, we'll have problems of size n. Below that are problems of size n / b. Below that are problems of size (n / b) / b = n / b2. Generally, at level k of the tree, the problem size will be n / bk. The recursion stops when the problem size drops to 0 or 1, which happens when k = logb n. In other words, the height of the recursion tree will be O(logb n).
Now, just knowing the tree height won't tell you the total amount of work done, because we also need to know the branching factor and the work done per level. There are a lot of different ways that these can interact with one another, but fortunately there's a beautiful theorem called the master theorem that lets you read off the solution pretty elegantly by just looking at a, b, and f(n). In your case, the recurrence is
T(n) = 3T(n / 2) + O(n)
Plugging this into the master theorem, we see that the solution to the recurrence is T(n) = O(nlog2 3). This is approximately O(n1.58).

Solving recurrence T(n) = T(n/5) + T(7n/10) + Θ(n)

I want to solve this recurrence with an accuracy of Θ:
T(n) = T(n/5) + T(7n/10) + Θ(n)
I can solving typical recurrence but I don't know what to do with this one as it doesn't match to any case of master theorem.
Any help or hint?
You can use the Akra-Bazzi generalization of the master theorem.
If the Theta(n) term were replaced by something smaller, like Theta(1) or Theta(sqrt(n)) in this case, you could simply find the value of alpha so that n^alpha = (n/5)^alpha + (7n/10)^alpha by factoring out n^alpha. However, if you do that, you get alpha = 0.84, and n^0.84 is asymptotically smaller than Theta(n), so the contribution from the Theta(n) terms dominate if you iterate the recursion until the arguments are small. The result is that T(n) is Theta(n), though with different constants than the Theta(n) in the recursion.

Proving worst case running time of QuickSort

I am trying to perform asymptotic analysis on the following recursive function for an efficient way to power a number. I am having trouble determining the recurrence equation due to having different equations for when the power is odd and when the power is even. I am unsure how to handle this situation. I understand that the running time is theta(logn) so any advice on how to proceed to this result would be appreciated.
Recursive-Power(x, n):
if n == 1
return x
if n is even
y = Recursive-Power(x, n/2)
return y*y
else
y = Recursive-Power(x, (n-1)/2)
return y*y*x
In any case, the following condition holds:
T(n) = T(floor(n/2)) + Θ(1)
where floor(n) is the biggest integer not greater than n.
Since floor doesn't have influence on results, the equation is informally written as:
T(n) = T(n/2) + Θ(1)
You have guessed the asymptotic bound correctly. The result could be proved using Substitution method or Master theorem. It is left as an exercise for you.

Recursion and Big O

I've been working through a recent Computer Science homework involving recursion and big-O notation. I believe I understand this pretty well (certainly not perfectly, though!) But there is one question in particular that is giving me the most problems. The odd thing is that by looking it, it looks to be the most simple one on the homework.
Provide the best rate of growth using the big-Oh notation for the solution to the following recurrence?
T(1) = 2
T(n) = 2T(n - 1) + 1 for n>1
And the choices are:
O(n log n)
O(n^2)
O(2^n)
O(n^n)
I understand that big O works as an upper bound, to describe the most amount of calculations, or the highest running time, that program or process will take. I feel like this particular recursion should be O(n), since, at most, the recursion only occurs once for each value of n. Since n isn't available, it's either better than that, O(nlogn), or worse, being the other three options.
So, my question is: Why isn't this O(n)?
There's a couple of different ways to solve recurrences: substitution, recurrence tree and master theorem. Master theorem won't work in the case, because it doesn't fit the master theorem form.
You could use the other two methods, but the easiest way for this problem is to solve it iteratively.
T(n) = 2T(n-1) + 1
T(n) = 4T(n-2) + 2 + 1
T(n) = 8T(n-3) + 4 + 2 + 1
T(n) = ...
See the pattern?
T(n) = 2n-1⋅T(1) + 2n-2 + 2n-3 + ... + 1
T(n) = 2n-1⋅2 + 2n-2 + 2n-3 + ... + 1
T(n) = 2n + 2n-2 + 2n-3 + ... + 1
Therefore, the tightest bound is Θ(2n).
I think you have misunderstood the question a bit. It does not ask you how long it would take to solve the recurrence. It is asking what the big-O (the asymptotic bound) of the solution itself is.
What you have to do is to come up with a closed form solution, i. e. the non-recursive formula for T(n), and then determine what the big-O of that expression is.
The question is asking for the big-Oh notation for the solution to the recurrence, not the cost of calculation the recurrence.
Put another way: the recurrence produces:
1 -> 2
2 -> 5
3 -> 11
4 -> 23
5 -> 47
What big-Oh notation best describes the sequence 2, 5, 11, 23, 47, ...
The correct way to solve that is to solve the recurrence equations.
I think this will be exponential. Each increment to n makes the value to be twice as large.
T(2) = 2 * T(1) = 4
T(3) = 2 * T(2) = 2 * 4
...
T(x) would be the running time of the following program (for example):
def fn(x):
if (x == 1):
return # a constant time
# do the calculation for n - 1 twice
fn(x - 1)
fn(x - 1)
I think this will be exponential. Each increment to n brings twice as much calculation.
No, it doesn't. Quite on the contrary:
Consider that for n iterations, we get running time R. Then for n + 1 iterations we'll get exactly R + 1.
Thus, the growth rate is constant and the overall runtime is indeed O(n).
However, I think Dima's assumption about the question is right although his solution is overly complicated:
What you have to do is to come up with a closed form solution, i. e. the non-recursive formula for T(n), and then determine what the big-O of that expression is.
It's sufficient to examine the relative size of T(n) and T(n + 1) iterations and determine the relative growth rate. The amount obviously doubles which directly gives the asymptotic growth.
First off, all four answers are worse than O(n)... O(n*log n) is more complex than plain old O(n). What's bigger: 8 or 8 * 3, 16 or 16 * 4, etc...
On to the actual question. The general solution can obviously be solved in constant time if you're not doing recursion
( T(n) = 2^(n - 1) + 2^(n) - 1 ), so that's not what they're asking.
And as you can see, if we write the recursive code:
int T( int N )
{
if (N == 1) return 2;
return( 2*T(N-1) + 1);
}
It's obviously O(n).
So, it appears to be a badly worded question, and they are probably asking you the growth of the function itself, not the complexity of the code. That's 2^n. Now go do the rest of your homework... and study up on O(n * log n)
Computing a closed form solution to the recursion is easy.
By inspection, you guess that the solution is
T(n) = 3*2^(n-1) - 1
Then you prove by induction that this is indeed a solution. Base case:
T(1) = 3*2^0 - 1 = 3 - 1 = 2. OK.
Induction:
Suppose T(n) = 3*2^(n-1) - 1. Then
T(n+1) = 2*T(n) + 1 = 3*2^n - 2 + 1 = 3*2^((n+1)-1) - 1. OK.
where the first equality stems from the recurrence definition,
and the second from the inductive hypothesis. QED.
3*2^(n-1) - 1 is clearly Theta(2^n), hence the right answer is the third.
To the folks that answered O(n): I couldn't agree more with Dima. The problem does not ask the tightest upper bound to the computational complexity of an algorithm to compute T(n) (which would be now O(1), since its closed form has been provided). The problem asks for the tightest upper bound on T(n) itself, and that is the exponential one.

Resources