Induction on recursive problems - recursion

Let 𝑇(𝑛) be defined recursively as follows: 𝑇(1) = 𝑐 and 𝑇(𝑛) = 𝑇(⌊n/2⌋) + 𝑐 for all integers 𝑛 ≥ 2, where 𝑐 is an arbitrary positive constant. Prove by induction on 𝑛 that 𝑇(𝑛) ≤ 𝑐log𝑛 + 𝑐. (Note: ⌊𝑥⌋ is the floor function, defined as rounding down 𝑥 to the closest integer that is ≤ 𝑥.)
Can anyone help me out step by step with this example?
Thanks in advance!

T(1) = 𝑐 and T(n) = T(⌊n/2⌋) + 𝑐
We need to prove that T(n) <= c * log(2, n) + c
Simplest proof
The T(n) = T(⌊n/2⌋) + 𝑐 recursion will need ⌊log(2, n)⌋ steps to reach T(1) (because we always halve the index, which adds up into a binary logarithm)
Since we add c each time we go further down in the recursion, the ⌊log(2, n)⌋ steps amount to c * ⌊log(2, n)⌋.
So, since we have reached T(1), we have
c * ⌊log(2, n)⌋ + T(1) =
c * ⌊log(2, n)⌋ + c
and it is trivial that
c * ⌊log(2, n)⌋ + c <= c * log(2, n) + c
How to prove this with induction
Since we have a division by 2 each time for the index, it is clear that if ⌊log(2, i)⌋ = ⌊log(2, j)⌋, then T(i) = T(j), because the exact parameter value is not important by itself, it is the number of steps that determines the value we multiply c with.
And, since in our condition above ⌊log(2, i)⌋ = ⌊log(2, j)⌋, the steps needed for T(i) and T(j) are exactly the same, which proves that
if i = 2^p and i < j < 2^(p + 1), then
T(i) = T(j) = c * ⌊log(2, i)⌋ + c and it is true that
c * ⌊log(2, i)⌋ + c <= c * log(2, i) + c < c * log(2, j) + c
Hence, exact equalities occur when we have a power of 2 as a parameter, while, otherwise T(j) is strictly smaller than c * log(2, j) + c
As a result, it is enough to prove that
statement(2^n) => statement(2^(n + 1))
T(2^(n + 1)) =
T(2 ^ n) + c =
c * log(2, 2^n) + c + c =
c * log(2, 2^n) + c * log(2, 2) + c =
c * (log(2, 2^n) + log(2, 2)) + c =
c * (log(2, 2 * 2^n)) + c =
c * (log(2, 2^(n + 1))) which, accordingly to the concept of mathematical inclusion proves our statement.

Related

Solving the recurrence relation T(n) = 2T(n/2)+1 with recursion

I'm trying to find the big O of T(n) = 2T(n/2) + 1. I figured out that it is O(n) with the Master Theorem but I'm trying to solve it with recursion and getting stuck.
My solving process is
T(n) = 2T(n/2) + 1 <= c * n
(we know that T(n/2) <= c * n/2)
2T(n/2) + 1 <= 2 * c * n/2 +1 <= c * n
Now I get that 1 <= 0.
I thought about saying that
T(n/2) <= c/2 * n/2
But is it right to do so? I don't know if I've seen it before so I'm pretty stuck.
I'm not sure what you mean by "solve it with recursion". What you can do is to unroll the equation.
So first you can think of n as a power of 2: n = 2^k.
Then you can rewrite your recurrence equation as T(2^k) = 2T(2^(k-1)) + 1.
Now it is easy to unroll this:
T(2^k) = 2 T(2^(k-1)) + 1
= 2 (T(2^(k-2) + 1) + 1
= 2 (2 (T(2^(k-3) + 1) + 1) + 1
= 2 (2 (2 (T(2^(k-4) + 1) + 1) + 1) + 1
= ...
This goes down to k = 0 and hits the base case T(1) = a. In most cases one uses a = 0 or a = 1 for exercises, but it could be anything.
If we now get rid of the parentheses the equation looks like this:
T(n) = 2^k * a + 2^k + 2^(k-1) + 2^(k-2) + ... + 2^1 + 2^0
From the beginning we know that 2^k = n and we know 2^k + ... + 2 + 1 = 2^(k+1) -1. See this as a binary number consisting of only 1s e.g. 111 = 1000 - 1.
So this simplifies to
T(n) = 2^k * a + 2^(k+1) - 1
= 2^k * a + 2 * 2^k - 1
= n * a + 2 * n - 1
= n * (2 + a) - 1
And now one can see that T(n) is in O(n) as long as a is a constant.

Induction Proof $T(n) = 9T(n/3) + n^2$

How can I prove that the reccurence
T(n) = 9T(n/3) + n2
leads to T(n) = O(n2 log(n)) using the substitution method and a proof by induction? I’m not allowed to use the Master Theorem.
Using induction and assuming T(n) ≤ cn2 log n, I got to the following point:
T(n) = 9 * T(n/3) + n2
≤ 9c ( n2 / 9 + log(n/3)) +n2
= cn2 + 9c log(n/3) + n2
Thank you.
I think you've made a math error in your substitution. If we assume that T(n) ≤ cn2 log n, then we'd get
T(n) = 9T(n / 3) + n2
≤ 9(c(n / 3)2 log(n / 3)) + n2
= 9((1 / 9)cn2 log (n / 3)) + n2
= cn2 log(n / 3) + n2
You're very close to having things complete at this point. As a hint, suppose that the logarithm is a base-3 logarithm. What happens if you then use properties of logarithms to simplify cn2 log(n / 3)?

Solving the recurrence equation T(n) = 3 + m * T(n - m)

I have a Computer Science Midterm tomorrow and I need help determining the complexity of a particular recursive function as below, which is much complicated than the stuffs I've already worked on: it has two variables
T(n) = 3 + mT(n-m)
In simpler cases where m is a constant, the formula can be easily obtained by writing unpacking the relation; however, in this case, unpacking doesn't make the life easier as follows (let's say T(0) = c):
T(n) = 3 + mT(n-m)
T(n-1) = 3 + mT(n-m-1)
T(n-2) = 3 + mT(n-m-2)
...
Obviously, there's no straightforward elimination according to these inequalities. So, I'm wondering whether or not I should use another technique for such cases.
Don't worry about m - this is just a constant parameter. However you're unrolling your recursion incorrectly. Each step of unrolling involves three operations:
Taking value of T with argument value, which is m less
Multiplying it by m
Adding constant 3
So, it will look like this:
T(n) = m * T(n - m) + 3 = (Step 1)
= m * (m * T(n - 2*m) + 3) + 3 = (Step 2)
= m * (m * (m * T(n - 3*m) + 3) + 3) + 3 = ... (Step 3)
and so on. Unrolling T(n) up to step k will be given by following formula:
T(n) = m^k * T(n - k*m) + 3 * (1 + m + m^2 + m^3 + ... + m^(k-1))
Now you set n - k*m = 0 to use the initial condition T(0) and get:
k = n / m
Now you need to use a formula for the sum of geometric progression - and finally you'll get a closed formula for the T(n) (I'm leaving that final step to you).

computational complexity of T(n)=T(n-1)+n

I have to calculate the computational complexity and computational complexity class of T(n) = T(n-1) + n.
My problem is that I don't know any method to do so and the only one I'm familiar with is universal recursion which doesn't apply to this task.
T(0) = a
T(n) = T(n-1) + n
n T(n)
---------
0 a
1 T(1-1) + n = a + 1
2 T(2-1) + n = a + 1 + 2
3 T(3-1) + n = a + 1 + 2 + 3
...
k T(k-1) + n = a + 1 + 2 + ... + k
= a + k(k+1)/2
Guess T(n) = O(n^2) based on the above. We can prove it by induction.
Base case: T(1) = T(0) + 1 = a + 1 <= c*1^2 provided that c >= a + 1.
Induction hypothesis: assume T(n) <= c*n^2 for all n up to and including k.
Induction step: show that T(k+1) <= c*(k+1)^2. We have
T(k+1) = T(k) + k + 1 <= c*k^2 + k + 1
<= c*k^2 + 2k + 1 // provided k >= 0
<= c*(k^2 + 2k + 1) // provided c >= 1
= c*(k+1)^2
We know k >= 0 and we can choose c to be the greater of a+1 and 1, which must reasonably be a+1 since T(0) is nonnegative.

Calculating complexity of recurrence

I am having trouble understanding the concept of recurrences. Given you have T(n) = 2T(n/2) +1 how do you calculate the complexity of this relationship? I know in mergesort, the relationship is T(n) = 2T(n/2) + cn and you can see that you have a tree with depth log2^n and cn work at each level. But I am unsure how to proceed given a generic function. Any tutorials available that can clearly explain this?
The solution to your recurrence is T(n) ∈ Θ(n).
Let's expand the formula:
T(n) = 2*T(n/2) + 1. (Given)
T(n/2) = 2*T(n/4) + 1. (Replace n with n/2)
T(n/4) = 2*T(n/8) + 1. (Replace n with n/4)
T(n) = 2*(2*T(n/4) + 1) + 1 = 4*T(n/4) + 2 + 1. (Substitute)
T(n) = 2*(2*(2*T(n/8) + 1) + 1) + 1 = 8*T(n/8) + 4 + 2 + 1. (Substitute)
And do some observations and analysis:
We can see a pattern emerge: T(n) = 2k * T(n/2k) + (2k − 1).
Now, let k = log2 n. Then n = 2k.
Substituting, we get: T(n) = n * T(n/n) + (n − 1) = n * T(1) + n − 1.
For at least one n, we need to give T(n) a concrete value. So we suppose T(1) = 1.
Therefore, T(n) = n * 1 + n − 1 = 2*n − 1, which is in Θ(n).
Resources:
https://www.cs.duke.edu/courses/spring05/cps100/notes/slides07-4up.pdf
http://www.cs.duke.edu/~ola/ap/recurrence.html
However, for routine work, the normal way to solve these recurrences is to use the Master theorem.

Resources