computational complexity of T(n)=T(n-1)+n - recursion

I have to calculate the computational complexity and computational complexity class of T(n) = T(n-1) + n.
My problem is that I don't know any method to do so and the only one I'm familiar with is universal recursion which doesn't apply to this task.

T(0) = a
T(n) = T(n-1) + n
n T(n)
---------
0 a
1 T(1-1) + n = a + 1
2 T(2-1) + n = a + 1 + 2
3 T(3-1) + n = a + 1 + 2 + 3
...
k T(k-1) + n = a + 1 + 2 + ... + k
= a + k(k+1)/2
Guess T(n) = O(n^2) based on the above. We can prove it by induction.
Base case: T(1) = T(0) + 1 = a + 1 <= c*1^2 provided that c >= a + 1.
Induction hypothesis: assume T(n) <= c*n^2 for all n up to and including k.
Induction step: show that T(k+1) <= c*(k+1)^2. We have
T(k+1) = T(k) + k + 1 <= c*k^2 + k + 1
<= c*k^2 + 2k + 1 // provided k >= 0
<= c*(k^2 + 2k + 1) // provided c >= 1
= c*(k+1)^2
We know k >= 0 and we can choose c to be the greater of a+1 and 1, which must reasonably be a+1 since T(0) is nonnegative.

Related

Solving the recurrence relation T(n) = 2T(n/2)+1 with recursion

I'm trying to find the big O of T(n) = 2T(n/2) + 1. I figured out that it is O(n) with the Master Theorem but I'm trying to solve it with recursion and getting stuck.
My solving process is
T(n) = 2T(n/2) + 1 <= c * n
(we know that T(n/2) <= c * n/2)
2T(n/2) + 1 <= 2 * c * n/2 +1 <= c * n
Now I get that 1 <= 0.
I thought about saying that
T(n/2) <= c/2 * n/2
But is it right to do so? I don't know if I've seen it before so I'm pretty stuck.
I'm not sure what you mean by "solve it with recursion". What you can do is to unroll the equation.
So first you can think of n as a power of 2: n = 2^k.
Then you can rewrite your recurrence equation as T(2^k) = 2T(2^(k-1)) + 1.
Now it is easy to unroll this:
T(2^k) = 2 T(2^(k-1)) + 1
= 2 (T(2^(k-2) + 1) + 1
= 2 (2 (T(2^(k-3) + 1) + 1) + 1
= 2 (2 (2 (T(2^(k-4) + 1) + 1) + 1) + 1
= ...
This goes down to k = 0 and hits the base case T(1) = a. In most cases one uses a = 0 or a = 1 for exercises, but it could be anything.
If we now get rid of the parentheses the equation looks like this:
T(n) = 2^k * a + 2^k + 2^(k-1) + 2^(k-2) + ... + 2^1 + 2^0
From the beginning we know that 2^k = n and we know 2^k + ... + 2 + 1 = 2^(k+1) -1. See this as a binary number consisting of only 1s e.g. 111 = 1000 - 1.
So this simplifies to
T(n) = 2^k * a + 2^(k+1) - 1
= 2^k * a + 2 * 2^k - 1
= n * a + 2 * n - 1
= n * (2 + a) - 1
And now one can see that T(n) is in O(n) as long as a is a constant.

Substitution method for solving reccurences

I have recently had trouble with understanding the substitution method for solving reccurences. I watched few on-line lectures about the problem, but sadly it did not tell me much (in one of them I heard that it is based on guessing, which made me even more confused) and I am looking for some tips. My objective is to solve three different reccurence functions using substitution method, find their time complexity and their values for T(32).
Function 1 is defined as:
T(1) = 1
T(n) = T(n-1) + n for n > 1
I started off by listing first few executions:
T(2) = T(2-1)+2 = 1+2
T(3) = T(3-1)+3 = 1+2+3
T(4) = T(4-1)+4 = 1+2+3+4
T(5) = T(5-1)+5 = 1+2+3+4+5
...
T(n) = 1+2+...+(n-1)+n = n(n+1)/2
Then I proved by induction, that T(1) = 1 using the formula for sum of the first n natural numbers, and then that it is also true for n+1. It was pretty clear to me, but I am not sure whether this is substitution method. Also knowing the formula T(n) = n(n+1)/2 I easily calculated T(32) = 528 and counted the time complexity, which is O(n^2).
In examples (2) and (3) I only need solution for n=2^k when k is a natural number, but it would be nice if you recommended me any articles showing how to get these for all n as well (but I suppose it is way harder than that).
Function 2 is defined as:
T(1) = 0
T(n) = T(n/2) + 1 for even n > 1
T(n) = T((n+1)/2) + 1 for odd n > 1
As I was allowed to prove it only for n=2^k and based on my gained knowledge I tried to do it following way:
T(n) = T(n/2) + 1
= T(n/4) + 1 + 1 = T(n/4) + 2
= T(n/8) + 1 + 2 = T(n/8) + 3
= T(n/16) + 1 + 3 = T(n/16) + 4
= T(n/2^i) + i // where i <= k, according to tutorials
And this is the moment where I get stuck and I cannot proceed further. I suppose that my calculations are correct, but I am not sure how should I look for a formula, which would satisfy this function. After I get the right formula, calculating T(32) or time complexity will not be a problem.
Function 3 is defined as:
T(1) = 1
T(n) = 2T(n/2) + 1 for even n > 1
T(n) = T((n – 1)/2) + T((n+1)/2) + 1 for odd n > 1
My calculations:
T(n) = 2T(n/2) + 1
= 2(2T(n/4)+1) + 1 = 4T(n/4) + 3
= 4(2T(n/8)+1) + 3 = 8T(n/8) + 7
= iT(n/2^i) + 2^i - 1
And again it comes to the formula, which I am not sure how should be rewritten.
Basically, does substitution method for solving reccurences means finding and iterative formula?
After restudying the topic I found what I did wrong and not to leave my question unanswered, I will try to do it.
The first function is calculated well, induction proof is also correct - nothing to add here.
When it comes to the second function where I got stuck, I did not
pay attention that I was actually using a substitution n=2^k. This
is how it should look:
T(n) = T(n/2) + 1
= T(n/4) + 1 + 1 = T(n/4) + 2
= T(n/8) + 1 + 2 = T(n/8) + 3
= T(n/16) + 1 + 3 = T(n/16) + 4
= T(n/2^k) + k
= T(1) + k
= k
Induction proof that T(2^k) = k works:
Base case: k=1, then T(2^1) = T(2) = 1. (it cannot be k=0, as 2^0 is not bigger than 1)
Inductive step: assume T(2^k) = k, we want to show T(2^(k+1)) = k+1. As 2^k=n, then 2^(k+1) = 2*2^k = 2n.
T(2n) = T(n) + 1
= T(n/2) + 1 + 1
= T(n/4) + 2 + 1
= T(n/8) + 3 + 1
= T(1) + k + 1
= k + 1.
Time complexity: O(log n)
T(32) = T(2^5) = 5
In the third function I missed that every time the function called
itself, the value has doubled.
T(n) = 2T(n/2) + 1
= 2(2T(n/4)+1) + 1 = 4T(n/4) + 3
= 4(2T(n/8)+1) + 3 = 8T(n/8) + 7
= 8(2T(n/16)+1) + 7 = 16T(n/16) + 15
= 16(2T(n/32)+1) + 15 = 32T(n/32) + 31
= 2^k*T(n/2^i) + 2^k - 1
= 2^k*T(1) + 2^k - 1
= 2^k + 2^k - 1
= 2^(k+1) - 1
Induction proof that T(2^k) = 2^(k+1)-1 works:
Base case: k=1, then T(2^1) = T(2) = 3. Original function T(2) = 2T(1)+1 = 2+1 = 3, so base case is true.
Inductive step: assume T(2^k) = 2^(k+1)-1, we want to show T(2^(k+1)) = 2^(k+2)-1. Similarly, as in the second function, 2^k=n, so 2^(k+1) = 2*2^k = 2n.
T(2n) = 2T(n) + 1
= 2(2T(n/2)+1) + 1 = 4T(n/2) + 3
= 4(2T(n/4)+1) + 1 = 8T(n/4) + 7
= 8(2T(n/8)+1) + 1 = 16T(n/8) + 15
= 2^(k+1) + 2^(k+1) - 1
= 2*2^(k+1) - 1
= 2^(k+2) - 1.
We could also take a look at first few elements for T(n), which are 1, 3, 5, 7, 9, etc., so T(n) = 2n-1
Time complexity: O(n)
T(32) = T(2^5) = 2^(5+1) - 1 = 64 - 1 = 63

Finding a Closed Form for Recursive Function [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 6 years ago.
Improve this question
Consider the following recurrence relation.
T(n) = 5 if n <= 2
T(n-1) + n otherwise
Closed form solution for T(n) is
I got solution as n(n+1)/2 + 7 for all the values. But in my university exam they gave the solution n(n+1)/2 + 2. However this solution doesn't terminate at 5 for values n<2. Can some body please explain ?
Let's solve it; first let's expand in telescopic sums:
T(k) = T(k)
T(k + 1) = T(k) + k + 1
T(k + 2) = T(k + 1) + k + 2 = T(k) + k + 1 + k + 2
...
T(k + m) = T(k) + k + 1 + k + 2 + ... + k + m =
= T(k) + mk + 1 + 2 + ... + m =
= T(k) + mk + (1 + m) * m / 2
...
Now we have
T(k + m) = T(k) + mk + (1 + m) * m / 2
Let k = 2:
T(m + 2) = T(2) + 2m + (1 + m) * m / 2 = 5 + 2m + (1 + m) * m / 2
Finally, let m + 2 = n or m = n - 2:
T(n) = 5 + 2 * (n - 2) + (n - 1) * (n - 2) / 2 = n * (n + 1) / 2 + 2
We have
T(n) = n * (n + 1) / 2 + 2 when n > 2
T(n) = 5 when n <= 2
As a simple non-rigorous reasoning exercise, we known that T(n) = T(n-1) + n yields the sum of the first n numbers: S(n) = n * (n + 1) / 2
Now, when n=2, S(2) = 3, so the value of 5 is actually an increment by 5 - S(2) = 2. So we could say:
T(n) = S(n) + (5 - S(2)) for n >=2
or
T(n) = S(n) + 2 for n >= 2
T(n) = 5 for n <= 2
Note that at n=2, the two relations are identical.

Asymptotic time complexity of recursive function (Theta)

I have been asked to analyze the asymptotic time complexity of the following recursion function:
for-all k ≥ 1:
T(n) = n + T(n/2) + T(n/4) + T(n/8) + .... + T(n/2^k)
I was able to prove that:
T(n) = O(n⋅log n) and T(n) = Ω(n),
but I am looking for a tighter bound (Big Theta).
First of all:
I understand "for-all k >= 1" this way: for k = 1 to k = m where 2m-1 ≤ n ≤ 2m.
So basicly m = log₂(n) holds.
Have a look at my calculation:
T(n) = n + Σk=1,...,m T(n/2k)
= n + T(n/2) + Σk=2,...,m T(n/2k)
= n + n/2 + 2⋅Σk=2,...,m T(n/2k)
= ...
= n + Σk=1,...,m k⋅n/2k
= n + n⋅Σk=1,...,m k/2k
= n + n⋅(2 - 2-mm - 21-m)
≤ n + 2⋅n
= 3n
So T(n) is in Θ(n).
Notice:
You can also approximate Σk=1,...,m k/2k by the integral s(m) = ∫1m k/2k dk.
And here limm → ∞s(m) = 2 also holds.

How can I calculate the exact worst-case running time of a function given by a recurrence?

I am trying to calculate the value of the running time at the worst case for a function whose worst-case runtime is given by this recurrence:
T(0) = 0
T(n) = n + T(n - 1) (if n > 0)
Does anyone have any advice how to do this? I don't see how to solve this.
It might help to try expanding out the recurrence to see if you spot a pattern:
T(0) = 0
T(1) = 1 + T(0) = 1 + 0
T(2) = 2 + T(1) = 2 + 1 + 0
T(3) = 3 + T(2) = 3 + 2 + 1 + 0
Based on this pattern, it looks like T(n) = 0 + 1 + 2 + ... + n. This is a famous summation worked out by Gauss, and it solves to n(n+1)/2. Therefore, we could conjecture that T(n) = n(n+1)/2.
You can formalize this by proving it by induction. As a base case, T(0) = 0 = 0(0+1)/2, so everything checks out. For the inductive step, assume T(n) = n(n+1)/2. Then T(n+1) = (n+1) + T(n) = (n+1) + n(n+1)/2 = (n+1) (1 + n / 2) = (n+1)(n+2)/2 = ((n+1))((n+1) + 1) / 2, which checks out as well.
Therefore, your exact value is T(n) = n(n+1)/2.
Hope this helps!

Resources