Calculating the Recurrence Relation T(n)=T(n-1)+logn - recursion

We are to solve the recurrence relation through repeating substitution:
T(n)=T(n-1)+logn
I started the substitution and got the following.
T(n)=T(n-2)+log(n)+log(n-1)
By logarithm product rule, log(mn)=logm+logn,
T(n)=T(n-2)+log[n*(n-1)]
Continuing this, I get
T(n)=T(n-k)+log[n*(n-1)*...*(n-k)]
We know that the base case is T(1), so n-1=k -> k=n+1, and substituting this in we get
T(n)=T(1)+log[n*(n-1)*...*1]
Clearly n*(n-1)*...*1 = n! so,
T(n)=T(1)+log(n!)
I do not know how to solve beyond this point. Is the answer simply O(log(n!))? I have read other explanations saying that it is Θ(nlogn) and thus it follows that O(nlogn) and Ω(nlogn) are the upper and lower bounds respectively.

This expands out to log (n!). You can see this because
T(n) = T(n - 1) + log n
= T(n - 2) + log (n - 1) + log n
= T(n - 3) + log (n - 2) + log (n - 1) + log n
= ...
= T(0) + log 1 + log 2 + ... + log (n - 1) + log n
= T(0) + log n!
The exact answer depends on what T(0) is, but this is Θ(log n!) for any fixed constant value of T(0).
A note - using Stirling's approximation, Θ(log n!) = Θ(n log n). That might help you relate this back to existing complexity classes.
Hope this helps!

Stirling's formula is not needed to get the big-Theta bound. It's O(n log n) because it's a sum of at most n terms each at most log n. It's Omega(n log n) because it's a sum of at least n/2 terms each at least log (n/2) = log n - 1.

Yes, this is a linear recurrence of the first order. It can be solved exactly. If your initial value is $T(1) = 0$, you do get $T(n) = \log n!$. You can approximate $\log n!$ (see Stirling's formula):
$$
\ln n! = n \ln n - n + \frac{1}{2} \ln \pí n + O(\ln n)
$$
[Need LaTeX here!!]

Related

Recurrence: T(n) = T(n/2) + T(n/4) + T(n/8) + Ω(n) , what is the complexity of T(n)?

how to solve the recurrence equation . T(n) = T(n/2) + T(n/4) + T(n/8) + Ω(n).
I can solve it if instead of Ω(n), we had (n), but now I can't solve it. please help me!
It seems a little bit unusual to have a lower bound like that, but I believe we can use strong induction to give a lower bound on T(n). I start with an educated guess that the recursion will add a factor of O(lg n) to the Ω(n) bound, and use strong induction to verify this guess. Let's consider the case in which Ω(n) is minimised, to achieve a lower bound on the whole recurrence. That is, we assume that the function of n hiding in the Omega notation is actually Θ(n):
Assume true for all values of n up to k:
I.H. : T(n) < cn lg n - dn (for n < k)
Prove that it is true for k also:
T(k) = T(k/2) + T(k/4) + T(k/8) + Θ(n)
(IH) T(k) = (k/2)lg(k/2) + (k/4)lg(k/4) + (k/8)lg(k/8) - 3dn + Θ(n)
T(k) = ck lg k - 3dn + Θ(n)
< ck lg k - dn as required
Since we can choose the constant d large enough to outweigh the constant hidden in the Theta-notation. Since cn lg n is Ω(n lg n), we can give this as a lower bound on the recurrence. Owing to the Ω(n) term in the original, I believe this is the tightest asymptotic bound we can give.

F(n) = F(n-1) - F(n-2)

I came across this sequence in a programming contest
F(n)= F(n-1)-F(n-2);
Given F0 and F1 find nth term
(http://codeforces.com/contest/450/problem/B) (the contest is over)
Now the solution of this problem is like this
The sequence take value f0, f1, f1-f0, -f0, -f1, f0 - f1 then again f0 and the whole sequence is repeated.
I get that this value is being repeated but could not found the reason for this cyclic order. I searched for cyclic order and sequences but could not find sufficient material that could give the actual feel for the reason of the cycle.
If applying your original formula for n-1
F(n -1) = F(n-2) - F(n -3)
Than if I replace F(n-1) in the original F(n) expression
F(n) = F(n-2) - F(n -3) - F(n-2) = -F(n - 3)
F(n) = - F(n-3)
Since the later also is valid if I replace n with n-3
F(n - 3) = - F(n -6)
Combining the last two
F(n) = -(-F(n-6)) = F(n-6)
Thus sequence is cyclical with the period of six
Another way to approach this problem. Note that F(n) = F(n - 1) - F(n - 2) is the same as F(n) - F(n - 1) + F(n - 2) = 0 which makes it a linear difference equation. Such equations have fundamental solutions a^n where a is a root of a polynomial: suppose F(n) = a^n, then a^n - a^(n - 1) + a^(n - 2) = (a^2 - a + 1)*a^(n - 2) = 0, so a^2 - a + 1 = 0 which has two complex roots (you can find them) which have modulus 1 and argument pi/3. So their powers 1, a, a^2, a^3, ... travel around the unit circle and come back to 1 after 2 pi/(pi/3) = 6 steps.
This analysis has the same defect as the corresponding one for differential equations -- how do you know to look for solutions of a certain kind? I don't have an answer for that, maybe someone else does. In the meantime, whenever you see a linear difference equation, think about solutions of the form a^n.

Big-O proof involving a sum of logs

Prove that
I put the series into the summation, but I have no idea how to tackle this problem. Any help is appreciated
There are two useful mathematical facts that can help out here. First, note that ⌈x⌉ ≤ x + 1 for any x. Therefore,
sum from i = 1 to n (⌈log (n/i)⌉) ≤ (sum from i = 1 to n log (n / i)) + n
Therefore, if we can show that the second summation is O(n), we're done.
Using properties of logs, we can rewrite
log(n/1) + log(n/2) + ... + log(n/n)
= log(nn / n!)
Let's see if we can simplify this. Using properties of logarithms, we get that
log(nn / n!) = log(nn) - log(n!)
= n log n - log (n!)
Now, we can use Stirling's approximation, which says that
log (n!) = n log n - n + O(log n)
Therefore:
n log n - log (n!)
= n log n - n log n + n - O(log n)
= O(n)
So the summation is O(n), as required.
Hope this helps!
As a rule we know that:
Consequently:

Simplifying Recurrence Relation c(n) = c(n/2) + n^2

I'm really confused on simplifying this recurrence relation: c(n) = c(n/2) + n^2.
So I first got:
c(n/2) = c(n/4) + n^2
so
c(n) = c(n/4) + n^2 + n^2
c(n) = c(n/4) + 2n^2
c(n/4) = c(n/8) + n^2
so
c(n) = c(n/8) + 3n^2
I do sort of notice a pattern though:
2 raised to the power of whatever coefficient is in front of "n^2" gives the denominator of what n is over.
I'm not sure if that would help.
I just don't understand how I would simplify this recurrence relation and then find the theta notation of it.
EDIT: Actually I just worked it out again and I got c(n) = c(n/n) + n^2*lgn.
I think that is correct, but I'm not sure. Also, how would I find the theta notation of that? Is it just theta(n^2lgn)?
Firstly, make sure to substitute n/2 everywhere n appears in the original recurrence relation when placing c(n/2) on the lhs.
i.e.
c(n/2) = c(n/4) + (n/2)^2
Your intuition is correct, in that it is a very important part of the problem. How many times can you divide n by 2 before we reach 1?
Let's take 8 for an example
8/2 = 4
4/2 = 2
2/2 = 1
You see it's 3, which as it turns out is log(8)
In order to prove the theta notation, it might be helpful to check out the master theorem. This is a very useful tool for proving complexity of a recurrence relation.
Using the master theorem case 3, we can see
a = 1
b = 2
logb(a) = 0
c = 2
n^2 = Omega(n^2)
k = 9/10
(n/2)^2 < k*n^2
c(n) = Theta(n^2)
The intuition as to why the answer is Theta(n^2) is that you have n^2 + (n^2)/4 + (n^2)/16 + ... + (n^2)/2^(2n), which won't give us logn n^2s, but instead increasingly smaller n^2s
Let's answer a more generic question for recurrences of the form:
r(n) = r(d(n)) + f(n). There are some restrictions for the functions, that need further discussion, e.g. if x is a fix point of d, then f(x) should be 0, otherwise there isn't any solution. In your specific case this condition is satisfied.
Rearranging the equation we get that r(n) - r(d(n)) = f(n), and we get the intuition that r(n) and r(d(n)) are both a sum of some terms, but r(n) has one more term than r(d(n)), that's why the f(n) as the difference. On the other hand, r(n) and r(d(n)) have to have the same 'form', so the number of terms in the previously mentioned sum has to be infinite.
Thus we are looking for a telescopic sum, in which the terms for r(d(n)) cancel out all but one terms for r(n):
r(n) = f(n) + a_0(n) + a_1(n) + ...
- r(d(n)) = - a_0(n) - a_1(n) - ...
This latter means that
r(d(n)) = a_0(n) + a_1(n) + ...
And just by substituting d(n) into the place of n into the equation for r(n), we get:
r(d(n)) = f(d(n)) + a_0(d(n)) + a_1(d(n)) + ...
So by choosing a_0(n) = f(d(n)), a_1(n) = a_0(d(n)) = f(d(d(n))), and so on: a_k(n) = f(d(d(...d(n)...))) (with k+1 pieces of d in each other), we get a correct solution.
Thus in general, the solution is of the form r(n) = sum{i=0..infinity}(f(d[i](n))), where d[i](n) denotes the function d(d(...d(n)...)) with i number of iterations of the d function.
For your case, d(n)=n/2 and f(n)=n^2, hence you can get the solution in closed form by using identities for geometric series. The final result is r(n)=4/3*n^2.
Go for advance Master Theorem.
T(n) = aT(n/b)+n^klog^p
where a>0 b>1 k>0 p=real number.
case 1: a>b^k
T(n) = 0(n^logba) b is in base.
case 2 a=b^k
1. p>-1 T(n) than T(n)=0(n^logba log^p+1)
2. p=-1 Than T(n)=0(n^logba logn)
3. p<-1 than T(n)=0(n^logba)
case 3: a<b^k
1.if p>=0 than T(n)=0(n^k log^p n)
2 if p<0 than T(n)=O(n^k)
forgave Constant bcoz constant doesn't change time complexity or constant change processor to processor .(i.e n/2 ==n*1/2 == n)

Solving a recurrence equation with multiple recursive steps

I'm looking at some algorithms, and I'm trying to ascertain how multiple recursive steps are treated when forming the equation.
So exhibit A:
It is obvious to me that the recurrence equation here is: T(n) = c + 2T(n/2) which which in big O notation simplifies to O(n)
However here, we have something similar going on as well and I get the recurrence equation T(n) = n + 2T(n/2) since we have two recursive calls not unlike the first one, which in big O notation simplifies to O(n), however that is not the case here. Any input as to how to get the correct recurrence equation in this second one over here?
Any input as to how to go about solving this would be brilliant.
You might be interested in the Master Theorem:
http://en.wikipedia.org/wiki/Master_theorem
The recurrence equation T(n) = n + 2T(n/2) is Theta(n log n), which can be derived using the theorem. To do it manually, you can also assume n = 2^k and do:
T(n) = 2T(n/2) + n
= 2(2T(n/4) + n/2) + n
= (2^2)T(n/(2^2)) + 2n
= (2^2)(2T(n/(2^3)) + n/(2^2)) + 2n
= (2^3)T(n/(2^3)) + 3n
= ...
= (2^k)T(n/(2^k)) + kn
= nT(1) + n log2 n
= Theta(n log n)

Resources