Master Theorem & Recurrences - math

I want to find out how to solve the Master Theorem for this code:
unsigned long fac (unsigned long n ) {
if (n == 1 )
return 1;
else
return fact(n-1)*n;
}
So based on the fact that I have only 1 time calling itself a=1. Besides that function call there is nothing else so O(n) = 1 as well. Now I am struggling with my b. Normally the general formula is:
T(n) = a*T(n/2) + f(n)
In this case I don't divide the main problem though. The new problem has to solve just n-1. What is b now? Because my recurrence would be:
T(n) = 1*T(n-1) + O(1)
How can I use the Master Theorem now, since I don't know my exact b?

You can "cheat" by using a change of variable.
Let T(n) = S(2^n). Then the recurrence says
S(2^n) = S(2^n/2) + O(1)
which we rewrite
S(m) = S(m/2) + O(1).
By the Master theorem with a=1, b=2, the solution is logarithmic
S(m) = O(log m),
which means
T(n) = S(2^n) = O(log 2^n) = O(n).
Anyway, the recurrence is easier to solve directly, using
T(n) = T(n-1) + O(1) = T(n-2) + O(1) + O(1) = ... = T(0) + O(1) + O(1) + ... O(1) = O(n).

The Master Theorem doesn't apply to this particular recurrence relation, but that's okay - it's not supposed to apply everywhere. You most commonly see the Master Theorem show up in divide-and-conquer style recurrences where you split the input apart into blocks that are a constant fraction of the original size of the input, and in this particular case that's not what's happening.
To solve this recurrence, you'll need to use another method like the iteration method or looking at the shape of the recursion tree in a different way.

Related

Complexity Recursion in For

Hi i wanted to know how can i solve the tine complexity of this algorithm
I solved with f(n/4) but not f(n/i)
void f(int n){
if (n<4) return;
for (int i=0;i*i<n;i++)
printf("-");
for (int i=2;i<4;i++)
f(n/i); // solved the case f(n/4) but stuck f(n/i)
}
Note that the loop condition is i<4, so i never reaches 4. i.e. the only recursive terms are f(n/2) and f(n/3).
Recurrence relation:
T(n) = T(n/2) + T(n/3) + Θ(sqrt(n))
There are two ways to approach this problem:
Find upper and lower bounds by replacing one of the recursive terms with the other:
R(n) = 2T(n/3) + Θ(sqrt(n))
S(n) = 2T(n/2) + Θ(sqrt(n))
R(n) ≤ T(n) ≤ S(n)
You can easily solve for both bounds by substitution or applying the Master Theorem:
R(n) = O(n^[log3(2)]) = O(n^0.63...)
S(n) = O(n)
If you need an exact answer, use the Akra-Bazzi method:
a1 = a2 = 1
h1(x) = h2(x) = 0
g(x) = sqrt(x)
b1 = 1/2
b2 = 1/3
You need to solve for a power p such that [1/2]^p + [1/3]^p = 1. Do this numerically with e.g. Newton-Raphson, to obtain p = 0.78788.... Perform the integral:
‒ to obtain T(n) = O(n^0.78...), which is consistent with the bounds found before.
I think this is about O(sqrt(9/2) * sqrt(n)) time, but I'd go with O(sqrt(n)) to be safe. It's admittedly been a while since I worked with time complexity.
If n < 4, the function returns immediately, at constant time O(1)
If n >= 4, the function's for loop, for (int i=0; i*i<n; i++) performs the constant-time function printf("-"); a total number of sqrt(n) times. So far we're at O(sqrt(n)) time.
The next for loop performs two recursive calls: one for f(n/2) and one for f(n/3)
The first runs in O(sqrt(n/2)) time, the second in O(sqrt(n/4)) time, and so on - this series converges to O(sqrt(2n))
Likewise, the function f(n/3) converges to O(sqrt(3/2 n))
This doesn't factor in the fact that each recursive call also invokes a little extra time by calling both of these functions when it runs, but I believe this converges to about O(sqrt(n)) + O(sqrt(2n)) + O(sqrt(3/2 n)), which itself converges to O(sqrt(9/2) * sqrt(n))
This is likely a little low bit low for an exact constant value, but I believe you can safely say this runs at O(sqrt(n)) time, with some small-ish constant out front.

Best Case / Worst Case Recurrence Relations

So, I have a psuedocode that I have to analyze for a class. I'm trying to figure out the best case and the worst case in terms of theta. I figured out the best case, but I'm having trouble with the worst case. I think the worst case is actually the same as the best case, but am second guessing myself and would like some feedback on how to properly develop the recurrence for the worst case if in fact they are not the same.
Code:
function max-element(A)
if n = 1
return A[1]
val = max-element(A[2...n]
if A[1] > val
return A[1]
else
return val
Best Case Recurrence:
T(1) = 1
T(n) = T(n-1) + 1
T(n-1) = T(n-2) + 1
T(n) = T((n-2) + 1) + 1
T(n) = T(n-1) + 1 -> T(n) = T(n-k) + k
Let k = n-1
T(n) = T(n-(n-1)) + n - 1
T(n) = T(1) + n -1
T(n) = 1 + n - 1
T(n) = n
The running time only depends on the number of elements of the array; in particular, it is independent of the contents of the array. So the best- and worst-case running times coincide.
A more correct way to model the time complexity is via the recurrence T(n) = T(n-1) + O(1) and T(1)=O(1) because the O(1) says that you spend some additional constant time in each recursive call. It clearly solves to T(n)=O(n) as you already noted. In fact, this is tight, i.e., we have T(n)=Theta(n).
The running time only depends on the number of elements of the array; in particular, it is independent of the contents of the array. So the best- and worst-case running times coincide.
A more correct way to model the time complexity is via the recurrence T(n) = T(n-1) + O(1) and T(1)=O(1) because the O(1) says that you spend some additional constant time in each recursive call. It clearly solves to T(n)=O(n) as you already noted. In fact, this is tight, i.e., we have T(n)=Theta(n).

Simplifying Recurrence Relation c(n) = c(n/2) + n^2

I'm really confused on simplifying this recurrence relation: c(n) = c(n/2) + n^2.
So I first got:
c(n/2) = c(n/4) + n^2
so
c(n) = c(n/4) + n^2 + n^2
c(n) = c(n/4) + 2n^2
c(n/4) = c(n/8) + n^2
so
c(n) = c(n/8) + 3n^2
I do sort of notice a pattern though:
2 raised to the power of whatever coefficient is in front of "n^2" gives the denominator of what n is over.
I'm not sure if that would help.
I just don't understand how I would simplify this recurrence relation and then find the theta notation of it.
EDIT: Actually I just worked it out again and I got c(n) = c(n/n) + n^2*lgn.
I think that is correct, but I'm not sure. Also, how would I find the theta notation of that? Is it just theta(n^2lgn)?
Firstly, make sure to substitute n/2 everywhere n appears in the original recurrence relation when placing c(n/2) on the lhs.
i.e.
c(n/2) = c(n/4) + (n/2)^2
Your intuition is correct, in that it is a very important part of the problem. How many times can you divide n by 2 before we reach 1?
Let's take 8 for an example
8/2 = 4
4/2 = 2
2/2 = 1
You see it's 3, which as it turns out is log(8)
In order to prove the theta notation, it might be helpful to check out the master theorem. This is a very useful tool for proving complexity of a recurrence relation.
Using the master theorem case 3, we can see
a = 1
b = 2
logb(a) = 0
c = 2
n^2 = Omega(n^2)
k = 9/10
(n/2)^2 < k*n^2
c(n) = Theta(n^2)
The intuition as to why the answer is Theta(n^2) is that you have n^2 + (n^2)/4 + (n^2)/16 + ... + (n^2)/2^(2n), which won't give us logn n^2s, but instead increasingly smaller n^2s
Let's answer a more generic question for recurrences of the form:
r(n) = r(d(n)) + f(n). There are some restrictions for the functions, that need further discussion, e.g. if x is a fix point of d, then f(x) should be 0, otherwise there isn't any solution. In your specific case this condition is satisfied.
Rearranging the equation we get that r(n) - r(d(n)) = f(n), and we get the intuition that r(n) and r(d(n)) are both a sum of some terms, but r(n) has one more term than r(d(n)), that's why the f(n) as the difference. On the other hand, r(n) and r(d(n)) have to have the same 'form', so the number of terms in the previously mentioned sum has to be infinite.
Thus we are looking for a telescopic sum, in which the terms for r(d(n)) cancel out all but one terms for r(n):
r(n) = f(n) + a_0(n) + a_1(n) + ...
- r(d(n)) = - a_0(n) - a_1(n) - ...
This latter means that
r(d(n)) = a_0(n) + a_1(n) + ...
And just by substituting d(n) into the place of n into the equation for r(n), we get:
r(d(n)) = f(d(n)) + a_0(d(n)) + a_1(d(n)) + ...
So by choosing a_0(n) = f(d(n)), a_1(n) = a_0(d(n)) = f(d(d(n))), and so on: a_k(n) = f(d(d(...d(n)...))) (with k+1 pieces of d in each other), we get a correct solution.
Thus in general, the solution is of the form r(n) = sum{i=0..infinity}(f(d[i](n))), where d[i](n) denotes the function d(d(...d(n)...)) with i number of iterations of the d function.
For your case, d(n)=n/2 and f(n)=n^2, hence you can get the solution in closed form by using identities for geometric series. The final result is r(n)=4/3*n^2.
Go for advance Master Theorem.
T(n) = aT(n/b)+n^klog^p
where a>0 b>1 k>0 p=real number.
case 1: a>b^k
T(n) = 0(n^logba) b is in base.
case 2 a=b^k
1. p>-1 T(n) than T(n)=0(n^logba log^p+1)
2. p=-1 Than T(n)=0(n^logba logn)
3. p<-1 than T(n)=0(n^logba)
case 3: a<b^k
1.if p>=0 than T(n)=0(n^k log^p n)
2 if p<0 than T(n)=O(n^k)
forgave Constant bcoz constant doesn't change time complexity or constant change processor to processor .(i.e n/2 ==n*1/2 == n)

Solving the recurrence T(n) = T(n/2) + T(n/4) + T(n/8)?

I'm trying to solve a recurrence T(n) = T(n/8) + T(n/2) + T(n/4).
I thought it would be a good idea to first try a recurrence tree method, and then use that as my guess for substitution method.
For the tree, since no work is being done at the non-leaves levels, I thought we could just ignore that, so I tried to come up with an upper bound on the # of leaves since that's the only thing that's relevant here.
I considered the height of the tree taking the longest path through T(n/2), which yields a height of log2(n). I then assume the tree is complete, with all levels filled (ie. we have 3T(n/2)), and so we would have 3^i nodes at each level, and so n^(log2(3)) leaves. T(n) would then be O(n^log2(3)).
Unfortunately I think this is an unreasonable upper bound, I think I've made it a bit too high... Any advice on how to tackle this?
One trick you can use here is rewriting the recurrence in terms of another variable. Let's suppose that you write n = 2k. Then the recurrence simplifies to
T(2k) = T(2k-3) + T(2k-2) + T(2k-1).
Let's let S(k) = T(2k). This means that you can rewrite this recurrence as
S(k) = S(k-3) + S(k-2) + S(k-1).
Let's assume the base cases are S(0) = S(1) = S(2) = 1, just for simplicity. Given this, you can then use a variety of approaches to solve this recurrence. For example, the annihilator method (section 5 of the link) would be great here for solving this recurrence, since it's a linear recurrence. If you use the annihilator approach here, you get that
S(k) - S(k - 1) - S(k - 2) - S(k - 3) = 0
S(k+3) - S(k+2) - S(k+1) - S(k) = 0
(E3 - E2 - E - 1)S(k) = 0
If you find the roots of the equation E3 - E2 - E - 1, then you can write the solution to the recurrence as a linear combination of those roots raised to the power of k. In this case, it turns out that the recurrence is similar to that for the Tribonacci numbers, and if you solve everything you'll find that the recurrence solves to something of the form O(1.83929k).
Now, since you know that 2k = n, we know that k = lg n. Therefore, the recurrence solves to O(1.83929lg n). Let's let a = 1.83929. Then the solution has the form O(alg n) = O(a(loga n) / loga2)) = O(n1/loga 2). This works out to approximately O(n0.87914...). Your initial upper bound of O(nlg 3) = O(n1.584962501...) is significantly weaker than this one.
Hope this helps!
There is a way simpler method than proposed by #template. Apart from a Master's theorem, there is also an Akra-Bazzi method that allows you to solve the recurrences of this kind:
which is exactly what you have. So your g(x) = 0, a1 = a2 = a3 = 1 and b1 = 1/2, b2 = 1/4 and b3= 1/8. So now you have to solve the equation: 1/2^p + 1/4^p + 1/8^p = 1.
Solving it p is approximately 0.879. You do not even need to solve the integral because it is equal to 0. So your overall complexity is O(n^0.879).

Solving a recurrence equation with multiple recursive steps

I'm looking at some algorithms, and I'm trying to ascertain how multiple recursive steps are treated when forming the equation.
So exhibit A:
It is obvious to me that the recurrence equation here is: T(n) = c + 2T(n/2) which which in big O notation simplifies to O(n)
However here, we have something similar going on as well and I get the recurrence equation T(n) = n + 2T(n/2) since we have two recursive calls not unlike the first one, which in big O notation simplifies to O(n), however that is not the case here. Any input as to how to get the correct recurrence equation in this second one over here?
Any input as to how to go about solving this would be brilliant.
You might be interested in the Master Theorem:
http://en.wikipedia.org/wiki/Master_theorem
The recurrence equation T(n) = n + 2T(n/2) is Theta(n log n), which can be derived using the theorem. To do it manually, you can also assume n = 2^k and do:
T(n) = 2T(n/2) + n
= 2(2T(n/4) + n/2) + n
= (2^2)T(n/(2^2)) + 2n
= (2^2)(2T(n/(2^3)) + n/(2^2)) + 2n
= (2^3)T(n/(2^3)) + 3n
= ...
= (2^k)T(n/(2^k)) + kn
= nT(1) + n log2 n
= Theta(n log n)

Resources