Using the masters method - math

On my midterm I had the problem:
T(n) = 8T(n/2) + n^3
and I am supposed to find its big theta notation using either the masters or alternative method. So what I did was
a = 8, b = 2 k = 3
log28 = 3 = k
therefore, T(n) is big theta n3. I got 1/3 points so I must be wrong. What did I do wrong?

T(n) = aT(n/b) + f(n)
You applied the version when f(n) = O(n^(log_b(a) - e)) for some e > 0.
This is important, you need this to be true for some e > 0.
For f(n) = n^3, b = 2 and a = 8,
n^3 = O(n^(3-e)) is not true for any e > 0.
So your picked the wrong version of the Master theorem.
You need to apply a different version of Master theorem:
if f(n) = Theta ((log n)^k * n^log_b(a)) for some k >= 0,
then
T(n) = Theta((log n)^(k+1) * n^log_b(a))
In your problem, you can apply this case, and that gives T(n) = Theta(n^3 log n).
An alternative way to solve your problem would be:
T(n) = 8 T(n/2) + n^3.
Let g(n) = T(n)/n^3.
Then
n^3 *g(n) = 8 * (n/2)^3 * g(n/2)+ n^3
i.e g(n) = g(n/2) + 1.
This implies g(n) = Theta(logn) and so T(n) = Theta(n^3 logn).

Related

Recurrence relations in computer science mathematics T(n) = 6T(n/6) + 2n + 3 for n a power of 6 T(1) = 1?

Recurrence relations can be directly derived from a recursive algorithm, but
they are in a form that does not allow us to quickly determine how efficient
the algorithm is.
Please how can I solve this
T(n) = 6T(n/6) + 2n + 3 for n a power of 6 T(1) = 1 solution ?
This recurrence could be solved with rsolve from SymPy, Python's symbolic math library.
from sympy import Function, rsolve
from sympy.abc import k, n
f = Function('f')
g = Function('g')
# T(n) = 6T(n/6) + 2n + 3 for n a power of 6 T(1) = 1
T = f(n) - 6*f(n/6) - 2*n - 3
Tk = T.subs({n: 6**k, f(n): g(k), f(n/6):g(k-1)})
s = rsolve(Tk, g(k), {g(0): 1})
print ("solution for k:", s.cancel())
for k in range(0,11):
print(f"k={k}, n={6**k}, T(n)={2*6**k*k + (8*6**k - 3)//5}")
This gives:
Tk(k) = 2*6**k*k + 8*6**k/5 - 3/5 or Tk(k) = ((10k+8)6k - 3)/5
T(n) = 2*n*log(n)/log(6) + 8*n/5 - 3/5 or T(n) = ((n(10log6(n)+8) - 3)/5
First 11 values:
k=0, n=1, T(n)=1
k=1, n=6, T(n)=21
k=2, n=36, T(n)=201
k=3, n=216, T(n)=1641
k=4, n=1296, T(n)=12441
k=5, n=7776, T(n)=90201
k=6, n=46656, T(n)=634521
k=7, n=279936, T(n)=4367001
k=8, n=1679616, T(n)=29561241
k=9, n=10077696, T(n)=197522841
k=10, n=60466176, T(n)=1306069401
We can check the formulas via the recursive formulation:
def recursive_t(n):
if n == 1:
res = 1
else:
t_ndiv6 = recursive_t(n//6)
res = 6 * t_ndiv6 + 2 * n + 3
print(f"T({n})={res}")
return res
recursive_t(6**10)
This prints out the same values for the same n.

Is this recurrence relation O(infinity)?

Is this recurrence relation O(infinity)?
T(n) = 49*T(n/7) + n
There are no base conditions given.
I tried solving using master's theorem and the answer is Theta(n^2). But when solving with recurrence tree, the solution comes to be an infinite series, of n*(7 + 7^2 + 7^3 +...)
Can someone please help?
Let n = 7^m. The recurrence becomes
T(7^m) = 49 T(7^(m-1)) + 7^m,
or
S(m) = 49 S(m-1) + 7^m.
The homogeneous part gives
S(m) = C 49^m
and the general solution is
S(m) = C 49^m - 7^m / 6
i.e.
T(n) = C n² - n / 6 = (T(1) + 1 / 6) n² - n / 6.
If you try the recursion method:
T(n) = 7^2 T(n/7) + n = 7^2 [7^2 T(n/v^2) + n/7] + n = 7^4 T(n/7^2) + 7n + n
= ... = 7^(2i) * T(n/7^i) + n * [7^0 + 7^1 + 7^2 + ... + 7^(i-1)]
When the i grows n/7^i gets closer to 1 and as mentioned in the other answer, T(1) is a constant. So if we assume T(1) = 1, then:
T(n/7^i) = 1
n/7^i = 1 => i = log_7 (n)
So
T(n) = 7^(2*log_7 (n)) * T(1) + n * [7^0 + 7^1 + 7^2 + ... + 7^(log_7(n)-1)]
=> T(n) = n^2 + n * [1+7+7^2+...+(n-1)] = n^2 + c*n = theta(n^2)
Usually, when no base case is provided for a recurrence relation, the assumption is that the base case is something T(1) = 1 or something along those lines. That way, the recursion eventually terminates.
Something to think about - you can only get an infinite series from your recursion tree if the recursion tree is infinitely deep. Although no base case was specified in the problem, you can operate under the assumption that there is one and that the recursion stops when the input gets sufficiently small for some definition of "sufficiently small." Based on that, at what point does the recursion stop? From there, you should be able to convert your infinite series into a series of finite length, which then will give you your answer.
Hope this helps!

How can i find a running time of a recurrence relation?

The running time for this recurrence relation is O(nlogn). Since I am new to algorithm how would I show that mathematically?
T(n) = 2⋅T(n/2) + O(n)
T(n) = 2 ( 2⋅T(n/4) + O(n) ) + O(n) // since T(n/2) = 2⋅T(n/4) + O(n)
So far I can see that if I suppose n to be a power of 2 like n = 2m, then may be I can show that, but I am not getting the clear picture. Can anyone help me?
If you use the master theorem, you get the result you expected.
If you want to proof this "by hand", you can see this easily by supposing n = 2m is a power of 2 (as you already said). This leads you to
T(n) = 2⋅T(n/2) + O(n)
= 2⋅(2⋅T(n/4) + O(n/2)) + O(n)
= 4⋅T(n/4) + 2⋅O(n/2) + O(n)
= 4⋅(2⋅T(n/8) + O(n/4)) + 2⋅O(n/2) + O(n)
= Σk=1,...,m 2k⋅O(n/2k)
= Σk=1,...,m O(n)
= m⋅O(n)
Since m = log₂(n), you can write this as O(n log n).
At the end it doesn't matter if n is a power of 2 or not.
To see this, you can think about this: You have an input of n (which is not a power of 2) and you add more elements to the input until it contains n' = 2m Elements with m ∈ ℕ and log(n) ≤ m ≤ log(n) + 1, i.e. n' is the smalest power of 2 that is greater than n. Obviously T(n) ≤ T(n') holds and we know T(n') is in
O(n'⋅log(n')) = O(c⋅n⋅log(c⋅n)) = O(n⋅log(n) + n⋅log(c)) = O(n⋅log(n))
where c is a constant between 1 and 2.
You can do the same with the greatest power of 2 that is smaller than n. This gives leads you to T(n) ≥ T(n'') and we know T(n'') is in
O(n''⋅log(n'')) = O(c⋅n⋅log(c⋅n)) = O(n⋅log(n))
where c is a constant between 1/2 and 1.
In total you get, that the complexity of T(n) is bounded by the complexitys of T(n'') and T(n') wich are both O(n⋅log(n))and so T(n) is also in O(n⋅log(n)), even if it is not a power of 2.

Can anyone help me find the big theta run-time complexity

I am new to the concept of Big Theta ( Θ )run-time complexity,
I have the following recurrence relations to analyze,
T(n) = 2T(n/3) + 5n2 and I got Θ(2)
T(n) = T(n/4) + n4 and I got Θ(n4)
Please verify my answers.
Your answers are correct.
You can solve these kind of problems by applying Master Theorem.
The Link is to Master Theorem,
http://en.wikipedia.org/wiki/Master_theorem#Generic_form
If T(n) = a T(n/b) + f(n) where a >= 1 and b > 1
We need to consider case 3 of Master Theorem,
Case 3: if f(n) = Θ(nc) where c > logba
Then T(n) = Θ(nc)
First recurrence
T(n) = 2T(n/3) + 5n2
a = 2, b = 3 and f(n) = 5 n2
There for, f(n) = Θ(nc), where c = 2.
Now c > logba since 2 > log32.
Thus T(n) = Θ(n2) as mentioned by you.
Second Recurrence
T(n) = T(n/4) + n4
a = 1, b = 4 and f(n) = n4
There for, f(n) = Θ(nc), where c = 4.
Now c > logba since 4 > log41.
Thus T(n) = Θ(n4) as mentioned by you.
These are both correct. Because the second term of each recurrence equation is of a much higher order than the first, it will dominate the first term (in layman's terms).

Calculating complexity of recurrence

I am having trouble understanding the concept of recurrences. Given you have T(n) = 2T(n/2) +1 how do you calculate the complexity of this relationship? I know in mergesort, the relationship is T(n) = 2T(n/2) + cn and you can see that you have a tree with depth log2^n and cn work at each level. But I am unsure how to proceed given a generic function. Any tutorials available that can clearly explain this?
The solution to your recurrence is T(n) ∈ Θ(n).
Let's expand the formula:
T(n) = 2*T(n/2) + 1. (Given)
T(n/2) = 2*T(n/4) + 1. (Replace n with n/2)
T(n/4) = 2*T(n/8) + 1. (Replace n with n/4)
T(n) = 2*(2*T(n/4) + 1) + 1 = 4*T(n/4) + 2 + 1. (Substitute)
T(n) = 2*(2*(2*T(n/8) + 1) + 1) + 1 = 8*T(n/8) + 4 + 2 + 1. (Substitute)
And do some observations and analysis:
We can see a pattern emerge: T(n) = 2k * T(n/2k) + (2k − 1).
Now, let k = log2 n. Then n = 2k.
Substituting, we get: T(n) = n * T(n/n) + (n − 1) = n * T(1) + n − 1.
For at least one n, we need to give T(n) a concrete value. So we suppose T(1) = 1.
Therefore, T(n) = n * 1 + n − 1 = 2*n − 1, which is in Θ(n).
Resources:
https://www.cs.duke.edu/courses/spring05/cps100/notes/slides07-4up.pdf
http://www.cs.duke.edu/~ola/ap/recurrence.html
However, for routine work, the normal way to solve these recurrences is to use the Master theorem.

Resources