How to solve ๐‘‡(๐‘›) = 2๐‘‡(๐‘›/2) +1 with Substitution Method - recursion

Here I got a recursive function and I want to solve that ( finding the time complexity ) with Substitution Method (mathematical induction) .
๐‘‡(๐‘›) = 2๐‘‡(๐‘›/2) +1
In the question mentioned that our guess should be ฮฉ(log n) . Actually I used mathematical induction to prove that the T(n) = O(n) then because that n = ฮฉ(log n) so the T(n) is too .but I was not successful to prove it correctly .
I have seen all the answers for this function that asked before but they were not solved with Substitution Method . could you help me to prove that in this way .

Your strategy is correct. We prove by induction that T(n) = Theta(n).
We assume w.l.o.g. the initial condition T(n) = 1.
Induction hypothesis: T(n) = 2n - 1.
Base case: T(1) = 1 = 2 * 1 - 1.
Inductive case:
T(n) = 2T(n/2) + 1
= 2 (2n/2 - 1) + 1 (inductive hypothesis)
= 2n - 1 (QED)
Hence T(n) = Theta(n), and from there it follows e.g. that T(n) = Omega(log n) etc.

Related

Is this recurrence relation O(infinity)?

Is this recurrence relation O(infinity)?
T(n) = 49*T(n/7) + n
There are no base conditions given.
I tried solving using master's theorem and the answer is Theta(n^2). But when solving with recurrence tree, the solution comes to be an infinite series, of n*(7 + 7^2 + 7^3 +...)
Can someone please help?
Let n = 7^m. The recurrence becomes
T(7^m) = 49 T(7^(m-1)) + 7^m,
or
S(m) = 49 S(m-1) + 7^m.
The homogeneous part gives
S(m) = C 49^m
and the general solution is
S(m) = C 49^m - 7^m / 6
i.e.
T(n) = C nยฒ - n / 6 = (T(1) + 1 / 6) nยฒ - n / 6.
If you try the recursion method:
T(n) = 7^2 T(n/7) + n = 7^2 [7^2 T(n/v^2) + n/7] + n = 7^4 T(n/7^2) + 7n + n
= ... = 7^(2i) * T(n/7^i) + n * [7^0 + 7^1 + 7^2 + ... + 7^(i-1)]
When the i grows n/7^i gets closer to 1 and as mentioned in the other answer, T(1) is a constant. So if we assume T(1) = 1, then:
T(n/7^i) = 1
n/7^i = 1 => i = log_7 (n)
So
T(n) = 7^(2*log_7 (n)) * T(1) + n * [7^0 + 7^1 + 7^2 + ... + 7^(log_7(n)-1)]
=> T(n) = n^2 + n * [1+7+7^2+...+(n-1)] = n^2 + c*n = theta(n^2)
Usually, when no base case is provided for a recurrence relation, the assumption is that the base case is something T(1) = 1 or something along those lines. That way, the recursion eventually terminates.
Something to think about - you can only get an infinite series from your recursion tree if the recursion tree is infinitely deep. Although no base case was specified in the problem, you can operate under the assumption that there is one and that the recursion stops when the input gets sufficiently small for some definition of "sufficiently small." Based on that, at what point does the recursion stop? From there, you should be able to convert your infinite series into a series of finite length, which then will give you your answer.
Hope this helps!

Runtime Complexity | Recursive calculation using Master's Theorem

So I've encountered a case where I have 2 recursive calls - rather than one. I do know how to solve for one recursive call, but in this case I'm not sure whether I'm right or wrong.
I have the following problem:
T(n) = T(2n/5) + T(3n/5) + n
And I need to find the worst-case complexity for this.
(FYI - It's some kind of augmented merge sort)
My feeling was to use the first equation from the Theorem, but I feel something is wrong with my idea. Any explanation on how to solve problems like this will be appreciated :)
The recursion tree for the given recursion will look like this:
Size Cost
n n
/ \
2n/5 3n/5 n
/ \ / \
4n/25 6n/25 6n/25 9n/25 n
and so on till size of input becomes 1
The longes simple path from root to a leaf would be n-> 3/5n -> (3/5) ^2 n .. till 1
Therefore let us assume the height of tree = k
((3/5) ^ k )*n = 1 meaning k = log to the base 5/3 of n
In worst case we expect that every level gives a cost of n and hence
Total Cost = n * (log to the base 5/3 of n)
However we must keep one thing in mind that ,our tree is not complete and therefore
some levels near the bottom would be partially complete.
But in asymptotic analysis we ignore such intricate details.
Hence in worst Case Cost = n * (log to the base 5/3 of n)
which is O( n * log n )
Now, let us verify this using substitution method:
T(n) = O( n * log n) iff T(n) < = dnlog(n) for some d>0
Assuming this to be true:
T(n) = T(2n/5) + T(3n/5) + n
<= d(2n/5)log(2n/5) + d(3n/5)log(3n/5) + n
= d*2n/5(log n - log 5/2 ) + d*3n/5(log n - log 5/3) + n
= dnlog n - d(2n/5)log 5/2 - d(3n/5)log 5/3 + n
= dnlog n - dn( 2/5(log 5/2) - 3/5(log 5/3)) + n
<= dnlog n
as long as d >= 1/( 2/5(log 5/2) - 3/5(log 5/3) )

How can i find a running time of a recurrence relation?

The running time for this recurrence relation is O(nlogn). Since I am new to algorithm how would I show that mathematically?
T(n) = 2โ‹…T(n/2) + O(n)
T(n) = 2 ( 2โ‹…T(n/4) + O(n) ) + O(n) // since T(n/2) = 2โ‹…T(n/4) + O(n)
So far I can see that if I suppose n to be a power of 2 like n = 2m, then may be I can show that, but I am not getting the clear picture. Can anyone help me?
If you use the master theorem, you get the result you expected.
If you want to proof this "by hand", you can see this easily by supposing n = 2m is a power of 2 (as you already said). This leads you to
T(n) = 2โ‹…T(n/2) + O(n)
= 2โ‹…(2โ‹…T(n/4) + O(n/2)) + O(n)
= 4โ‹…T(n/4) + 2โ‹…O(n/2) + O(n)
= 4โ‹…(2โ‹…T(n/8) + O(n/4)) + 2โ‹…O(n/2) + O(n)
= ฮฃk=1,...,m 2kโ‹…O(n/2k)
= ฮฃk=1,...,m O(n)
= mโ‹…O(n)
Since m = logโ‚‚(n), you can write this as O(n log n).
At the end it doesn't matter if n is a power of 2 or not.
To see this, you can think about this: You have an input of n (which is not a power of 2) and you add more elements to the input until it contains n' = 2m Elements with m โˆˆ โ„• and log(n) โ‰ค m โ‰ค log(n) + 1, i.e. n' is the smalest power of 2 that is greater than n. Obviously T(n) โ‰ค T(n') holds and we know T(n') is in
O(n'โ‹…log(n')) = O(cโ‹…nโ‹…log(cโ‹…n)) = O(nโ‹…log(n) + nโ‹…log(c)) = O(nโ‹…log(n))
where c is a constant between 1 and 2.
You can do the same with the greatest power of 2 that is smaller than n. This gives leads you to T(n) โ‰ฅ T(n'') and we know T(n'') is in
O(n''โ‹…log(n'')) = O(cโ‹…nโ‹…log(cโ‹…n)) = O(nโ‹…log(n))
where c is a constant between 1/2 and 1.
In total you get, that the complexity of T(n) is bounded by the complexitys of T(n'') and T(n') wich are both O(nโ‹…log(n))and so T(n) is also in O(nโ‹…log(n)), even if it is not a power of 2.

How do you pick variable substitutions in recurrence relations?

In our Data Structures class we are learning how to solve recurrence relations in 1 variable. Unfortunately some things seem to come "out of the blue".
For example, some exercises already tell you how to substitute the variable n:
Compute T(n) for n = 2^k
T(n) = a for n =< 2
T(n) = 8T(n/2) + bn^2 (a and b are > 0)
But some exercises just give you the T(n) without providing a replacement for the variable n:
T(n) = 1 n =<1
T(n) = 2T(n/4) + sqrt(n)
I used the iterative method and arrived to the right answer: sqrt(n) + (1/2) * sqrt(n) * Log(n).
But when the professor explained she started by saying: "Let n = 4^k", which is what I mean by "out of the blue". Using that fact the answer is simpler to obtain.
But how is the student supposed to come up with that?
This is another example:
T(n) = 1 n =<1
T(n) = 2T( (n-1)/2 ) + n
Here I started again with the iterative method but I can't reach a definitive answer, it looks more complex that way.
After 3 iterative steps I arrived to this:
T(n) = 4T( (n-2)/4 ) + 2n - 1
T(n) = 8T( (n-3)/8 ) + 3n - 3
T(n) = 16T( (n-4)/16 ) + 4n - 6
I am inclined to say T(i) = 2^i * T( (n-i)/2^i ) + i*n - ? This last part I can't figure out, maybe I made a mistake.
However in the answer she provides she starts again with another substitution: Let n = (2^k) -1. I donโ€™t see where this comes from - why would I do this? What is the logic behind that?
In all of these cases, these substitutions are reasonable because they rewrite the recurrence as one of the form S(k) = aS(k - 1) + f(k). These recurrences are often easier to solve than other recurrences because they define S(k) purely in terms of S(k - 1).
Let's do some examples to see how this works. Consider this recurrence:
T(n) = 1 (if n โ‰ค 1)
T(n) = 2T(n/4) + sqrt(n) (otherwise)
Here, the size of the problem shrinks by a factor of four on each iteration. Therefore, if the input is a perfect power of four, then the input will shrink from size 4k to 4k-1, from 4k-1 to 4k-2, etc. until the recursion bottoms out. If we make this substitution and let S(k) = T(4k), then we get hat
S(0) = 1
S(k) = 2S(k - 1) + 2k
This is now a recurrence relation where S(k) is defined in terms of S(k - 1), which can make the recurrence easier to solve.
Let's look at your original recurrence:
T(n) = a (for n โ‰ค 2)
T(n) = 8T(n/2) + bn2
Notice that the recursive step divides n by two. If n is a perfect power of two, then the recursive step considers the power of two that comes right before n. Letting S(k) = T(2k) gives
S(k) = a (for k โ‰ค 1)
S(k) = 8S(k - 1) + b22k
Notice how that S(k) is defined in terms of S(k - 1), which is a much easier recurrence to solve. The choice of powers of two was "natural" here because it made the recursive step talk purely about the previous value of S and not some arbitrarily smaller value of S.
Now, look at the last recurrence:
T(n) = 1 (n โ‰ค 1)
T(n) = 2T( (n-1)/2 ) + n
We'd like to make some substitution k = f(n) such that T(f(n)) = 2T(f(n) - 1) + n. The question is how to do that.
With some trial and error, we get that setting f(n) = 2n - 1 fits the bill, since
(f(n) - 1) / 2 = ((2n - 1) - 1) / 2 = (2n - 2) / 2 = 2n-1 - 1 = f(n) - 1
Therefore, letting k = 2n - 1 and setting S(k) = T(2n - 1), we get
S(n) = 1 (if n โ‰ค 1)
S(n) = 2S(n - 1) + 2n - 1
Hope this helps!

Can anyone help me find the big theta run-time complexity

I am new to the concept of Big Theta ( ฮ˜ )run-time complexity,
I have the following recurrence relations to analyze,
T(n) = 2T(n/3) + 5n2 and I got ฮ˜(2)
T(n) = T(n/4) + n4 and I got ฮ˜(n4)
Please verify my answers.
Your answers are correct.
You can solve these kind of problems by applying Master Theorem.
The Link is to Master Theorem,
http://en.wikipedia.org/wiki/Master_theorem#Generic_form
If T(n) = a T(n/b) + f(n) where a >= 1 and b > 1
We need to consider case 3 of Master Theorem,
Case 3: if f(n) = ฮ˜(nc) where c > logba
Then T(n) = ฮ˜(nc)
First recurrence
T(n) = 2T(n/3) + 5n2
a = 2, b = 3 and f(n) = 5 n2
There for, f(n) = ฮ˜(nc), where c = 2.
Now c > logba since 2 > log32.
Thus T(n) = ฮ˜(n2) as mentioned by you.
Second Recurrence
T(n) = T(n/4) + n4
a = 1, b = 4 and f(n) = n4
There for, f(n) = ฮ˜(nc), where c = 4.
Now c > logba since 4 > log41.
Thus T(n) = ฮ˜(n4) as mentioned by you.
These are both correct. Because the second term of each recurrence equation is of a much higher order than the first, it will dominate the first term (in layman's terms).

Resources