Reccurence Relation - recursion

I have a method:
int Tree (int n) {
if (n <= 0) return 0;
if (n == 1) return 1;
return ((n*n) + Tree (n-3));
}
I'm trying to find the recurrence relation that captures the running time T(n) for the method 'Tree', so far I've got T(n) = T(n-3) + O(1), then I will need to express the running time as a series of terms, where each term denotes the number of operations at a
distinct level of the recursion tree:
I have T(n) = T(n-3) + O(1) then T(n-1) = T(n-4) + O(1) then T(n-2) = T(n-5) + O(1)
...
But Im unsure if this is right

After T(n) = T(n-3) + O(1) you don't need to check T(n-1) but T(n-3) which is = T(n-6) + O(1). Replacing n by 3p+r you get T(3p+r) = T(3(p-1)+r)+O(1) which gives T(3p+r)=T(r)+O(p), Since T(r) = O(1) and O(p) = O(n), T(n) = O(n)

Related

Is this recurrence relation O(infinity)?

Is this recurrence relation O(infinity)?
T(n) = 49*T(n/7) + n
There are no base conditions given.
I tried solving using master's theorem and the answer is Theta(n^2). But when solving with recurrence tree, the solution comes to be an infinite series, of n*(7 + 7^2 + 7^3 +...)
Can someone please help?
Let n = 7^m. The recurrence becomes
T(7^m) = 49 T(7^(m-1)) + 7^m,
or
S(m) = 49 S(m-1) + 7^m.
The homogeneous part gives
S(m) = C 49^m
and the general solution is
S(m) = C 49^m - 7^m / 6
i.e.
T(n) = C n² - n / 6 = (T(1) + 1 / 6) n² - n / 6.
If you try the recursion method:
T(n) = 7^2 T(n/7) + n = 7^2 [7^2 T(n/v^2) + n/7] + n = 7^4 T(n/7^2) + 7n + n
= ... = 7^(2i) * T(n/7^i) + n * [7^0 + 7^1 + 7^2 + ... + 7^(i-1)]
When the i grows n/7^i gets closer to 1 and as mentioned in the other answer, T(1) is a constant. So if we assume T(1) = 1, then:
T(n/7^i) = 1
n/7^i = 1 => i = log_7 (n)
So
T(n) = 7^(2*log_7 (n)) * T(1) + n * [7^0 + 7^1 + 7^2 + ... + 7^(log_7(n)-1)]
=> T(n) = n^2 + n * [1+7+7^2+...+(n-1)] = n^2 + c*n = theta(n^2)
Usually, when no base case is provided for a recurrence relation, the assumption is that the base case is something T(1) = 1 or something along those lines. That way, the recursion eventually terminates.
Something to think about - you can only get an infinite series from your recursion tree if the recursion tree is infinitely deep. Although no base case was specified in the problem, you can operate under the assumption that there is one and that the recursion stops when the input gets sufficiently small for some definition of "sufficiently small." Based on that, at what point does the recursion stop? From there, you should be able to convert your infinite series into a series of finite length, which then will give you your answer.
Hope this helps!

Calculate big O complexity of partitioning problem

My pseudo-code looks like:
solve(n)
for i:= 1 to n do
process(i);
solve(n-i);
where process(n) is a function with some complexity f(n). In my case f(n)=O(n^2), but I am also interested in general case (for example if f(n)=O(n)).
So, I have T(n) = f(n) + ... + f(1) + T(n-1) + ... + T(1). I cannot apply Master theorem as the sub-problems are not the same size.
How to calculate complexity of this recursion?
Small trick – consider solve(n-1):
solve(n) : T(n) = f(n) + f(n-1) + f(n-2) + ... + f(1) + T(n-1) + T(n-2) + ... + T(0)
solve(n-1): T(n-1) = f(n-1) + f(n-2) + ... + f(1) + T(n-2) + ... + T(0)
Subtract the latter from the former:
Expand repeatedly:
Solve the last summation for f(n) to obtain the complexity.
e.g. for f(n) = O(n):
Alternative method – variable substitution:
S(m) is in the correct form for the Master Theorem.
e.g. for f(n) = O(n) = O(log m), use Case 2 with k = 0:
Same result, q.e.d.

Big-O complexity recursion Vs iteration

Question 5 on Determining complexity for recursive functions (Big O notation) is:
int recursiveFun(int n)
{
for(i=0; i<n; i+=2)
// Do something.
if (n <= 0)
return 1;
else
return 1 + recursiveFun(n-5);
}
To highlight my question, I'll change the recursive parameter from n-5 to n-2:
int recursiveFun(int n)
{
for(i=0; i<n; i+=2)
// Do something.
if (n <= 0)
return 1;
else
return 1 + recursiveFun(n-2);
}
I understand the loop runs in n/2 since a standard loop runs in n and we're iterating half the number of times.
But isn't the same also happening for the recursive call? For each recursive call, n is decremented by 2. If n is 10, call stack is:
recursiveFun(8)
recursiveFun(6)
recursiveFun(4)
recursiveFun(2)
recursiveFun(0)
...which is 5 calls (i.e. 10/2 or n/2). Yet the answer provided by Michael_19 states it runs in n-5 or, in my example, n-2. Clearly n/2 is not the same as n-2. Where have I gone wrong and why is recursion different from iteration when analyzing for Big-O?
Common way to analyze big-O of a recursive algorithm is to find a recursive formula that "counts" the number of operation done by the algorithm. It is usually denoted as T(n).
In your example: the time complexity of this code can be described with the formula:
T(n) = C*n/2 + T(n-2)
^ ^
assuming "do something is constant Recursive call
Since it's pretty obvious it will be in O(n^2), let's show Omega(n^2) using induction:
Induction Hypothesis:
T(k) >= C/8 *k^2 for 0 <= k < n
And indeed:
T(n) = C*n/2 + T(n-2) >= (i.h.) C*n/2 + C*(n-2)^2 / 8
= C* n/2 + C/8(n^2 - 4n + 2) =
= C/8 (4n + n^2 - 4n + 2) =
= C/8 *(n^2 + 2)
And indeed:
T(n) >= C/8 * (n^2 + 2) > C/8 * n^2
Thus, T(n) is in big-Omega(n^2).
Showing big-O is done similarly:
Hypothesis: T(k) <= C*k^2 for all 2 <= k < n
T(n) = C*n/2 + T(n-2) <= (i.h.) C*n/2 + C*(n^2 - 4n + 4)
= C* (2n + n^2 - 4n + 4) = C (n^2 -2n + 4)
For all n >= 2, -2n + 4 <= 0, so for any n>=2:
T(n) <= C (n^2 - 2n + 4) <= C^n^2
And the hypothesis is correct - and by definition of big-O, T(n) is in O(n^2).
Since we have shown T(n) is both in O(n^2) and Omega(n^2), it is also in Theta(n^2)
Analyzing recursion is different from analyzing iteration because:
n (and other local variable) change each time, and it might be hard to catch this behavior.
Things get way more complex when there are multiple recursive calls.

Big(O) of recursive n choose k code

Is the big(O) of the following recursive code simply O(n choose k)?
int nchoosek(int n, int k) {
if (n == k) return 1;
if (k == 0) return 1;
return nchoosek(n-1, k) + nchoosek(n-1, k-1);
}
I'm not sure if the notation is correct but I guess you can write the recurrence for this function like
T(n) = T(n-1, k) + T(n-1, k-1) + O(1)
Since you have two possible paths we just have to analyse the worst case of each and choose the slowest.
Worst case of T(n-1, k)
Given that 0<k<n and k is as far as possible from n then we have
T(n-1, k) = T(n-1) = O(n)
Worst case of T(n-1, k-1)
We need 0<k<n and k should be as close to n as possible. Then
T(n-1, k-1) = T(n-1) = O(n)
Therefore T(n, k) = 2*O(n) + O(1) = O(n)
Another way of seeing this is by reducing the problem to other known problems, for example you can solve the same problem by using the definition of the choose function in terms of the Factorial function:
nCk = n!/k!(n-k)!
The running time of the Factorial is O(n) even in the recursive case.
nCk requires calculating the Factorial three times:
n! => O(n)
k! => O(k) = O(n)
(n-k)! => O(n-k) = O(n)
Then the multiplication and division are both constant time O(1), hence the running time is:
T(n) = 3*O(n) + 2*O(1) = O(n)

Computational complexity of a longest path algorithm witn a recursive method

I wrote a code segment to determine the longest path in a graph. Following is the code. But I don't know how to get the computational complexity in it because of the recursive method in the middle. Since finding the longest path is an NP complete problem I assume it's something like O(n!) or O(2^n), but how can I actually determine it?
public static int longestPath(int A) {
int k;
int dist2=0;
int max=0;
visited[A] = true;
for (k = 1; k <= V; ++k) {
if(!visited[k]){
dist2= length[A][k]+longestPath(k);
if(dist2>max){
max=dist2;
}
}
}
visited[A]=false;
return(max);
}
Your recurrence relation is T(n, m) = mT(n, m-1) + O(n), where n denotes number of nodes and m denotes number of unvisited nodes (because you call longestPath m times, and there is a loop which executes the visited test n times). The base case is T(n, 0) = O(n) (just the visited test).
Solve this and I believe you get T(n, n) is O(n * n!).
EDIT
Working:
T(n, n) = nT(n, n-1) + O(n)
= n((n-1)T(n, n-2) + O(n)) + O(n) = ...
= n(n-1)...1T(n, 0) + O(n)(1 + n + n(n-1) + ... + n(n-1)...2)
= O(n)(1 + n + n(n-1) + ... + n!)
= O(n)O(n!) (see http://oeis.org/A000522)
= O(n*n!)

Resources