Computational complexity of a longest path algorithm witn a recursive method - recursion

I wrote a code segment to determine the longest path in a graph. Following is the code. But I don't know how to get the computational complexity in it because of the recursive method in the middle. Since finding the longest path is an NP complete problem I assume it's something like O(n!) or O(2^n), but how can I actually determine it?
public static int longestPath(int A) {
int k;
int dist2=0;
int max=0;
visited[A] = true;
for (k = 1; k <= V; ++k) {
if(!visited[k]){
dist2= length[A][k]+longestPath(k);
if(dist2>max){
max=dist2;
}
}
}
visited[A]=false;
return(max);
}

Your recurrence relation is T(n, m) = mT(n, m-1) + O(n), where n denotes number of nodes and m denotes number of unvisited nodes (because you call longestPath m times, and there is a loop which executes the visited test n times). The base case is T(n, 0) = O(n) (just the visited test).
Solve this and I believe you get T(n, n) is O(n * n!).
EDIT
Working:
T(n, n) = nT(n, n-1) + O(n)
= n((n-1)T(n, n-2) + O(n)) + O(n) = ...
= n(n-1)...1T(n, 0) + O(n)(1 + n + n(n-1) + ... + n(n-1)...2)
= O(n)(1 + n + n(n-1) + ... + n!)
= O(n)O(n!) (see http://oeis.org/A000522)
= O(n*n!)

Related

Reccurence Relation

I have a method:
int Tree (int n) {
if (n <= 0) return 0;
if (n == 1) return 1;
return ((n*n) + Tree (n-3));
}
I'm trying to find the recurrence relation that captures the running time T(n) for the method 'Tree', so far I've got T(n) = T(n-3) + O(1), then I will need to express the running time as a series of terms, where each term denotes the number of operations at a
distinct level of the recursion tree:
I have T(n) = T(n-3) + O(1) then T(n-1) = T(n-4) + O(1) then T(n-2) = T(n-5) + O(1)
...
But Im unsure if this is right
After T(n) = T(n-3) + O(1) you don't need to check T(n-1) but T(n-3) which is = T(n-6) + O(1). Replacing n by 3p+r you get T(3p+r) = T(3(p-1)+r)+O(1) which gives T(3p+r)=T(r)+O(p), Since T(r) = O(1) and O(p) = O(n), T(n) = O(n)

Time complexity of reversing a stack with recursion

I have tried to find the time complexity of the following code.
int insert_at_bottom(int x, stack st) {
if(st is empty) {
st.push(x)
}
else {
int a = st.top()
st.pop()
insert_at_bottom(x)
st.push(a)
}
}
void reverse(stack st) {
if(st is not empty) {
int st = st.top()
st.pop()
reverse(st)
insert_at_bottom(x, st)
}
}
// driver function
int[] reverseStack(int[] st) {
reverse(st)
return st
}
For each element on top of the stack we are popping the whole stack out placing that top element at the bottom which takes O(n) operations. And these O(n) operations are performed for every element in the stack, so time complexity should be O(n^2).
However, I want to find the time complexity mathematically. I tried to find the recurrence relation, and I got T(n)=2T(n-1)+1. This is probably wrong, as the time complexity of the second function call should not be taken as T(n-1).
Your argumentation is in general correct. If insertion at the bottom takes O(n) time, then the reverse function takes O(n2) time, because it performs a linear-time operation on the stack for each element.
The recurrence relation for reverse() would look a bit different though. In each step, you do three things:
Call itself on n-1
An O(n) time operation (insert_at_bottom())
Some constant-time stuff
Thus, you can just sum these together. So I would argue it can be written as:
T(n) = T(n-1) + n + c, where c is a constant.
You will find that, due to recursion, T(n-1) = T(n-2) + n-1 + c. So, if you keep expanding the whole series in this fashion under n > 0, you obtain:
T(n) = 1 + ... + n-1 + n + nc
Since 1 + 2 + ... + n) = n(n + 1)/2 (see this), we obtain that
T(n) = n(n+1)/2 + nc = n2/2 + n/2 + nc = O(n2). □
The O(n) time of insert_at_bottom(), you can show in a similar way.

Big(O) of recursive n choose k code

Is the big(O) of the following recursive code simply O(n choose k)?
int nchoosek(int n, int k) {
if (n == k) return 1;
if (k == 0) return 1;
return nchoosek(n-1, k) + nchoosek(n-1, k-1);
}
I'm not sure if the notation is correct but I guess you can write the recurrence for this function like
T(n) = T(n-1, k) + T(n-1, k-1) + O(1)
Since you have two possible paths we just have to analyse the worst case of each and choose the slowest.
Worst case of T(n-1, k)
Given that 0<k<n and k is as far as possible from n then we have
T(n-1, k) = T(n-1) = O(n)
Worst case of T(n-1, k-1)
We need 0<k<n and k should be as close to n as possible. Then
T(n-1, k-1) = T(n-1) = O(n)
Therefore T(n, k) = 2*O(n) + O(1) = O(n)
Another way of seeing this is by reducing the problem to other known problems, for example you can solve the same problem by using the definition of the choose function in terms of the Factorial function:
nCk = n!/k!(n-k)!
The running time of the Factorial is O(n) even in the recursive case.
nCk requires calculating the Factorial three times:
n! => O(n)
k! => O(k) = O(n)
(n-k)! => O(n-k) = O(n)
Then the multiplication and division are both constant time O(1), hence the running time is:
T(n) = 3*O(n) + 2*O(1) = O(n)

Calculating the complexity of the function

I have writing a function for calculating the length of longest increasing sequence. Here arr[] is array of length n. And lisarr is of length of n to store the length of element i.
I am having difficulty to calculate recurrence expression which is a input for master theorem.
int findLIS(int n){
if(n==0)
return 1 ;
int res;
for(int i=0;i<n;i++){
res=findLIS(i);
if(arr[n]>arr[i] && res+1>lisarr[n])
lisarr[n]=res+1;
}
return lisarr[n];
}
Please give the way to calculate the recurrence relation.
Should it be
T(n)=O(n)+T(1)?
It is O(2^n). Let's calculate exact number of iterations and denote it with f(n). Recurrence relation is f(n) = 1 + f(n-1) + f(n-2) + .. + f(1) + f(0), with f(1)=2 and f(0)=1, which gives f(n)=2*f(n-1), and finally f(n)=2^n.
Proof by induction:
Base n=0 -> Only one iteration of the function.
Let us assume that f(n)=2^n. Then for input n+1 we have f(n+1) = 1 + f(n) + f(n-1) + .. + f(1) + f(0) = 1 + (2^n + 2^(n-1) + .. + 2 + 1) = 1 + (2^(n+1) - 1)=2^(n+1).. Number one at the beginning comes from the part outside of the for loop, and the sum is the consequence of the for loop - you always have one recursive call for each i in {0,1,2,..,n}.

Complexity of recursive algorithm, that finds largest element in an array

How to calculate complexity of ths recursive algorithm?
int findMax(int a[ ], int l, int r)
{
if (r – l == 1)
return a[ l ];
int m = ( l + r ) / 2;
int u = findMax(a, l, m);
int v = findMax(a, m, r);
if (u > v)
return u;
else
return v;
}
From the Master Theorem:
T(n) = a * T(n/b) + f(n)
Where:
a is number of sub-problems
f(n) is cost of operation outside the recursion; f(n) = O(nc)
n/b size of the sub-problem
The idea behind this function is that you repeat the operation on the first half of items (T(n/2)) and on the second half of items (T(n/2)). You get the results and compare them (O(1)) so you have:
T(n) = 2 * T(n/2) + O(1)
So f(n) = O(1) and in terms of n value we get O(n0) - we need that to calculate c. So a = 2 and b = 2 and c = 0. From the Master Theorem (as correctly pointed out in comments) we end up with case where c < logba as log22 = 0. In this case the complexity of whole recursive call is O(n).

Resources