Summation for nested loops - math

I hope my question makes sense. I need to come up with the mathematical summation for the following piece of pseudo-code where S1 and S2 are constant operations
for i ← 1 to n Do
for j ← i to n Do
S1
for k ← 1 to j do
S2
I have attempted to come up with the equation which is
T(n) = ∑ni=1 ∑nj=i ∑jk=1 1
I tried to solve the inner most loop, that is ∑jk=1 1 and my answer is
T(n) = ∑jk=1 1
T(n) = [k – 1 + 1] * 1
T(n) = k
which I am not is correct.
I am not sure how to go about completing this summation as I am confused to what the next steps are in calculating it. Any advice is greatly appreciated.
Apologies on the equation syntax I formed above, it did not import properly from Word.
Thank you.

Your actual runtime is:
O(f(n,j)) = O(First loop) * O(second loop) * O(third loop)
Therefore the runtime O(f(n,j)) is:
O(f(n,j)) = n * n * j
We know n > j. So j is negligible with big n.
O(f(n,j)) = O(f(n)) = O(n^2)
So your problem is element of O(n^2)
Have a look at:
Big-O in Docu

Related

is n! in the order of Theta((n+1)!)? can you show me a proof? [duplicate]

What about (n-1)!?
Also if you could show me a proof that would help me understand better.
I'm stuck on this one.
To show that (n+1)! is in O(n!) you have to show that there is a constant c so that for all big enough n (n > n0) the inequality
(n+1)! < c n!
holds. However since (n+1)! = (n+1) n! this simplifies to
n+1 < c
which clearly does not hold since c is a constant and n can be arbitrarily large.
On the other hand, (n-1)! is in O(n!). The proof is left as an exercise.
(n+1)! = n! * (n+1)
O((n+1)*n!) = O(nn!+n!) = O(2(nn!)) = O(n*n!) > O(n!)
(n-1)! = n! * n-1
O(n-1)! = O(n!/n) < O(n!)
I wasnt formally introduced to algorithmic complexity so take what I write with a grain of salt
That said, we know n^3 is way worse than n, right?
Well, since (n + 1)! = (n - 1)! * n * (n + 1)
Comparing (n + 1)! to (n - 1)! is like comparing n to n^3
Sorry, I dont have proof but expanding the factorial as above should lead to it

Run time of the following code

So my code is
function mystery(n, k):
if k ≥ n
return foo(n)
sum = 0
for i = k to n
sum = sum + mystery(n, k+1)
return sum
I have created a tree and the answer I am getting is $n^2*(n-1)!$. where foo is O(n. )Is it correct?
You are correct. If you were to turn the recursion into iteration, you would have loops that run triangle(n-k) times -- where triangle is the triangle function, the sum of integers 1 through N. The formula for that is
triangle(N) = N * (N-1) / 2
Multiply this by O(n), drop the 1/2 constant, and that yields your answer.
[ N.B. Since k is a constant for complexity purposes, you also drop that]

Finding time complexity of recursive formula

I'm trying to find time complexity (big O) of a recursive formula.
I tried to find a solution, you may see the formula and my solution below:
Like Brenner said, your last assumption is false. Here is why: Let's take the definition of O(n) from the Wikipedia page (using n instead of x):
f(n) = O(n) if and only if there exist constants c, n0 s.t. |f(n)| <= c |g(n)|, for alln >= n0.
We want to check if O(2^n^2) = O(2^n). Clearly, 2^n^2 is in O(2^n^2), so let's pick f(n) = 2^n^2 and check if this is in O(2^n). Put this into the above formula:
exists c, n0: 2^n^2 <= c * 2^n for all n >= n0
Let's see if we can find suitable constant values n0 and c for which the above is true, or if we can derive a contradiction to proof that it is not true:
Take the log on both sides:
log(2^n^2) <= log(c * 2 ^ n)
Simplify:
2 ^n log(2) <= log(c) + n * log(2)
Divide by log(2):
n^2 <= log(c)/log(2) * n
It's easy to see know that there is no c, n0 for which the above is true for all n >= n0, thus O(2^n^2) = O(n^2) is not a valid assumption.
The last assumption you've specified with the question mark is false! Do not make such assumptions.
The rest of the manipulations you've supplied seem to be correct. But they actually bring you nowhere.
You should have finished this exercise in the middle of your draft:
T(n) = O(T(1)^(3^log2(n)))
And that's it. That's the solution!
You could actually claim that
3^log2(n) == n^log2(3) ==~ n^1.585
and then you get:
T(n) = O(T(1)^(n^1.585))
which is somewhat similar to the manipulations you've made in the second part of the draft.
So you can also leave it like this. But you cannot mess with the exponent. Changing the value of the exponent changes the big-O classification.

Recursion in Mathematica

Can anyone explain to me how do I use a recursion, if I don't know the limit. For example, I need the remainder r of the Euclidean algorithm for gcd(a,b) which equals 0. I figured out that the recursive formula I need is
r[n]=r[n-2]-Floor[r[n-2]/r[n-1] * r[n-1]
r[1] = a - Floor[a/b]* b;
r[2] = b - Floor[b/r1] r1;

Recursive Relationship - Fibonacci

I've been doing some past papers for my ComSci course and I've ran into a bit of trouble understanding this question:
"Define a recursive relationship that expresses the number of calls involved in using the below function to find the nth Fibonacci number.: "
def f(n):
if n == 1 or n == 2:
return 1
else:
return f(n - 1) + f(n - 2)
I understand how the function works f(1), f(2) requires 1 call f(3) requires 3, f(4) requires 5 etc... However, I'm at a loss as to how to approach this question.
Thanks for reading :)
The question asks you to explain how many calls will be made to f based on n. The part that says, "Define a recursive relationship" is actually a hint about your answer.
So your answer will look something like:
Let T(x) be the function which defines the number of calls to compute f(x)
Then:
T(n) = { something using T and values less than n }
If you are trying to figure this out yourself - stop here, Spoilers follow (so your question is answered completely).
---------------------------------- Spoiler -------------------------------
n=1: T(1) = 1
n=2: T(2) = 1
n>2: T(n) = 1 + T(n - 1) + T(n - 2)
--------------------------------- End Spoiler ------------------------------

Resources