Big Oh with log (n) and exponents - math

So I have a few given functions and need to fond Big Oh for them (which I did).
n log(n) = O(n log(n))
n^2 = O(n^2)
n log(n^2) = O(n log(n))
n log(n)^2 = O(n^3)
n = O(n)
log is the natural logarithm.
I am pretty sure that 1,2,5 are correct.
For 3 I found a solution somewhere here: n log(n^2) = 2 n log (n) => O (n log n)
But I am completely unsure about 4). n^3 is definitely bigger than n*log(n^2) but is it the Oh of it? My other guess would be O(n^2).
A few other things:
n^2 * log(n)
n^2 * log(n)^2
What would that be?
Would be great if someone could explain it if it is wrong. Thank you!

Remember that big-O provides an asymptotic upper bound on a function, so any function that is O(n) is also O(n log n), O(n2), O(n!), etc. Since log n = O(n), we have n log2 n = O(n3). It's also the case that n log2 n = O(n log2 n) and n log2 n = O(n2). In fact, n log2 n = O(n1 + ε) for any ε > 0, since logk n = O(nε) for any ε > 0.
The functions n2 log n and n2 log2 n can't be simplified in the way that some of the other ones can. Runtimes of the form O(nk logr n) aren't all that uncommon. In fact, there are many algorithms that have runtime O(n2 log n) and O(n2 log2 n), and these runtimes are often left as such. For example, each iteration of the Karger-Stein algorithm takes time O(n2 log n) because this runtime comes from the Master Theorem as applied to the recurrence
T(n) = 2T(n / √2) + O(n2)
Hope this helps!

Related

is n! in the order of Theta((n+1)!)? can you show me a proof? [duplicate]

What about (n-1)!?
Also if you could show me a proof that would help me understand better.
I'm stuck on this one.
To show that (n+1)! is in O(n!) you have to show that there is a constant c so that for all big enough n (n > n0) the inequality
(n+1)! < c n!
holds. However since (n+1)! = (n+1) n! this simplifies to
n+1 < c
which clearly does not hold since c is a constant and n can be arbitrarily large.
On the other hand, (n-1)! is in O(n!). The proof is left as an exercise.
(n+1)! = n! * (n+1)
O((n+1)*n!) = O(nn!+n!) = O(2(nn!)) = O(n*n!) > O(n!)
(n-1)! = n! * n-1
O(n-1)! = O(n!/n) < O(n!)
I wasnt formally introduced to algorithmic complexity so take what I write with a grain of salt
That said, we know n^3 is way worse than n, right?
Well, since (n + 1)! = (n - 1)! * n * (n + 1)
Comparing (n + 1)! to (n - 1)! is like comparing n to n^3
Sorry, I dont have proof but expanding the factorial as above should lead to it

Comparing O((logn) ^ const) with O(n)

I did some computation and found out that if const = 2, then the derivative of the n in the infinity would be 1, and the derivative of the (logn) ^ 2 would be 2logn/n, which is tend to be 0, thus it seems O(n) /O((logn)^2) should be divergent when n is going to infinity, but what if const > 2?
Rather than looking at the derivative, consider rewriting each expression in terms of the same base. Notice, for example, that for any logarithm base b that
n = blogb n,
so in particular
n = (log n)log(log n) n
which can be rewritten using properties of logarithms as
n = (log n)log n / log log n
Your question asks how (log n)k compares against n. This means that we're comparing (log n)k against (log n)log n / log log n. This should make clearer that no constant k will ever cause (log n)k to exceed n, since the term log n / log log n will eventually exceed k for any fixed constant k.

Big-O proof involving a sum of logs

Prove that
I put the series into the summation, but I have no idea how to tackle this problem. Any help is appreciated
There are two useful mathematical facts that can help out here. First, note that ⌈x⌉ ≤ x + 1 for any x. Therefore,
sum from i = 1 to n (⌈log (n/i)⌉) ≤ (sum from i = 1 to n log (n / i)) + n
Therefore, if we can show that the second summation is O(n), we're done.
Using properties of logs, we can rewrite
log(n/1) + log(n/2) + ... + log(n/n)
= log(nn / n!)
Let's see if we can simplify this. Using properties of logarithms, we get that
log(nn / n!) = log(nn) - log(n!)
= n log n - log (n!)
Now, we can use Stirling's approximation, which says that
log (n!) = n log n - n + O(log n)
Therefore:
n log n - log (n!)
= n log n - n log n + n - O(log n)
= O(n)
So the summation is O(n), as required.
Hope this helps!
As a rule we know that:
Consequently:

Complexity asymptotic relation (theta, Big O, little o, Big Omega, little omega) between functions

Let's define:
Tower(1) of n is: n.
Tower(2) of n is: n^n (= power(n,n)).
Tower(10) of n is: n^n^n^n^n^n^n^n^n^n.
And also given two functions:
f(n) = [Tower(logn n) of n] = n^n^n^n^n^n^....^n (= log n times "height of tower").
g(n) = [Tower(n) of log n] = log(n)^log(n)^....^log(n) (= n times "height of tower").
Three questions:
How are functions f(n)/g(n) related each other asymptotically (n-->infinity),
in terms of: theta, Big O, little o, Big Omega, little omega ?
Please describe exact way of solution and not only eventual result.
Does base of log (i.e.: 0.5, 2, 10, log n, or n) effect the result ?
If no - why ?
If yes - how ?
I'd like to know whether in any real (even if hypotetic) application there complexity performance looks similar to f(n) or g(n) above. Please give case description - if such exist.
P.S.
I tried to substitute: log n = a, therefore: n = 2^a or 10^a.
And got confused of counting height of received "towers".
I won't provide you a solution, because you have to work on your homework, but maybe there are other people interested about some hints.
1) Mathematics:
log(a^x) = x*log(a)
this will humanize your problem
2) Mathematics:
logx(y) = log2(y) / log2(x) = log10(y) / log10(x)
of course: if x is constant => log2(x) and log10(x) are constants
3) recursive + stop condition

Growth of inverse factorial

Consider the inverse factorial function, f(n) = k where k! is the greatest factorial <= n. I've been told that the inverse factorial function is O(log n / log log n). Is it true? Or is it just a really really good approximation to the asymptotic growth? The methods I tried all give things very close to log(n)/log log(n) (either a small factor or a small term in the denominator) but not quite.
Remember that, when we're using O(...), constant factors don't matter, and any term that grows more slowly than another term can be dropped. ~ means "is proportional to."
If k is large, then n = k! ~ k^k. So log n ~ k log k, or k ~ log n / log k or k ~ log n / log(log n / log k) = log n / (log log n - log log k). Because n >> k we can drop the term in the denominator, and we get k ~ log n / log log n so k = O(log n / log log n).
Start from Stirling's Approximation for ln(k!) and work backwards from there. Apologies for not working the whole thing out; my brain doesn't seem to be working tonight.

Resources