Remember that square-root(n) = n^(1/2)
== 1 ==
True or false?
3 n^2 + 10 n log n = O(n lg n) FALSE
3 n^2 + 10 n log n = Omega(n^2) TRUE
3 n^2 + 10 n log n = Theta(n^2) TRUE
n log n + n/2 = O(n) FALSE
== 2 ==
Sort the following functions in increasing order of asymptotic running-time.
f1 = 2^(2^5000) =
f2 = 2^(4n)
f3 = n * square-root(n)
f4 = n2 * (lg (lg (lg n)))
f1, f3, f4, f2
== 3 ==
Prove that 5n + 6 = O(n^2) using the definition of O, providing a valid c and n0.
Can you find another valid c and n0 that proves that, as well?
c = 1, n0 = 6
c = 2, n0 = 4
== 4 ==
Given the recurrence equation:
T(n) = 2T(n/4) + square-root(n)
with T(1) = theta(1)
and consider its recurrence tree. (do not show it here, please answer the following questions).
What is the mathematical formula that describes the cost of each node?
Below, i = the level of the recurrence tree, which starts from the root with level i = 0.
each node is: square-root(n/(4^i))
What is the mathematical formula that describes sum of the sibling nodes at each level of the recurrence tree that is not the level of the leaves?
Simplify the formula as much as you can.
2^i * square-root(n/(4^i))
=
2^i * square-root(n) / square-root(4^i)
=
2^i * square-root(n) / 2^i (here we have square-root(4^i) = 2^i)
=
square-root(n)
How manu leaves does the tree has? and what is the sum of their cost?
2^(log_4 n)
cost of each leave is theta(1). That is, some cost c.
The number of leaves is 2^(log_4 n) = n^(log_4 2) by logarithm property.
notice that (log_4 2) = 1/2.
therefore n^(log_4 2) = n^(1/2) = square-root(n).
the sum of their cost is: square-root(n) * c
What is the running-time in theta notation?
theta(square-root(n) * log_4 n)