In Cormen theorem 3.1 says that
For example, the best case running time of insertion sort is big-omega(n), whereas worst case running time of Insertion sort is Big-oh(n^2).
The running time of insertion sort therefore falls between big-omega(n) and Bigoh(n^2)
Now if we look at the Exercise 3.1-6 it asks
Prove that the running time of an algorithm is Big-theta(g(n)) iff its worst case running time is Big-oh(g(n)) and its best case running time is big-omega(g(n))
- Am I the only one who sees a contradiction here.
- I mean if we abide by the question that has to be proved, we conclude that for asymptotically tighter bounds (f(n) = Big-theta(g(n))) we need to have f(n) = big-omega(g(n)) for the algorithm's best case and Big-oh(g(n)) in its worst case
- But in case of Insertion sort best case time complexity is big-omega(n) and worst case time complexity is Big-oh(n^2)
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…