It is important to remember that the notation, whether O, Ω or Θ, expresses the asymptotic growth of a function; it does not have anything intrinsically to do with algorithms per se. The function in question may be the "complexity" (running time) of an algorithm, either worst-case, best-case or average-case, but the notation is independent of where the function comes from.
For example, the function f(n)=3n2+5 is:
- O(n2), it is also O(n2log n), O(n3), O(n4) and so on, but is not O(n).
- Ω(n2), it is also Ω(n log n), Ω(n) and so on, but is not Ω(n3).
- Θ(n2). It is not even Θ(n2log n) or Θ(n2/log n).
Now, usually the function considered is the worst-case complexity of an algorithm, and which notation of the three is used depends on what we want to say about it and on how carefully we do the analysis. For example, we may observe that because there are two nested loops, the worst-case running time is at most O(n2), without caring about whether this is actually achieved for some input. (Usually it is obvious that it is.) Or, we may say that the worst-case running time of sorting is Ω(n log n), because there must be some inputs for which it must take at least cn(log n) steps. Or, we may look at a particular mergesort algorithm, and see that it takes at most O(n log n) steps in the worst-case and that some input makes it take n log n steps, so the worst-case running time is Θ(n log n).
Note that in all the three examples above, it was still the same (worst-case) running time that was being analyzed. We may analyze the best-case or average-case instead, but again, which notation of the three we use depends on what we want to say — whether we want to give an upper bound, lower bound, or tight bound on the order of growth of the same function.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…