User:Rockingravi/sandbox

Analysing Big Oh Notation Θ notation is used to bound a function from above and below. When we need to bound a function from above (called asymptotic upper bound ), we can use O( Big oh) notation. For a given function g(n), we denote this by O(g(n)) pronounced as Big Oh of g of n.

O(g(n)) = f(n): there exist positive constants c and k such that 0 <= f(n) <= cg(n) for all n >= k.

f(n) = O(g(n)) indicates that f(n) is a member of set O(g(n)). Note that f(n) = Θ(g(n)) implies O(g(n)) as Θ notation is stronger than O notation. Mathematically,

Θ(g(n)) ⊆ O(g(n))

We can easily find the complexity of quadratic equation, an2 + bn +c, where a > 0, is in Θ(n2). As you know that Θ(g(n)) is subset of O(g(n)) therefore in terms of Big Oh notation it is O(n2). Usually Big Oh notation represents the worst case running time. For example we see that insertion sort has Θ(n2) running time, in best case it is Θ(n). Technically, it is abuse to say that insertion sort algorithm has running time O(n2), as running time depends on the value of n. In short O(n2) represents the worst case running time of insertion sort.