Skip to main content

2. Introduction to big-Oh

06/02/23

Class test - 01/03/23 - Will be about content for that Monday that week!

Classification of Functions

  • CS needs a way to group functions by scaling behaviour,. Get rid of the unnecessary details.

  • Experience of CS is that this is best done by the big-Oh notation and family

  • Stirling's approximation - ln(n!)=nlnnn+O(lnn)ln(n!) = n ln n - n + O(ln n) (lnln is natural log)

  • Polynomial function are nO(1)n^{O(1)}

  • Best case of algorithm X is O(nlogn)O(n log n)

Big-Oh Notation: Motives

Big-Oh is often applied to runtime, but this is not essential

  • Definitions are just in terms of functions
  • It is not only for worst case runtime
  • Should be designed to be suitable for discussion of runtime functions
  • Usually applied to worst case
  • Can look at the ratio between two lines

Big-Oh Notation: Definition

Given positive functions f(n)f(n) and g(n)g(n), then we say that f(n)=O(g(n))f(n) = O(g(n)) if and only if there exists positive constants cc and n0n_0 such that f(n)cg(n)f(n)\le cg(n) for all n>non \gt n_o

i.e. exists-exists-forall structure: c>0.n0\exists c>0. \exists n_0 such that n>n0.f(n)cg(n)\forall n\gt n_0. f(n)\le c g(n)

  • Pick cc, before handling forall nn. So cc is not allowed to be a function of nn
  • Cannot pick that cc depends on nn
  • Can usually take c=f/gc = f/g and the definition becomes useless as it would not fail
  • Showing something works as n=n0n=n_0 is pointless as says nothing about the growth rates at large nn

The definition does not mention 'worst case of an algorithm runtime'. It does not even mention 'algorithms', only 'functions'