\( \newcommand{\NOT}{\neg} \newcommand{\AND}{\wedge} \newcommand{\OR}{\vee} \newcommand{\XOR}{\oplus} \newcommand{\IMP}{\Rightarrow} \newcommand{\IFF}{\Leftrightarrow} \newcommand{\TRUE}{\text{True}\xspace} \newcommand{\FALSE}{\text{False}\xspace} \newcommand{\IN}{\,{\in}\,} \newcommand{\NOTIN}{\,{\notin}\,} \newcommand{\TO}{\rightarrow} \newcommand{\DIV}{\mid} \newcommand{\NDIV}{\nmid} \newcommand{\MOD}[1]{\pmod{#1}} \newcommand{\MODS}[1]{\ (\text{mod}\ #1)} \newcommand{\N}{\mathbb N} \newcommand{\Z}{\mathbb Z} \newcommand{\Q}{\mathbb Q} \newcommand{\R}{\mathbb R} \newcommand{\C}{\mathbb C} \newcommand{\cA}{\mathcal A} \newcommand{\cB}{\mathcal B} \newcommand{\cC}{\mathcal C} \newcommand{\cD}{\mathcal D} \newcommand{\cE}{\mathcal E} \newcommand{\cF}{\mathcal F} \newcommand{\cG}{\mathcal G} \newcommand{\cH}{\mathcal H} \newcommand{\cI}{\mathcal I} \newcommand{\cJ}{\mathcal J} \newcommand{\cL}{\mathcal L} \newcommand{\cK}{\mathcal K} \newcommand{\cN}{\mathcal N} \newcommand{\cO}{\mathcal O} \newcommand{\cP}{\mathcal P} \newcommand{\cQ}{\mathcal Q} \newcommand{\cS}{\mathcal S} \newcommand{\cT}{\mathcal T} \newcommand{\cV}{\mathcal V} \newcommand{\cW}{\mathcal W} \newcommand{\cZ}{\mathcal Z} \newcommand{\emp}{\emptyset} \newcommand{\bs}{\backslash} \newcommand{\floor}[1]{\left \lfloor #1 \right \rfloor} \newcommand{\ceil}[1]{\left \lceil #1 \right \rceil} \newcommand{\abs}[1]{\left | #1 \right |} \newcommand{\xspace}{} \newcommand{\proofheader}[1]{\underline{\textbf{#1}}} \)

9.4 Asymptotic Growth and Limits

[Note: this section is not part of the require course material for CSC110. It is presented mainly for the nice connection between Big-O notation and calculus.]

Our asymptotic notation of \(\cO\), \(\Omega\), and \(\Theta\) are concerned with the comparing the long-term behaviour of two functions. It turns out that the concept of “long-term behaviour” is captured in another object of mathematical study, familiar to us from calculus: the limit of the function as its input approaches infinity.

Let \(f: \N \to \R\) and \(L \in \R\). We have the following two definitions:We’re restricting our attention here to functions with domain \(\N\) because that’s our focus in computer science. \[ \lim_{n \to \infty} f(n) = L:~ \forall \epsilon \in \R^+,~ \exists n_0 \in \N,~ \forall n \in \N,~ n \geq n_0 \IMP |f(n) - L| < \epsilon \] \[ \lim_{n \to \infty} f(n) = \infty:~ \forall M \in \R^+,~ \exists n_0 \in \N,~ \forall n \in \N,~ n \geq n_0 \IMP f(n) > M \]

Using just these definitions and the definitions of our asymptotic symbols \(\cO\), \(\Omega\), and \(\Theta\), we can prove the following pretty remarkable results:

For all \(f, g: \N \to \R^{\geq 0}\), if \(g(n) \neq 0\) for all \(n \in \N\), then the following statements hold:

  1. If there exists \(L \in \R^+\) such that \(\displaystyle{\lim_{n \to \infty} \frac{f(n)}{g(n)} = L}\), then \(g \in \Theta(f)\).
  2. If \(\displaystyle{\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0}\), then \(f \in \cO(g)\) and \(f \notin \Theta(g)\).
  3. If \(\displaystyle{\lim_{n \to \infty} \frac{f(n)}{g(n)} = \infty}\), then \(f \in \Omega(g)\) and \(f \notin \cO(g)\).

Proving this theorem is actually a very good (lengthy) exercise. The proof involves keeping track of variables and manipulating inequalities, two key skills you’re developing in this course! And they do tend to be useful in practice (although again, not for this course) to proving asymptotic bounds like \(n^2 \in \cO(1.01^n)\).

But note that the converse of these statements is not true; for example, it is possible (and another nice exercise) to find functions \(f\) and \(g\) such that \(g \in \Theta(f)\), but \(\displaystyle{\lim_{n \to \infty} \frac{f(n)}{g(n)}}\) does not exist.