Algorithm Complexity — Big-O Notation
Algorithm Complexity — Big-O Notation
Big-O notation describes how an algorithm's running time or space grows as the input size $n$ increases.
$f(n) = O(g(n))$ if there exist constants $c > 0$ and $n_0$ such that:
$$f(n) \le c \cdot g(n) \quad \text{for all } n \ge n_0$$
Big-O captures the upper bound on growth rate, ignoring constants and lower-order terms.
| Big-O | Name | Example |
|---|---|---|
| $O(1)$ | Constant | Array index access |
| $O(\log n)$ | Logarithmic | Binary search |
| $O(n)$ | Linear | Simple loop |
| $O(n \log n)$ | Linearithmic | Merge sort |
| $O(n^2)$ | Quadratic | Nested loops (bubble sort) |
| $O(2^n)$ | Exponential | Recursive subset generation |
What is the Big-O of $f(n) = 3n^2 + 5n + 100$?
Drop constants and lower-order terms: $O(n^2)$.
As $n \to \infty$, the $n^2$ term dominates.
A loop runs $n$ times, and inside it another loop runs $n$ times. Within the inner loop, we do $O(1)$ work. What's the total complexity?
$$O(n \times n \times 1) = O(n^2)$$
Binary search halves the search space each step. If $n = 1{,}024$, how many steps at most?
$$\log_2 1{,}024 = 10 \text{ steps}$$
Complexity: $O(\log n)$.
Practice Problems
Show Answer Key
1. $O(n)$ — linear term dominates
2. $T(n) = 2T(n/2) + O(1)$ → $O(n)$ by Master theorem (case 1) — similar to traversing a binary tree
3. $O(n)$ — linear scan
4. $T(n) = 2T(n/2) + O(n)$ → $O(n \log n)$
5. $(10{,}000/1{,}000)^2 = 100\times$ slower → 100 seconds
6. $O(1) < O(\log n) < O(n) < O(n\log n) < O(n^2) < O(2^n) < O(n!)$
7. $O(1)$ — direct access with index arithmetic
8. $O(n)$ — when all elements hash to the same bucket
9. $n(n-1)/2 = O(n^2)$
10. $n = 2^{20} = 1{,}048{,}576$
11. $O(n \log n)$ — Master theorem case 2
12. Total $= n + n/2 + n/4 + \cdots = 2n$; $O(n)$