--- date: 02.09.2024 type: theoretical --- $\mathcal{O}$ ## Complexity - Time Complexity: amount of time an algorithm takes to complete as a function of the size of its input. - Space Complexity: amount of memory it uses as a function of the size of its input. ### Big O - **Formal Definition:** Big O notation, $\mathcal{O}(f(n))$, describes the upper bound of the time or space complexity of an algorithm in terms of the input size $n$. It characterizes the worst-case scenario of an algorithm's growth rate. Formally, a function $T(n)$ is said to be $\mathcal{O}(f(n))$ if there exist positive constants $c$ and $n_0$ such that for all $n \geq n_0$: $$ 0 \leq T(n) \leq c \cdot f(n). $$ This means that $T(n)$ will grow at most as fast as $f(n)$, up to a constant multiple $c$, for sufficiently large $n$. ![Big O cheatsheet](Big%20o.png) ### Informally: Big-O notation provides an upper limit on: - **Time**: How long an algorithm will take. - **Space**: How much memory it will require. It helps compare algorithms by focusing on their growth rate and ignoring constant factors. ### Other Notations for Growth Rates 1. **Big-O ($\mathcal{O}(f(n))$)**: - Describes the **upper bound** (worst-case) of $T(n)$. - Informally: $T(n)$ grows no faster than $f(n)$. - Formal Definition: $$ T(n) \text{ is } \mathcal{O}(f(n)) \iff \exists c > 0, n_0 > 0 \text{ such that } T(n) \leq c \cdot f(n) \text{ for all } n \geq n_0. $$ 2. **Big-Omega ($\Omega(f(n))$)**: - Describes the **lower bound** (best-case) of $T(n)$. - Informally: $T(n)$ grows at least as fast as $f(n)$. - Formal Definition: $$ T(n) \text{ is } \Omega(f(n)) \iff \exists c > 0, n_0 > 0 \text{ such that } T(n) \geq c \cdot f(n) \text{ for all } n \geq n_0. $$ 3. **Big-Theta ($\Theta(f(n))$)**: - Describes the **exact bound** (tight bound) of $T(n)$. - Informally: $T(n)$ grows exactly as fast as $f(n)$. - Formal Definition: $$ T(n) \text{ is } \Theta(f(n)) \iff T(n) \text{ is } \mathcal{O}(f(n)) \text{ and } T(n) \text{ is } \Omega(f(n)). $$ 4. **Little-O ($o(f(n))$)**: - Describes a **loose upper bound** of $T(n)$. - Informally: $T(n)$ grows strictly slower than $f(n)$. - Formal Definition: $$ T(n) \text{ is } o(f(n)) \iff \forall c > 0, \exists n_0 > 0 \text{ such that } T(n) < c \cdot f(n) \text{ for all } n \geq n_0. $$ Think of growth rates as vehicles on a highway: - $\mathcal{O}(f(n))$: $T(n)$ cannot go faster than $f(n)$. - $\Omega(f(n))$: $T(n)$ cannot go slower than $f(n)$. - $\Theta(f(n))$: $T(n)$ moves at the exact same speed as $f(n)$. - $o(f(n))$: $T(n)$ moves slower than $f(n)$ and never catches up.