Computational complexity is designed to compare two or more algorithms, independently of computer hardware capabilities and coding language.

Asymptotic Notation

There are three ways of measuring complexity asymptotically:

  1. Big-O Notation: Measures the worst-case complexity, the limit of an algorithm
  2. Big-Theta: Measures the average-case complexity, can be thought of as the between area between worst-case and best-case
  3. Big-Omega: Measures the best-case complexity, namely is considered the base of the algorithm.

Example: , because increases the most as n approaches infinity

Big-O Notation

Big-Theta Notation

This is used to measure the average-case complexity of an algorithm/function/operation.

Big-Omega Notation

This is used to measure the best-case complexity of an algorithm/function/operation. It is considered the floor for the growth function

Example: In a linear search for an array of size n, the best-case would be that the term is in the first index, 0. This means only 1 operation is needed. Therefore the big-omega would be

Growth Functions - and

Most algorithms’ time complexities are represented by . For example, Mergesort has an (Note, does NOT contain big-O).

Time Complexity

Examples

CommandTime complexity
Statements constant
Conditional statements constant
Loops linear
Nested loops polynomial
[[Divide-and-ConquerDivide-and-Conquer]]
Link to source

Space Complexity

Examples

CommandSpace complexity
Initializing, Assigning a variable constant
Storing an array/list/stack/queue linear
Iterative Function Calls constant
Recursive Function Calls linear
Link to source