Time complexity is a way to compare two or more algorithms based on their duration. This comparison is independent of language, hardware, etc. and is instead based on the code itself.

To analyse the time complexity of an algorithm with input , assume the input size n approaches infinity. Eventually, some parts of the rule matter more as n gets bigger than others.

Time complexity uses asymptotic notation, which filters a formula by two simple rules:

  • Drop all constants
  • Only keep the ‘highest’ term, the term that increases the most

Operations on asymptotic notation should only take the biggest factor for all operations* . E.g. , because as n → ∞, only matters

Examples

CommandTime complexity
Statements constant
Conditional statements constant
Loops linear
Nested loops polynomial
[[Divide-and-ConquerDivide-and-Conquer]]