• Home
• Asymptotic Notation Cheatsheet

# Asymptotic Notation Cheatsheet

Asymptotic notation is a way of describing the behavior of an algorithm as the input size grows to infinity. It allows us to compare the efficiency of different algorithms, regardless of the specific input size or hardware being used. There are three common asymptotic notations:

1. Big O notation: This describes the upper bound of an algorithm’s running time, or how much time the algorithm will take at most. It is denoted as O(f(n)), where f(n) is the function that represents the running time. For example, an algorithm with a running time of O(n) means that the algorithm will take at most n time units to complete.
2. Big Ω notation: This describes the lower bound of an algorithm’s running time, or how much time the algorithm will take at least. It is denoted as Ω(f(n)), where f(n) is the function that represents the running time. For example, an algorithm with a running time of Ω(n) means that the algorithm will take at least n time units to complete.
3. Big Θ notation: This describes the tight bound of an algorithm’s running time, or how much time the algorithm will take on average. It is denoted as Θ(f(n)), where f(n) is the function that represents the running time. For example, an algorithm with a running time of Θ(n) means that the algorithm will take exactly n time units to complete.

Some common asymptotic functions include:

• Constant time: O(1)
• Linear time: O(n)
• Logarithmic time: O(log n)