Skip to main content

BIG O Breakdown

What is Big O Notation?

Big O notation is a mathematical way to describe the time complexity or space complexity of an algorithm. It represents the worst-case growth rate of an algorithm as the input size nn increases, helping us understand how scalable an algorithm is.


Key Characteristics of Big O:

  1. Focus on Growth: Big O focuses on how the number of operations grows with the size of the input (nn).

    • Example: An algorithm that performs 2n+52n + 5 operations is O(n)O(n) because as nn grows, the constant 55 and coefficient 22 become negligible.
  2. Worst-Case Analysis: It assumes the largest number of operations the algorithm might perform for the input size nn.

  3. Ignore Constants and Lower-Order Terms:

    • Constants don’t matter in Big O because they don’t scale with nn.
    • Example: O(2n)=O(n)O(2n) = O(n), O(n+10)=O(n)O(n + 10) = O(n).
  4. Only the Dominant Term Counts:

    • If an algorithm has multiple terms like O(n2+n)O(n^2 + n), the term with the fastest growth rate dominates. So, O(n2+n)=O(n2)O(n^2 + n) = O(n^2).

Common Big O Complexities

Complexity Name Description
O(1)O(1) Constant Takes the same amount of time regardless of input size.
O(log⁡n)O(\log n) Logarithmic Reduces the problem size by half at every step.
O(n)O(n) Linear Time grows directly proportional to the input size.
O(nlog⁡n)O(n \log n) Linearithmic Common in efficient sorting algorithms like Merge Sort.
O(n2)O(n^2) Quadratic Common in nested loops, grows rapidly with input size.
O(2n)O(2^n) Exponential Doubles operations for each increment in input size.
O(n!)O(n!) Factorial Infeasible for even moderately large input sizes.

How to Calculate Big O

To calculate Big O, analyze the algorithm step by step, focusing on loops, function calls, and recursive depth.


1. Loops

  • A loop that runs nn times is O(n)O(n).
  • Nested loops multiply their complexities.
    • Example:
      for i in range(n):        # O(n)
          for j in range(n):    # O(n)
              print(i, j)       # O(1)
      
      Total = O(n)⋅O(n)=O(n2)O(n) \cdot O(n) = O(n^2).

2. Sequential Steps

  • If an algorithm has multiple parts, add their complexities.

    • Example:
      for i in range(n):    # O(n)
          print(i)          # O(1)
      
      for j in range(m):    # O(m)
          print(j)          # O(1)
      
      Total = O(n)+O(m)O(n) + O(m).
  • If n=mn = m, the total is O(n)+O(n)=O(2n)=O(n)O(n) + O(n) = O(2n) = O(n).


3. Function Calls

  • If a function is called recursively, analyze how many times it is called and the work done in each call.
    • Example (Binary Search):
      def binary_search(arr, target, left, right):
          if left > right:
              return -1
          mid = (left + right) // 2
          if arr[mid] == target:
              return mid
          elif arr[mid] < target:
              return binary_search(arr, target, mid + 1, right)
          else:
              return binary_search(arr, target, left, mid - 1)
      
      • Binary search splits the array into halves, so its complexity is O(log⁡n)O(\log n).

4. Recurrence Relations

  • Recursive algorithms often have recurrence relations.
    • Example (Merge Sort):
      • Merge Sort divides the array into halves and merges them back.
      • Relation: T(n)=2T(n/2)+O(n)T(n) = 2T(n/2) + O(n) (two recursive calls + merge operation).
      • Complexity: O(nlog⁡n)O(n \log n).

Practical Examples

Example 1: Single Loop

for i in range(n):
    print(i)  # O(1)
  • Total = O(n)O(n).

Example 2: Nested Loop

for i in range(n):
    for j in range(n):
        print(i, j)  # O(1)
  • Total = O(n)⋅O(n)=O(n2)O(n) \cdot O(n) = O(n^2).

Example 3: Sorting and Traversal

arr.sort()  # O(n log n)
for i in arr:
    print(i)  # O(n)
  • Total = O(nlog⁡n)+O(n)=O(nlog⁡n)O(n \log n) + O(n) = O(n \log n).

Example 4: Binary Search

def binary_search(arr, target, left, right):
    if left > right:
        return -1
    mid = (left + right) // 2
    if arr[mid] == target:
        return mid
    elif arr[mid] < target:
        return binary_search(arr, target, mid + 1, right)
    else:
        return binary_search(arr, target, left, mid - 1)
  • Complexity: O(log⁡n)O(\log n), because the array size is halved in each recursive call.

Tips for Calculating Big O

  1. Focus on Loops: Count how many times each loop runs.
  2. Break Down Steps: Analyze each segment of the algorithm separately.
  3. Remove Constants: Ignore constants and lower-order terms.
  4. Understand Recursive Depth: For recursive algorithms, determine how the input size shrinks with each call.

With practice, determining Big O becomes intuitive!