Skip to content

Time & Space Complexity

First PublishedByAtif Alam

Time complexity is how runtime grows with input size (e.g. doubling the input).

Space complexity is the extra memory (beyond the input) the algorithm uses.

We use Big O notation for how cost grows as input gets large (constants and lower-order terms are dropped).

From fastest to slowest (for large n):

  • O(1) — Constant: same cost regardless of input size (e.g. hash lookup, array index).
  • O(log n) — Logarithmic: cost grows slowly (e.g. binary search, balanced tree operations).
  • O(n) — Linear: one pass over the input (e.g. single loop, BFS/DFS over n nodes).
  • O(n log n) — Linearithmic: typical for efficient sorting (e.g. merge sort, quicksort average).
  • O(n²) — Quadratic: nested loops over the same input (e.g. insertion sort, some DP).
  • O(2^n) or exponential — Cost doubles (or worse) as input grows (e.g. naive recursion, some backtracking).
  • O(V + E) — Graph input: linear in vertices plus edges (e.g. BFS, DFS on a graph).

Actual complexity depends on the problem variant; the table below is a typical summary for the topics we cover.

AlgorithmTimeSpace
Two PointersO(n)O(1)
Sliding WindowO(n)O(1) or O(k)
Prefix SumBuild O(n), query O(1)O(n)
Breadth-First SearchO(V + E) or O(n)O(n)
Depth-First SearchO(V + E) graph; O(n) treeO(h) recursion stack (height or depth)
Binary SearchO(log n)O(1)
HeapO(log n) per insert/extractO(n)
BacktrackingOften exponentialO(depth)
Dynamic ProgrammingOften O(n²) or O(n·W)Proportional to state
Divide and Conquere.g. O(n log n)O(log n)
TrieO(m) per string (length m)O(total characters)
Union FindO(α(n)) per operationO(n)
GreedyOften O(n log n) + O(n)O(1) or O(n)

Each topic page has a Time / space line with more detail and context for that algorithm.