Back to all posts

Recursion

4 min read
algorithmsrecursioncs-fundamentalsinterview-prep

A function that calls itself with a smaller subproblem until it hits a base case. Every recursive solution has an equivalent iterative one.

Core Structure

Two mandatory parts — missing either causes infinite recursion or wrong output.

  • Base case: Condition that stops recursion
  • Recursive case: Calls itself with a reduced input
text
solve(n):
  if n == baseCase: return result   // base case
  return solve(smallerN)            // recursive case

Call Stack

Each recursive call is pushed onto the call stack. Stack unwinds as calls return.

  • Deep recursion → StackOverflowError
  • Max depth depends on the runtime (JVM default ~500–1000 frames)
text
factorial(n):
  if n == 0: return 1         // base case
  return n * factorial(n-1)   // recursive case

// Call stack for factorial(4):
// factorial(4) → factorial(3) → factorial(2) → factorial(1) → factorial(0)
// Returns: 1 → 1 → 2 → 6 → 24
TimeSpace
ComplexityO(n)O(n) call stack

Fibonacci

Classic tree recursion — each call spawns two more. Naïve version is exponential.

text
fib(n):
  if n <= 1: return n
  return fib(n-1) + fib(n-2)
VersionTimeSpace
NaïveO(2ⁿ)O(n)
MemoizedO(n)O(n)
Bottom-up DPO(n)O(1)

Naïve fib recomputes the same subproblems exponentially. Always memoize or convert to DP.


Tail Recursion

Recursive call is the last operation in the function. Compilers can optimize into a loop (TCO), eliminating stack growth.

  • Java does not natively support TCO — still risks stack overflow
  • Kotlin/Scala: use tailrec keyword to enforce optimization
text
// NOT tail recursive — multiplication happens after return
factorial(n): return n * factorial(n-1)

// Tail recursive — accumulator carries state
factorial(n, acc = 1):
  if n == 0: return acc
  return factorial(n-1, n * acc)   // last op is the call
TimeSpace (with TCO)
ComplexityO(n)O(1)

Divide and Conquer

Split problem into independent subproblems, solve recursively, combine results.

text
solve(arr):
  if len(arr) == 1: return arr
  left  = solve(arr[0..mid])
  right = solve(arr[mid..n])
  return combine(left, right)
PatternExampleTime
Split in halfMerge Sort, Binary SearchO(n log n) / O(log n)
Subtract 1Factorial, FibonacciO(n)
Split into k partsTree traversalO(n)

Tree Recursion

Each call spawns more than one recursive call. Common in tree/graph traversal and combinatorics.

text
// Binary tree traversal (inorder)
inorder(node):
  if node == null: return
  inorder(node.left)
  visit(node)
  inorder(node.right)
TimeSpace
ComplexityO(n)O(h) where h = tree height

Balanced tree: O(log n) space. Skewed tree: O(n) space.


Backtracking

Explore all possibilities by building a solution incrementally and undoing a choice if it leads to a dead end.

  • Pattern: Choose → Recurse → Unchoose
text
backtrack(state, choices):
  if isGoal(state): record solution; return
  for each choice in choices:
    if isValid(choice):
      make(choice)
      backtrack(state, remaining choices)
      undo(choice)               // backtrack
Problem TypeExample
Permutations / CombinationsSubsets, N-Queens
Constraint satisfactionSudoku
Path findingMaze, Word search
TimeSpace
ComplexityO(b^d)O(d)

b = branching factor, d = depth. Prune early to cut down the search tree drastically.


Memoization

Cache recursive results to avoid redundant computation. Top-down DP.

text
memo = {}

solve(n):
  if n in memo: return memo[n]     // cache hit
  if isBaseCase(n): return base
  result = solve(n-1) + solve(n-2)
  memo[n] = result                 // store before return
  return result
Without MemoWith Memo
FibonacciO(2ⁿ)O(n)
Coin ChangeO(2ⁿ)O(n×m)
Space overheadO(n) stackO(n) stack + cache

Quick Reference

PatternStructureTimeSpace
Linear recursion1 call per levelO(n)O(n)
Tail recursion1 call, last opO(n)O(1) with TCO
Binary recursion2 calls per levelO(2ⁿ) naïveO(n)
Divide & ConquerSplit + combineO(n log n)O(log n)
BacktrackingChoose/unchooseO(b^d)O(d)
Memoized recursionCache subproblemsO(n)O(n)