Home · Python & FastAPI · C# · .NET · SQL · TypeScript & React · System Design · Interview Prep · Algo Patterns · SQL
⚡ LeetCode Patterns

Algorithm Patterns

10 core patterns that unlock hundreds of LeetCode problems. Learn the shape of each pattern, understand why it works, and see it applied to real problems.

// Quick Reference
Sliding Window01 Two Pointers02 Fast & Slow Ptrs03 Binary Search04 DFS05 BFS06 Doubly Linked List07 Dynamic Programming08 Heap / Priority Queue09 Merge Sort10
01
Sliding Window
Maintain a dynamic subarray / substring window that grows and shrinks as you move through the input.
Time: O(n) Space: O(1) or O(k) Easy → Hard

How it works

Instead of recomputing a subarray from scratch every time (O(n²)), a sliding window keeps two pointers — left and right — that define the current window. You expand right, and shrink left only when the window violates a constraint.

There are two flavours:

  • Fixed size — window is always k elements wide.
  • Variable size — window grows until it breaks a rule, then left pointer catches up.
Spot it when: the problem asks for a subarray/substring that is "longest", "shortest", or "maximum sum" under some constraint on contiguous elements.

Classic problems

Best Time to Buy Stock Longest Substring Without Repeating Minimum Size Subarray Sum Minimum Window Substring
Problem — Longest substring without repeating charsPython
def length_of_longest_substring(s: str) -> int:
    seen = {}          # char → last seen index
    left = 0
    best = 0

    for right, ch in enumerate(s):
        # If char is already in window, shrink left
        if ch in seen and seen[ch] >= left:
            left = seen[ch] + 1

        seen[ch] = right
        best = max(best, right - left + 1)

    return best

# "abcabcbb" → 3  ("abc")
# "pwwkew"  → 3  ("wke")
Pattern — Fixed window (max sum of k elements)Python
def max_sum_subarray(nums: list, k: int) -> int:
    window = sum(nums[:k])
    best   = window

    for i in range(k, len(nums)):
        # Slide: add right element, drop left element
        window += nums[i] - nums[i - k]
        best    = max(best, window)

    return best

# [2,1,5,1,3,2], k=3 → 9  (5+1+3)
02
Two Pointers
Place one pointer at each end (or both near the same end) and march them toward each other based on comparison logic.
Time: O(n) Space: O(1) Easy → Medium

How it works

With a sorted array you can eliminate possibilities cheaply. Start left = 0, right = n-1. Check the pair sum:

  • Too small → move left right (increase sum)
  • Too large → move right left (decrease sum)
  • Match → record and move both

Also used on same-direction problems where a slow pointer tracks a "write head" while a fast pointer scans ahead (e.g. remove duplicates in-place).

Spot it when: the input is sorted (or sortable) and you're searching for a pair/triplet that meets a condition, or when you need an in-place partition.

Classic problems

Remove Duplicates from Sorted Array Two Sum II 3Sum Container With Most Water
Problem — Two Sum II (sorted input)Python
def two_sum(numbers: list, target: int) -> list:
    left, right = 0, len(numbers) - 1

    while left < right:
        s = numbers[left] + numbers[right]

        if   s == target: return [left+1, right+1]
        elif s  < target: left  += 1
        else:             right -= 1

# [2,7,11,15], target=9 → [1,2]
Problem — Container With Most WaterPython
def max_area(height: list) -> int:
    left, right = 0, len(height) - 1
    best = 0

    while left < right:
        h    = min(height[left], height[right])
        best = max(best, h * (right - left))
        # Move the shorter side inward
        if height[left] < height[right]: left  += 1
        else:                             right -= 1

    return best
03
Fast & Slow Pointers
Floyd's tortoise-and-hare: two pointers at different speeds reveal cycles and midpoints in linked structures.
Time: O(n) Space: O(1) Easy → Medium

How it works

Slow moves one node at a time. Fast moves two nodes at a time. If there is a cycle, fast will eventually lap slow and they will meet. If there is no cycle, fast reaches None first.

  • Cycle detection — do slow & fast ever point to the same node?
  • Middle of list — when fast hits the end, slow is at the midpoint.
  • Cycle start — after meeting, reset one pointer to head; both move at speed 1. They meet at the cycle entrance.
Spot it when: the problem involves a linked list and asks about cycles, midpoints, or the k-th element from the end.

Classic problems

Linked List Cycle Middle of Linked List Linked List Cycle II Happy Number
Problem — Detect cycle in linked listPython
def has_cycle(head) -> bool:
    slow = fast = head

    while fast and fast.next:
        slow = slow.next
        fast = fast.next.next

        if slow is fast:
            return True

    return False
Problem — Find middle of linked listPython
def middle_node(head):
    slow = fast = head

    while fast and fast.next:
        slow = slow.next
        fast = fast.next.next
    # For even-length lists, slow lands on
    # the second of the two middle nodes.
    return slow

# 1→2→3→4→5  → returns node 3
# 1→2→3→4    → returns node 3
05
Depth-First Search (DFS)
Explore a path all the way to its end before backtracking. Naturally recursive — or use an explicit stack.
Time: O(V + E) Space: O(h) recursion depth Easy → Hard

How it works

DFS dives deep along one branch before exploring siblings. The call stack (or an explicit stack) tracks where to backtrack to. On trees it naturally produces pre-order, in-order, and post-order traversals.

  • Tree DFS — check a node, recurse left, recurse right.
  • Graph DFS — use a visited set to avoid revisiting nodes.
  • Backtracking — DFS + undoing the last choice (permutations, subsets, n-queens).
Spot it when: exploring all paths, counting connected components, checking if a path exists, or generating all combinations/permutations.

Classic problems

Maximum Depth of Binary Tree Number of Islands Path Sum II Word Search
Problem — Number of Islands (grid DFS)Python
def num_islands(grid: list) -> int:
    rows, cols = len(grid), len(grid[0])
    count = 0

    def dfs(r, c):
        if r < 0 or r >= rows or c < 0 \
           or c >= cols or grid[r][c] != "1":
            return
        grid[r][c] = "0"  # mark visited (sink it)
        dfs(r+1,c); dfs(r-1,c)
        dfs(r,c+1); dfs(r,c-1)

    for r in range(rows):
        for c in range(cols):
            if grid[r][c] == "1":
                dfs(r, c)
                count += 1
    return count
Tree DFS — max depthPython
def max_depth(root) -> int:
    if not root:
        return 0
    return 1 + max(
        max_depth(root.left),
        max_depth(root.right)
    )
06
Breadth-First Search (BFS)
Explore all neighbours at the current depth before going deeper. Guarantees the shortest path in unweighted graphs.
Time: O(V + E) Space: O(w) max width Easy → Hard

How it works

BFS uses a queue (FIFO). Start by enqueuing the source node. At each step dequeue one node, process it, then enqueue all unvisited neighbours. Nodes at distance 1 are fully processed before distance 2, and so on.

  • Level-order tree traversal — all nodes at depth d before depth d+1.
  • Shortest path — the first time BFS reaches a node is via the shortest route.
  • Multi-source BFS — seed the queue with multiple starting nodes simultaneously (e.g. "rotting oranges").
Spot it when: the problem asks for shortest path, minimum steps, or level-by-level processing in an unweighted graph or grid.

Classic problems

Binary Tree Level Order Traversal Rotting Oranges Word Ladder Shortest Path in Binary Matrix
Problem — Rotting Oranges (multi-source BFS)Python
from collections import deque

def oranges_rotting(grid: list) -> int:
    q = deque()
    fresh = 0

    for r, row in enumerate(grid):
        for c, val in enumerate(row):
            if val == 2: q.append((r, c, 0))
            if val == 1: fresh += 1

    minutes = 0
    while q:
        r, c, t = q.popleft()
        for dr, dc in [(1,0),(-1,0),(0,1),(0,-1)]:
            nr, nc = r+dr, c+dc
            if 0 <= nr < len(grid) and \
               0 <= nc < len(grid[0]) and \
               grid[nr][nc] == 1:
                grid[nr][nc] = 2
                fresh -= 1
                minutes = t + 1
                q.append((nr, nc, t+1))

    return minutes if fresh == 0 else -1
07
Doubly Linked List
Each node holds a value plus pointers to both the previous and next node — enabling O(1) insertion and deletion at any known position.
Insert/Delete: O(1) Space: O(n) Medium → Hard

How it works

A doubly linked list (DLL) solves the singly-linked-list problem of needing to know the previous node to delete. With node.prev and node.next you can splice out a node in O(1).

The killer interview application is the LRU Cache: combine a DLL (ordered by recency) with a HashMap (O(1) lookup) to get O(1) get and put.

  • Most-recently-used sits at the tail.
  • Least-recently-used sits at the head — evict it when over capacity.
  • On every access: remove the node from its current position, re-insert at tail.
Spot it when: you need both O(1) access by key AND O(1) ordered eviction — that's always a hashmap + DLL.

Classic problems

LRU Cache LFU Cache Design Browser History
Problem — LRU Cache (hashmap + DLL)Python
class Node:
    def __init__(self, key=0, val=0):
        self.key = key; self.val = val
        self.prev = self.next = None

class LRUCache:
    def __init__(self, capacity: int):
        self.cap  = capacity
        self.map  = {}            # key → node
        # Sentinel head / tail (dummy nodes)
        self.head = Node()
        self.tail = Node()
        self.head.next = self.tail
        self.tail.prev = self.head

    def _remove(self, node):
        node.prev.next = node.next
        node.next.prev = node.prev

    def _insert_tail(self, node):
        node.prev = self.tail.prev
        node.next = self.tail
        self.tail.prev.next = node
        self.tail.prev      = node

    def get(self, key: int) -> int:
        if key not in self.map: return -1
        node = self.map[key]
        self._remove(node)
        self._insert_tail(node)
        return node.val

    def put(self, key: int, value: int):
        if key in self.map:
            self._remove(self.map[key])
        node = Node(key, value)
        self._insert_tail(node)
        self.map[key] = node
        if len(self.map) > self.cap:
            lru = self.head.next  # evict LRU
            self._remove(lru)
            del self.map[lru.key]
08
Dynamic Programming
Break a problem into overlapping subproblems, solve each once, and store the results to avoid redundant recomputation.
Time: O(n·states) Space: O(states) Medium → Hard

How it works

DP requires two properties: optimal substructure (the optimal solution contains optimal sub-solutions) and overlapping subproblems (the same sub-problem is solved repeatedly in a naive recursion).

Two implementation styles:

  • Top-down (memoization) — recursive function + a cache. Write the natural recursion, then add @lru_cache or a dict.
  • Bottom-up (tabulation) — fill a table iteratively from base cases up. Usually better space and avoids stack overflow.
Spot it when: the problem says "how many ways", "minimum cost", "maximum value", or you notice the same recursive call appearing with the same arguments more than once.

Classic problems

Climbing Stairs Coin Change Longest Common Subsequence Edit Distance
Problem — Coin Change (min coins to make amount)Python
def coin_change(coins: list, amount: int) -> int:
    # dp[i] = min coins to make amount i
    dp = [float("inf")] * (amount + 1)
    dp[0] = 0          # base case

    for i in range(1, amount + 1):
        for coin in coins:
            if coin <= i:
                dp[i] = min(dp[i], dp[i - coin] + 1)

    return dp[amount] if dp[amount] != float("inf") else -1

# coins=[1,5,6,9], amount=11 → 2  (5+6)
Problem — Longest Common SubsequencePython
def lcs(text1: str, text2: str) -> int:
    m, n = len(text1), len(text2)
    dp = [[0]*(n+1) for _ in range(m+1)]

    for i in range(1, m+1):
        for j in range(1, n+1):
            if text1[i-1] == text2[j-1]:
                dp[i][j] = dp[i-1][j-1] + 1
            else:
                dp[i][j] = max(dp[i-1][j], dp[i][j-1])
    return dp[m][n]

# "abcde", "ace" → 3
09
Heap / Priority Queue
A complete binary tree where every parent is ≤ its children (min-heap). Gives you the smallest (or largest) element in O(1), with O(log n) insert and remove.
Push/Pop: O(log n) Space: O(n) Medium → Hard

How it works

Python's heapq module is a min-heap. Push elements and the smallest always floats to the top. For a max-heap, negate the values when pushing.

  • K largest elements — maintain a min-heap of size k. If new element > heap top, replace top.
  • K smallest elements — maintain a max-heap of size k.
  • Merge K sorted lists — push (val, list_index, element_index) into heap; always pop the smallest and push its successor.
  • Median streaming — two heaps: max-heap for lower half, min-heap for upper half.
Spot it when: you need repeated access to the "current minimum/maximum" as new elements arrive, or "top K" questions.

Classic problems

Kth Largest Element in a Stream K Closest Points to Origin Merge K Sorted Lists Find Median from Data Stream
Problem — K Closest Points to OriginPython
import heapq

def k_closest(points: list, k: int) -> list:
    # Max-heap of size k (negate dist for max-heap)
    heap = []

    for x, y in points:
        dist = x*x + y*y        # skip sqrt — monotone
        heapq.heappush(heap, (-dist, x, y))
        if len(heap) > k:
            heapq.heappop(heap) # evict the farthest

    return [[x, y] for _, x, y in heap]
Problem — Find Median from Data StreamPython
class MedianFinder:
    def __init__(self):
        self.lo = []   # max-heap (lower half, negated)
        self.hi = []   # min-heap (upper half)

    def addNum(self, num: int):
        heapq.heappush(self.lo, -num)
        # Balance: lo top ≤ hi top
        if self.lo and self.hi and \
           -self.lo[0] > self.hi[0]:
            heapq.heappush(self.hi, -heapq.heappop(self.lo))
        # Keep sizes equal or lo has one extra
        if len(self.lo) > len(self.hi) + 1:
            heapq.heappush(self.hi, -heapq.heappop(self.lo))
        elif len(self.hi) > len(self.lo):
            heapq.heappush(self.lo, -heapq.heappop(self.hi))

    def findMedian(self) -> float:
        if len(self.lo) > len(self.hi):
            return -self.lo[0]
        return (-self.lo[0] + self.hi[0]) / 2
10
Merge Sort / Divide & Conquer
Recursively split the input in half, solve each half, then merge the two sorted halves back together. The merge step is where the work happens.
Time: O(n log n) Space: O(n) Medium → Hard

How it works

The divide-and-conquer template is always three steps:

  • Divide — split the problem at the midpoint.
  • Conquer — recurse on both halves (base case: single element).
  • Combine — merge (or otherwise combine) the two solved halves.

Because the tree of recursion has log n levels and each level does O(n) work during the merge, the total is O(n log n).

The same pattern drives counting inversions, sorting linked lists in O(n log n) with O(1) space (better than arrays!), and external sort for data too large for RAM.

Spot it when: you need a stable O(n log n) sort, or the problem involves counting something across pairs that a split-and-merge traversal can accumulate naturally.

Classic problems

Merge Sorted Array Sort List (linked list) Count of Smaller Numbers After Self Reverse Pairs
Classic — Merge Sort on an arrayPython
def merge_sort(arr: list) -> list:
    if len(arr) <= 1:
        return arr

    mid   = len(arr) // 2
    left  = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return _merge(left, right)

def _merge(a: list, b: list) -> list:
    result = []
    i = j = 0
    while i < len(a) and j < len(b):
        if a[i] <= b[j]:
            result.append(a[i]); i += 1
        else:
            result.append(b[j]); j += 1
    return result + a[i:] + b[j:]
Problem — Sort a Linked List in O(n log n)Python
def sort_list(head):
    if not head or not head.next:
        return head

    # Split in half using fast & slow pointers
    slow, fast = head, head.next
    while fast and fast.next:
        slow = slow.next
        fast = fast.next.next
    mid        = slow.next
    slow.next  = None   # cut the list

    left  = sort_list(head)
    right = sort_list(mid)
    return merge_lists(left, right)

def merge_lists(l1, l2):
    dummy = cur = ListNode()
    while l1 and l2:
        if l1.val <= l2.val: cur.next = l1; l1 = l1.next
        else:                cur.next = l2; l2 = l2.next
        cur = cur.next
    cur.next = l1 or l2
    return dummy.next