Two Pointer Technique: Solving Pair and Partition Problems in O(n)
Collapse O(n²) pair-search problems to O(n) by converging two index boundaries on a sorted array.
Abstract Algorithms
TLDR: Place one pointer at the start and one at the end of a sorted array. Move them toward each other based on a comparison condition. Every classic pair/partition problem that naively runs in O(n²) collapses to O(n) with this one idea — and mastering it unlocks Three Sum, Container With Most Water, and dozens of interview problems in one mental model.
📖 The O(n²) Trap: Why Brute-Force Pair Searches Break at Scale
You have a sorted array and need to find two numbers that sum to a target. The first instinct is a nested loop: for every element, scan the rest of the array for a match. That works, but it runs in O(n²) time. For an array of 10,000 elements that is 100 million comparisons. For 100,000 elements it's 10 billion.
The nested-loop trap appears in dozens of interview problems: pair sums, palindrome checks, container area maximization, triplet sums. All of them share the same underlying structure — you're comparing or combining two positions in an array — and all of them have the same escape hatch.
The key insight is this: if the array is sorted, you already know the relationship between any two elements the moment you read their values. You don't need to check every pair. You can make a directional decision — move left or move right — and each decision permanently eliminates a column of candidates. Two pointers exploit exactly this property.
| Approach | Time Complexity | Space Complexity | Requires Sorting? |
| Nested loop (brute force) | O(n²) | O(1) | No |
| Hash set lookup | O(n) | O(n) | No |
| Two pointer | O(n) | O(1) | Yes |
The two-pointer technique is the only approach that achieves O(n) time and O(1) space — no hash map, no extra array. The trade-off is that the input must be sorted (or sortable at acceptable cost).
🔍 Sorted Arrays and the Convergence Property
The technique only works because sorting gives you a total order you can exploit. When left and right are two indices in a sorted array, the following invariants hold at all times:
arr[left]is the smallest unconsidered value on the left.arr[right]is the largest unconsidered value on the right.- If
arr[left] + arr[right] > target, then no element to the right ofleftcan pair witharr[right]to produce a smaller sum — so you must moverightleft. - If
arr[left] + arr[right] < target, then no element to the left ofrightcan pair witharr[left]to produce a larger sum — so you must moveleftright.
This is the convergence property. Each pointer movement provably eliminates one column of candidates, so the two pointers together eliminate n candidates in at most n moves. No pair is re-examined.
The same logic generalizes beyond pair sums. In Container With Most Water, moving the shorter wall inward is the only way to possibly increase area. In palindrome checks, if the outer characters match, you can safely move both inward. In all cases, sorting (or the natural ordering) provides the decision rule.
⚙️ How Two Pointers Converge: The Decision Loop
The canonical template is compact. Set left = 0 and right = n - 1. In each loop iteration, compute the condition using both values and move the appropriate pointer.
// Generic two-pointer template
int left = 0, right = arr.length - 1;
while (left < right) {
int value = compute(arr[left], arr[right]); // e.g. sum, area, comparison
if (value == target) {
// Found solution — record it, then decide whether to keep searching
left++;
right--;
} else if (value < target) {
left++; // Need a larger value — move left inward
} else {
right--; // Need a smaller value — move right inward
}
}
The loop invariant is left < right. When they meet, every pair has been considered exactly once. The inner body executes at most n times, giving O(n) time.
Three decision shapes:
- Sum too small → move
leftright (arr[left] increases, sum increases). - Sum too large → move
rightleft (arr[right] decreases, sum decreases). - Sum equals target → record and move both (or just one, depending on the problem).
This is the full algorithm. Every two-pointer problem is a variation of this loop with a problem-specific condition substituted in.
🧠 Deep Dive: Why the Loop Stays Correct and Linear
The Internals: Pointer Movement as Candidate Elimination
Think of the sorted array as a two-dimensional grid of pairs (i, j) where i < j. The brute-force approach visits every cell of this grid. Two pointers walk the perimeter: starting from the top-right corner (0, n-1), each move either eliminates an entire row (move left pointer) or an entire column (move right pointer).
When you move left from index i to i+1 because the sum was too small, you're asserting: "No element to the left of i+1 can combine with any element to produce the target sum when paired with arr[right] through arr[current right]." That assertion is correct because the array is sorted and arr[i] ≤ arr[i+1]. The entire column i is pruned in one step.
This is why the technique only works on sorted data. Without sorted order, moving left right doesn't guarantee anything about the remaining candidates — you'd need to check them all.
Where the pointer rule comes from for each problem:
- Two Sum II (pair sum): move the pointer whose value makes the sum too far from target.
- Container With Most Water: move the pointer with the shorter height (longer height can only stay the same or increase by moving the short side).
- Three Sum: fix one element with an outer loop; run two pointers on the remaining subarray.
Performance Analysis: Time and Space for Each Variant
| Problem | Outer Work | Inner Work | Total Time | Space |
| Two Sum II | O(1) | O(n) two-pointer scan | O(n) | O(1) |
| Container With Most Water | O(1) | O(n) scan | O(n) | O(1) |
| Three Sum | O(n) outer loop | O(n) inner scan | O(n²) | O(1) sorting |
Two Sum II and Container With Most Water are pure O(n). Three Sum adds one outer loop, making it O(n²) overall — but this is still optimal for the three-sum problem because the output alone can be O(n²) pairs in the worst case. The two-pointer inner scan keeps the constant factor small.
The space cost is O(1) for all variants: no auxiliary hash map, no extra array. The only space usage is the two integer indices and the output list.
Sorting cost: If the input isn't pre-sorted, you pay O(n log n) for sorting before running the O(n) scan. The combined complexity is still O(n log n), which is far better than the O(n²) brute force.
📊 Visualizing Two Pointers Converging on a Sorted Array
flowchart TD
A([Start: left=0, right=n-1]) --> B{left < right?}
B -- No --> Z([Done])
B -- Yes --> C[Compute sum = arr left + arr right]
C --> D{sum vs target}
D -- equal --> E[Record pair\nleft++ and right--]
D -- too small --> F[left++\nNeed larger value]
D -- too large --> G[right--\nNeed smaller value]
E --> B
F --> B
G --> B
Each branch eliminates at least one candidate, guaranteeing the loop terminates in at most n steps.
For Three Sum, picture a fixed outer pointer scanning left to right. For each fixed value, the two pointers independently converge on the remaining subarray to find pairs that complement it.
flowchart LR
A([Fix arr i]) --> B([left = i+1\nright = n-1])
B --> C{left < right}
C -- Yes --> D{triplet sum\nvs 0}
D -- zero --> E[Record triplet\nSkip duplicates\nleft++ right--]
D -- negative --> F[left++]
D -- positive --> G[right--]
E --> C
F --> C
G --> C
C -- No --> H([Next i])
🌍 Where Two Pointers Show Up: Arrays, Strings, and Intervals
The two-pointer pattern appears in production code and systems far beyond competitive programming:
In-place array partitioning: Languages like Java use dual-pivot quicksort (introduced in Java 7) where two pivots and three-way partitioning use pointer convergence to partition subarrays without allocating extra memory.
String palindrome validation: Every palindrome checker — from input validation libraries to DNA sequence analyzers — uses the same convergence loop on the character array. Check s[left] == s[right], move inward.
Merge step in merge sort: The merge of two sorted halves walks two pointers forward through each half, picking the smaller element at each step. This is the forward-only variant of two pointers.
Sliding window initialization: Many variable-window problems (longest substring, smallest subarray with sum ≥ k) start as a two-pointer framework and then layer on a running aggregate. The pointer logic is the same; the condition changes.
⚖️ When Sorting Costs More Than It Saves
Two pointers are not universally applicable. The main failure modes:
Unsorted data with expensive sort: If you can't sort the input (it's a stream, or the sort cost is prohibitive), two pointers don't apply directly. A hash-set approach at O(n) time, O(n) space is better for a single query. However, if you need to answer many queries against the same array, paying O(n log n) once to sort and then running O(n) per query is usually worth it.
Non-linear data structures: Two pointers work on arrays and strings — structures with indexed access. They don't directly apply to trees, graphs, or hash maps. The fast/slow pointer variant (Floyd's algorithm) adapts the idea to linked lists specifically.
Multiple valid answers needing all solutions: Three Sum asks for all unique triplets. The duplicate-skipping step — while (left < right && arr[left] == arr[left-1]) left++ — is easy to forget and produces duplicate triplets that fail assertions. Every two-pointer solution that iterates for multiple answers needs explicit deduplication.
🧭 Choosing Between Two Pointers and Hash-Set Lookup
| Situation | Recommendation |
| Sorted array, find one pair | Two pointers — O(n) time, O(1) space |
| Unsorted array, single query | Hash set — O(n) time, O(n) space; skip the sort |
| Unsorted array, many queries | Sort once, then two pointers per query |
| Need all pairs / triplets | Two pointers with duplicate skipping |
| Non-linear structure (tree, graph) | Use DFS/BFS; two pointers don't apply |
| Palindrome / in-place partition | Two pointers — canonical choice |
The decisive question: Is the data sorted, or can it be sorted at acceptable cost? If yes, two pointers wins on space. If the data can't be sorted and space isn't a constraint, reach for a hash set.
🧪 Three Classic Problems: Full Java Solutions
Problem 1: Two Sum II — Sorted Array
Given a 1-indexed sorted array numbers and a target, return the indices of the two numbers that add to the target. Exactly one solution is guaranteed.
public class TwoSumII {
/**
* LeetCode 167 — Two Sum II (Input Array Is Sorted)
* Time: O(n) — single left/right convergence
* Space: O(1) — two index variables only
*/
public int[] twoSum(int[] numbers, int target) {
int left = 0;
int right = numbers.length - 1;
while (left < right) {
int sum = numbers[left] + numbers[right];
if (sum == target) {
return new int[]{left + 1, right + 1}; // 1-indexed output
} else if (sum < target) {
left++;
} else {
right--;
}
}
throw new IllegalArgumentException("No solution exists");
}
public static void main(String[] args) {
TwoSumII solver = new TwoSumII();
int[] result = solver.twoSum(new int[]{2, 7, 11, 15}, 9);
System.out.println(result[0] + ", " + result[1]); // 1, 2
}
}
Trace on [2, 7, 11, 15], target=9:
- left=0 (2), right=3 (15): sum=17 > 9 → right--
- left=0 (2), right=2 (11): sum=13 > 9 → right--
- left=0 (2), right=1 (7): sum=9 == 9 → return [1, 2]
Problem 2: Container With Most Water
Given heights of vertical lines, find two lines that together with the x-axis form a container that holds the most water.
public class ContainerWithMostWater {
/**
* LeetCode 11 — Container With Most Water
* Key insight: always move the shorter line — moving the taller line can only
* decrease or maintain the limiting factor (the shorter line).
* Time: O(n)
* Space: O(1)
*/
public int maxArea(int[] height) {
int left = 0;
int right = height.length - 1;
int maxWater = 0;
while (left < right) {
int width = right - left;
int h = Math.min(height[left], height[right]);
maxWater = Math.max(maxWater, width * h);
// Move the pointer with the shorter height
if (height[left] <= height[right]) {
left++;
} else {
right--;
}
}
return maxWater;
}
public static void main(String[] args) {
ContainerWithMostWater solver = new ContainerWithMostWater();
System.out.println(solver.maxArea(new int[]{1, 8, 6, 2, 5, 4, 8, 3, 7})); // 49
}
}
Why move the shorter height? The area is width × min(left, right). As you move inward, width decreases. To possibly increase area, you need min(left, right) to increase. The only way that can happen is if the limiting (shorter) height gets replaced by something taller. Moving the taller pointer can never increase min.
Problem 3: Three Sum — All Unique Triplets
Find all unique triplets in an unsorted array that sum to zero.
import java.util.*;
public class ThreeSum {
/**
* LeetCode 15 — 3Sum
* Strategy: sort, then fix one element and use two pointers on the rest.
* Time: O(n²) — O(n log n) sort + O(n) per element × O(n) elements
* Space: O(1) — ignoring output list; sort is in-place
*/
public List<List<Integer>> threeSum(int[] nums) {
Arrays.sort(nums);
List<List<Integer>> result = new ArrayList<>();
for (int i = 0; i < nums.length - 2; i++) {
// Skip duplicate values for the fixed element
if (i > 0 && nums[i] == nums[i - 1]) continue;
// Early termination: smallest possible triplet > 0
if (nums[i] > 0) break;
int left = i + 1;
int right = nums.length - 1;
int target = -nums[i];
while (left < right) {
int sum = nums[left] + nums[right];
if (sum == target) {
result.add(Arrays.asList(nums[i], nums[left], nums[right]));
// Skip duplicates for left and right
while (left < right && nums[left] == nums[left + 1]) left++;
while (left < right && nums[right] == nums[right - 1]) right--;
left++;
right--;
} else if (sum < target) {
left++;
} else {
right--;
}
}
}
return result;
}
public static void main(String[] args) {
ThreeSum solver = new ThreeSum();
System.out.println(solver.threeSum(new int[]{-1, 0, 1, 2, -1, -4}));
// [[-1, -1, 2], [-1, 0, 1]]
}
}
The duplicate-skipping logic is critical. Without the while (nums[left] == nums[left+1]) left++ lines, you'd record the same triplet multiple times when the array contains repeated values.
🛠️ How LeetCode Pattern Libraries Categorize the Two-Pointer Family
The open-source community has converged on a taxonomy of two-pointer variants. Understanding this taxonomy helps you recognize which sub-pattern applies to a new problem:
1. Opposite-end two pointers (this post): Both pointers start at the ends of a sorted array and converge inward. Applied to: pair sums, container problems, palindrome checks, sorted merge.
2. Same-direction two pointers (slow/fast): Both pointers start at the left and move right at different speeds. Applied to: removing duplicates in-place, finding the n-th node from the end of a linked list, cycle detection (Fast and Slow Pointer pattern).
3. Sliding window: One pointer anchors the left boundary; the other expands the right boundary. A shrink condition collapses the window. Applied to: longest substring problems, minimum subarray with constraint, fixed-size window aggregates.
LeetCode's public Explore card for Two Pointers groups problems by these categories. The open-source repo neetcode/roadmap organizes them as a progression from one-pass scanning to multi-pointer and sliding window.
// Canonical "remove duplicates in-place" — same-direction two pointers
public int removeDuplicates(int[] nums) {
if (nums.length == 0) return 0;
int slow = 0;
for (int fast = 1; fast < nums.length; fast++) {
if (nums[fast] != nums[slow]) {
slow++;
nums[slow] = nums[fast];
}
}
return slow + 1; // length of deduplicated prefix
}
For a deep dive on the same-direction variant with cycle detection, see the companion post on the Fast and Slow Pointer pattern.
📚 Lessons Learned: Boundary Bugs and Duplicate Blindspots
1. The off-by-one on loop termination. The condition is left < right, not left <= right. When they're equal, they both point to the same element — there's no pair to form. Using <= causes you to consider the same element twice, producing a spurious result.
2. Forgetting to sort before applying the pattern. On an unsorted array, moving left right because the sum is too small is not a valid deduction. Always verify the sorted precondition. When sorting is needed, factor O(n log n) into your complexity answer.
3. Skipping duplicate deduplication in Three Sum. This is the most common interview bug. After recording a valid triplet, both pointers must skip over repeated values before advancing. A clean way to remember: skip after recording, before moving.
4. Choosing two pointers when a hash set is simpler. Two pointers require a sort step that a hash set doesn't. For a single-query unsorted array, the hash set approach is conceptually simpler. Use two pointers when the space advantage (O(1) vs O(n)) is a stated constraint, or when you need to handle all pairs/triplets.
5. Applying the pattern to unsorted streams. Real-time data arriving in a stream can't be sorted. Two pointers don't apply to raw streams; sliding window with a monotonic deque is the right tool there.
📌 Summary and Key Takeaways
- The core idea is to place pointers at both ends of a sorted array and move them inward based on a comparison condition. Each move eliminates a column of candidates.
- Time complexity is O(n) for single-pass problems (Two Sum II, Container With Most Water) and O(n²) for three-element problems (Three Sum), with O(1) space in both cases.
- The sorting prerequisite costs O(n log n) when the input isn't pre-sorted, but the combined O(n log n) still beats O(n²) brute force.
- Duplicate skipping is mandatory in Three Sum. Forget it and you'll return duplicate triplets.
- The pattern has three flavors: opposite-end (this post), same-direction (fast/slow, sliding window), and each applies to a distinct class of problems.
- Decisive question: "Is the structure sorted and do I need to compare elements from both ends?" If yes, two pointers is your O(1)-space solution.
📝 Practice Quiz
Given sorted array
[-4, -1, -1, 0, 1, 2]and target sum0, how many total pointer movements does the two-pointer algorithm make before finding the pair(-1, 1)?- A) 1
- B) 2
- C) 3 Correct Answer: B — left starts at index 0 (-4), right at index 5 (2). Sum = -2 < 0 → left++. Now left=1 (-1), right=5 (2). Sum = 1 > 0 → right--. Now left=1 (-1), right=4 (1). Sum = 0 → found. Two movements before the match.
Container With Most Water uses the two-pointer technique. Why do you always move the pointer with the shorter height rather than the taller one?
- A) Moving the shorter pointer maintains the loop invariant
left < right. - B) Moving the shorter pointer is the only way to possibly increase the min-height factor, which is the bottleneck on area.
- C) Moving the taller pointer would cause integer overflow. Correct Answer: B — Area = width × min(height[left], height[right]). Moving the taller pointer can only keep min the same or reduce it, because the shorter pointer still limits min. Moving the shorter pointer gives a chance of finding a taller one.
- A) Moving the shorter pointer maintains the loop invariant
In Three Sum, after recording a valid triplet
[nums[i], nums[left], nums[right]], why must you skip duplicate values before advancing the pointers?- A) To maintain the sorted invariant.
- B) To avoid recording the same triplet multiple times when the array contains repeated values.
- C) To prevent the left pointer from overtaking the right pointer. Correct Answer: B — Without skipping, if nums[left+1] == nums[left], advancing left by one would immediately find the same triplet again.
Open-ended challenge: You are given a sorted array and asked to find all unique quadruplets that sum to a target value. How would you extend the Three Sum two-pointer approach to Four Sum? What is the time complexity of your approach, and is it optimal? When would a hash-based approach outperform it?
🔗 Related Posts
- Fast and Slow Pointer (Floyd's Cycle Detection) — the same-direction variant of two pointers, specialized for linked-list cycles and midpoint finding.
- Sliding Window Technique — extends the two-pointer idea with a running aggregate for subarray and substring problems.
- Backtracking Explained — a complementary search technique for when exhaustive exploration with pruning is needed instead of linear scanning.

Written by
Abstract Algorithms
@abstractalgorithms
More Posts
Redis Sorted Sets Explained: Skip Lists, Scores, and Real-World Use Cases
TLDR: Redis Sorted Sets (ZSETs) store unique members each paired with a floating-point score, kept in sorted order at all times. Internally they use a skip list for O(log N) range queries and a hash table for O(1) score lookup — giving you the best o...
Write-Time vs Read-Time Fan-Out: How Social Feeds Scale
TLDR: Fan-out is the act of distributing one post to many followers' feeds. Write-time fan-out (push) pre-computes feeds at post time — fast reads but catastrophic write amplification for celebrities. Read-time fan-out (pull) computes feeds on demand...

Tries (Prefix Trees): The Data Structure Behind Autocomplete
TLDR: A Trie stores strings character by character in a tree, so every string sharing a common prefix shares those nodes. Insert and search are O(L) where L is the word length. Tries beat HashMaps on
