
- 1Overview of DSA Problem Solving Techniques
- 2Brute Force Technique in DSA
- 3Greedy Algorithm Technique in DSA
- 4Divide and Conquer Technique in DSA
- 5Dynamic Programming Technique in DSA
- 6Backtracking Technique in DSA
- 7Recursion Technique in DSA
- 8Sliding Window Technique in DSA
- 9Two Pointers Technique
- 10Binary Search Technique
- 11Tree / Graph Traversal Technique in DSA
- 12Bit Manipulation Technique in DSA
- 13Hashing Technique
- 14Heaps Technique in DSA

- 1Find Maximum and Minimum in Array using Loop
- 2Find Second Largest in Array
- 3Find Second Smallest in Array
- 4Reverse Array using Two Pointers
- 5Check if Array is Sorted
- 6Remove Duplicates from Sorted Array
- 7Left Rotate an Array by One Place
- 8Left Rotate an Array by K Places
- 9Move Zeroes in Array to End
- 10Linear Search in Array
- 11Union of Two Arrays
- 12Find Missing Number in Array
- 13Max Consecutive Ones in Array
- 14Find Kth Smallest Element
- 15Longest Subarray with Given Sum (Positives)
- 16Longest Subarray with Given Sum (Positives and Negatives)
- 17Find Majority Element in Array (more than n/2 times)
- 18Find Majority Element in Array (more than n/3 times)
- 19Maximum Subarray Sum using Kadane's Algorithm
- 20Print Subarray with Maximum Sum
- 21Stock Buy and Sell
- 22Rearrange Array Alternating Positive and Negative Elements
- 23Next Permutation of Array
- 24Leaders in an Array
- 25Longest Consecutive Sequence in Array
- 26Count Subarrays with Given Sum
- 27Sort an Array of 0s, 1s, and 2s
- 28Two Sum Problem
- 29Three Sum Problem
- 304 Sum Problem
- 31Find Length of Largest Subarray with 0 Sum
- 32Find Maximum Product Subarray

- 1Binary Search in Array using Iteration
- 2Find Lower Bound in Sorted Array
- 3Find Upper Bound in Sorted Array
- 4Search Insert Position in Sorted Array (Lower Bound Approach)
- 5Floor and Ceil in Sorted Array
- 6First Occurrence in a Sorted Array
- 7Last Occurrence in a Sorted Array
- 8Count Occurrences in Sorted Array
- 9Search Element in a Rotated Sorted Array
- 10Search in Rotated Sorted Array with Duplicates
- 11Minimum in Rotated Sorted Array
- 12Find Rotation Count in Sorted Array
- 13Search Single Element in Sorted Array
- 14Find Peak Element in Array
- 15Square Root using Binary Search
- 16Nth Root of a Number using Binary Search
- 17Koko Eating Bananas
- 18Minimum Days to Make M Bouquets
- 19Find the Smallest Divisor Given a Threshold
- 20Capacity to Ship Packages within D Days
- 21Kth Missing Positive Number
- 22Aggressive Cows Problem
- 23Allocate Minimum Number of Pages
- 24Split Array - Minimize Largest Sum
- 25Painter's Partition Problem
- 26Minimize Maximum Distance Between Gas Stations
- 27Median of Two Sorted Arrays of Different Sizes
- 28K-th Element of Two Sorted Arrays

- 1Reverse Words in a String
- 2Find the Largest Odd Number in a Numeric String
- 3Find Longest Common Prefix in Array of Strings
- 4Check If Two Strings Are Isomorphic - Optimal HashMap Solution
- 5Check String Rotation using Concatenation - Optimal Approach
- 6Check if Two Strings Are Anagrams - Optimal Approach
- 7Sort Characters by Frequency - Optimal HashMap and Heap Approach
- 8Find Longest Palindromic Substring - Dynamic Programming Approach
- 9Find Longest Palindromic Substring Without Dynamic Programming
- 10Remove Outermost Parentheses in String
- 11Find Maximum Nesting Depth of Parentheses - Optimal Stack-Free Solution
- 12Convert Roman Numerals to Integer - Efficient Approach
- 13Convert Integer to Roman Numeral - Step-by-Step for Beginners
- 14Implement Atoi - Convert String to Integer in Java
- 15Count Number of Substrings in a String - Explanation with Formula
- 16Edit Distance Problem
- 17Calculate Sum of Beauty of All Substrings - Optimal Approach
- 18Reverse Each Word in a String - Optimal Approach


- 1Preorder Traversal of a Binary Tree using Recursion
- 2Preorder Traversal of a Binary Tree using Iteration
- 3Inorder Traversal of a Binary Tree using Recursion
- 4Inorder Traversal of a Binary Tree using Iteration
- 5Postorder Traversal of a Binary Tree Using Recursion
- 6Postorder Traversal of a Binary Tree using Iteration
- 7Level Order Traversal of a Binary Tree using Recursion
- 8Level Order Traversal of a Binary Tree using Iteration
- 9Reverse Level Order Traversal of a Binary Tree using Iteration
- 10Reverse Level Order Traversal of a Binary Tree using Recursion
- 11Find Height of a Binary Tree
- 12Find Diameter of a Binary Tree
- 13Find Mirror of a Binary Tree
- 14Left View of a Binary Tree
- 15Right View of a Binary Tree
- 16Top View of a Binary Tree
- 17Bottom View of a Binary Tree
- 18Zigzag Traversal of a Binary Tree
- 19Check if a Binary Tree is Balanced
- 20Diagonal Traversal of a Binary Tree
- 21Boundary Traversal of a Binary Tree
- 22Construct a Binary Tree from a String with Bracket Representation
- 23Convert a Binary Tree into a Doubly Linked List
- 24Convert a Binary Tree into a Sum Tree
- 25Find Minimum Swaps Required to Convert a Binary Tree into a BST
- 26Check if a Binary Tree is a Sum Tree
- 27Check if All Leaf Nodes are at the Same Level in a Binary Tree
- 28Lowest Common Ancestor (LCA) in a Binary Tree
- 29Solve the Tree Isomorphism Problem
- 30Check if a Binary Tree Contains Duplicate Subtrees of Size 2 or More
- 31Check if Two Binary Trees are Mirror Images
- 32Calculate the Sum of Nodes on the Longest Path from Root to Leaf in a Binary Tree
- 33Print All Paths in a Binary Tree with a Given Sum
- 34Find the Distance Between Two Nodes in a Binary Tree
- 35Find the kth Ancestor of a Node in a Binary Tree
- 36Find All Duplicate Subtrees in a Binary Tree

- 1Find a Value in a Binary Search Tree
- 2Delete a Node in a Binary Search Tree
- 3Find the Minimum Value in a Binary Search Tree
- 4Find the Maximum Value in a Binary Search Tree
- 5Find the Inorder Successor in a Binary Search Tree
- 6Find the Inorder Predecessor in a Binary Search Tree
- 7Check if a Binary Tree is a Binary Search Tree
- 8Find the Lowest Common Ancestor of Two Nodes in a Binary Search Tree
- 9Convert a Binary Tree into a Binary Search Tree
- 10Balance a Binary Search Tree
- 11Merge Two Binary Search Trees
- 12Find the kth Largest Element in a Binary Search Tree
- 13Find the kth Smallest Element in a Binary Search Tree
- 14Flatten a Binary Search Tree into a Sorted List

- 1Breadth-First Search in Graphs
- 2Depth-First Search in Graphs
- 3Number of Provinces in an Undirected Graph
- 4Connected Components in a Matrix
- 5Rotten Oranges Problem - BFS in Matrix
- 6Flood Fill Algorithm - Graph Based
- 7Detect Cycle in an Undirected Graph using DFS
- 8Detect Cycle in an Undirected Graph using BFS
- 9Distance of Nearest Cell Having 1 - Grid BFS
- 10Surrounded Regions in Matrix using Graph Traversal
- 11Number of Enclaves in Grid
- 12Word Ladder - Shortest Transformation using Graph
- 13Word Ladder II - All Shortest Transformation Sequences
- 14Number of Distinct Islands using DFS
- 15Check if a Graph is Bipartite using DFS
- 16Topological Sort Using DFS
- 17Topological Sort using Kahn's Algorithm
- 18Cycle Detection in Directed Graph using BFS
- 19Course Schedule - Task Ordering with Prerequisites
- 20Course Schedule 2 - Task Ordering Using Topological Sort
- 21Find Eventual Safe States in a Directed Graph
- 22Alien Dictionary Character Order
- 23Shortest Path in Undirected Graph with Unit Distance
- 24Shortest Path in DAG using Topological Sort
- 25Dijkstra's Algorithm Using Set - Shortest Path in Graph
- 26Dijkstra’s Algorithm Using Priority Queue
- 27Shortest Distance in a Binary Maze using BFS
- 28Path With Minimum Effort in Grid using Graphs
- 29Cheapest Flights Within K Stops - Graph Problem
- 30Number of Ways to Reach Destination in Shortest Time - Graph Problem
- 31Minimum Multiplications to Reach End - Graph BFS
- 32Bellman-Ford Algorithm for Shortest Paths
- 33Floyd Warshall Algorithm for All-Pairs Shortest Path
- 34Find the City With the Fewest Reachable Neighbours
- 35Minimum Spanning Tree in Graphs
- 36Prim's Algorithm for Minimum Spanning Tree
- 37Disjoint Set (Union-Find) with Union by Rank and Path Compression
- 38Kruskal's Algorithm - Minimum Spanning Tree
- 39Minimum Operations to Make Network Connected
- 40Most Stones Removed with Same Row or Column
- 41Accounts Merge Problem using Disjoint Set Union
- 42Number of Islands II - Online Queries using DSU
- 43Making a Large Island Using DSU
- 44Bridges in Graph using Tarjan's Algorithm
- 45Articulation Points in Graphs
- 46Strongly Connected Components using Kosaraju's Algorithm
Two Pointers Technique in DSA | Patterns & Examples
What is the Two Pointers Technique?
The Two Pointers Technique is used to solve problems on linear data structures (like arrays or linked lists). The idea is to use two indices (pointers) to traverse the structure from different directions or speeds to find a desired condition or optimize a process.
This technique is especially effective when dealing with:
- Sorted arrays
- Linked lists
- Problems involving subarrays, pairs, or merging
When to Use Two Pointers?
- You need to search for a pair or subarray with a given property (e.g., sum = target)
- You need to perform in-place operations (like reversing or removing elements)
- You need to compare elements from both ends (e.g., palindrome checks)
Common Types of Two Pointers
- Opposite Direction: One pointer starts from the beginning, another from the end. Used for pair finding, reversal, etc.
- Same Direction: Both pointers move from the start. Often used in sliding window problems.
Example 1: Check if Array Has a Pair with Given Sum
Problem Statement: You are given a sorted array of integers and a target sum. Your task is to check whether there exists a pair of numbers in the array whose sum is equal to the target.
Why Use Two Pointers?
Since the array is sorted, we can avoid using nested loops (which would take O(n²) time) by using the two pointers technique to reduce the time complexity to O(n).
The idea is to place one pointer at the start of the array and the other at the end. We move the pointers inward based on the current sum:
- If the sum is too small, move the
left
pointer to the right to increase the sum. - If the sum is too large, move the
right
pointer to the left to decrease the sum. - If the sum equals the target, we've found our pair!
Pseudocode
// Input: sorted array and target sum
left = 0
right = array.length - 1
while (left < right):
sum = array[left] + array[right]
if (sum == target):
return true
else if (sum < target):
left += 1
else:
right -= 1
return false
Dry Run with Sample Input
Input:
array = [1, 2, 3, 4, 6, 8, 11] target = 10
left | right | array[left] | array[right] | sum | Action |
---|---|---|---|---|---|
0 | 6 | 1 | 11 | 12 | Sum > 10 → Decrease right |
0 | 5 | 1 | 8 | 9 | Sum < 10 → Increase left |
1 | 5 | 2 | 8 | 10 | Sum = 10 → Found! |
Result: True (Pair found: 2 and 8)
Step-by-Step Explanation
- Start with
left = 0
andright = 6
(pointing to 1 and 11). - Sum = 1 + 11 = 12. Too big → move
right
left (to 8). - Now sum = 1 + 8 = 9. Too small → move
left
right (to 2). - Now sum = 2 + 8 = 10 → Target found!
Time and Space Complexity
- Time Complexity: O(n) — Each pointer moves at most n times.
- Space Complexity: O(1) — Only a few variables used, no extra data structures.
Why This Works Only on Sorted Arrays?
Two pointers rely on the ability to determine whether to increase or decrease the sum by moving a pointer. That only works when the array is sorted — otherwise, increasing or decreasing a pointer won't have predictable effects on the sum.
The two pointers technique is a simple yet powerful way to avoid nested loops and get a linear time solution. It takes advantage of the sorted property to move intelligently and find the answer efficiently.
Example 2: Reverse an Array In-Place
Problem: Given an array, reverse its elements in-place (without using any extra array).
Why Two Pointers?
The Two Pointers Technique is perfect for this problem because we need to swap elements from both ends of the array until we meet in the middle. We don't need to scan the array multiple times — just once, with two pointers approaching each other from opposite sides.
left
pointer starts at the beginning (index 0).right
pointer starts at the end (last index).- We swap the elements at these two positions.
- Then move the pointers towards each other:
left += 1
,right -= 1
. - Repeat until
left >= right
.
Pseudocode
left = 0
right = array.length - 1
while (left < right):
swap(array[left], array[right])
left += 1
right -= 1
Sample Input
array = [10, 20, 30, 40, 50]
Dry Run (Step-by-Step)
Step | Left Pointer | Right Pointer | Swap | Array State |
---|---|---|---|---|
Initial | 0 (10) | 4 (50) | 10 <--> 50 | [50, 20, 30, 40, 10] |
After 1st swap | 1 (20) | 3 (40) | 20 <--> 40 | [50, 40, 30, 20, 10] |
After 2nd swap | 2 (30) | 2 (30) | Stop (left == right) | [50, 40, 30, 20, 10] |
Final Output
[50, 40, 30, 20, 10]
Time and Space Complexity
- Time Complexity: O(n), where n is the number of elements in the array. Each element is visited once.
- Space Complexity: O(1), since we reverse the array in-place using constant extra space.
This is a classic example where the Two Pointers Technique simplifies the problem:
- No need to create a new array.
- No nested loops — just a single loop with two moving pointers.
- Efficient and clean solution for beginners to understand array manipulation.
Example 3: Remove Duplicates from Sorted Array
Problem: Given a sorted array, remove the duplicates in-place so that each element appears only once and return the new length of the array. The relative order of the elements should be kept the same, and you should not use extra space (O(1) space).
Why Two Pointers?
This is a classic case for the Two Pointers Technique. The idea is simple but very efficient:
- Use one pointer (
i
) to keep track of the last unique element (also called the "slow" pointer). - Use the second pointer (
j
) to scan the array for the next unique element (also called the "fast" pointer).
Since the array is already sorted, duplicates will always be adjacent. So, we can move the fast pointer and compare each element with the last unique value.
When we find a new unique element, we increment the slow pointer and overwrite the duplicate position with the new unique value.
Pseudocode
if array is empty:
return 0
i = 0 // slow pointer
for j from 1 to array.length - 1:
if array[j] != array[i]:
i += 1
array[i] = array[j]
return i + 1
Step-by-Step Dry Run
Let's walk through an example:
Input: [10, 10, 20, 20, 30, 40, 40]
j (fast) | i (slow) | array[j] | array[i] | Action | Array State |
---|---|---|---|---|---|
1 | 0 | 10 | 10 | Duplicate → skip | [10, 10, 20, 20, 30, 40, 40] |
2 | 0 | 20 | 10 | New value → i++ → array[1] = 20 | [10, 20, 20, 20, 30, 40, 40] |
3 | 1 | 20 | 20 | Duplicate → skip | [10, 20, 20, 20, 30, 40, 40] |
4 | 1 | 30 | 20 | New value → i++ → array[2] = 30 | [10, 20, 30, 20, 30, 40, 40] |
5 | 2 | 40 | 30 | New value → i++ → array[3] = 40 | [10, 20, 30, 40, 30, 40, 40] |
6 | 3 | 40 | 40 | Duplicate → skip | [10, 20, 30, 40, 30, 40, 40] |
Final value of i: 3
New length: i + 1 = 4
Modified Array: [10, 20, 30, 40]
(first 4 elements)
Explanation
i
always points to the last unique value found.j
explores the array to find the next non-duplicate value.- When a new unique value is found, we increment
i
and copyarray[j]
toarray[i]
.
Time and Space Complexity
- Time Complexity: O(n) — one pass through the array.
- Space Complexity: O(1) — in-place update, no extra storage used.
Why Two Pointers Works Here
By separating the roles of the two pointers:
- The slow pointer tracks the position of the next unique value to be placed.
- The fast pointer scans for upcoming unique values.
This method ensures no duplicate is placed in the first part of the array, and the array is modified in-place with minimal code.
Advantages of Two Pointers
- Efficient: Reduces time complexity from O(n²) to O(n) for many problems.
- In-Place: Often allows modifying arrays without extra space.
- Intuitive: Easy to implement once the pattern is understood.
Limitations
- Works best on sorted arrays or data structures where order can be exploited.
- Requires careful index management to avoid bugs.
- Not suitable for problems that require random access across the array.
Conclusion
The Two Pointers Technique is a simple yet powerful strategy in DSA. It offers elegant solutions to a variety of problems with linear time complexity. Once you learn to spot scenarios where two pointers can be applied, you'll find your problem-solving speed and accuracy improving significantly.