Binary TreesBinary Trees36
  1. 1Preorder Traversal of a Binary Tree using Recursion
  2. 2Preorder Traversal of a Binary Tree using Iteration
  3. 3Inorder Traversal of a Binary Tree using Recursion
  4. 4Inorder Traversal of a Binary Tree using Iteration
  5. 5Postorder Traversal of a Binary Tree Using Recursion
  6. 6Postorder Traversal of a Binary Tree using Iteration
  7. 7Level Order Traversal of a Binary Tree using Recursion
  8. 8Level Order Traversal of a Binary Tree using Iteration
  9. 9Reverse Level Order Traversal of a Binary Tree using Iteration
  10. 10Reverse Level Order Traversal of a Binary Tree using Recursion
  11. 11Find Height of a Binary Tree
  12. 12Find Diameter of a Binary Tree
  13. 13Find Mirror of a Binary Tree
  14. 14Left View of a Binary Tree
  15. 15Right View of a Binary Tree
  16. 16Top View of a Binary Tree
  17. 17Bottom View of a Binary Tree
  18. 18Zigzag Traversal of a Binary Tree
  19. 19Check if a Binary Tree is Balanced
  20. 20Diagonal Traversal of a Binary Tree
  21. 21Boundary Traversal of a Binary Tree
  22. 22Construct a Binary Tree from a String with Bracket Representation
  23. 23Convert a Binary Tree into a Doubly Linked List
  24. 24Convert a Binary Tree into a Sum Tree
  25. 25Find Minimum Swaps Required to Convert a Binary Tree into a BST
  26. 26Check if a Binary Tree is a Sum Tree
  27. 27Check if All Leaf Nodes are at the Same Level in a Binary Tree
  28. 28Lowest Common Ancestor (LCA) in a Binary Tree
  29. 29Solve the Tree Isomorphism Problem
  30. 30Check if a Binary Tree Contains Duplicate Subtrees of Size 2 or More
  31. 31Check if Two Binary Trees are Mirror Images
  32. 32Calculate the Sum of Nodes on the Longest Path from Root to Leaf in a Binary Tree
  33. 33Print All Paths in a Binary Tree with a Given Sum
  34. 34Find the Distance Between Two Nodes in a Binary Tree
  35. 35Find the kth Ancestor of a Node in a Binary Tree
  36. 36Find All Duplicate Subtrees in a Binary Tree
GraphsGraphs46
  1. 1Breadth-First Search in Graphs
  2. 2Depth-First Search in Graphs
  3. 3Number of Provinces in an Undirected Graph
  4. 4Connected Components in a Matrix
  5. 5Rotten Oranges Problem - BFS in Matrix
  6. 6Flood Fill Algorithm - Graph Based
  7. 7Detect Cycle in an Undirected Graph using DFS
  8. 8Detect Cycle in an Undirected Graph using BFS
  9. 9Distance of Nearest Cell Having 1 - Grid BFS
  10. 10Surrounded Regions in Matrix using Graph Traversal
  11. 11Number of Enclaves in Grid
  12. 12Word Ladder - Shortest Transformation using Graph
  13. 13Word Ladder II - All Shortest Transformation Sequences
  14. 14Number of Distinct Islands using DFS
  15. 15Check if a Graph is Bipartite using DFS
  16. 16Topological Sort Using DFS
  17. 17Topological Sort using Kahn's Algorithm
  18. 18Cycle Detection in Directed Graph using BFS
  19. 19Course Schedule - Task Ordering with Prerequisites
  20. 20Course Schedule 2 - Task Ordering Using Topological Sort
  21. 21Find Eventual Safe States in a Directed Graph
  22. 22Alien Dictionary Character Order
  23. 23Shortest Path in Undirected Graph with Unit Distance
  24. 24Shortest Path in DAG using Topological Sort
  25. 25Dijkstra's Algorithm Using Set - Shortest Path in Graph
  26. 26Dijkstra’s Algorithm Using Priority Queue
  27. 27Shortest Distance in a Binary Maze using BFS
  28. 28Path With Minimum Effort in Grid using Graphs
  29. 29Cheapest Flights Within K Stops - Graph Problem
  30. 30Number of Ways to Reach Destination in Shortest Time - Graph Problem
  31. 31Minimum Multiplications to Reach End - Graph BFS
  32. 32Bellman-Ford Algorithm for Shortest Paths
  33. 33Floyd Warshall Algorithm for All-Pairs Shortest Path
  34. 34Find the City With the Fewest Reachable Neighbours
  35. 35Minimum Spanning Tree in Graphs
  36. 36Prim's Algorithm for Minimum Spanning Tree
  37. 37Disjoint Set (Union-Find) with Union by Rank and Path Compression
  38. 38Kruskal's Algorithm - Minimum Spanning Tree
  39. 39Minimum Operations to Make Network Connected
  40. 40Most Stones Removed with Same Row or Column
  41. 41Accounts Merge Problem using Disjoint Set Union
  42. 42Number of Islands II - Online Queries using DSU
  43. 43Making a Large Island Using DSU
  44. 44Bridges in Graph using Tarjan's Algorithm
  45. 45Articulation Points in Graphs
  46. 46Strongly Connected Components using Kosaraju's Algorithm

Two Pointers Technique in DSA | Patterns & Examples

What is the Two Pointers Technique?

The Two Pointers Technique is used to solve problems on linear data structures (like arrays or linked lists). The idea is to use two indices (pointers) to traverse the structure from different directions or speeds to find a desired condition or optimize a process.

This technique is especially effective when dealing with:

  • Sorted arrays
  • Linked lists
  • Problems involving subarrays, pairs, or merging

When to Use Two Pointers?

  • You need to search for a pair or subarray with a given property (e.g., sum = target)
  • You need to perform in-place operations (like reversing or removing elements)
  • You need to compare elements from both ends (e.g., palindrome checks)

Common Types of Two Pointers

  • Opposite Direction: One pointer starts from the beginning, another from the end. Used for pair finding, reversal, etc.
  • Same Direction: Both pointers move from the start. Often used in sliding window problems.

Example 1: Check if Array Has a Pair with Given Sum

Problem Statement: You are given a sorted array of integers and a target sum. Your task is to check whether there exists a pair of numbers in the array whose sum is equal to the target.

Why Use Two Pointers?

Since the array is sorted, we can avoid using nested loops (which would take O(n²) time) by using the two pointers technique to reduce the time complexity to O(n).

The idea is to place one pointer at the start of the array and the other at the end. We move the pointers inward based on the current sum:

  • If the sum is too small, move the left pointer to the right to increase the sum.
  • If the sum is too large, move the right pointer to the left to decrease the sum.
  • If the sum equals the target, we've found our pair!

Pseudocode

// Input: sorted array and target sum
left = 0
right = array.length - 1

while (left < right):
    sum = array[left] + array[right]
    if (sum == target):
        return true
    else if (sum < target):
        left += 1
    else:
        right -= 1

return false

Dry Run with Sample Input

Input:

array = [1, 2, 3, 4, 6, 8, 11]
target = 10
leftrightarray[left]array[right]sumAction
0611112Sum > 10 → Decrease right
05189Sum < 10 → Increase left
152810Sum = 10 → Found!

Result: True (Pair found: 2 and 8)

Step-by-Step Explanation

  1. Start with left = 0 and right = 6 (pointing to 1 and 11).
  2. Sum = 1 + 11 = 12. Too big → move right left (to 8).
  3. Now sum = 1 + 8 = 9. Too small → move left right (to 2).
  4. Now sum = 2 + 8 = 10 → Target found!

Time and Space Complexity

  • Time Complexity: O(n) — Each pointer moves at most n times.
  • Space Complexity: O(1) — Only a few variables used, no extra data structures.

Why This Works Only on Sorted Arrays?

Two pointers rely on the ability to determine whether to increase or decrease the sum by moving a pointer. That only works when the array is sorted — otherwise, increasing or decreasing a pointer won't have predictable effects on the sum.

The two pointers technique is a simple yet powerful way to avoid nested loops and get a linear time solution. It takes advantage of the sorted property to move intelligently and find the answer efficiently.

Example 2: Reverse an Array In-Place

Problem: Given an array, reverse its elements in-place (without using any extra array).

Why Two Pointers?

The Two Pointers Technique is perfect for this problem because we need to swap elements from both ends of the array until we meet in the middle. We don't need to scan the array multiple times — just once, with two pointers approaching each other from opposite sides.

  • left pointer starts at the beginning (index 0).
  • right pointer starts at the end (last index).
  • We swap the elements at these two positions.
  • Then move the pointers towards each other: left += 1, right -= 1.
  • Repeat until left >= right.

Pseudocode

left = 0
right = array.length - 1

while (left < right):
    swap(array[left], array[right])
    left += 1
    right -= 1

Sample Input

array = [10, 20, 30, 40, 50]

Dry Run (Step-by-Step)

Step Left Pointer Right Pointer Swap Array State
Initial 0 (10) 4 (50) 10 <--> 50 [50, 20, 30, 40, 10]
After 1st swap 1 (20) 3 (40) 20 <--> 40 [50, 40, 30, 20, 10]
After 2nd swap 2 (30) 2 (30) Stop (left == right) [50, 40, 30, 20, 10]

Final Output

[50, 40, 30, 20, 10]

Time and Space Complexity

  • Time Complexity: O(n), where n is the number of elements in the array. Each element is visited once.
  • Space Complexity: O(1), since we reverse the array in-place using constant extra space.

This is a classic example where the Two Pointers Technique simplifies the problem:

  • No need to create a new array.
  • No nested loops — just a single loop with two moving pointers.
  • Efficient and clean solution for beginners to understand array manipulation.

Example 3: Remove Duplicates from Sorted Array

Problem: Given a sorted array, remove the duplicates in-place so that each element appears only once and return the new length of the array. The relative order of the elements should be kept the same, and you should not use extra space (O(1) space).

Why Two Pointers?

This is a classic case for the Two Pointers Technique. The idea is simple but very efficient:

  • Use one pointer (i) to keep track of the last unique element (also called the "slow" pointer).
  • Use the second pointer (j) to scan the array for the next unique element (also called the "fast" pointer).

Since the array is already sorted, duplicates will always be adjacent. So, we can move the fast pointer and compare each element with the last unique value.

When we find a new unique element, we increment the slow pointer and overwrite the duplicate position with the new unique value.

Pseudocode

if array is empty:
    return 0

i = 0  // slow pointer

for j from 1 to array.length - 1:
    if array[j] != array[i]:
        i += 1
        array[i] = array[j]

return i + 1

Step-by-Step Dry Run

Let's walk through an example:

Input: [10, 10, 20, 20, 30, 40, 40]

j (fast) i (slow) array[j] array[i] Action Array State
101010Duplicate → skip[10, 10, 20, 20, 30, 40, 40]
202010New value → i++ → array[1] = 20[10, 20, 20, 20, 30, 40, 40]
312020Duplicate → skip[10, 20, 20, 20, 30, 40, 40]
413020New value → i++ → array[2] = 30[10, 20, 30, 20, 30, 40, 40]
524030New value → i++ → array[3] = 40[10, 20, 30, 40, 30, 40, 40]
634040Duplicate → skip[10, 20, 30, 40, 30, 40, 40]

Final value of i: 3

New length: i + 1 = 4

Modified Array: [10, 20, 30, 40] (first 4 elements)

Explanation

  • i always points to the last unique value found.
  • j explores the array to find the next non-duplicate value.
  • When a new unique value is found, we increment i and copy array[j] to array[i].

Time and Space Complexity

  • Time Complexity: O(n) — one pass through the array.
  • Space Complexity: O(1) — in-place update, no extra storage used.

Why Two Pointers Works Here

By separating the roles of the two pointers:

  • The slow pointer tracks the position of the next unique value to be placed.
  • The fast pointer scans for upcoming unique values.

This method ensures no duplicate is placed in the first part of the array, and the array is modified in-place with minimal code.

Advantages of Two Pointers

  • Efficient: Reduces time complexity from O(n²) to O(n) for many problems.
  • In-Place: Often allows modifying arrays without extra space.
  • Intuitive: Easy to implement once the pattern is understood.

Limitations

  • Works best on sorted arrays or data structures where order can be exploited.
  • Requires careful index management to avoid bugs.
  • Not suitable for problems that require random access across the array.

Conclusion

The Two Pointers Technique is a simple yet powerful strategy in DSA. It offers elegant solutions to a variety of problems with linear time complexity. Once you learn to spot scenarios where two pointers can be applied, you'll find your problem-solving speed and accuracy improving significantly.