Time Complexity of Array Operations


When working with arrays, it is essential to understand the time complexity of common operations. This helps in writing efficient code, especially when dealing with large datasets. Time complexity represents how the runtime of an operation grows as the size of the array (N) increases.

Summary Table

Operation Best Case Average Case Worst Case
Insertion (by index) O(1) O(N) O(N)
Deletion (by index) O(N) O(N) O(N)
Traversal O(N) O(N) O(N)
Update (by index) O(1) O(1) O(1)
Search (Linear) O(1) O(N) O(N)
Search (Binary - Sorted) O(1) O(log N) O(log N)

1. Insertion

Insertion adds a new element to the array. The time complexity depends on where the new element is inserted.

2. Deletion

Deleting an element involves removing it and shifting subsequent elements to fill the gap. This is true for static and dynamic arrays.

3. Traversal

Traversal means visiting every element of the array, often using a loop. Regardless of data values or order, the number of steps required is equal to the number of elements.

4. Update

Updating an element means replacing the value at a known index. Since arrays support direct indexing, this operation is done in constant time.

5. Search (Linear)

Linear search scans each element one-by-one from left to right until the target is found or the array ends.

6. Search (Binary, Sorted Array)

Binary search only works on sorted arrays. It repeatedly halves the search space by comparing the middle element, achieving logarithmic performance.

Note: Binary search requires the array to be sorted beforehand. If sorting is needed, that cost (usually O(N log N)) must be considered separately.