Big O
Big O Notation#
Big O notation describes how algorithm performance scales with input size. It focuses on growth rate rather than exact execution times, ignoring constants and lower-order terms to reveal fundamental scalability characteristics.
For example, an algorithm taking 3n² + 5n + 2 operations is classified as O(n²) because the n² term dominates as n grows large.
Core Concepts#
graph TD
A[Big O Analysis] --> B[Growth Rate Focus]
A --> C[Algorithm Comparison]
A --> D[Scalability Prediction]
B --> E[Asymptotic Behavior]
B --> F[Ignore Constants]
C --> G[Time Complexity]
C --> H[Space Complexity]
style A fill:#e3f2fd
style B fill:#c8e6c9
style D fill:#fff3e0
Why Big O Matters:
- Performance Prediction: Predict behavior with large datasets before implementation
- Algorithm Comparison: Standardized framework for comparing approaches
- Scalability Analysis: Determine if solutions will work as data scales
- Resource Planning: Estimate computational and memory requirements
Common Complexity Classes#
- O(1) - Constant Time: Performance remains constant regardless of input size.
- O(log n) - Logarithmic Time: Performance grows slowly as input size increases exponentially.
- O(n) - Linear Time: Performance scales directly with input size.
- O(n log n) - Linearithmic Time: Common in efficient sorting algorithms.
- O(n²) - Quadratic Time: Performance grows with the square of input size.
- O(2ⁿ), O(n!) - Exponential/Factorial Time: Performance explodes with input size.
Performance Comparison#
graph TD
A["Input Size: 1,000"] --> B1["O(1): 1 unit of work"]
A --> B2["O(n): 1,000 units of work"]
A --> B3["O(n²): 1,000,000 units of work"]
style B1 fill:#c8e6c9
style B2 fill:#fff3e0
style B3 fill:#ffcdd2
graph TD
C["Input Size: 1,000,000"] --> D1["O(1): 1 unit of work"]
C --> D2["O(n): 1,000,000 units of work"]
C --> D3["O(n²): 1,000,000,000,000 units of work"]
style D1 fill:#c8e6c9
style D2 fill:#fff3e0
style D3 fill:#ffcdd2
Algorithm Examples by Complexity#
graph TD
A[Algorithm Complexity] --> B["O(1) - Constant"]
A --> C["O(log n) - Logarithmic"]
A --> D["O(n) - Linear"]
A --> E["O(n²) - Quadratic"]
B --> F["Array Access<br/>Hash Table Lookup"]
C --> G["Binary Search<br/>Tree Operations"]
D --> H["Linear Search<br/>Array Traversal"]
E --> I["Bubble Sort<br/>Nested Loops"]
style B fill:#c8e6c9
style C fill:#e3f2fd
style D fill:#fff3e0
style E fill:#ffcdd2
Time vs Space Complexity#
Time complexity refers to how the execution time of an algorithm scales as the size of its input increases, while space complexity describes how the memory usage grows with input size. Often, there are trade-offs between the two: an algorithm may use more memory (space) to achieve faster execution (reducing time complexity), or it may use less memory at the expense of longer execution times.
Practical Examples#
O(1) - Constant Time Access: Array element access takes the same time regardless of array size.
Array access is O(1) because the memory address is calculated directly: base_address + (index × element_size).
O(n) - Linear Time: Finding the maximum element requires examining every element once.
Linear time complexity: examine each element once. Double the array size, double the execution time.
O(log n) - Logarithmic Time: Binary search eliminates half the search space in each step.
public static int binarySearch(int[] arr, int target) {
int left = 0, right = arr.length - 1;
while (left <= right) { // O(log n)
int mid = left + (right - left) / 2; // Prevent overflow
if (arr[mid] == target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1; // Not found
}
function binarySearch(arr: number[], target: number): number {
let left = 0, right = arr.length - 1;
while (left <= right) { // O(log n)
const mid = Math.floor(left + (right - left) / 2);
if (arr[mid] === target) return mid;
if (arr[mid] < target) left = mid + 1;
else right = mid - 1;
}
return -1; // Not found
}
For 1,000,000 elements, binary search takes at most 20 comparisons (log₂(1,000,000) ≈ 20).
Practical Guidelines#
- Avoid O(n²) algorithms for large datasets (n > 1000)
- Use O(log n) when possible - binary search on sorted data
- Trade space for time - hash tables for O(1) lookups
- Measure real performance - Big O ignores constants that matter for small inputs
- Consider average vs worst case - searching in an unsorted vs sorted list
Use Big O to compare approaches and predict scalability, but always measure actual performance for your specific use case.