Time Complexity
Time complexity measures how many steps an algorithm takes as function of input size.
Simple Analogy
Time complexity is like:
- Recipe cooking time: As ingredients increase, how does time change?
- Travel time: How long to visit n cities?
Formal Definition
TM M has time complexity T(n) if:
- On any input of length n
- M halts within T(n) steps
- For all inputs
Running time: Maximum steps over all inputs of size n
Measuring Time on TM
Single-Tape TM
One step = one transition
Count: Every time δ is applied
Example:
q₀abc ⊢ Xq₁bc (1 step)
⊢ XYq₂c (2 steps total)
⊢ XYZqaccept (3 steps total)
Time complexity: T(3) = 3
Multi-Tape TM
One step = one parallel move of all heads
Advantage: Can be faster!
Example: Palindrome check
- Single-tape: O(n²) (shuttling)
- Two-tape: O(n) (copy then compare)
Model Independence
Polynomial equivalence: Different TM models differ by polynomial factor
Example: k-tape TM with time T(n) → single-tape with time O(n·T(n))
Consequence: P is model-independent!
Big-O Notation
Definition
f(n) = O(g(n)) if there exist constants c, n₀ such that:
f(n) ≤ c·g(n) for all n ≥ n₀
Meaning: f grows no faster than g (up to constant factor)
Examples
3n² + 5n + 10 = O(n²)
- Choose c = 4, n₀ = 10
- For n ≥ 10: 3n² + 5n + 10 ≤ 4n²
5n log n = O(n log n)
- Choose c = 5, n₀ = 1
2ⁿ ≠ O(n¹⁰⁰)
- Exponential not polynomial!
Rules
Addition: O(f) + O(g) = O(max(f, g))
- O(n²) + O(n) = O(n²)
Multiplication: O(f) × O(g) = O(f·g)
- O(n) × O(log n) = O(n log n)
Composition: If f = O(g) and g = O(h), then f = O(h)
- Transitivity
Time Complexity Examples
Example 1: Linear Search
Problem: Find element in array of n elements
Algorithm:
1. For i = 1 to n:
2. If array[i] = target: return i
3. Return "not found"
Time:
- Best case: O(1) (first element)
- Worst case: O(n) (not found or last)
- Average case: O(n/2) = O(n)
Time complexity: T(n) = O(n)
Example 2: Binary Search
Problem: Find element in sorted array of n elements
Algorithm:
1. low = 1, high = n
2. While low ≤ high:
3. mid = (low + high) / 2
4. If array[mid] = target: return mid
5. If array[mid] < target: low = mid + 1
6. Else: high = mid - 1
7. Return "not found"
Time:
- Each iteration: O(1)
- Number of iterations: log n (halving each time)
Time complexity: T(n) = O(log n)
Example 3: Bubble Sort
Problem: Sort n elements
Algorithm:
1. For i = 1 to n:
2. For j = 1 to n-i:
3. If array[j] > array[j+1]:
4. Swap array[j] and array[j+1]
Time:
- Outer loop: n iterations
- Inner loop: n iterations (average)
- Total: n × n = n²
Time complexity: T(n) = O(n²)
Example 4: Merge Sort
Problem: Sort n elements
Algorithm (divide and conquer):
1. If n = 1: return
2. Split array into two halves
3. Recursively sort each half
4. Merge sorted halves
Time:
- Split: O(1)
- Merge: O(n)
- Recurrence: T(n) = 2T(n/2) + O(n)
- Solution: T(n) = O(n log n)
Time complexity: T(n) = O(n log n)
Example 5: Matrix Multiplication
Problem: Multiply two n×n matrices
Algorithm (standard):
1. For i = 1 to n:
2. For j = 1 to n:
3. For k = 1 to n:
4. C[i,j] += A[i,k] * B[k,j]
Time: Three nested loops, each n iterations
Time complexity: T(n) = O(n³)
Note: Strassen’s algorithm achieves O(n^2.807)
Example 6: All Subsets
Problem: Generate all subsets of n elements
Algorithm:
Recursively generate:
- Subsets without element i
- Subsets with element i
Time: 2ⁿ subsets to generate
Time complexity: T(n) = O(2ⁿ)
Exponential!
Space Complexity
Space complexity measures how much memory an algorithm uses as function of input size.
Simple Analogy
Space complexity is like:
- Desk space: How much workspace needed?
- Storage: How many boxes to store data?
Formal Definition
TM M has space complexity S(n) if:
- On any input of length n
- M uses at most S(n) tape cells
- For all inputs
Space: Maximum cells used over all inputs of size n
Measuring Space on TM
Single-Tape TM
Count cells: From leftmost to rightmost visited
Example:
Tape: [a][b][c][X][Y][B][B][B]...
↑ ↑
leftmost rightmost visited
Space used: 5 cells
Input vs Work Space
Total space: Input + work space
Work space: Additional space beyond input
Often measure: Work space only
Multi-Tape TM
Count: Sum of cells used on all tapes
Or: Maximum cells on any tape
Convention: Depends on model
Big-O for Space
Same notation: S(n) = O(f(n))
Meaning: Space grows no faster than f(n)
Example: S(n) = 3n + 10 = O(n)
Space Complexity Examples
Example 1: In-Place Sort
Problem: Sort array of n elements
Algorithm: Bubble sort (swap in place)
Space:
- Input: n elements
- Work space: O(1) (few variables)
Space complexity: S(n) = O(1) work space
Example 2: Merge Sort
Problem: Sort array of n elements
Algorithm: Merge sort with auxiliary array
Space:
- Input: n elements
- Work space: O(n) (merge array)
Space complexity: S(n) = O(n) work space
Example 3: Recursive Fibonacci
Problem: Compute nth Fibonacci number
Algorithm:
fib(n):
if n ≤ 1: return n
return fib(n-1) + fib(n-2)
Space:
- Maximum recursion depth: n
- Stack space: O(n)
Space complexity: S(n) = O(n)
Example 4: Iterative Fibonacci
Problem: Compute nth Fibonacci number
Algorithm:
fib(n):
a = 0, b = 1
for i = 2 to n:
c = a + b
a = b, b = c
return b
Space: Few variables (a, b, c)
Space complexity: S(n) = O(1)
Much better!
Example 5: BFS vs DFS
Problem: Traverse graph with n vertices
BFS (Breadth-First Search):
- Queue: O(n) vertices
- Space: O(n)
DFS (Depth-First Search):
- Stack: O(h) where h = height
- Space: O(h) = O(n) worst case
Both: O(n) in worst case
Time vs Space Trade-offs
Memoization
Without memoization (Fibonacci):
- Time: O(2ⁿ)
- Space: O(n)
With memoization:
- Time: O(n)
- Space: O(n)
Trade: More space for less time
Compression
Uncompressed:
- Time: O(1) access
- Space: Full storage
Compressed:
- Time: O(n) decompression
- Space: Less storage
Trade: More time for less space
Complexity Classes
TIME Classes
DTIME(f(n)): Decidable in deterministic time O(f(n))
NTIME(f(n)): Decidable in non-deterministic time O(f(n))
Examples:
- DTIME(n): Linear time
- DTIME(n²): Quadratic time
- DTIME(2ⁿ): Exponential time
SPACE Classes
DSPACE(f(n)): Decidable using deterministic space O(f(n))
NSPACE(f(n)): Decidable using non-deterministic space O(f(n))
Examples:
- DSPACE(log n): Logarithmic space (L)
- DSPACE(n): Linear space (context-sensitive)
- DSPACE(n²): Quadratic space
Important Theorems
Theorem 1: Space Reuse
Theorem: Space can be reused, time cannot.
Meaning:
- Can overwrite tape cells (reuse space)
- Can’t undo time steps
Consequence: Space often less than time
Theorem 2: Linear Speedup
Theorem: If T(n) = O(f(n)) and f(n) ≥ n, then for any ε > 0, can achieve time (1+ε)·f(n).
Meaning: Can speed up by constant factor (compress states)
Practical: Not usually important (constants hidden anyway)
Theorem 3: Tape Compression
Theorem: If S(n) = O(f(n)), can achieve space O(f(n)/c) for any constant c.
Meaning: Can compress tape by constant factor
Method: Use larger alphabet (k symbols per cell)
Theorem 4: Multi-Tape Simulation
Theorem: k-tape TM with time T(n) can be simulated by 1-tape TM in time O(T(n)²).
Proof idea:
- Mark k head positions on single tape
- One step of k-tape = O(n) steps on 1-tape
- Total: O(n·T(n)) ≈ O(T(n)²) if T(n) ≥ n
Space: Only O(T(n)) (space reused)
Theorem 5: Savitch’s Theorem
Theorem: NSPACE(f(n)) ⊆ DSPACE(f(n)²)
Meaning: Can simulate NTM using quadratic space
Proof idea: Recursively check reachability
Consequence: PSPACE = NPSPACE
Complexity Relationships
Time vs Space
DTIME(f(n)) ⊆ DSPACE(f(n))
Reason: Can’t use more space than time (at most one cell per step)
DSPACE(f(n)) ⊆ DTIME(2^O(f(n)))
Reason: At most 2^O(f(n)) configurations possible
Deterministic vs Non-Deterministic
DTIME(f(n)) ⊆ NTIME(f(n))
Obvious: DTM is special case of NTM
NTIME(f(n)) ⊆ DTIME(2^O(f(n)))
Reason: Exponential blowup in simulation
NSPACE(f(n)) ⊆ DSPACE(f(n)²) (Savitch)
Much better: Only quadratic!
Context-Sensitive Languages
CSL = DSPACE(n)
Definition: Languages decidable in linear space
Example: {aⁿbⁿcⁿ}
TM:
1. Mark one 'a', one 'b', one 'c'
2. Repeat using same cells (reuse space!)
3. Check all matched
Space: O(n) (just mark on input)
Not CFL: Can’t do with PDA (needs two stacks)
But decidable in linear space: TM can!
Logarithmic Space (L)
L = DSPACE(log n)
Surprisingly powerful!
Examples:
- Path existence in directed graph
- Sorting (with output on separate tape)
- Arithmetic operations
Convention: Input tape read-only, work tape O(log n)
Why Logarithmic?
Can store:
- Counters up to n (log n bits)
- Pointers to input positions (log n bits)
- Small number of variables
Can’t store: Entire input (would need n space)
Polynomial Time (P)
P = ⋃ₖ DTIME(nᵏ)
Definition: Languages decidable in polynomial time
Examples:
- Sorting: O(n log n)
- Shortest path: O(n²)
- Primality testing: O(n⁶) (AKS algorithm)
- Linear programming: Polynomial
Philosophy: P = “efficiently computable”
Why Polynomial?
Robust: Independent of computational model (within polynomial)
Closed: Under composition, polynomial remains polynomial
Practical: Often corresponds to feasible algorithms
Exponential Time (EXPTIME)
EXPTIME = ⋃ₖ DTIME(2^(nᵏ))
Definition: Languages decidable in exponential time
Examples:
- Generalized chess (n×n board)
- Generalized Go
- Some verification problems
Known: P ⊊ EXPTIME (strict!)
Proof: Time hierarchy theorem
Common Time Complexities
| Class | Time | Example |
|---|---|---|
| O(1) | Constant | Array access |
| O(log n) | Logarithmic | Binary search |
| O(n) | Linear | Linear search |
| O(n log n) | Linearithmic | Merge sort |
| O(n²) | Quadratic | Bubble sort |
| O(n³) | Cubic | Matrix multiply |
| O(2ⁿ) | Exponential | Subsets |
| O(n!) | Factorial | Permutations |
Common Space Complexities
| Class | Space | Example |
|---|---|---|
| O(1) | Constant | Iterative Fibonacci |
| O(log n) | Logarithmic | Binary tree height |
| O(n) | Linear | Recursive Fibonacci |
| O(n²) | Quadratic | 2D DP table |
Important Exam Points
- ✓ Time complexity: Number of steps as function of n
- ✓ Space complexity: Number of cells used as function of n
- ✓ Big-O: Upper bound f(n) = O(g(n))
- ✓ P: Polynomial time ⋃ₖ DTIME(nᵏ)
- ✓ L: Logarithmic space DSPACE(log n)
- ✓ CSL: Linear space DSPACE(n)
- ✓ Time vs space: DTIME(f) ⊆ DSPACE(f)
- ✓ Savitch: NSPACE(f) ⊆ DSPACE(f²)
- ✓ Multi-tape: Can be simulated in O(T²) time
- ✓ Trade-offs: More space can reduce time (memoization)
Common Mistakes to Avoid
✗ Mistake 1: Confusing time with space
✓ Correct: Time = steps, space = memory cells
✗ Mistake 2: Counting only worst case
✓ Correct: Time complexity is worst-case by definition
✗ Mistake 3: Including input in space
✓ Correct: Often measure work space only
✗ Mistake 4: Ignoring recursion stack
✓ Correct: Stack depth counts toward space
✗ Mistake 5: Thinking O(n) always beats O(n²)
✓ Correct: For small n or large constants, not necessarily
✗ Mistake 6: Confusing L with DSPACE(1)
✓ Correct: L = DSPACE(log n), not constant
Practice Questions
Q1: What is time complexity?
Answer: Number of steps algorithm takes as function of input size.
Q2: What is space complexity?
Answer: Amount of memory used as function of input size.
Q3: Time complexity of binary search?
Answer: O(log n) - halve search space each step.
Q4: Space complexity of merge sort?
Answer: O(n) work space for merge array.
Q5: What is P?
Answer: Languages decidable in polynomial time.
Q6: What is L?
Answer: Languages decidable in logarithmic space.
Q7: Can space be less than time?
Answer: Yes! Space can be reused, time cannot.
Q8: What is Savitch’s theorem?
Answer: NSPACE(f(n)) ⊆ DSPACE(f(n)²).
Q9: Time for all subsets of n elements?
Answer: O(2ⁿ) - exponential (2ⁿ subsets).
Q10: CSL equals what space class?
Answer: DSPACE(n) - linear space.
Summary
- Time complexity: Steps as function of input size
- Space complexity: Memory as function of input size
- Big-O notation: Asymptotic upper bound
- Linear search: O(n) time
- Binary search: O(log n) time
- Sorting: O(n log n) best (comparison-based)
- P: Polynomial time (efficient)
- L: Logarithmic space
- CSL: Linear space DSPACE(n)
- Time vs space: DTIME(f) ⊆ DSPACE(f) ⊆ DTIME(2^O(f))
- Savitch: Non-deterministic space only quadratic blowup
- Trade-offs: Can often trade time for space or vice versa
Time and space complexity provide quantitative measures of algorithm efficiency!