Definition of Time Complexity
Time complexity is a measure used in computer science to describe the amount of time an algorithm takes to run, as a function of the size of its input. It provides an upper bound on the running time, helping us compare the efficiency of different algorithms, especially as the input size grows.
Time complexity is typically expressed using Big O notation, such as $O(1)$, $O(n)$, $O(\log n)$, or $O(n^2)$, where $n$ is the size of the input.
Worked Example: Linear Search
Suppose we have an array of $n$ elements and we want to find if a value $x$ exists in the array using linear search.
Algorithm Steps:
- Start at the first element.
- Compare each element to $x$.
- Stop if $x$ is found or the end of the array is reached.
Step-by-Step Analysis:
- In the worst case, $x$ is not in the array, so we check all $n$ elements.
- Each comparison is a constant-time operation.
- Time complexity quantifies how an algorithm's running time grows with input size.
- Big O notation is used to express the upper bound of time complexity.
- Understanding time complexity helps in selecting efficient algorithms for large datasets.
The total number of operations in the worst case is $n$.
Time Complexity Calculation:
$$ T(n) = n \cdot c $$
where $c$ is the time for one comparison.
Using Big O notation:
$$ T(n) = O(n) $$