💾 Archived View for hanicef.me › document › big-o-notation captured on 2023-01-29 at 15:38:01. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2022-07-16)

-=-=-=-=-=-=-

The big O notation

(C) Gustaf Alhäll, released under CC BY 4.0

Original document can be found at hanicef.me

License can be found at creativecommons.org

Definition

Time complexity is a measure of time an algorithm takes to be calculated on a computer. It's definition is:

f(x) ∈ O(g(x))

where f is the function and g is the complexity of f. The O here is called the big O notation, which is used to describe a function's asymptotic behavior, which is basically how fast a function is growing or declining in computation time. It represents how the function f grows in time when x gets larger, which becomes more clear when plotted on a graph.

To further explain what this means, here's an example:

a + b ∈ O(1)

The above represents adding two values together, and has a complexity of O(1). Since additional always requires the same amount of steps, the complexity of the function is 1. This is apparent when plotting the time of evaluating a + b on a graph, as it will show a straight line.

Another example, which is significantly more complex, is:

 n
 Σ i ∈ O(n)
i=1

This calculates 1 + 2 + 3 + ... + n, where n is the amount of numbers to add together. This function has the complexity of O(n), where the time complexity increases linearly as n increases. This basically means, if n is doubled, the function is going to take twice as long to complete.

Time complexity is important as it allows us to reason about performance without having to worry about technical details, such as programming language and hardware.

Best-case and worst-case

Algorithms are not always consistent when it comes to time complexity. Because of this, time complexities often is defined with a best-case and a worst-case. The best-case time complexity is the lowest possible complexity the algorithm can do, and the worst-case is the highest possible complexity.

To illustrate, we can use the insertion sort algorithm:

void insertion_sort(int const a[], size_t n) {
    size_t i, j;
    int t;
    for (i = 0; i < n; i++) {
        for (j = i; j > 0 && a[j-1] > a[j]; j--) {
            t = a[j];
            a[j] = a[j-1];
            a[j-1] = t;
        }
    }
}

To explain how this works, imagine a set of books in a random order. Now, take out the second book on the left, and insert it so the first two books are sorted alphabetically. Then, go to the third book on the left and place it so the first three books are sorted alphabetically. Repeat this process until all books have been placed in alphabetical order. This is effectively what this algorithm is doing: taking all values from left to right and placing them so all the previous values are in order.

This gives the algorithm a best-case complexity of O(n) and a worst-case of O(n²). This is because, in the best-case scenario, the list is in order, so none of the values in the array has to be moved. In the worst-case scenario, though, it's also sorted, but in the reverse order. This means that all values need to moved all the way to the opposite side in order to be ordered correctly, which causes the inner loop to cycle through the maximum amount of times for the algorithm to complete.