Asymptotic notation
Big-(Big-Theta) notation
Logarithms grow more slowly than polynomials. That is, grows more slowly than for any positive constant . But since the value of , increases as increases, grows faster than .
The following graph compares the growth of , , and :
Here's a list of functions in asymptotic notation that we often encounter when analyzing algorithms, ordered by slowest to fastest growing:
Note that an exponential function a^nana, start superscript, , grows faster than any polynomial function , where is any constant.
A function has "constant" growth if its output does not change based on the input, the . The easy way to identify constant functions is find those that have no in their expression anywhere, or have , In this case, and are constant.
A function has "linear" growth if its output increases linearly with the size of its input. The way to identify linear functions is find those where is never raised to a power (although is OK) or used as a power. In this case, and are linear.
A function has "polynomial" growth if its output increases according to a polynomial expression. The way to identify polynomial functions is to find those where is raised to some constant power. In this case, and are polynomial.
A function has "exponential" growth if its output increases according to an exponential expression. The way to identify exponential functions is to find those where a constant is raised to some expression involving . In this case, and are exponential.
We have several different types of functions here, so we start by thinking about the general properties of those function types and how their rate of growth compares. Here's a reminder of different function types shown here, in order of their growth:
Constant functions
Logarithmic functions
Linear functions
Linearithmic functions
Polynomial functions
Exponential functions
Which type is each of the presented options?
Constant functions:
Logarithmic functions
,
Linear functions
Linearithmic functions
,
Polynomial functions
,
Exponential functions
Within the logarithmic functions, the lesser bases grow more quickly than the higher bases - so will grow more quickly than . You can see that in the graph below:
Within the polynomial functions, will grow more slowly than since it has a lesser exponent. We don't even have to look at the constants in front, since the exponent is more significant.
To conclude, the correct order of the functions would be:
,
Big-O notation
If a running time is , then for large enough , the running time is at most for some constant . Here's how to think of a running time that is :
We use big-O notation for asymptotic upper bounds, since it bounds the growth of the running time from above for large enough input sizes.
Big-Ω (Big-Omega) notation
Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound. We use big-Ω notation; that's the Greek letter "omega."
If a running time is , then for large enough , the running time is at least for some constant . Here's how to think of a running time that is :
We say that the running time is "big-Ω of . We use big-Ω notation for asymptotic lower bounds, since it bounds the growth of the running time from below for large enough input sizes.
Just as automatically implies , it also automatically implies . So we can say that the worst-case running time of binary search is .
Last updated