Asymptotic notation

Big-Θ\Theta(Big-Theta) notation

Logarithms grow more slowly than polynomials. That is, Θ(log2n)\Theta(\log_2 n) grows more slowly than Θ(na)\Theta(n^a) for any positive constant aa. But since the value of log2n\log_2 n, increases as nn increases, Θ(log2n)\Theta(\log_2 n) grows faster than Θ(1)\Theta(1).

The following graph compares the growth of 11, nn, and log2n\log_2 n:

Here's a list of functions in asymptotic notation that we often encounter when analyzing algorithms, ordered by slowest to fastest growing:

  1. Θ(1)\Theta(1)

  2. Θ(log2n)\Theta(\log_2 n)

  3. Θ(n)\Theta(n)

  4. Θ(nlog2n)\Theta(n \log_2 n)

  5. Θ(n2)\Theta(n^2)

  6. Θ(n2log2n)\Theta(n^2 \log_2 n)

  7. Θ(n3)\Theta(n^3)

  8. Θ(2n)\Theta(2^n)

  9. Θ(n!)\Theta(n!)

Note that an exponential function a^nana, start superscript, n>1n > 1, grows faster than any polynomial function nbn^b, where bb is any constant.

A function has "constant" growth if its output does not change based on the input, the nn. The easy way to identify constant functions is find those that have no nn in their expression anywhere, or have n0n^0, In this case, 11 and 10001000 are constant.

A function has "linear" growth if its output increases linearly with the size of its input. The way to identify linear functions is find those where nn is never raised to a power (although n1n^1is OK) or used as a power. In this case, 3n\bf3n and (3/2)n\bf(3/2)n are linear.

A function has "polynomial" growth if its output increases according to a polynomial expression. The way to identify polynomial functions is to find those where nn is raised to some constant power. In this case, 2n3\bf2n^3 and 3n2\bf3n^2 are polynomial.

A function has "exponential" growth if its output increases according to an exponential expression. The way to identify exponential functions is to find those where a constant is raised to some expression involving nn. In this case, 2n\bf2^n and (3/2)n\bf(3/2)^n are exponential.

We have several different types of functions here, so we start by thinking about the general properties of those function types and how their rate of growth compares. Here's a reminder of different function types shown here, in order of their growth:

  1. Constant functions

  2. Logarithmic functions

  3. Linear functions

  4. Linearithmic functions

  5. Polynomial functions

  6. Exponential functions

Which type is each of the presented options?

  1. Constant functions:

    • 6464

  2. Logarithmic functions

    • log8n\log_8{n}, log2n\log_2{n}

  3. Linear functions

    • 4n4n

  4. Linearithmic functions

    • nlog2nn\,\log_2{n}, nlog8nn\,\log_8{n}

  5. Polynomial functions

    • 8n28n^2, 6n36n^3

  6. Exponential functions

    • 82n8^{2n}

Within the logarithmic functions, the lesser bases grow more quickly than the higher bases - so log2n\log_2{n} will grow more quickly than log8n\log_8{n}. You can see that in the graph below:

Within the polynomial functions, 8n28n^2 will grow more slowly than 6n36n^3 since it has a lesser exponent. We don't even have to look at the constants in front, since the exponent is more significant.

To conclude, the correct order of the functions would be:

  • 6464

  • log8n\log_8{n}

  • log2n\log_2{n}

  • 4n4n

  • nlog6nn\,\log_6{n}

  • nlog2nn\,\log_2{n}

  • 8n28n^2, 6n36n^3

  • 82n8^{2n}

Big-O notation

If a running time is O(f(n))O(f(n)), then for large enough nn, the running time is at most kf(n)k \cdot f(n) for some constant kk. Here's how to think of a running time that is O(f(n))O(f(n)):

We use big-O notation for asymptotic upper bounds, since it bounds the growth of the running time from above for large enough input sizes.

Big-Ω (Big-Omega) notation

Sometimes, we want to say that an algorithm takes at least a certain amount of time, without providing an upper bound. We use big-Ω notation; that's the Greek letter "omega."

If a running time is Ω(f(n))\Omega(f(n)), then for large enough nn, the running time is at least kf(n)k \cdot f(n) for some constant kk. Here's how to think of a running time that is Ω(f(n))\Omega(f(n)):

We say that the running time is "big-Ω of f(n)f(n). We use big-Ω notation for asymptotic lower bounds, since it bounds the growth of the running time from below for large enough input sizes.

Just as Θ(f(n))\Theta(f(n)) automatically implies O(f(n))O(f(n)), it also automatically implies Ω(f(n))\Omega(f(n)). So we can say that the worst-case running time of binary search is Ω(log2n)\Omega(\log_2 n).

Last updated