All software developers know about Big O. For instance, contemporary sorting algorithms usually have complexity. Something like looks like a function of a function, but it’s really a classification system.
We start by defining the complexity of an algorithm to be an estimate of the number of steps required to solve a problem. This count is going to vary by the size of the problem, and so the complexity must be a function of the problem size.
Sometimes this can result in a nasty looking functions like or . We say that both of these functions are in the category because they have similar growth properties. This is called BachmannLandau notation.
In mathematics, this notation describes the asymptotic behavior of a function. if , then we have a guarantee that after a certain point, the graph of will fall below the graph of , if you’re willing to rotate a bit. In computer science, we use this to understand how the performance of an algorithm will change as the size of the problem or input grows. There are actually several BachmannLandau notations, each of which gives us some kind of indication on the limits of the growth of a function.
Name 
Notation 
Analogy 
Big O 
is 

Little o 
is 
< 
Big Omega 
is 

Little omega 
is 

Theta 
is 

Credit goes to MIT for the excellent table and analogies. The definitions of each can be found in the referenced material.
Additionally, some of these definitions can be expressed using calculus. We start with the definition of in the material says that is if :
That’s a symbolic mouthful. It means, “For all positive values of C, there exists some k such that, if x>k, the absolute value of f(n) is less than or equal to the absolute value of g(n) multiplied by C.
Separately, any calculus text will give this definition for a limit at infinity. I like Paul’s Online Calculus Notes. It says that if
<
Again in plain English, “For any positive value of , there must exist some value for N such that for any value of x that is greater than N, this inequality holds.”
Let’s make some substitutions of variable names in the above equation. If we change out f(x) for f(x)/g(x), 0 for L, C for , n for x, and k for N :
<
or
<
This boils down to an alternative definition of . is if :
This allows us to use other mathematical tricks, like L’Hopitals Rule, to figure out if two given functions follow this relationship.