In computing, there are usually all sorts of ways to tackle a problem, but coming up with an efficient and effective algorithm can be tricky. One of the ways we describe how good our algorithms are is expressing them as RUNTIMES.
As our problem size (N) increases, the amount of operations we need to do is related to our runtime. A runtime of N operations is preferred over a runtime of N^2, because as N grows, N^2 will grow much faster. A runtime of lg(N) is even better. That means that as the problem size grows, the amount of operations we need to do only goes up log base 2 of N (an extremely small number)
A good exercise is to come up with 2 very different values for N and figure out how many more operations need to be done between all the runtimes.