Now, that brings up just the idea of Moore's Law and
I want to talk more carefully about algorithms that scale.
So it's been true ever since the 70s that since we first had integrated circuits,
the founder of Intel, Gordon Moore postulated this idea that about every two
years, you're going to double the number of transistors in an integrated circuit.
So that means that we're going to get twice as much memory every two years,
because our memories are implemented on integrated circuits and
our processor speed is going to double every two years.
Those are really important implications.
So one thing that I've noticed like during my career is that what that means is that
if you buy a new computer,
you write a program that touches every word of the computer whatever it does.
It's going to take a few seconds and one of the first computers I used just had
tens of thousands words of memory, you can do 10,000 instructions per second and
I'll talk about this type of computer later on.
It's going to take a few seconds to access every word in that computer,
then we got to millions, tens of millions.
And now a day, you have billions of words of memory.
But you can also execute billions of instructions per second.
So to get to everywhere in the computer, it's going to be a couple of seconds.
It doesn't matter what kind of computer it is, roughly.
So what does that mean in terms of programs that we run on computers?
That's the idea of scalability.
So what you need is an algorithm that its running time doubles when
the problem size doubles.
The idea is when you get a new computer, you can fit a problem that's twice as big.
So your algorithm better be able to
manage that at least at scale.
So for example, if the order of growth is N,
problem size doubles, then the running time doubles.
N log, N also pretty close, but N squared doesn't happen.
We already saw according to our doubling test,
if the running time is quadratic, then when the problem size doubles.
The running time goes up by a factor of four and even worse for N cubed.
So the thing is if you're solving the problem you're solving now and
you get a faster computer, then you can do it in half the time.
But at least if you take on a bigger problem,
you can solve it in the same time.
It took you before to solve the smaller problem and that's really progress,
if you have an algorithm that scales.
If you have a quadratic algorithm or worse, an algorithm that doesn't scale,
then you can till solve the problems you're solving in half the time.
Your computers choice is fast after all.
But if you take on a bigger problem, because you have more memory,
it's going to take you twice as long to solve it and
that immediately leads to frustration.
And so you have to have algorithms that scale, if you want to keep pace with
Moore's Law and that comes up in application after application and
sorting's the first example.
So what we're looking for is a sorting algorithm that scales.
That runs in time proportional to N log N not quadratic time like insertion sort.