0:00

To complete talking about how science works, let's talk for

a while about information.

What I really want to do since we live in the information age is to

connect information with knowledge, which is what science is all about.

Let me use a very simple example.

Suppose you put a dot on a piece of paper, and

you want it to be able to describe where that was in increasing levels of accuracy.

How would you do that?

You'd say the dot is on the piece of a paper.

That doesn't localize it or describe it in any detail.

However, you could fold the piece of paper in half and

say the dot is above the line or below the line.

You've localized the dot to within a factor of 2.

If you then folded the piece of paper the other way,

you do divided it into quarters, and the dot would be in one of the quarters.

So you'd localize it to 1 part in 4, 25%.

And you can follow this logic and see that each time you fold the paper in half and

then half again, the dot will clearly lie in one of the small rectangles.

And you will have localized it with increasing accuracy.

1:42

We could extent this into three dimensions and imagine a room where there

was a particular molecule that happened to be in one place in the room.

Could divide the room in half and half again.

Both along its length and its width and its height.

And gradually, we would localize that particular molecule.

Each question asked, is it in this side or

that side, is a bit of information, one or zero, yes or no.

And so that's a direct connection between a bit, a binary digit,

which is how computers work and knowledge, or increasing knowledge.

2:45

Or, you could have a very fine thermometer,

grading into a hundred degree segments from 0 to 100 degrees C.

And you've characterized the information by more bits.

So there's this direct relationship whether you're measuring sound or

light, or anything, position into bits of information.

Let me give you another example.

Suppose I asked you to think of a number between 1 and 1000, and

don't tell me what it is.

And I told you I could guess that number in only eight guesses,

you'd probably think that implausible.

Just by guessing numbers surely it would take me 500 guesses just to

have a 50% chance of getting it right.

But that's not true, that's not the most powerful use of information.

If instead my questions were, is the number above or below 500,

the answer is either yes or no, and I've localized it to half the number line.

If I then ask, is it above or below 250, I've localized it to another factor of 2.

And as you can see, following this logic, each question I ask,

each answer you give, one new bit of information divides the line in two,

in two, in two, and in two again.

And so, with only 8 bits of information, 2 to the power 8,

I can characterize the numbers from 1 to 1,000.

This is the power of information.

It's a very efficient way of packing information into questions asked about

the world.

We live in an amniotic fluid of information, in increasing volumes.

And we sort of take it for granted, the power of our computers,

the amount of information we're subjected to and

have available to us is increasing exponentially,

which means doubling essentially every year or every six months.

This is Moore's Law, it applies not only to the speed of computers, but

also the bandwidth of the internet and the amount of information available online.

Maybe you are not aware of it, but when you watch the silly cat video on YouTube,

you're watching a set of ones and zeroes, binary digits of information amounting to

some billions of them changing sometimes a second and

it's produced by this power of information technology.

It's the same process that leads to localization of that dot on a piece of

paper, by defining a region of space in smaller and smaller quantities.

What level of information are we talking about in modern culture

after several decades of the computer and internet revolution?

You probably know the amount of data you have access to.

Some gigabytes, I suspect.

The amount of information in the modern world is growing at a phenomenal rate.

It is indeed growing exponentially, doubling every year.

The amount of new information created last year was something like 50 exabytes,

which is close to one with 20 zeros after it number of bytes.

A simply unimaginable amount of information.

So it's not just scientists that have to deal with and parse information and

know how to characterize it, our everyday lives are a wash with information and

we do need to understand it at some level.

This power of information is so

efficient the packing of this information, that we can characterize large number of

items with a very small number of bits of information.

As we saw, it takes less than ten items, or decisions, or

questions to characterize a number line from 1 to 1,000.

Which means, a 1,000 items are characterized by 10 bits.

6:22

30 bits would be enough to characterize 1 billion items and so on.

If we extrapolate this, it means that only about 80 bits of information will be

needed to characterize the position of any atom in the universe to within the size of

an atom, which is an extraordinary concept.

James Watson, co-discoverer of the mechanism of DNA,

said in the mid-1950s life is digital information.

Quite an extraordinary statement from someone before the era of

their personal computer.

What he meant was that the base pair sequence of biological material is

coding information of the genome, of the functioning of every organism.

If we run with this metaphor and

look at the history of information in this world over the last mil,

billion of years since the Earth formed, we can characterize different phases.

Life codifies information in biological molecules, and

as life has grown more complex, the information content has increased.

The human genome contains about 3 billion base pairs that characterize all

the information about a human being.

That's the biological rate of information growth on this planet.

When human culture started, and especially with the invention of the printing press

by Gutenberg, we elevated our rate of information gathering, retrieval, and

storage with books.

That corresponds to the uptick in this graph with the rate of

information growth started growing billions of times faster than it

had by simple biological evolution.

And then in the final phase of this progression,

at an even more rapid exponential rate, starting only a few decades ago, computers

allowed us to increase information storage, transmittal, and retrieval

by another factor of a million, to a phenomenal rate, the one I just mentioned.

So in terms of the age of information, we're in an unparalleled situation

compared to biological evolution, which may occur elsewhere in the universe.

8:22

But let me finish talking about science not with bits and

bytes, but with the imagination that's at the heart of scientific progress.

We can imagine that our brains are able to encounter the things of the universe and

our observations.

But are really we able to imagine everything that happens in the universe?

Just imagine the universe divides into things that are and

things that aren't, things that do happen and things that don't.

I think our imaginations are good enough to imagine most of

the things that actually happen in the universe, but perhaps not all.

We're eternally surprised in cosmology and

astronomy with things that we never anticipated.

However the power of our brains,

of course, is that we can imagine lots of things that don't happen.

That's the basis of science fiction of poetry and art.

So while I think we can imagine most of the things that actually happen in

the universe, there are perhaps things in the universe that do happen that

we're not smart enough, clever enough or imaginative enough to conceive of.

However, our brains are supple and

we do imagine things that don't happen in the universe.

The power of imagination puts all of of this within the landscape of our head.