One thing that's nice is a lot of the interactions that we

tend to have between individuals will have more structure,

and so the games will be nice ones,

they won't be the worst case games that

are going to be computational complex,

they're going to be ones where we can

actually say something meaningful about this structure.

So, what we're going to start with is as a canonical special case.

So, it's a very simple version of the game but

one that is going to be fairly widely applicable.

So, what is true,

is we're looking at a situation where person I is going to take an action.

Let's let that be Xi,

and we'll start with the case where it's just a binary action,

is either zero or one,

so I either buy this book or I don't buy the book.

Or I invest in a new technology, I don't.

I learn a language, I don't learn a language.

I end up going to a movie, I don't go to movie.

The payoff is going to depend on how many neighbors choose each action.

So, how many people choose action zero?

How many neighbors choose action one?

How many neighbors I have?

So, that's going to be my payoff is going to depend on those things.

So, we've got each person choosing an action in zero, one.

We're going to consider a situation where your payoffs depend on your action.

So, person i's payoff depends on their action,

it's also going to depend on the number of individuals,

number of neighbors of i that choose one.

So, how many of my neighbors chose one?

It'll depend on my degree,

how many neighbors I have.

So, if I have 100 neighbors,

it might be different than if I have three neighbors,

and two of them are choosing action one.

Two out of three is different than two out of 100,

so I might care differently depending on how many neighbors I have.

So, what are the main simplifying assumptions in this setting, means,

simplifying assumptions are that we've got just the zero,

one action, so we either taken action or we don't.

I only care about the number of friends taking the action,

not the identities of them.

So I don't care whether it's I don't have best friends and less best friends.

I treat friends equally in terms of who's taking the action.

It also just depends on my degree.

So, how many friends I have,

I don't have a different preference than somebody else.

So we can enrich these models later to allow for people to

have different preferences and weight things differently.

But for now, let's think of a world where everybody treats

their friends equally and it only matters how many friends they have,

not who their friends are.

So, let's look at as an example of a simple game of complements.

I'm willing to choose this new technology if and only if at least two t neighbors do.

So, this is a game I suppose I'm learning to play bridge,

a card game, I have to have

at least three friends who play bridge before I'm going to want to learn play bridge.

So, my payoff to playing action zero,

if I don't learn it, I just get a zero.

One example of this would be that I get a payoff from playing action one,

which looks like minus this threshold plus how many friends play it.

So, if this threshold was three,

then I get minus three plus how many of my friends play it.

So, for instance, if at least three of my friends play it,

and I'm going to get a payoff of zero.

If four of my friends play it I've got a payoff of one.

If five of my friends play it again,

I have a payoff of two and so forth.

So, this would be a very simple example,

where I'm going to be willing to choose action

one if and only if at least three of my neighbors do.

But you could write down all kinds of different payoff matrices,

this is just one example.

So, let's think of looking at a network now,

and we've got a situation where we've got a bunch of different people,

and a person's willing to take action one if and only if at least two neighbors do.

So, this is a game where once I have

at least two of my friends who bought this new technology,

I'm willing to do it, otherwise I don't.

So, what do we know first of all?

Well, if we look at this network,

all these blue people,

they're going to take action zero because they only have one friend.

Actually sorry, this person has two friends that one shouldn't be coded as a zero.

So, these three individuals only have one friend.

So, they're definitely going to have to take action zero,

there's no way they're going to have at least two neighbors do it.

But we can do is,

we can ask what about this player?

Well, their actions going to depend on what their other friends do.

One possibility is that we set for instance,

these three individuals altered to playing action one.

So, if these two individuals are doing it,

then this person's willing too,

they're all willing to because now they each have at least two friends doing it.

So, one possibility would be to stick it where we were before,

where nobody takes the action because nobody else

does and so the technology never gets off the ground.

So, it's possible that if it's a technology that needs people to

want to communicate with other people and to have other people do it before they do it,

there's a possibility of never getting it's seeded,

it never gets off the ground.

Another possibility is yes,

these three people all adopt it because they each have two friends who do it,

and so, that's also an equilibrium.

Now, if these are the only people adopting,

then nobody else actually wants to do it because

all the other individuals still have at most one friend who did it,

so nobody else is above their threshold,

and indeed it's still an equilibrium

for these three people to do it and nobody else to do it.

So, nobody else wants to take the action

because none of the other people have two neighbors to do.