Hello, bonjour, servus, vee-tie-yu, welcome to the eighth tutorial of Statistical Mechanics: Algorithms and Computations from the Physics Department of Ecole Normale supérieure. This week, we started our discussion of classical spin models, and in particular of the Ising model one of the handful of models in physics that really count. As discussed in the lecture, this model describes spins on a lattice with a rudimentary nearest-neighbors interaction. The energy between two neighboring sites is equal to -1 if they are aligned and it is equal to +1 if they are of opposite sign. The energy of the system is the sum of the pair energies and the statistical weight of the configuration is given by the Boltzmann factor exp(-beta E). The Ising model is one of the cases where many different fields of physics meet, it is here that our original understanding of phase transition was acquired. And it was an early meeting place for the basic simulation methods as the Metropolis algorithm that we discussed in the lecture or the closely related heat-bath algorithm later on, also for the cluster simulation methods. Each time a great new idea comes up, it is first tried out in the Ising model. This is the story that Michael and Alberto will develop in the heat-bath algorithm, and these developments have lead to the perfect sampling approach first presented by Propp and Wilson in 1996. So let us now discuss an alternative to the Metropolis Monte Carlo method: the heat-bath algorithm. Rather than flipping a spin at a random site, we now thermalize it with respect to its local environment. As Werner discussed in the context of a-priori probability in this week's lecture the detailed balance condition that accompanied us since week one can be written more generally as shown here where A(a->b) corresponds to the a-priori choice of proposing a move from a to b which together with the acceptance probability P_accept(a->b) gives the probability to move from a to b. In this general detailed balance condition the Metropolis strategy corresponds to a symmetric choice A(a->b) = A(b->a) so that these a-priori probabilities drop out of the picture and this is exactly the reason why we never spoke about them until this week. In contrast, in the general detailed balance condition for the heat-bath algorithm, we have A(a->b) proportional to P(b), so that now the acceptance probabilities drop out of the picture so now in the heat-bath algorithm how do we actually thermalize a subsystem (that is a spin) at temperature T? For this, we simply pull the spin out of the system and measure the field h at the now empty position. In this example, h=2. If we would return the spin in the state +1 it would have the energy E=-h if instead we would return the spin in the state -1 it would have the energy E=h This leads to the normalized probabilities pi_plus and pi_minus So if a random number Upsilon between 0 and 1 is smaller than the probability pi_plus, we return the spin in the state +1 if on the other hand the number Upsilon is larger than pi_plus we return the spin in the state -1 In Python, this gives the very short program heat-bath_ising.py In this program we keep track of and update the energy E so that we can easily obtain the specific heat c_V of the system as a function of temperature, and we see its characteristic peak which indicates the ferromagnetic-paramagnetic phase transition which takes place at a temperature Tc that we know exactly. But note however that in order to study this phase transition in very large systems it is actually better to use the cluster algorithm that has been presented by Werner in this week's lecture. So now let us look at the configurations generated by this algorithm. Here we have the randomly sampled initial condition and then here we sample one after the other the sites we want to update as well as the new values sigma_k See, we update the spins here and here.. in a beautiful illustration of what constitutes a Markov chain Here we arrive at the final state and you can see that the program has run so long that this is really well thermalized Now let us run the algorithm again with a new initial configuration, which is clearly different from the first one Again, we randomly sample the sites k and new spins sigma_k and again the system is brought to equilibrium. But hold on, we run the simulation two times with clearly different initial conditions so what a coincidence that we ended up with the same final configurations! Let us run the algorithm a third time, with a third initial condition Again, we end up with the same final configuration and we could continue this, even more often so we have a random Monte Carlo algorithm but we suspect - and Alberto will soon prove this - that all configurations merge towards the same final configuration. This is because although we use different initial configurations we have used the same sequence of sites updates as well as the same sequence of random numbers to make our updates. Something very strange is happening here. The heat-bath algorithm implements a true Markov chain that does approach equilibrium but using a random seed you can be sure that any other configuration will after some time lead to the same sequence of configurations So, we can continue our two simulations here to see that they really stay the same for all times This simply means that the state of our Markov chain depends on the sequence of sites k(t) and random numbers Upsilon(t), but not on the initial configuration. So let us now rewind our two simulations to see that initially they have really been different but that they have merged at a given moment shown here. So we understand that Markov chains couple They forget their initial configurations they forget where they're coming from As we discussed in Tutorial 1, this forgetting takes place on a time scale given by the correlation time mathematicians speak of mixing-time and it follows that the coupling time must be larger than the correlation time. Coupling of Markov chains has first been formulated by the mathematician Wolfgang Doeblin in the 1930s and it was really an ingenious observation. Alberto in a few moments will show you how we can really prove that all the 2^N initial configurations couple even if N equals 100 or 1000 or 1000000.. this will lead to the perfect sampling idea but before listening to him, please take a moment to download run and modify the program heatbath_ising.py and understand the influence of our innocent little line with random.seed(). Michael has made us observe that the heat-bath algorithm produces identical output for the Ising model even if we start from different initial configurations Let us formalize this observation in terms of an algorithm heatbath_ising_random_map.py This algorithm has no starting configuration, but has a seed which produces the deterministic sequence of the sites k for the spin updates, and of the random numbers Upsilon for the thermalization of the heat-bath algorithm. This program is really a random map, an alternative way of viewing Markov chain algorithms. Michael just observed that after a long simulation time the output of a random map is independent on the input which means on the initial configuration But it depends on the seed and on the random numbers triggered by it This property is called coupling The coupling, the property that all the 2^N possible initial configurations of the Ising model finally merge together after a given number of iterations called the coupling time, seems too good to be true. And even if it was true, how can we check it? For example for a system of 100x100 spins how can we check that all the 2^10000 configurations finally merge at a given time? And this time wouldn't be too large? So, to understand how coupling works let's run again the program heatbath_ising_random_map.py starting from two configurations the all up configurations, on the right and a random initial configuration on the left as it was already observed by Michael the two copies just merge and couple after a while just remember that at each time step we have updated the same spin and used the same random number Upsilon for both copies Starting from these two configurations we immediately scroll to the last stage of the simulation, where the coupling occurs We observe that the configuration that started as a all up has a few sites where the spin is up while in the left configuration the spin is down This is never the other way round it never happens that the spin is up on the left and down on the right this is a remarkable property and it holds also at the very initial stages of the simulation and actually it works at all time Let us suppose like here that the right configuration is larger than the left configuration at a given time t this means that it can never happen that a spin is down on the right and up on the left Let us now analyze what happens at time t+1 we see that the field h at all sites on the right is always larger or equal to the field h on the left let's go back to the program heatbath_ising.py we see that the function pi_plus(h) is an increasing function of h this means that if we update up the spin on the left, we also update up the spin on the right. The order at time t is conserved at time t+1 This means that the left configuration remains smaller than the right configuration at all time up to the time of the coupling Let's play the same game with the all down configuration on the left you have the all down configuration and on the right some random starting initial condition Once again, you see that the configurations on the left are smaller than the configurations on the right until the coupling time The same reasoning leads to the same conclusions and we can also conclude that the all up configuration is always larger than the all down configuration until the coupling time the relation which is preserved by the heat-bath algorithm is called half-order. The all up configuration is larger than any other configuration the all down configuration is smaller than any other configuration but two configurations do not necessarily have a relationship between them So now let us run a large system on the left the all down configuration on the right the all up configuration let's evolve them and see where they differ after a while they converge, they merge to the same coupled configuration together with an astronomical number of possible initial configurations like a galaxy size flock of sheep, herded in by two shepherd dogs. The unique configuration you see here is the configuration at which the 2^N initial configurations couple it is clearly independent on the initial configuration however it is not really drawn from the Boltzmann distribution this is because the coupling occurs at a configuration which interpolates between the all up and all down configurations The coupling time is a stochastic variable it depends on the seed and on the choice of k and Upsilon its mean value is a little bit longer than the correlation time we will deepen our understanding on the coupling phenomenon in this week's homework using a seminal idea by Propp and Wilson in 1996 you will be even able to generate - using this Markov chain heat-bath algorithm - Boltzmann configurations which are perfectly independent like the pebbles thrown by the children on the Monte Carlo beach Before going on, please download run and modify the program heatbath_ising_random_map.py that we discussed in this section. Curiously, it is a few line shorter than the program heatbath_ising.py and its graphics version allows you to follow two simulations at a time and you can observe the half-order and the coupling phenomenon. So, in conclusion, we have discussed in this tutorial the heat-bath algorithm for the Ising model at first sight it was just another local Markov chain Monte Carlo algorithm but then we noticed that beyond our expectations it respected a curious half-order principle and this allowed us to make contact with coupling Coupling really is the concept that unites the two words of Monte Carlo the children's world of direct sampling and the adults' world of Markov chain Monte Carlo as it outputs configurations that have zero correlations with the initial state unfortunately it is impossible to use coupling in a general context coming up with new coupling algorithm is the coolest trend in Monte Carlo just as it is to devise a new slick an shiny cluster algorithm as the one we discussed in the lecture In this week's homework you will be brought into contact with both of these concepts, and we wish you a lot of fun with Homework session 8 of Statistical Mechanics: Algorithms and Computations So, see you next week for a discussion of faster-than-the-clock algorithms in the Ising model.