Monte Carlo method in statistical mechanics

The general motivation to use the Monte Carlo method in statistical physics is to evaluate a multivariable integral.

The typical problem begins with a system for which the Hamiltonian is known, it is at a given temperature and it follows the Boltzmann statistics.

To obtain the mean value of some macroscopic variable, say A, the general approach is to compute, over all the phase space, PS for simplicity, the mean value of A using the Boltzmann distribution: where

- a vector with all the degrees of freedom (for instance, for a mechanical system,

One possible approach to solve this multivariable integral is to exactly enumerate all possible configurations of the system, and calculate averages at will.

In realistic systems, on the other hand, an exact enumeration can be difficult or impossible to implement.

For those systems, the Monte Carlo integration (and not to be confused with Monte Carlo method, which is used to simulate molecular chains) is generally employed.

The main motivation for its use is the fact that, with the Monte Carlo integration, the error goes as

In the following sections, the general implementation of the Monte Carlo integration for solving this kind of problems is discussed.

are uniformly obtained from all the phase space (PS) and N is the number of sampling points (or function evaluations).

From all the phase space, some zones of it are generally more important to the mean of the variable

sufficiently high when compared to the rest of the energy spectra are the most relevant for the integral.

Using this fact, the natural question to ask is: is it possible to choose, with more frequency, the states that are known to be more relevant to the integral?

are the sampled values taking into account the importance probability

Since most of the times it is not easy to find a way of generating states with a given distribution, the Metropolis algorithm must be used.

Substituting on the previous sum, So, the procedure to obtain a mean value of a given variable, using metropolis algorithm, with the canonical distribution, is to use the Metropolis algorithm to generate states given by the distribution

One important issue must be considered when using the metropolis algorithm with the canonical distribution: when performing a given measure, i.e. realization of

On systems with relevant energy gaps, this is the major drawback of the use of the canonical distribution because the time needed to the system de-correlate from the previous state can tend to infinity.

As stated before, micro-canonical approach has a major drawback, which becomes relevant in most of the systems that use Monte Carlo Integration.

For those systems with "rough energy landscapes", the multicanonic approach can be used.

To overcome this, the Wang and Landau algorithm is normally used to obtain the DOS during the simulation.

Note that after the DOS is known, the mean values of every variable can be calculated for every temperature, since the generation of states does not depend on

(for instance, to obtain the magnetic susceptibility of the system) since it is straightforward to generalize to other observables.

distribution: step 1.1: Perform TT times the following iteration: step 1.1.1: pick a lattice site at random (with probability 1/N), which will be called i, with spin

step 1.1.5: update the several macroscopic variables in case the spin flipped:

after TT times, the system is considered to be not correlated from its previous state, which means that, at this moment, the probability of the system to be on a given state follows the Boltzmann distribution, which is the objective proposed by this method.

A major drawback of this method with the single spin flip choice in systems like Ising model is that the tunneling time scales as a power law as

The method thus neglects dynamics, which can be a major drawback, or a great advantage.

An additional advantage is that some systems, such as the Ising model, lack a dynamical description and are only defined by an energy prescription; for these the Monte Carlo approach is the only one feasible.

The great success of this method in statistical mechanics has led to various generalizations such as the method of simulated annealing for optimization, in which a fictitious temperature is introduced and then gradually lowered.