SPIN GLASSES AND OPTIMIZATION IN COMPLEX SYSTEMS

Spin glasses are useless. Even the most imaginative physicists, submitted to grant pressure, could not find applications for these materials. Yet their study, triggered by pure intellectual interest, has created a formidable new branch of statistical physics distinguished this year by the Nobel prize attributed to Giorgio Parisi.


Spin glass as an archetypical optimization problem
About fifty years ago, the attention of a few physicists was drawn to the anomalous magnetic response of some special magnetic alloys like Cu-Mn, in which the magnetic moments of Mn interact by pairs through random exchange couplings which can be ferromagnetic or antiferromagnetic, depending on their distance.Specifically, two magnetic moment s i and s j described as Ising spins taking values ± 1, have an interaction energy -J ij s i s j .If the coupling J ij is positive, the low energy configurations are those with parallel spins (ferromagnetic situation), if it is negative, the energy is lower when they are antiparallel (antiferromagnetic).In a spin glass, where both types of interactions are present, finding the lowest energy configuration, the "ground state", among the 2 N configurations of N spins is very hard.It is in fact an example of a so-called NP-hard problem: there is no known algorithm that can find the ground state in a computer time growing like a power of $N$, all known algorithms are exponential (and all algorithms are exponential if the famous conjecture P≠NP is correct).Physically, the relaxation time of spin glasses increases very rapidly when lowering the temperature, and in the spin-glass phase one cannot reach equilibrium.
Searching the ground-state is one of the many challenges of spin glasses.One also wants to understand what are the properties of spin configurations when the system is at equilibrium at a finite temperature, what type of random order sets-up in the low temperature spin-glass phase, and what is the nature of the phase transition.local magnetization m i of spin i, the expectation value of s i , is non-zero.But the magnetizations m i are all distinct.One can thus aim at finding all the m i .At the mean-field level one can write self-consistent equations that relate all these magnetizations, called "TAP equations" [3].If they can be solved efficiently, this provides an algorithm.

BOX 1: MESSAGE PASSING ALGORITHMS
The third main challenge is the multiplicity of states: there actually exist many ways in which the spins can freeze.A more macroscopic order parameter considers all the possible spontaneous orders of spins (all solutions of the TAP equations) and provides a statistical description of how they differ.This geometry of the space of solutions is the one that is encoded in the famous replica solution of Parisi.The precise link between this macroscopic order parameter and the microscopic magnetizations of TAP was understood through the cavity method [2] which also opened the way to rigorous mathematical proofs of the validity of the Parisi solution [4].

Optimizing with spin glass methods
It was soon realised that the new statistical mechanics of disordered systems developed for spin glasses might have applications in many different fields [2].In fact, large-size systems of many "atoms" (in the Greek sense) interacting with disordered potentials are ubiquitous in science, including social sciences.In the theory of optimisation problems, phase transitions have now become an important chapter, and the theoretical methods invented in spin glass theory, mostly the replica and cavity method, provide very useful tools for their analysis.
In parallel, the development of appropriate mean-field equations for glassy systems with short-range interactions [2,5] has opened the way to powerful new types of message passing algorithms, which have become important in information theory and signal processing.They provide a fast and distributed way of estimating the marginal probability of each variable.In the general case they rely on a mean-field type of approximation that neglect some correlations, but in some well-designed problems like those appearing in information theory, they can become exact in the relevant limit of large-size systems.

The three phases of constraint satisfaction problems
One remarkable example of fruitful interactions concerns constraint satisfaction problems [5,6].Deciding the satisfiability of a Boolean formula is a NP-hard optimization problem, actually it is at the root of the theory of NP-hardness.In random satisfiability problems, where the clauses are generated randomly, one finds a phase transition in the thermodynamic limit N,M→∞, keeping the ratio of clauses to variables, α=M/N, fixed.Spin-glass based methods [7] allowed to precisely locate this phase transition which separates a regime of low density of constraints, α<α s where almost all problems have a solution from a regime α>α s

Disordered systems: a new chapter of statistical physics
This spin-glass mystery was solved at the beginning of the eighties [1,2].Its solution required three major conceptual developments of statistical physics.
The first one is taming the disorder.In order to describe a given sample of a spin glass, one should give you the values of interaction couplings between all pairs of spins.If the interactions are short range, the number of such couplings is proportional to the number of spins, which is of the order of Avogadro's number N. In more general mathematical versions with long range interactions the number of couplings grows like N 2 .In both cases the detailed description of a sample is impossible.Fortunately, the experimental behaviour of spin glasses, like their magnetic response to an external field, or their specific heat, does not depend on all these details: all samples of Cu-Mn with 1% of Mn atoms behave the same, provided they are well prepared.Mathematically, one introduces ensembles of spin glasses, like the one where all couplings are sampled independently from a gaussian distribution, and one proves that in the large-size limit all samples have the same thermodynamic behaviour.Yet, all samples are microscopically distinct, and have a distinct ground-state, which is NP-hard to find, which means impossible in practice even for systems of moderate sizes with a few thousands spins.
The second challenge was to identify the right order parameters for describing the spin-glass order.At low temperature each spin tends to point in a favorite direction, and the where almost all problems have no solution.But the most surprising result obtained from the spin-glass analysis is the existence of another phase transition at a value α c <α s due to a major change in the geometry of the space of solutions.In a generic random satisfiability formula with a density of constraints α<α c all the (exponentially many in absolute, but exponentially rare with respect to the full space) solutions build a connected cluster, and one can jump from one solution to the next by changing the assignment of one well chosen variable.Instead, in the intermediate range α c <α<α s the space of solution is shattered into many disconnected clusters, which are well separated from each other.
Importantly, this clustering transition of the geometry of solution space is correlated with the practical difficulty of finding fast algorithms for solving random satisfiability formulas.When α<α c the space of solutions is connected and there exist algorithms (for instance those based on mean-field equations) that are able to find a solution in polynomial time: the generic problem is easy.In the intermediate regime α c <α<α s we know that there exist solutions, but the algorithms able to find them (like the enumeration of all the 2 N possible assignments of the variables) take an exponential time: the generic problem is in-principle solvable, but it is hard from the computational point of view: in practice we have no efficient algorithms.When α>α s there are no solutions.

Towards a physical theory of algorithmic complexity?
This pattern with three phases, in which the solution of a constraint satisfaction problem is easy at low constraint density, then becomes algorithmically hard in an intermediate regime, and impossible in the high constraint density phase, has been found in many optimization problems.It is generally associated with a sudden shattering transition of the space of solutions.In physics language, the spin glass models which have this property are the one with a discontinuous glass transition (sometimes called "one step replica symmetry breaking" transition).The existence of well separated clusters of solutions is now seen as a possible reason from algorithmic hardness, and opens interesting routes for new approaches to algorithmic intractability [8].In contrast to the standard classification of problems as P versus NP which is based on a worst-case analysis, this new construction will deal with "typical case" complexity, namely what happens in almost all instances generated from some given distribution.

A lesson
From the original magnetic anomaly in some "useless" alloys to its numerous applications in optimization and in so many other fields that I could not describe here, it has been a long way.This story shows once again that interesting research initially driven by pure intellectual interest can find fascinating developments in totally unexpected areas.n Mean field methods go back to more than a century ago, with the work of Pierre Weiss to understand the basic mechanism of ferromagnetism.In the last two decades it has been found how to write mean field equations, called in this context belief propagation or BP equations, for a very broad class of constraint satisfaction problems.These are problems in which N variables interact by groups of K: the joint probability distribution of the N variables is expressed as a product of factors, each involving K variables.The correlation structure is best understood in terms of a factor graph (see Fig. 1).In the thermodynamic limit N→∞ at fixed K, the mean field equations can be written as messages passed between the vertices of the graph, from variable to factor and from factor to variable.Solving them iteratively provides a new class of powerful algorithms, based on.m FIG.1: Left: an example of a factor graph: the probability law of the four variables is written as a product of 4 factors: P(x 1 ,x 2 ,x 3 ,x 4 )=Ψ a (x 1 ,x 2 , x 4 ) Ψ b (x 2 ,x 3 ,x 4 ) Ψ c (x 1 ,x 2 ,x 3 ) Ψ d (x 1 ,x 3 ,x 4 ).Middle: the message m 1→a is the probability of x 1 if the factor a is absent.Right: the message m c→1 is the probability of x 1 if it is connected only to c.The BP equations relate these various messages, the message going out from a node being computed from the incoming message.

BOX 2 :
K-SATISFIABILITY A K-SAT formula is a conjunction (an AND function) Φ=C 1^ C 2^…^ C M of M clauses, where each clause is a disjunction (an OR) of K Boolean variables x 1 ,…,x N or their negation.For instance, the clause C=x 1 ^ x 2 is TRUE unless x 1 and x 2 are FALSE.Formula Φ is satisfiable if and only if there exist an assignment of variables that satisfy all the clauses, this is then called a solution of the satisfiability problem.In random K-SAT, one wants to satisfy M= αN clauses, each involving K variables randomly chosen or their negation.m Fig.2 : The phase diagram of random K-SAT.When the density of constraints α increases, one goes from a low-constraint "EASY" phase where the space of solutions is connected (one can move from one solution to the next by flipping one variable at a time), to an intermediate "HARD" phase where the solution space is shuttered into many pieces very far away from each other.At high constraint density, there are no solutions.α s is the "SAT-UNSAT" phase transition, while α c is a geometrical phase transition where efficient algorithms get stuck.