The brain's ability to assimilate new information depends on it being in a state of 'self-organised criticality'. Researchers believe this allows the brain to adapt to new situations.
Genius on the edge of chaos
Have you ever experienced that eerie feeling of a thought popping into your head as if from nowhere, with no clue as to why you had that particular idea at that particular time? You may think that such fleeting thoughts, however random they seem, must be the product of predictable and rational processes. After all, the brain cannot be random, can it? Surely it processes information using ordered, logical operations, like a powerful computer?
Actually, no. In reality, your brain operates on the edge of chaos. Though much of the time it runs in an orderly and stable way, every now and again it suddenly and unpredictably lurches into a blizzard of noise. Neuroscientists have long suspected as much. Only recently, however, have they come up with proof that brains work this way. Now they are trying to work out why. Some believe that near-chaotic states may be crucial to memory, and could explain why some people are smarter than others.
In technical terms, systems on the edge of chaos are said to be in a state of"self-organised criticality". These systems are right on the boundary between stable, orderly behaviour - such as a swinging pendulum - and the unpredictable world of chaos, as exemplified by turbulence. The quintessential example of self-organised criticality is a growing sand pile. As grains build up, the pile grows in a predictable way until, suddenly and without warning, it hits a critical point and collapses. These "sand avalanches" occur spontaneously and are almost impossible to predict, so the system is said to be both critical and self-organising.
Self-organised criticality has another defining feature: even though individual avalanches are impossible to predict, their overall distribution is regular. The avalanches are "scale invariant", which means that avalanches of all possible sizes occur. They also follow a "power law" distribution, which means bigger avalanches happen less often than smaller avalanches, according to a strict mathematical ratio.
The brain has features in common. Networks of brain cells - neurons - alternate between periods of calm and periods of instability - "avalanches" of electrical activity that cascade through the neurons. Like real avalanches, exactly how these cascades occur and the resulting state of the brain are unpredictable. "Lying at the critical point allows the brain to rapidly adapt to new circumstances," says Andreas Meyer-Lindenberg from the Central Institute of Mental Health in Mannheim, Germany.
In 2003, John Beggs of Indiana University in Bloomington began investigating spontaneous electrical activity in slices of rat brain tissue. He found that these neural avalanches are scale invariant and that their size obeys a power law. Importantly, the ratio of large to small avalanches fit the predictions of the computational models that had first suggested that the brain might be in a state of self-organised criticality.
More recently, it has become clear that brain activity also shows signs of self-organised criticality on a larger scale. As it processes information, the brain often synchronises large groups of neurons to fire at the same frequency, a process called "phase-locking". Like broadcasting different radio stations at different frequencies, this allows different "task forces" of neurons to communicate among themselves without interference from others.
The brain also constantly reorganises its task forces, so the stable periods of phase-locking are interspersed with unstable periods in which the neurons fire out of sync in a blizzard of activity. This, again, is reminiscent of a sand pile. Could it be another example of self-organised criticality? In 2006, Prof Meyer-Lindenberg and his team made the first stab at answering that question. They used brain scans to map the connections between regions of the human brain and discovered that they form a "small-world network" - exactly the right architecture to support self-organised criticality.
Small-world networks lie somewhere between regular networks, where each node is connected to its nearest neighbours, and random networks, which have no regular structure but many long-distance connections between nodes at opposite sides of the network. For the brain, it is the perfect compromise. One of the characteristics of small-world networks is that you can communicate to any other part of the network through just a few nodes - the "six degrees of separation" reputed to link any two people in the world. In the brain, the number is 13.
Prof Meyer-Lindenberg created a computer simulation of a small-world network with 13 degrees of separation. The results confirmed that the brain has just the right architecture for its activity to sit on the tipping point between order and disorder. That clinching evidence arrived earlier this year, when Ed Bullmore of the University of Cambridge and his team used brain scanners to record neural activity in 19 human volunteers. What is more, when the team tried to reproduce the activity they saw in the volunteers' brains in computer models, they found that they could only do so if the models were in a state of self-organised criticality.
The work of Prof Bullmore's team is compelling evidence that self-organised criticality is an essential property of brain activity, says David Liley, a neuroscientist at Swinburne University of Technology in Melbourne, Australia. But why should that be? Perhaps because self-organised criticality is the perfect starting point for many of the brain's functions. The neuronal avalanches that Prof Beggs investigated, for example, are perfect for transmitting information across the brain. "One of the advantages of self-organised criticality is that the avalanches can propagate over many links," he says.
Self-organised criticality also appears to allow the brain to adapt to new situations. "The closer we get to the boundary of instability, the more quickly a particular stimulus will send the brain into a new state," says Prof Liley. It may also play a role in memory. Prof Beggs's team noticed that certain chains of neurons would fire repeatedly in avalanches, sometimes over several hours. Because an entire chain can be triggered by the firing of one neuron, these chains could be the stuff of memory, argues Prof Beggs: memories may come to mind unexpectedly because a neuron fires randomly or could be triggered unpredictably by a neuronal avalanche.
Hovering on the edge of chaos provides brains with their amazing capacity to process information and rapidly adapt to our ever-changing environment, but what happens if we stray either side of the boundary? The most obvious assumption would be that all of us are a step away from mental illness. Prof Meyer-Lindenberg suggests that schizophrenia may be caused by parts of the brain straying away from the critical point. However, for now that is purely speculative.
"They say it's a fine line between genius and madness," says Prof Liley. "Maybe we're finally beginning to understand the wisdom of this statement." www.newscientist.com