In the brain, function follows form

&ball; Physics 15, 93

By interpreting magnetic resonance images in the context of network control theory, researchers try to explain the dynamics of the brain in terms of structure, information content and energy.

USC Mark and Mary Stevens Neuroimaging and Informatics Institute (
Figure 1: A diffusion tensor image (DTI) showing some of the brain’s key connections.

Developing a physics for the brain is a daunting task that has obsessed scientists since large-scale brain activity measurements became available half a century ago. Advances in such techniques encourage rapid progress in the field. Magnetic Resonance Imaging (MRI) in particular provides two extremely valuable types of data. First, a variant of MRI called diffusion tensor imaging (DTI) provides a way to construct a map of the brain’s key connections — the brain’s physical “wiring” (Fig. 1† Second, functional MRI (fMRI) can measure where in the brain activity has just occurred by observing what is called the blood oxygen level dependent (BOLD) signal. Leon Weninger of the University of Pennsylvania and colleagues have now combined these methods with tools from network control theory to describe brain dynamics in terms of the information content of specific patterns of brain activity or “brain states” and of the energy costs of transitions between such states. [1]† The article offers exciting new strategies, derived from physics, to interpret brain structure and function.

The structure and activity measures provided by DTI and BOLD fMRI allow scientists to analyze two basic properties of the brain. First of all, to function properly, the brain probably needs to observe itself: parts of the brain must be able to estimate the state of other parts of the brain in order to reconstruct what is happening both inside and outside the brain. The analogy in control engineering is that parts of the brain must be observable. The brain also needs to control parts of itself to be able to read this article, generate speech and motor functions, and try hard to recall memories. [2]† Rudolf Kalman first defined the concepts of perceptibility and manageability of linear systems in 1960 [3]† In 1974, Ching-Tai Lin extended Kalman’s theory to explain which topologies of networks are structurally controllable – asking how the absence of connections in parts of a network would make it uncontrollable in the Kalman sense. [4]† But brains are floridely nonlinear and establishing observability and controllability for complex nonlinear networks is much more difficult than for linear systems [5]† Analyzing nonlinear observability and controllability requires the use of more complex mathematics such as Lie derivatives and parentheses, as well as the introduction of group theoretical concepts, as symmetries can destroy observability and controllability in fascinating ways. [6]†

Weninger and his colleagues investigate the controllability of brain activity and state by asking a series of fundamental questions related to connectivity-bound state transitions to other brain states. They characterize a given state through a measure of information (how likely is the state to be observed within the set of brain regions), and they use a fundamental result derived from Kalman’s work to determine whether the connectivity of a network makes it manageable. In particular, they use a Gram matrix for manageability that combines the relationship between network topology and control input. Such controllability creates critical constraints on the state transitions of the brain.

The scale and guts of this project are mind-boggling. The team applies their approach to a massive dataset from the Human Connectome Project [7]† They divide the brain regions of fMRI scans into activity packets and quantify the Shannon information in the set of packets according to the inverse of the probability of reaching a particular state while resting versus while performing a series of cognitive tasks. States of high information content are those that are statistically rare at rest. The researchers then calculate the energy needed to transition from one state to another. They report several main findings: (1) the information content depends on the cognitive context (compared to motor tasks, social tasks are very challenging!); (2) the energy required to switch to high information states (rare) is greater than that required to switch between low information states (general); and (3) the state transitions show that brain wiring is optimized to make this dynamic system efficient. Average manageability was found to correlate with ease of transition to information-rich states.

Many questions beyond the scope of the study are worth asking for those who want to think more about this characterization of thought processes. For example, do the high-energy transitions to high-information states reflect cognitive effort? Certainly, as I write this article, my brain is engaged in tasks that are more complex than we would expect from a statistical set of elements in a Boltzmann-like distribution of energy states. But can we use such an information-control-theoretical description of brain activity and mental effort to shed new light on cognitive dysfunction and mental health? An information-based dynamic biomarker for cognitive impairment would be very useful if it emerged from such a framework.

However, some limitations of the study should be taken into account. One of those limitations relates to the detailed mechanics behind the brain’s energy balance. The brain is an open system that consumes about 20% of the body’s resting metabolic energy. Most of that energy is spent restoring ion gradients across nerve cells and repacking neural transmitters after activity [8]† During activity, that stored energy is dissipated almost immediately (in milliseconds); the brain then recharges such energy stores more slowly (in seconds). The BOLD fMRI signal reflects this slower replenishment, not the rapid dissipation that occurs during brain activity. It is not known whether the energy required to reach a state of high information is stored regardless of the improbability of reaching that state through stochastic or resting activity.

Another limitation arises from the maximum spatial resolution currently possible in MRI: both the BOLD signal and the DTI pathways measure local regions of interest much larger than individual neurons. There is therefore a huge amount of subgrid physics going on in the brain that is not captured by these measures. In addition, much of the brain’s connectivity is one-way for a particular nerve fiber or bundle – there is no reversibility or detailed equilibrium in these ensemble dynamics. And an important characteristic of cognitive function is synchronization [9], which is typically electrically measured; yet synchrony implies symmetries in networks, and such symmetry can destroy manageability [10]†

The work of Weninger and colleagues provides exciting new strategies to investigate brain dynamics and cognitive states and is sure to lead to other fascinating questions in our minds about these most complex organs.


  1. L. Weninger et al.“The information content of brain states is explained by structural constraints on the energy state,” Phys. E 106014401 (2022)
  2. SJ Schiff, Neural control engineering: the emerging intersection between control theory and neuroscience (MIT Press, Cambridge, 2012)[Amazon][WorldCat]†
  3. RE Kalman, “A New Approach to Linear Filtering and Prediction Problems”, J. Basic Eng. 8235 (1960)
  4. CT Lin, “Structural Controllability”, IEEE Trans. Vending machine. transl. 19201 (1974)
  5. YY Liu et al.“Manageability of complex networks”, Nature 473167 (2011)
  6. AJ Whalen et al.“Observability and Controllability of Nonlinear Networks: The Role of Symmetry,” Phys. Rev. X 5011005 (2015)
  7. DC Van Essen et al.“The WU-Minn Human Connectome Project: An Overview”, NeuroImage 8062 (2013)
  8. P. Lennie, “The Cost of Cortical Calculation,” Current Biology 13493 (2003)
  9. PJ Uhlhaas et al.“Neural Synchrony in Cortical Networks: History, Concept, and Current Status,” Front side. integrate. neurosci. 3 (2009)
  10. LM Pecora et al.“Cluster synchronization and isolated desynchronization in complex networks with symmetries,” Nature Avg. 5 (2014)

About the author

Image by Steven Schiff

Steven Schiff is a pediatric neurosurgeon with interests in neural control engineering, sustainable health technology, and global health. He founded the Center for Neural Engineering at Penn State University, wrote the first book on Neural Control Engineering (MIT Press, 2012), and is now developing the Center for Global Neurosurgery at Yale University. He received the NIH Director’s Pioneer and Transformative Awards in 2015 and 2018 respectively, enabling him to pursue his interests in the sustainable fight against childhood infections in developing countries. This work has evolved into an exploration of what Schiff calls Predictive Personalized Public Health (P3H).

Read PDF

areas of expertise

biological physicsComplicated systems

Related articles

Scientists Unravel Silk Habitat Morphology
material skills

Scientists Unravel Silk Habitat Morphology

Using scanning electron microscopes, researchers have observed how water converts individual silk threads into protective sheets to create waterproof habitats for web-spinning insects. Read more “

Complex dance of light-seeking algae in light gradients
Defects keep 3D printed particles swirling

More articles

Leave a Comment

Your email address will not be published. Required fields are marked *