Skip to main content

REVIEW article

Front. Syst. Neurosci., 15 November 2016
Volume 10 - 2016 | https://doi.org/10.3389/fnsys.2016.00090

Neurobiology as Information Physics

  • Department of Cellular Biology, Franklin College of Arts and Sciences, University of Georgia, Athens, GA, USA

This article reviews thermodynamic relationships in the brain in an attempt to consolidate current research in systems neuroscience. The present synthesis supports proposals that thermodynamic information in the brain can be quantified to an appreciable degree of objectivity, that many qualitative properties of information in systems of the brain can be inferred by observing changes in thermodynamic quantities, and that many features of the brain’s anatomy and architecture illustrate relatively simple information-energy relationships. The brain may provide a unique window into the relationship between energy and information.

Introduction

That information is physical has been suggested by evidence since the founding of classical thermodynamics (Lloyd, 2006; Gleick, 2011). In recent years, Landauer’s principle (Landauer, 1996; Bennett, 2003), which relates information-theoretic entropy to thermodynamic information, has been confirmed (Parrondo et al., 2015), and the experimental demonstration of a form of information-energy equivalence (Alfonso-Faus, 2013) has verified that Maxwell’s demon cannot violate any known laws of thermodynamics (Maruyama et al., 2009). The theoretical finding that entropy is conserved as event horizon area is leading to the resolution of the black hole information paradox (Davies, 2010; Moskowitz, 2015), and there is a fundamental relationship between information and the geometry of spacetime itself (Bousso, 2002; Eling et al., 2006). Current formulations of quantum theory are revealing properties of physical information (Wheeler, 1986; Brukner and Zeilinger, 2003; Lloyd, 2006; Vedral, 2010), and information-interpretive attempts to show that gravity is quantized (Smolin, 2001; Lee et al., 2013) could even lead to the unification of quantum mechanics and the theories of relativity. Although similar approaches are increasingly influential in biology (Schneider and Sagan, 2005; England, 2013; Flack, 2014), “a formalization of the relationship between information and energy is currently lacking in neuroscience” (Collell and Fauquet, 2015). The purpose of this article is to explore a few different sides of this relationship and, along the way, to suggest that many hypotheses and theories in neuroscience can be unified by the physics of information.

Information Bounds

“How can the events in space and time which take place within the spatial boundary of a living organism be accounted for by physics and chemistry?” – (Schrödinger, 1944, from Friston, 2013).

As a fundamental physical entity (Lloyd, 2015), information is not fully understood, and there is currently a significant amount of disagreement over different definitions of information and entropy in the literature (Poirier, 2014; Ben-Naim, 2015). In thermodynamics, however, information can be defined as a negation of thermodynamic entropy (Beck, 2009):

IS

A bit of thermodynamic entropy represents the distinction between two alternative states in a physical system (Stone, 2015). As a result, the total thermodynamic entropy of a system is proportional to the total number of distinguishable states contained in the system (Bekenstein, 2001, 2007). Because thermodynamic entropy is potential information relative to an observer (Lloyd, 2006), and an observer in a physical system is a component of the system itself, the total thermodynamic entropy of a system includes the portion of entropy that is accessible to the observer as relative thermodynamic information (Wheeler, 1989; Collell and Fauquet, 2015):

Irelative=StotalSrelative

Since entropy in any physical system is finite (Lloyd, 2006; Rovelli, 2015), the total thermodynamic entropy of any system of the brain can be quantified by applying the traditional form of the universal (Bekenstein, 1981, 1984, 2001, 2004, 2007) information-entropy bound:

Ssys=ζAEKћc

where A is area, E is energy including matter, ℏ is the reduced Planck constant, c is the speed of light, k is Boltzmann’s constant, and ζ is a factor such that 0 ≤ ζ ≤ 1.

Setting this factor to 1 in order to quantify the total thermodynamic entropy of a system at a certain level of structure now allows us to quantify thermodynamic information by partitioning the factor into a relative information component (ζI = 1-ζs) and a relative entropy component (ζs = 1-ζI),

Isys=ζIAEKћc=(1ζs)AEKћc

Because a maximal level of energy corresponds to a maximal level of thermodynamic information, and a minimal level of energy corresponds to a minimal level of thermodynamic information (Duncan and Semura, 2004), any transitions between energy levels occur as transitions between informational extrema. So, in the event that information enters a system of the brain,

ΔIsys=ΔEsyskT=ΔζI

where T is temperature. And, in the case that information exits a system,

ΔIsys=ΔEsurrkT=Δζs

Various forms of these relationships, including information-entropy bounds, have been applied in neuroscience (Friston, 2010; Sengupta et al., 2013a,c, 2016; Collell and Fauquet, 2015; Sterling and Laughlin, 2015). The contribution of this review is simply to show that these relationships can be united into a common theoretical framework.

Neurobiology

… classical thermodynamics… is the only physical theory of universal content which I am convinced, that within the framework of applicability of its basic concepts, will never be overthrown.” – (Einstein, 1949, from Bekenstein, 2001).

This section reviews thermodynamic relationships in systems neuroscience with a focus on information and energy. Beginning with neurons, moving to neural networks, and concluding at the level of the brain as a whole, I discuss the energetics of processes such as learning and memory, excitation and inhibition, and the production of noise in neurobiological systems.

The central role of energy in determining the activity of neurons exposes the close connection between information and thermodynamics at the level of the cell. For instance, the process of depolarization, which occurs as a transition to Emax from a resting state Emin, clearly shows that cellular information content is correlated with energy levels. In this respect, the resemblance between ion concentration gradients in neurons and temperature gradients in thermodynamic demons (i.e., agents that use information from their surroundings to decrease their thermodynamic entropy) is not a coincidence – in order to acquire information, neurons must expend energy to establish proper membrane potentials. Recall that Landauer’s principle (Plenio and Vitelli, 2001; Parrondo et al., 2015) places a lower bound on the quantity of energy released into the surroundings with the removal of information from a system. Thus, reestablishing membrane potentials after depolarization – the neuronal equivalent of resetting a demon’s memory – dissipates energy. Because Landauer’s principle applies to all levels of structure, and cells process large quantities of information, neurons use energy efficiently despite operating at several orders of magnitude above the nominal limit. Parameters including membrane area, spiking frequency, and axon length have all been optimized over the course of evolution to allow neurons to process information efficiently (Sterling and Laughlin, 2015). Examining the energetics of information processing in neurons reinforces the notion that, while it is often convenient to imagine the neuron to be a simple binary element, these cells are intricate computational structures that process more than one bit of information.

Relationships between information and energy can also be seen at the level of neural networks. Attractor networks naturally stabilize by seeking energy minima, and the relative positions of basins of attraction define the geometry of an energy landscape (Amit, 1992). As a result, the transition into an active attractor state occurs as a transition into an information-energy maximum. These transitions correspond to the generation of informational entities such as memories, decisions, and perceptual events (Rolls, 2012). In this way, the energy basins of attractor networks may be analogous to lower-level cellular and molecular energy gradients; a transition between any number of distinguishable energy levels follows the passage of a finite quantity of information. Since processing information requires the expenditure of energy, competitive network features also underscore the need to minimize unnecessary information processing. Lateral inhibition at this level may optimize thermodynamic efficiency by reducing metabolic expenses associated with networks responding less robustly to entering signals. Another interesting thermodynamic property of networks concerns macrostates: the functional states of large-scale neural networks rest emergently on the states of neuronal assemblies (Yuste, 2015). As a result, new computational properties may arise with the addition of new layers of network structure. Finally, the energetic cost of information has influenced network connectivity by imposing selective pressures to save energy by minimizing path length between network nodes (Bullmore and Sporns, 2009).

Again, in accordance with Landauer’s principle, the displacement of information from any system releases energy into the surroundings (Plenio and Vitelli, 2001; Duncan and Semura, 2004). This principle can be understood by imagining an idealized memory device, such as the brain of a thermodynamic demon. Since information is conserved (Susskind and Hrabovsky, 2014), and clearing a memory erases information, the thermodynamic entropy of the surroundings must increase when a demon refreshes its memory to update information. This fundamental connection between information, entropy, and energy appears in many areas of the neurobiology of learning. For example, adjusting a firing threshold in order to change the probability that a system will respond to a conditioned stimulus (Takeuchi et al., 2014; Choe, 2015) optimizes engram fitness by minimizing the quantity of energy needed for its activation (Still et al., 2012). Recurrent collateral connections further increase engram efficiency by enabling a minimal nodal stimulus to elicit its full energetic activation (Rolls, 2012). Experimental evidence also shows that restricting synaptic energy supply impairs the formation of stable engrams (Harris et al., 2012). Because the formation and disassembly of engrams during learning and forgetting optimizes the growth and pruning of networks in response to external conditions, the process of learning is itself a mechanism for minimizing entropy in the brain (Friston, 2003).

As another example of a multiscale process integrated across many levels by thermodynamics, consider the active balance between excitation and inhibition in neurobiological systems. Maintaining proper membrane potentials and adequate concentrations of signaling molecules requires the expenditure of energy, so it is advantageous for systems of the brain to minimize the processing of unnecessary information – to “send only what is needed” (Sterling and Laughlin, 2015). Balancing excitation and inhibition is therefore a crucial mechanism for saving energy. Theoretical evidence that this balancing maximizes the thermodynamic efficiency of processing Shannon information (Sengupta et al., 2013b) is consistent with experimental findings in several areas of research on inhibition. For instance, constant inhibitory modulation is needed to stabilize internal states, and hyperexcitation (e.g., in epilepsy, intoxication syndromes, or trauma) can decrease relative information by reducing levels of consciousness (Haider et al., 2006; Lehmann et al., 2012). Likewise, selective attention is mediated by the activation of inhibitory interneurons (Houghton and Tipper, 1996), and sensory inhibition appears to sharpen internal perceptual states (Isaacson and Scanziani, 2011). The need to balance excitation and inhibition at all levels of structure highlights the energetic cost of information.

A final example worth discussing is the relationship between thermodynamics and the production of noise in neurobiological systems. Noise is present in every system of the brain, and influences all aspects of the organ’s function (Faisal et al., 2008; Rolls and Deco, 2010; Destexhe and Rudolph-Lilith, 2012). Even in the absence of any potential forms of classical stochastic resonance, the noise-driven exploration of different states may optimize thermodynamic efficiency by allowing a system to randomly sample different accessible configurations. Theoretical arguments suggest indeed that noise enables neural networks to respond more quickly to detected signals (Rolls, 2012), and empirical evidence implicates noise as a beneficial means of optimizing the performance of diverse neurobiological processes (McDonnell and Ward, 2011). For example, noise in the form of neuronal DNA breaking (Guo et al., 2011; Herrup et al., 2013; Tognini et al., 2015) could enhance plasticity, since any stochastically optimized configuration would be more likely to survive over time as, in this case, a strengthened connection in a modifiable network. Because noise is a form of relative entropy, optimizing the signal-to-noise ratio in any neurobiological system promotes the efficient use of energy.

At the level of the brain as a whole, the connection between information and thermodynamics is readily apparent in the organ’s functional reliance on energy (Magistretti and Allaman, 2015), its seemingly disproportionate consumption of oxygen and energy substrates (e.g., ATP, glucose, ketones, etc.; Raichle and Gusnard, 2002; Herculano-Houzel, 2011), its vulnerability to hypoxic-ischemic damage (Lutz et al., 2003; Dreier et al., 2013) and in the reduction of consciousness often conferred by the onset of energy restrictions (Shulman et al., 2009; Stender et al., 2016). All fMRI, PET, and EEG interpretation rests on the foundational assumption that changes in the information content of neurobiological systems can be inferred by observing energy changes (Attwell and Iadecola, 2002; Collell and Fauquet, 2015), and it is well known that the information processing capacities of neurobiological systems are limited by energy supply (Howarth et al., 2012; Fox, 2015). Overall, these relationships are consistent with the form of information-energy equivalence predicted by Landauer’s principle and information-entropy bounds. The living brain appears to maintain a state of thermodynamic optimization.

Consciousness and Free Will

… science appears completely to lose from sight the large and general questions; but all the more splendid is the success when, groping in the thicket of special questions, we suddenly find a small opening that allows a hitherto undreamt of outlook on the whole.” – (Boltzmann, 1892, from Von Baeyer, 1999).

Although neuroscience has yet to explain consciousness or free will at any satisfactory level of detail, relationships between information and energy seem to be recognizable even at this level of analysis. This section reviews attempts to conceptualize major properties of consciousness (unity, continuity, complexity, and self-awareness) as features of information processing in the brain, and concludes with a discussion of free will.

At any given moment, awareness is experienced as a unified whole. Physical information is the substrate of consciousness (Annila, 2016), and the law of conservation of information requires any minimal unit of information to be transferred into a thermodynamic system as a temporally unitary quantity. As a result, it is possible that the passage of perceptual time itself occurs secondarily to the transfer of information, and that the information present in any integrated system of the brain at any observed time is necessarily cohesive and temporally unified. In this framework, the passage of time would vary in proportion to a system’s rate of energy dissipation. Although it is possible that physical systems in general exchange information in temporally unitary quantities, it is likely that many of the familiar features of the perceptual unity of consciousness require the structure and activity of neural networks in the brain. The biological basis of this unity may be the active temporal consolidation of observed events by integrated higher-order networks (Revonsuo, 1999; Varela et al., 2001; Greenfield and Collins, 2005; Dehaene and Changeux, 2011). An informational structure generated by the claustrum has been speculated to contribute to this experiential unity (Crick and Koch, 2005; Koubeissi et al., 2014), but it has also been reported that complete unilateral resection of the system performed in patients with neoplastic lesions of the region produces no externally observable changes in subjective awareness (Duffau et al., 2007). Overall, it appears unlikely that the presence of information in any isolated or compartmentalized network of the brain is responsible for generating the unified nature of conscious experience.

While perceptual time is likely the product of a collection of related informational processes rather than a single, globalized function mediated by any one specific system of the brain, some of the perceptual continuity of consciousness may result from the effectively continuous flow of thermodynamic information into and out of integrated systems of the brain. In this framework, the quantum (Prokopenko and Lizier, 2014) of perceptual time would be the minimal acquisition of information, and the entrance of information into neurobiological systems would occur alongside the entrance of energy. This relationship is implicit in the simple observation that the transition of a large-scale attractor network is progressively less discrete and smoother in time than the activation of a small-scale engram, the propagation of a cellular potential, the docking of a vesicle, the release of an ion, and so forth. Likewise, electroencephalography shows that the summation of a large number of discrete cellular potentials can accumulate into an effectively continuous wave as a network field potential (Nunez and Srinivasan, 2006), disruptions of which are often correlated with decreases in levels of consciousness (Blumenfeld and Taylor, 2003). It is also well known that higher frequency network oscillations tend to indicate states of wakefulness and active awareness, while lower frequency oscillations tend to be associated with internal states of lesser passage of perceptual time, such as dreamless sleep or unconsciousness. The possibility that the experiential arrow of time and the thermodynamic arrow of time share a common origin in the flow of information is supported both by general models of time in neuroscience and the physical interpretation of time as an entropy gradient (Stoica, 2008; Mlodinow and Brun, 2014).

The subjective complexity of consciousness may show that extensive network integration is needed for maximizing the mutual thermodynamic information and internal energy content of systems of the brain (Torday and Miller, 2016). An exemplary structure enabling such experience, likely one of many that together account for the subjective complexity of consciousness, is the thalamocortical complex (Calabrò et al., 2015; Hannawi et al., 2015). The functional architecture of such a network may show that, at any given moment in the internal model of a living brain, a wide range of integrated systems are sharing mutual sources of thermodynamic information. This pattern of structure may reveal that the perceptual depth and complexity of conscious experience is a direct product of recognizable features of the physical brain. However, it also seems that extensive local cortical processing of information is necessary for producing a refined and coherent sensorium within a system, and that both the thalamocortical complex and the brain stem are involved in generating the subjective complexity of consciousness (Edelman et al., 2011; Ward, 2011). The dynamics of attractor networks at higher levels of network structure may show that quantities of complex internal information can be observed as changes in cortical energy landscapes (Rolls, 2012), with a transition between attractor states following the transfer of information. The degree of subjective complexity of information enclosed by such a transition would be proportional to the degree of structural integration of underlying networks.

Self-awareness likely arose as a survival necessity rather than as an accident of evolution (Fabbro et al., 2015), and rudimentary forms of self-awareness likely began to appear early in the course of brain evolution as various forms of perceptual self-environment separation. As a simple example, consider the tickle response (Linden, 2007), which requires the ability to differentiate self-produced tactile sensations from those produced by external systems. The early need to distinguish between self-produced tactile states and those produced by more threatening non-self sources may be reflected by the observation that this recognition process is mediated to a great extent by the cerebellum (Blakemore et al., 2000). While it is possible that other similar developments began occurring very early on, the evolutionary acquisition of the refined syntactical and conceptual self present in the modern brain likely required the merging of pre-existing self networks with higher-level cortical systems. The eventual integration of language and self-awareness would have been advantageous for coordinating social groups (Graziano, 2013), since experiencing self-referential thought as inner speech facilitates verbal communication. Likewise, the coupling of self-awareness to internal sensory, cognitive, and motor states (Metzinger, 2004; Northoff et al., 2006) may be advantageous for maximizing information between systems within an individual brain. Neuropsychological conditions involving different forms of agnosia, neglect, and self-awareness deficits do show that a reduced awareness of self-ownership of motor skills, body parts, or perceptual states can result in significant disability (Parton et al., 2004; Morin, 2006; Orfei et al., 2007; Prigatano, 2009; Tsakiris, 2010; Overgaard, 2011; Fabbro et al., 2015; Chokron et al., 2016). Since experiencing self-awareness optimizes levels of mutual information between the external world and the brain’s internal model (Apps and Tsakiris, 2014), and this activity decreases thermodynamic entropy (Torday and Miller, 2016), self-awareness may be a mechanism for optimizing the brain’s consumption of energy.

Thermodynamic information is also interesting to consider in the context of free will. The brain is predictable within reason, and the performance of an action can be predicted before a decision is reported to have been made (Haggard, 2008). Entities such as ideas, feelings, and beliefs seem to exist as effectively deterministic evaluations of information processed in the brain. Whether or not the flow of information is subject to the brain’s volitional alteration, neuroscience also shows that information can be internally real to a system of the brain, even if this information is inconsistent with an external reality. That the brain can generate an externally inconsistent internal reality is demonstrated by phenomena such as confabulation, agnosia, blindsight, neglect, commissurotomy and hemispherectomy effects, placebo and nocebo effects, reality monitoring deficits, hallucinations, prediction errors, the suspension of disbelief during dreaming, the function of communication in minimizing divergence between internal realities, the quality of many kinds of realistic drug-induced experiences, and the effects of many neuropsychological conditions. The apparent fact that subjective reality is an active construction of the physical brain has even led to the proposal of model-dependent realism (Hawking and Mlodinow, 2011) as a philosophical paradigm in the search for a unified theory of physics. In any case, it is likely that beliefs, including those in free will, exist as information, and that their internal reality is a restatement of its frequently observer-dependent nature.

Empirical Outlook

Before concluding, it is worth reviewing a few notable experiments in greater detail. While considerable advances have been made in discovering how neurobiological systems operate according to principles of thermodynamic efficiency (Sterling and Laughlin, 2015), relationships between information and energy in the brain are only beginning to be understood. The following studies are examples of elegant and insightful experiments that should inspire future research.

Several recent brain imaging studies support the proposal (Annila, 2016) that thermodynamics is able to explain a number of mysteries involving consciousness. For example, Stender et al. (2016) used PET to measure global resting state energy consumption in 131 brain injury patients with impairments of consciousness as defined by the revised Coma Recovery Scale (CRS-R). The preservation of consciousness was found to require a minimal global metabolic rate of ≈ 40% of the average rate of controls; global energy consumption above this level was reported to predict the presence or recovery of consciousness with over 90% sensitivity. These results must be replicated and studied in closer detail before their specific theoretical implications are clear, but it is now established that levels of consciousness are correlated with energetic metrics of brain activity. To what extent there exists a well-defined “minimal energetic requirement for the presence of conscious awareness” (Stender et al., 2016) remains an open question. However, the empirical confirmation of a connection between consciousness and thermodynamics introduces the possibility of developing new experimental methods in consciousness research.

Neurobiological systems, and biological systems in general (Von Baeyer, 1999; Schneider and Sagan, 2005), can be considered thermodynamic demons in the sense that they are agents using information to decrease their thermodynamic entropy. Landauer’s principle requires that, in order not to violate any known laws of thermodynamics, such agents dissipate heat when erasing information from their memory storage devices. In an experimental test of this principle, reviewed along with similar experiments in Parrondo et al. (2015) and Bérut et al. (2012) studied heat dissipation in a simple memory device created by placing a glass bead in an optical double-well potential. Intuitively, this memory stored a bit of information by retaining the bead on one side of the potential rather than on the alternative. By manipulating the height of the optical barrier between wells, researchers moved the bead to one side of the memory without determining its previous location in the potential. This process was therefore logically irreversible, requiring the erasure of prior information from the memory device. Landauer’s principle predicts that, since information is conserved, the entropy of the memory’s surroundings must increase when this occurs. Bérut et al. (2012) have verified that energy is emitted when a memory is cleared. As noted by the authors, “this limit is independent of the actual device, circuit or material used to implement the irreversible operation.” It would be interesting to study the erasure principle in the context of neuroscience.

Experimental applications of information theory in cell biology have already led to the discovery of general principles of brain organization related to thermodynamics (Sterling and Laughlin, 2015). In one particularly interesting study, Niven et al. (2007) measured the energetic efficiency of information coding in retinal neurons. Intracellular recordings of membrane potential and input resistance were used to calculate rates of ATP consumption in response to different background light intensities. These rates of energy consumption were then compared with rates of Shannon information transmission in order to determine metabolic performance. It was found that metabolic demands increase non-linearly with respect to increases in information processing rate: thermodynamics appears to impose a “law of diminishing returns” on systems of the brain. The authors interpret these results as evidence that nature has selected for neurons that minimize unnecessary information processing. Studying how thermodynamics has influenced cellular parameters over the course of evolution is likely to raise many new empirically addressable questions.

Conclusion

This article has reviewed information-energy relationships in the hope that they may eventually provide a general framework for uniting theory and experiment in neuroscience. The physical nature of information and its status as a finite, measurable resource are emphasized to connect neurobiology and thermodynamics. As a scientific paradigm, the information movement currently underway in physics promises profound advances in our understanding of the relationship between energy, information, and the physical brain.

Author Contributions

The author confirms being the sole contributor of this work and approved it for publication.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

I am grateful to Baroness Susan Greenfield, Dr. Francesco Fermani, Dr. Karl Friston, Dr. Biswa Sengupta, Dr. Roy Frieden, Dr. Bernard Baars, Dr. Brett Clementz, Dr. Cristi Stoica, Dr. Satoru Suzuki, Dr. Paul King, Guillem Collell, Dr. Jordi Fauquet, and others who have helped me improve these ideas. I am also grateful to Dr. Shanta Dhar and her team for introducing me to academic research, to Jim Reid for introducing me to biology, to Alex Tisch for introducing me to physics, and to those affiliated with the Department of Neurosurgery at the University of Virginia Medical Center for introducing me to neuroscience.

References

Alfonso-Faus, A. (2013). “Fundamental principle of information-to-energy conversion,” in Proceedings of the 7th European Computing Conference, Dubrovnik.

Google Scholar

Amit, D. J. (1992). Modeling Brain Function. Cambridge: Cambridge University Press.

Google Scholar

Annila, A. (2016). On the character of consciousness. Front. Syst. Neurosci. 10:27. doi: 10.3389/fnsys.2016.00027

CrossRef Full Text | Google Scholar

Apps, M. A., and Tsakiris, M. (2014). The free-energy self: a predictive coding account of self-recognition. Neurosci. Biobehav. Rev. 41, 85–97. doi: 10.1016/j.neubiorev.2013.01.029

PubMed Abstract | CrossRef Full Text | Google Scholar

Attwell, D., and Iadecola, C. (2002). The neural basis of functional brain imaging signals. Trends Neurosci. 25, 621–625. doi: 10.1016/S0166-2236(02)02264-6

CrossRef Full Text | Google Scholar

Beck, C. (2009). Generalised information and entropy measures in physics. Contemp. Phys. 50, 495–510. doi: 10.1080/00107510902823517

CrossRef Full Text | Google Scholar

Bekenstein, J. D. (1981). Universal upper bound on the entropy-to-energy ratio for bounded systems. Phys. Rev. D 23, 287. doi: 10.1103/PhysRevD.23.287

CrossRef Full Text | Google Scholar

Bekenstein, J. D. (1984). Entropy content and information flow in systems with limited energy. Phys. Rev. D 30, 1669. doi: 10.1103/PhysRevD.30.1669

CrossRef Full Text | Google Scholar

Bekenstein, J. D. (2001). The limits of information. Stud. Hist. Philos. Mod. Phys. 32, 511–524. doi: 10.1016/S1355-2198(01)00020-X

CrossRef Full Text | Google Scholar

Bekenstein, J. D. (2004). Black holes and information theory. Contemp. Phys. 45, 31–43. doi: 10.1080/00107510310001632523

CrossRef Full Text | Google Scholar

Bekenstein, J. D. (2007). Information in the holographic universe. Sci. Am. 17, 66–73. doi: 10.1038/scientificamerican0407-66sp

CrossRef Full Text | Google Scholar

Ben-Naim, A. (2015). Information, Entropy, Life and the Universe. Singapore: World Scientific, 4–5.

Google Scholar

Bennett, C. H. (2003). Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon. Stud. Hist. Philos. Sci. B 34, 501–510. doi: 10.1016/S1355-2198(03)00039-X

PubMed Abstract | CrossRef Full Text | Google Scholar

Bérut, A., Arakelyan, A., Petrosyan, A., Ciliberto, S., Dillenschneider, R., and Lutz, E. (2012). Experimental verification of Landauer/’s principle linking information and thermodynamics. Nature 483, 187–189. doi: 10.1038/nature10872

PubMed Abstract | CrossRef Full Text | Google Scholar

Blakemore, S. J., Wolpert, D., and Frith, C. (2000). Why can’t you tickle yourself? Neuroreport 11, R11–R16. doi: 10.1097/00001756-200008030-00002

CrossRef Full Text | Google Scholar

Blumenfeld, H., and Taylor, J. (2003). Why do seizures cause loss of consciousness? Neuroscientist 9, 301–310. doi: 10.1177/1073858403255624

CrossRef Full Text | Google Scholar

Boltzmann, L. (1892). “On the methods of theoretical physics,” in Theoretical Physics and Philosophical Problems, ed. B. McGuinness (Dordrecht: Springer Netherlands), 5–12.

Google Scholar

Bousso, R. (2002). The holographic principle. Rev. Mod. Phys. 74, 825. doi: 10.1103/RevModPhys.74.825

CrossRef Full Text | Google Scholar

Brukner,Č, and Zeilinger, A. (2003). “Information and fundamental elements of the structure of quantum theory,” in Time, Quantum and Information, eds L. Castell and O. Ischebeck (Heidelberg: Springer), 323–354.

Google Scholar

Bullmore, E., and Sporns, O. (2009). Complex brain networks: graph theoretical analysis of structural and functional systems. Nat. Rev. Neurosci. 10, 186–198. doi: 10.1038/nrn2575

PubMed Abstract | CrossRef Full Text | Google Scholar

Calabrò, R. S., Cacciola, A., Bramanti, P., and Milardi, D. (2015). Neural correlates of consciousness: what we know and what we have to learn! Neurol. Sci. 36, 505–513. doi: 10.1007/s10072-015-2072-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Choe, Y. (2015). “Hebbian learning,” in Encyclopedia of Computational Neuroscience, eds D. Jaeger and R. Jung (New York, NY: Springer), 1305–1309. doi: 10.1007/978-1-4614-7320-6_672-1

CrossRef Full Text | Google Scholar

Chokron, S., Perez, C., and Peyrin, C. (2016). Behavioral consequences and cortical reorganization in homonymous hemianopia. Front. Syst. Neurosci. 10:57. doi: 10.3389/fnsys.2016.00057

PubMed Abstract | CrossRef Full Text | Google Scholar

Collell, G., and Fauquet, J. (2015). Brain activity and cognition: a connection from thermodynamics and information theory. Front. Psychol. 6:818. doi: 10.3389/fpsyg.2015.00818

PubMed Abstract | CrossRef Full Text | Google Scholar

Crick, F. C., and Koch, C. (2005). What is the function of the claustrum? Philos. Trans. R. Soc. Lond. B Biol. Sci. 360, 1271–1279. doi: 10.1098/rstb.2005.1661

PubMed Abstract | CrossRef Full Text | Google Scholar

Davies, P. (2010). “Universe from bit,” in Information and the Nature of Reality, eds P. Davies and N. H. Gregersen (Cambridge: Cambridge University Press), 83–117.

Google Scholar

Dehaene, S., and Changeux, J. P. (2011). Experimental and theoretical approaches to conscious processing. Neuron 70, 200–227. doi: 10.1016/j.neuron.2011.03.018

PubMed Abstract | CrossRef Full Text | Google Scholar

Destexhe, A., and Rudolph-Lilith, M. (2012). Neuronal Noise. New York, NY: Springer.

Google Scholar

Dreier, J. P., Isele, T., Reiffurth, C., Offenhauser, N., Kirov, S. A., Dahlem, M. A., et al. (2013). Is spreading depolarization characterized by an abrupt, massive release of Gibbs free energy from the human brain cortex? Neuroscientist 19, 25–42. doi: 10.1177/1073858412453340

PubMed Abstract | CrossRef Full Text | Google Scholar

Duffau, H., Mandonnet, E., Gatignol, P., and Capelle, L. (2007). Functional compensation of the claustrum: lessons from low-grade glioma surgery. J. Neurooncol. 81, 327–329. doi: 10.1007/s11060-006-9236-8

PubMed Abstract | CrossRef Full Text | Google Scholar

Duncan, T. L., and Semura, J. S. (2004). The deep physics behind the second law: information and energy as independent forms of bookkeeping. Entropy 6, 21–29. doi: 10.3390/e6010021

CrossRef Full Text | Google Scholar

Edelman, G. M., Gally, J. A., and Baars, B. J. (2011). Biology of consciousness. Front. Psychol. 2:4. doi: 10.3389/fpsyg.2011.00004

PubMed Abstract | CrossRef Full Text | Google Scholar

Einstein, A. (1949). “Autobiographical notes,” in Albert Einstein, ed. P. A. Schilpp (La Salle: Open Court), 33.

Google Scholar

Eling, C., Guedens, R., and Jacobson, T. (2006). Nonequilibrium thermodynamics of spacetime. Phys. Rev. Lett. 96, 121301. doi: 10.1103/PhysRevLett.96.121301

PubMed Abstract | CrossRef Full Text | Google Scholar

England, J. L. (2013). Statistical physics of self-replication. J. Chem. Phys. 139, 121923. doi: 10.1063/1.4818538

PubMed Abstract | CrossRef Full Text | Google Scholar

Fabbro, F., Aglioti, S. M., Bergamasco, M., Clarici, A., and Panksepp, J. (2015). Evolutionary aspects of self-and world consciousness in vertebrates. Front. Hum. Neurosci. 9:157. doi: 10.3389/fnhum.2015.00157

PubMed Abstract | CrossRef Full Text | Google Scholar

Faisal, A. A., Selen, L. P., and Wolpert, D. M. (2008). Noise in the nervous system. Nat. Rev. Neurosci. 9, 292–303. doi: 10.1038/nrn2258

PubMed Abstract | CrossRef Full Text | Google Scholar

Flack, J. C. (2014). Life’s information hierarchy. Santa Fe Inst. Bull. 28, 13–24.

Google Scholar

Fox, D. (2015). The limits of intelligence. Sci. Am. 305, 36–43.

Google Scholar

Friston, K. J. (2003). Learning and inference in the brain. Neural Netw. 16, 1325–1352. doi: 10.1016/j.neunet.2003.06.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K. J. (2010). The free-energy principle: a unified brain theory? Nat. Rev. Neurosci. 11, 127–138. doi: 10.1038/nrn2787

PubMed Abstract | CrossRef Full Text | Google Scholar

Friston, K. J. (2013). Life as we know it. J. R. Soc. Interface 10:20130475. doi: 10.1098/rsif.2013.0475

PubMed Abstract | CrossRef Full Text | Google Scholar

Gleick, J. (2011). The Information. New York, NY: Random House, 269–270.

Google Scholar

Graziano, M. S. (2013). Consciousness and the Social Brain. Oxford: Oxford University Press.

Google Scholar

Greenfield, S. A., and Collins, T. F. T. (2005). A neuroscientific approach to consciousness. Prog. Brain Res. 150, 11–23. doi: 10.1016/S0079-6123(05)50002-5

CrossRef Full Text | Google Scholar

Guo, J. U., Ma, D. K., Mo, H., Ball, M. P., Jang, M. H., Bonaguidi, M. A., et al. (2011). Neuronal activity modifies the DNA methylation landscape in the adult brain. Nat. Neurosci. 14, 1345–1351. doi: 10.1038/nn.2900

PubMed Abstract | CrossRef Full Text | Google Scholar

Haggard, P. (2008). Human volition: towards a neuroscience of will. Nat. Rev. Neurosci. 9, 934–946. doi: 10.1038/nrn2497

PubMed Abstract | CrossRef Full Text | Google Scholar

Haider, B., Duque, A., Hasenstaub, A. R., and McCormick, D. A. (2006). Neocortical network activity in vivo is generated through a dynamic balance of excitation and inhibition. J. Neurosci. 26, 4535–4545. doi: 10.1523/JNEUROSCI.5297-05.2006

CrossRef Full Text | Google Scholar

Hannawi, Y., Lindquist, M. A., Caffo, B. S., Sair, H. I., and Stevens, R. D. (2015). Resting brain activity in disorders of consciousness. Neurology 84, 1272–1280. doi: 10.1212/WNL.0000000000001404

PubMed Abstract | CrossRef Full Text | Google Scholar

Harris, J. J., Jolivet, R., and Attwell, D. (2012). Synaptic energy use and supply. Neuron 75, 762–777. doi: 10.1016/j.neuron.2012.08.019

PubMed Abstract | CrossRef Full Text | Google Scholar

Hawking, S. W., and Mlodinow, L. (2011). The Grand Design. New York, NY: Random House. 7, 46.

Google Scholar

Herculano-Houzel, S. (2011). Scaling of brain metabolism with a fixed energy budget per neuron: implications for neuronal activity, plasticity and evolution. PLoS ONE 6:e17514. doi: 10.1371/journal.pone.0017514

PubMed Abstract | CrossRef Full Text | Google Scholar

Herrup, K., Chen, J., and Li, J. (2013). Breaking news: thinking may be bad for DNA. Nat. Neurosci. 16, 518–519. doi: 10.1038/nn.3384

PubMed Abstract | CrossRef Full Text | Google Scholar

Houghton, G., and Tipper, S. P. (1996). Inhibitory mechanisms of neural and cognitive control: applications to selective attention and sequential action. Brain Cogn. 30, 20–43. doi: 10.1006/brcg.1996.0003

PubMed Abstract | CrossRef Full Text | Google Scholar

Howarth, C., Gleeson, P., and Attwell, D. (2012). Updated energy budgets for neural computation in the neocortex and cerebellum. J. Cereb. Blood Flow Metab. 32, 1222–1232. doi: 10.1038/jcbfm.2012.35

PubMed Abstract | CrossRef Full Text | Google Scholar

Isaacson, J. S., and Scanziani, M. (2011). How inhibition shapes cortical activity. Neuron 72, 231–243. doi: 10.1016/j.neuron.2011.09.027

PubMed Abstract | CrossRef Full Text | Google Scholar

Koubeissi, M. Z., Bartolomei, F., Beltagy, A., and Picard, F. (2014). Electrical stimulation of a small brain area reversibly disrupts consciousness. Epilepsy Behav. 37, 32–35. doi: 10.1016/j.yebeh.2014.05.027

PubMed Abstract | CrossRef Full Text | Google Scholar

Landauer, R. (1996). The physical nature of information. Phys. Lett. A 217, 188–193. doi: 10.1016/0375-9601(96)00453-7

CrossRef Full Text | Google Scholar

Lee, J. W., Kim, H. C., and Lee, J. (2013). Gravity from quantum information. J. Korean Phys. Soc. 63, 1094–1098. doi: 10.3938/jkps.63.1094

CrossRef Full Text | Google Scholar

Lehmann, K., Steinecke, A., and Bolz, J. (2012). GABA through the ages: regulation of cortical function and plasticity by inhibitory interneurons. Neural Plast 2012:892784. doi: 10.1155/2012/892784

PubMed Abstract | CrossRef Full Text | Google Scholar

Linden, D. J. (2007). The Accidental Mind. Cambridge: Harvard University Press, 9–12.

Google Scholar

Lloyd, S. (2006). Programming the Universe. New York, NY: Random House.

Google Scholar

Lloyd, S. (2015). Interview in Closer to Truth: Is Information Fundamental?. Available at: https://www.closertotruth.com/series/information-fundamental\#video-2621

Lutz, P. L., Nilsson, G. E., and Prentice, H. M. (2003). The Brain Without Oxygen. New York, NY: Kluwer.

Google Scholar

Magistretti, P. J., and Allaman, I. (2015). A cellular perspective on brain energy metabolism and functional imaging. Neuron 86, 883–901. doi: 10.1016/j.neuron.2015.03.035

PubMed Abstract | CrossRef Full Text | Google Scholar

Maruyama, K., Nori, F., and Vedral, V. (2009). Colloquium: the physics of Maxwell’s demon and information. Rev. Mod. Phys. 81, 1. doi: 10.1103/RevModPhys.81.1

CrossRef Full Text | Google Scholar

McDonnell, M. D., and Ward, L. M. (2011). The benefits of noise in neural systems: bridging theory and experiment. Nat. Rev. Neurosci. 12, 415–426. doi: 10.1038/nrn3061

PubMed Abstract | CrossRef Full Text | Google Scholar

Metzinger, T. (2004). Being No One. Cambridge: MIT Press.

Google Scholar

Mlodinow, L., and Brun, T. A. (2014). Relation between the psychological and thermodynamic arrows of time. Phys. Rev. E 89:052102. doi: 10.1103/PhysRevE.89.052102

PubMed Abstract | CrossRef Full Text | Google Scholar

Morin, A. (2006). Levels of consciousness and self-awareness: a comparison and integration of various neurocognitive views. Conscious. Cogn. 15, 358–371. doi: 10.1016/j.concog.2005.09.006

PubMed Abstract | CrossRef Full Text | Google Scholar

Moskowitz, C. (2015). Stephen hawking Hasn’t solved the black hole paradox just yet. Sci. Am. 27.

Google Scholar

Niven, J. E., Anderson, J. C., and Laughlin, S. B. (2007). Fly photoreceptors demonstrate energy-information trade-offs in neural coding. PLoS Biol. 5:e116. doi: 10.1371/journal.pbio.0050116

PubMed Abstract | CrossRef Full Text | Google Scholar

Northoff, G., Heinzel, A., De Greck, M., Bermpohl, F., Dobrowolny, H., and Panksepp, J. (2006). Self-referential processing in our brain–a meta-analysis of imaging studies on the self. Neuroimage 31, 440–457. doi: 10.1016/j.neuroimage.2005.12.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Nunez, P. L., and Srinivasan, R. (2006). Electric Fields of the Brain. New York, NY: Oxford University Press.

Google Scholar

Orfei, M. D., Robinson, R. G., Prigatano, G. P., Starkstein, S., Rüsch, N., Bria, P., et al. (2007). Anosognosia for hemiplegia after stroke is a multifaceted phenomenon: a systematic review of the literature. Brain 130, 3075–3090. doi: 10.1093/brain/awm106

PubMed Abstract | CrossRef Full Text | Google Scholar

Overgaard, M. (2011). Visual experience and blindsight: a methodological review. Exp. Brain Res. 209, 473–479. doi: 10.1007/s00221-011-2578-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Parrondo, J. M. R., Horowitz, J. M., and Sagawa, T. (2015). Thermodynamics of information. Nat. Phys. 11, 131–139. doi: 10.1038/nphys3230

CrossRef Full Text | Google Scholar

Parton, A., Malhotra, P., and Husain, M. (2004). Hemispatial neglect. J. Neurol. Neurosurg. Psychiatry 75, 13–21.

Google Scholar

Plenio, M. B., and Vitelli, V. (2001). The physics of forgetting: Landauer’s erasure principle and information theory. Contemp. Phys. 42, 25–60. doi: 10.1080/00107510010018916

CrossRef Full Text | Google Scholar

Poirier, B. (2014). A Conceptual Guide to Thermodynamics. Chichester: Wiley, 77–78.

Google Scholar

Prigatano, G. P. (2009). Anosognosia: clinical and ethical considerations. Curr. Opin. Neurol. 22, 606–611. doi: 10.1097/WCO.0b013e328332a1e7

PubMed Abstract | CrossRef Full Text | Google Scholar

Prokopenko, M., and Lizier, J. T. (2014). Transfer entropy and transient limits of computation. Sci. Rep. 4:5394. doi: 10.1038/srep05394

PubMed Abstract | CrossRef Full Text | Google Scholar

Raichle, M. E., and Gusnard, D. A. (2002). Appraising the brain’s energy budget. Proc. Natl. Acad. Sci. U.S.A. 99, 10237–10239. doi: 10.1073/pnas.172399499

PubMed Abstract | CrossRef Full Text | Google Scholar

Revonsuo, A. (1999). Binding and the phenomenal unity of consciousness. Consci. Cogn. 8, 173–185. doi: 10.1006/ccog.1999.0384

PubMed Abstract | CrossRef Full Text | Google Scholar

Rolls, E. T. (2012). Neuroculture. Oxford: Oxford University Press.

Google Scholar

Rolls, E. T., and Deco, G. (2010). The Noisy Brain. Oxford: Oxford University Press.

Google Scholar

Rovelli, C. (2015). “Relative information at the foundation of physics,” in It From Bit or Bit From It?, eds A. Aguirre, B. Foster, and Z. Merali (Cham: Springer), doi: 10.1007/978-3-319-12946-4_7

CrossRef Full Text | Google Scholar

Schneider, E. D., and Sagan, D. (2005). Into the Cool. Chicago: University of Chicago Press.

Google Scholar

Schrödinger, E. (1944). What is Life? Cambridge: Cambridge University Press, 3.

Google Scholar

Sengupta, B., Faisal, A. A., Laughlin, S. B., and Niven, J. E. (2013a). The effect of cell size and channel density on neuronal information encoding and energy efficiency. J. Cereb. Blood Flow Metab. 33, 1465–1473. doi: 10.1038/jcbfm.2013.103

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengupta, B., Laughlin, S. B., and Niven, J. E. (2013b). Balanced excitatory and inhibitory synaptic currents promote efficient coding and metabolic efficiency. PLoS Comput. Biol. 9:e1003263. doi: 10.1371/journal.pcbi.1003263

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengupta, B., Stemmler, M. B., and Friston, K. J. (2013c). Information and efficiency in the nervous system–a synthesis. PLoS Comput. Biol. 9:e1003157. doi: 10.1371/journal.pcbi.1003157

PubMed Abstract | CrossRef Full Text | Google Scholar

Sengupta, B., Tozzi, A., Cooray, G. K., Douglas, P. K., and Friston, K. J. (2016). Towards a neuronal gauge theory. PLoS Biol. 14:e1002400. doi: 10.1371/journal.pbio.1002400

PubMed Abstract | CrossRef Full Text | Google Scholar

Shulman, R. G., Hyder, F., and Rothman, D. L. (2009). Baseline brain energy supports the state of consciousness. Proc. Natl. Acad. Sci. U.S.A. 106, 11096–11101. doi: 10.1073/pnas.0903941106

PubMed Abstract | CrossRef Full Text | Google Scholar

Smolin, L. (2001). Three Roads to Quantum Gravity. New York, NY: Basic Books. 103, 169–178.

Google Scholar

Stender, J., Mortensen, K. N., Thibaut, A., Darkner, S., Laureys, S., Gjedde, A., et al. (2016). The minimal energetic requirement of sustained awareness after brain injury. Curr. Biol. 26, 1494–1499. doi: 10.1016/j.cub.2016.04.024

PubMed Abstract | CrossRef Full Text | Google Scholar

Sterling, P., and Laughlin, S. (2015). Principles of Neural Design. Cambridge: MIT Press.

Google Scholar

Still, S., Sivak, D. A., Bell, A. J., and Crooks, G. E. (2012). Thermodynamics of prediction. Phys. Rev. Lett. 109:120604. doi: 10.1103/PhysRevLett.109.120604

PubMed Abstract | CrossRef Full Text | Google Scholar

Stoica, O. C. (2008). Flowing with a Frozen River. FQXi, The Nature of Time essay contest. Available at: http://fqxi.org/community/essay/winners/2008.1\#Stoica

Google Scholar

Stone, J. V. (2015). Information Theory. Sheffield: Sebtel Press, 171.

Google Scholar

Susskind, L., and Hrabovsky, G. (2014). The Theoretical Minimum, Vol. 9. New York, NY: Basic Books, 170.

Google Scholar

Takeuchi, T., Duszkiewicz, A. J., and Morris, R. G. (2014). The synaptic plasticity and memory hypothesis: encoding, storage and persistence. Philos. Trans. R. Soc. B 369:20130288. doi: 10.1098/rstb.2013.0288

PubMed Abstract | CrossRef Full Text | Google Scholar

Tognini, P., Napoli, D., and Pizzorusso, T. (2015). Dynamic DNA methylation in the brain: a new epigenetic mark for experience-dependent plasticity. Front. Cell. Neurosci. 9:331. doi: 10.3389/fncel.2015.00331

PubMed Abstract | CrossRef Full Text | Google Scholar

Torday, J. S., and Miller, W. B. Jr. (2016). On the evolution of the mammalian brain. Front. Syst. Neurosci. 10:31. doi: 10.3389/fnsys.2016.00031

CrossRef Full Text | Google Scholar

Tsakiris, M. (2010). My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia 48, 703–712. doi: 10.1016/j.neuropsychologia.2009.09.034

PubMed Abstract | CrossRef Full Text | Google Scholar

Varela, F., Lachaux, J. P., Rodriguez, E., and Martinerie, J. (2001). The brainweb: phase synchronization and large-scale integration. Nat. Rev. Neurosci. 2, 229–239. doi: 10.1038/35067550

PubMed Abstract | CrossRef Full Text | Google Scholar

Vedral, V. (2010). Decoding Reality. Oxford: Oxford University Press.

Google Scholar

Von Baeyer, H. C. (1999). Maxwell’s Demon. New York, NY: Random House, 100–101.

Google Scholar

Ward, L. M. (2011). The thalamic dynamic core theory of conscious experience. Conscious. Cogn. 20, 464–486. doi: 10.1016/j.concog.2011.01.007

PubMed Abstract | CrossRef Full Text | Google Scholar

Wheeler, J. (1986). “John Wheeler,” in The Ghost in the Atom eds P. Davies and J. Brown (Cambridge: Cambridge University Press), 62.

Google Scholar

Wheeler, J. A. (1989). “Information, physics, quantum: the search for links,” in Proceedings of the Third International Symposium on Foundations of Quantum Mechanics, Tokyo, 354–368.

Google Scholar

Yuste, R. (2015). From the neuron doctrine to neural networks. Nat. Rev. Neurosci. 16, 487–497. doi: 10.1038/nrn3962

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: information thermodynamics, Landauer limit, free energy principle, optimization, Bekenstein bound

Citation: Street S (2016) Neurobiology as Information Physics. Front. Syst. Neurosci. 10:90. doi: 10.3389/fnsys.2016.00090

Received: 10 August 2016; Accepted: 28 October 2016;
Published: 15 November 2016.

Edited by:

Yan Mark Yufik, Virtual Structures Research, Inc., USA

Reviewed by:

Guillem Collell, Katholieke Universiteit Leuven, Belgium
Arto Annila, University of Helsinki, Finland

Copyright © 2016 Street. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sterling Street, sterling.street@uga.edu

Download