In real-world applications, observations are often constrained to a small fraction of a system. Such spatial subsampling can be caused by the inaccessibility or the sheer size of the system, and cannot be overcome by longer sampling. Spatial subsampling can strongly bias inferences about a system’s aggregated properties. To overcome the bias, we derive analytically a subsampling scaling framework that is applicable to different observables, including distributions of neuronal avalanches, of number of people infected during an epidemic outbreak, and of node degrees. We demonstrate how to infer the correct distributions of the underlying full system, how to apply it to distinguish critical from subcritical systems, and how to disentangle subsampling and finite size effects. Lastly, we apply subsampling scaling to neuronal avalanche models and to recordings from developing neural networks. We show that only mature, but not young networks follow power-law scaling, indicating self-organization to criticality during development.
The avalanche transformation as a model for avalanches in neural dynamics was introduced in my dissertation in 2008. Here we discuss this transformation in terms of group actions, random dynamics and skew products with a finite invariant measure. Some open problems are mentioned.
Structural inhomogeneities in synaptic efficacies have a strong impact on population response dynamics of cortical networks and are believed to play an important role in their functioning. However, little is known about how such inhomogeneities could evolve by means of synaptic plasticity. Here we present an adaptive model of a balanced neuronal network that combines two different types of plasticity, STDP and synaptic scaling. The plasticity rules yield both long-tailed distributions of synaptic weights and firing rates. Simultaneously, a highly connected subnetwork of driver neurons with strong synapses emerges. Coincident spiking activity of several driver cells can evoke population bursts and driver cells have similar dynamical properties as leader neurons found experimentally. Our model allows us to observe the delicate interplay between structural and dynamical properties of the emergent inhomogeneities. It is simple, robust to parameter changes and able to explain a multitude of different experimental findings in one basic network.
Critical behavior in neural networks is characterized by scale-free avalanche size distributions and can be explained by self-regulatory mechanisms. Theoretical and experimental evidence indicates that information storage capacity reaches its maximum in the critical regime. We study the effect of structural connectivity formed by Hebbian learning on the criticality of network dynamics. The network only endowed with Hebbian learning does not allow for simultaneous information storage and criticality. However, the critical regime can be stabilized by short-term synaptic dynamics in the form of synaptic depression and facilitation or, alternatively, by homeostatic adaptation of the synaptic weights. We show that a heterogeneous distribution of maximal synaptic strengths does not preclude criticality if the Hebbian learning is alternated with periods of critical dynamics recovery. We discuss the relevance of these findings for the flexibility of memory in aging and with respect to the recent theory of synaptic plasticity.
How a complex network is connected crucially impacts its dynamics and function. Percolation, the transition to extensive connectedness on gradual addition of links, was long believed to be continuous, but recent numerical evidence of ‘explosive percolation’ suggests that it might also be discontinuous if links compete for addition. Here we analyse the microscopic mechanisms underlying discontinuous percolation processes and reveal a strong impact of single-link additions. We show that in generic competitive percolation processes, including those showing explosive percolation, single links do not induce a discontinuous gap in the largest cluster size in the thermodynamic limit. Nevertheless, our results highlight that for large finite systems single links may still induce substantial gaps, because gap sizes scale weakly algebraically with system size. Several essentially macroscopic clusters coexist immediately before the transition, announcing discontinuous percolation. These results explain how single links may drastically change macroscopic connectivity in networks where links add competitively.
We analytically describe a transition scenario to self-organized criticality (SOC) that is new for physics as well as neuroscience; it combines the criticality of first and second-order phase transitions with a SOC phase. We consider a network of pulse-coupled neurons interacting via dynamical synapses, which exhibit depression and facilitation as found in experiments. We analytically show the coexistence of a SOC phase and a subcritical phase connected by a cusp bifurcation. Switching between the two phases can be triggered by varying the intensity of noisy inputs.
We show that a network of spiking neurons exhibits robust self-organized criticality if the synaptic efficacies follow realistic dynamics. Deriving analytical expressions for the average coupling strengths and inter-spike intervals, we demonstrate that networks with dynamical synapses exhibit critical avalanche dynamics for a wide range of interaction parameters. We prove that in the thermodynamical limit the network becomes critical for all large enough coupling parameters. We thereby explain experimental observations in which cortical neurons show avalanche activity with the total intensity of firing events being distributed as a power-law.
In many studies of self-organized criticality (SOC), branching processes were used to model the dynamics of the activity of the system during avalanches. This mathematical simplification was also adopted when investigating systems with a complicated connection topology including recurrent and subthreshold interactions. However, none of these studies really analyzed whether this convenient approximation was indeed applicable. In present paper we study the correspondences between avalanches generated by branching processes and by a fully connected neural network. The benefit from the analysis is not only the justification of such correspondence but also a simple learning rule, which allows self-organization of the network towards a critical state as recently observed in slice experiments