Fig. 2: The evolution of reservoir computing. | Nature Communications

Fig. 2: The evolution of reservoir computing.

From: Connectome-based reservoir computing with the conn2res toolbox

Fig. 2

a Generic recurrent neural network (RNN) model. In classic RNNs, recurrent connections are learned via backpropagation-through-time146. The network topology that emerges from training does not necessarily result in biologically-plausible connectivity patterns. b The conventional reservoir computing architecture consists of a RNN with randomly assigned weights. The connections of the reservoir remain fixed during training and learning occurs only at the connections between the recurrent network and the readout module. Examples of this include classic liquid state machines32 and echo-state-networks36. c Thanks to advances in imaging technologies, it is now possible to implement reservoirs with network architectures informed by empirical structural connectivity networks or connectomes. This allows us to explore the link between structure and function in biological brain networks from a computational point of view.

Back to article page