Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Exploiting Signal Propagation Delays to match Task Memory Requirements in Reservoir Computing

Version 1 : Received: 24 March 2024 / Approved: 25 March 2024 / Online: 25 March 2024 (15:04:44 CET)

How to cite: Iacob, S.; Dambre, J. Exploiting Signal Propagation Delays to match Task Memory Requirements in Reservoir Computing. Preprints 2024, 2024031489. https://doi.org/10.20944/preprints202403.1489.v1 Iacob, S.; Dambre, J. Exploiting Signal Propagation Delays to match Task Memory Requirements in Reservoir Computing. Preprints 2024, 2024031489. https://doi.org/10.20944/preprints202403.1489.v1

Abstract

Recurrent neural networks (RNNs) transmit information over time through recurrent connections. In contrast, biological neural networks use many other temporal processing mechanisms. One of these mechanisms is the inter-neuron delays caused by varying axon properties. Recently, this feature was implemented in echo state networks (ESNs), a type of RNN, by assigning spatial locations to neurons and introducing distance-dependent inter-neuron delays. These delays were shown to significantly improve ESN task performance. However, thus far it is still unclear why distance-based delay networks (DDNs) perform bettern than ESNs. In this paper, we show that by optimizing inter-node delays, the memory capacity of the network matches the memory requirements of the task. As such, networks concentrate their memory capabilities to the points in the past which contain the most information for the task at hand. Moreover, we show that DDNs have more total linear memory capacity, with the same amount of non-linear processing power.

Keywords

distance-based delays; inter-neuron delays; echo state networks; recurrent neural networks; reservoir computing; memory capacity

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.