Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Fast Learning in Complex Domains

Version 1 : Received: 25 June 2021 / Approved: 28 June 2021 / Online: 28 June 2021 (10:06:07 CEST)

How to cite: Worden, R. Fast Learning in Complex Domains. Preprints 2021, 2021060635. https://doi.org/10.20944/preprints202106.0635.v1 Worden, R. Fast Learning in Complex Domains. Preprints 2021, 2021060635. https://doi.org/10.20944/preprints202106.0635.v1

Abstract

Bayesian formulations of learning imply that whenever the evidence for a correlation between events in an animal’s habitat is sufficient, the correlation is learned. This implies that regularities can be learnt rapidly, from small numbers of learning examples. This speed of learning gives maximum possible fitness, and no faster learning is possible. There is evidence in many domains that animals and people can learn at nearly Bayesian optimal speeds. These domains include associative conditioning, and the more complex domains of navigation and language. There are computational models of learning which learn at near-Bayesian speeds in complex domains, and which can scale well – to learn thousands of pieces of knowledge (i.e., relations and associations). These are not neural net models. They can be defined in computational terms, as algorithms and data structures at David Marr’s [1] Level Two. Their key data structures are composite feature structures, which are graphs of multiple linked nodes. This leads to the hypothesis that animal learning results not from deep neural nets (which typically require thousands of training exam-ples), but from neural implementations of the Level Two models of fast learning; and that neu-rons provide the facilities needed to implement those models at Marr’s Level Three. The required facilities include feature structures, dynamic binding, one-shot memory for many feature struc-tures, pattern-based associative retrieval, unification and generalization of feature structures. These may be supported by multiplexing of data and metadata in the same neural fibres.

Keywords

Fast learning; Bayes’ theorem; navigation; language; spatial cognition; feature structures; unification; generalization; dynamic binding; metadata multiplexing.

Subject

Social Sciences, Psychology

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.