Article
Version 2
Preserved in Portico This version is not peer-reviewed
Symmetric Encoding-decoding Framework without Backpropagation
Version 1
: Received: 29 November 2022 / Approved: 29 November 2022 / Online: 29 November 2022 (07:07:56 CET)
Version 2 : Received: 29 November 2022 / Approved: 29 November 2022 / Online: 29 November 2022 (08:13:27 CET)
Version 2 : Received: 29 November 2022 / Approved: 29 November 2022 / Online: 29 November 2022 (08:13:27 CET)
How to cite: Zhai, P. Symmetric Encoding-decoding Framework without Backpropagation. Preprints 2022, 2022110537. https://doi.org/10.20944/preprints202211.0537.v2 Zhai, P. Symmetric Encoding-decoding Framework without Backpropagation. Preprints 2022, 2022110537. https://doi.org/10.20944/preprints202211.0537.v2
Abstract
We propose a forward-only multi-layer encoding-decoding framework based on the principle of Maximal Coding Rate Reduction (MCR$^2$), an information-theoretic metric that measures a statistical distance between two sets of feature vectors up to the second moment. The encoder directly transforms data vectors themselves via gradient ascent to maximize the MCR$^2$ distance between different classes in the feature space, resulting in class-wise mutually orthogonal subspace representations. The decoder follows a process symmetric to the encoder, and transforms the subspace feature vectors via gradient descent to minimize the MCR$^2$ distance between the reconstructed data and the original data. We show that the encoder transforms data to linear discriminative representations without breaking the higher-order manifolds, and the decoder reconstructs the data with high fidelity.
Keywords
Backpropagation-free Network, Information Theory, Maximal Coding Rate Reduction
Subject
Computer Science and Mathematics, Computer Science
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (1)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment
Commenter: Pengyuan Zhai
Commenter's Conflict of Interests: Author