Preprint Article Version 2 Preserved in Portico This version is not peer-reviewed

Symmetric Encoding-decoding Framework without Backpropagation

Version 1 : Received: 29 November 2022 / Approved: 29 November 2022 / Online: 29 November 2022 (07:07:56 CET)
Version 2 : Received: 29 November 2022 / Approved: 29 November 2022 / Online: 29 November 2022 (08:13:27 CET)

How to cite: Zhai, P. Symmetric Encoding-decoding Framework without Backpropagation. Preprints 2022, 2022110537. https://doi.org/10.20944/preprints202211.0537.v2 Zhai, P. Symmetric Encoding-decoding Framework without Backpropagation. Preprints 2022, 2022110537. https://doi.org/10.20944/preprints202211.0537.v2

Abstract

We propose a forward-only multi-layer encoding-decoding framework based on the principle of Maximal Coding Rate Reduction (MCR$^2$), an information-theoretic metric that measures a statistical distance between two sets of feature vectors up to the second moment. The encoder directly transforms data vectors themselves via gradient ascent to maximize the MCR$^2$ distance between different classes in the feature space, resulting in class-wise mutually orthogonal subspace representations. The decoder follows a process symmetric to the encoder, and transforms the subspace feature vectors via gradient descent to minimize the MCR$^2$ distance between the reconstructed data and the original data. We show that the encoder transforms data to linear discriminative representations without breaking the higher-order manifolds, and the decoder reconstructs the data with high fidelity.

Keywords

Backpropagation-free Network, Information Theory, Maximal Coding Rate Reduction

Subject

Computer Science and Mathematics, Computer Science

Comments (1)

Comment 1
Received: 29 November 2022
Commenter: Pengyuan Zhai
Commenter's Conflict of Interests: Author
Comment: Removing the line numbers
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.