Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

An Efficient Gaussian Mixture Model and It's Application to Neural Network

Version 1 : Received: 15 February 2023 / Approved: 16 February 2023 / Online: 16 February 2023 (06:59:43 CET)
Version 2 : Received: 10 March 2023 / Approved: 16 March 2023 / Online: 16 March 2023 (02:21:19 CET)

How to cite: Lu, W.; Ding, D.; Wu, F.; Yuan, G. An Efficient Gaussian Mixture Model and It's Application to Neural Network. Preprints 2023, 2023020275. https://doi.org/10.20944/preprints202302.0275.v1 Lu, W.; Ding, D.; Wu, F.; Yuan, G. An Efficient Gaussian Mixture Model and It's Application to Neural Network. Preprints 2023, 2023020275. https://doi.org/10.20944/preprints202302.0275.v1

Abstract

While modeling data in reality, uncertainties in both data (aleatoric) and model (epistemic) are not necessarily Gaussian. The uncertainties could appear with multimodality or any other particular forms which need to be captured. Gaussian mixture models (GMMs) are powerful tools that allow us to capture those features from a unknown density with a peculiar shape. Inspired by Fourier expansion, we propose a GMM model structure to decompose arbitrary unknown densities and prove the property of convergence. A simple learning method is introduced to learn GMMs through sampled datasets. We applied GMMs as well as our learning method to two classic neural network applications. The first one is learning encoder output of autoencoder as GMMs for handwritten digits images generation. The another one is learning gram matrices as GMMs for style-transfer algorithm. Comparing with the classic Expectation maximization (EM) algorithm, our method does not involve complex formulations and is capable of achieving applicable accuracy.

Keywords

Neural Network; Uncertainties; GMM; Density approximation

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.