Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Computational Information Geometry For Binary Classification of High-Dimensional Random Tensors

Version 1 : Received: 1 February 2018 / Approved: 1 February 2018 / Online: 1 February 2018 (16:32:04 CET)

How to cite: Pham, G.; Boyer, R.; Nielsen, F. Computational Information Geometry For Binary Classification of High-Dimensional Random Tensors. Preprints 2018, 2018020008. https://doi.org/10.20944/preprints201802.0008.v1 Pham, G.; Boyer, R.; Nielsen, F. Computational Information Geometry For Binary Classification of High-Dimensional Random Tensors. Preprints 2018, 2018020008. https://doi.org/10.20944/preprints201802.0008.v1

Abstract

The performance in terms of minimal Bayes’ error probability for detection of a high-dimensional random tensor is a fundamental under-studied difficult problem. In this work, we consider two Signal to Noise Ratio (SNR)-based detection problems of interest. Under the alternative hypothesis, i.e., for a non-zero SNR, the observed signals are either a noisy rank-R tensor admitting a Q-order Canonical Polyadic Decomposition (CPD) with large factors of size Nq  R, i.e, for 1  q  Q, where R, Nq ! ¥ with R1/q/Nq converge towards a finite constant or a noisy tensor admitting TucKer Decomposition (TKD) of multilinear (M1, . . . ,MQ)-rank with large factors of size Nq  Mq, i.e, for 1  q  Q, where Nq,Mq ! ¥ with Mq/Nq converge towards a finite constant. The detection of the random entries (coefficients) of the core tensor in the CPD/TKD is hard to study since the exact derivation of the error probability is mathematically intractable. To circumvent this technical difficulty, the Chernoff Upper Bound (CUB) for larger SNR and the Fisher information at low SNR are derived and studied, based on information geometry theory. The tightest CUB is reached for the value minimizing the error exponent, denoted by s?. In general, due to the asymmetry of the s-divergence, the Bhattacharyya Upper Bound (BUB) (that is, the Chernoff Information calculated at s? = 1/2) can not solve this problem effectively. As a consequence, we rely on a costly numerical optimization strategy to find s?. However, thanks to powerful random matrix theory tools, a simple analytical expression of s? is provided with respect to the Signal to Noise Ratio (SNR) in the two schemes considered. A main conclusion of this work is that the BUB is the tightest bound at low SNRs. This property is, however, no longer true for higher SNRs.

Keywords

Optimal Bayesian detection, information geometry, minimal error probability, Chernoff/Bhattacharyya upper bound, large random tensor, Fisher information, large random sensing matrix

Subject

Computer Science and Mathematics, Probability and Statistics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.