Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Minimax Lower Bounds for High-Dimensional Multi-Response Errors-in-Variables Regression

Version 1 : Received: 11 October 2022 / Approved: 12 October 2022 / Online: 12 October 2022 (09:57:25 CEST)

How to cite: Li, X.; Wu, D. Minimax Lower Bounds for High-Dimensional Multi-Response Errors-in-Variables Regression. Preprints 2022, 2022100169. https://doi.org/10.20944/preprints202210.0169.v1 Li, X.; Wu, D. Minimax Lower Bounds for High-Dimensional Multi-Response Errors-in-Variables Regression. Preprints 2022, 2022100169. https://doi.org/10.20944/preprints202210.0169.v1

Abstract

Noisy data is always encountered in real applications, such as bioinformatics, neuroimage and remote sensing. Existing methods mainly consider linear or generalized linear errors-in-variables regression, while relatively little attention is paid for the multivariate response case, and how to evaluate the estimation performance under perturbed covariates is still an open question. In this paper, we consider the information-theoretic limitations of estimating a low-rank matrix in the multi-response errors-in-variables regression model. By application of the information theory and statistical techniques on concentration inequalities, the minimax lower bound is provided in terms of the squared Frobenius loss, which recaptures the rate provided under the clean covariate assumption in previous literatures. Hence our result further indicates that though under the more realistic errors-in-variables situation, no more samples are required so as to achieve a rate-optimal estimation.

Keywords

low-rank matrices; errors-in-variables models; lower bounds; Kullback-Leibler divergence; information-theoretic limitations

Subject

Computer Science and Mathematics, Probability and Statistics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.