ARTICLE | doi:10.20944/preprints202210.0169.v1
Subject: Computer Science And Mathematics, Probability And Statistics Keywords: low-rank matrices; errors-in-variables models; lower bounds; Kullback-Leibler divergence; information-theoretic limitations
Online: 12 October 2022 (09:57:25 CEST)
Noisy data is always encountered in real applications, such as bioinformatics, neuroimage and remote sensing. Existing methods mainly consider linear or generalized linear errors-in-variables regression, while relatively little attention is paid for the multivariate response case, and how to evaluate the estimation performance under perturbed covariates is still an open question. In this paper, we consider the information-theoretic limitations of estimating a low-rank matrix in the multi-response errors-in-variables regression model. By application of the information theory and statistical techniques on concentration inequalities, the minimax lower bound is provided in terms of the squared Frobenius loss, which recaptures the rate provided under the clean covariate assumption in previous literatures. Hence our result further indicates that though under the more realistic errors-in-variables situation, no more samples are required so as to achieve a rate-optimal estimation.