Preprint
Article

This version is not peer-reviewed.

Geometric Information Bounds for CSI Manifolds in Hybrid 6G Networks

Submitted:

02 March 2026

Posted:

03 March 2026

You are already at the latest version

Abstract
This paper proposes a novel information-theoretic upper bound on the mutual information between the physical position of a user and the observed MIMO channel state information (CSI). Unlike classical Cram’er-Rao bounds or I-MMSE relations, our bound explicitly incorporates the spatial variability of the channel via the Jacobian of the channel with respect to position. We provide a derivation for both local linearized models and global nonlinear bounds, highlighting the dependence on array geometry and multipath structure. The results offer new insight into the intrinsic information available for position estimation and semantic localization in wireless networks.
Keywords: 
;  ;  ;  ;  ;  ;  ;  ;  ;  

1. Introduction

Localization in wireless networks is a fundamental task with applications in 5G/6G systems, beam management, and semantic inference. Classical approaches often rely on time-of-arrival, angle-of-arrival, or received signal strength measurements to derive Cram’er-Rao bounds (CRB) on estimation performance [1]. While useful, these bounds focus on estimator variance and do not provide global information-theoretic limits on the mutual information between user position and the channel state. Recently, the mutual information between latent variables and observed channel measurements has gained attention as a fundamental metric for understanding the limits of semantic localization [2,3]. However, existing results mostly treat linear channels or additive SNR parameters, relying on I-MMSE relations [2] or De Bruijn identities [4] for Gaussian perturbations. These formulations do not account for the explicit non-linear dependence of MIMO CSI on spatial position, array geometry, or multipath components. In this paper, we derive a new upper bound on I ( X ; Y ) , the mutual information between the position X and the observed CSI Y , for a MIMO system with an antenna array. The bound uses the Jacobian of the channel function with respect to position and integrates local contributions along trajectories in space, providing a global nonlinear bound that captures the intrinsic limitations of channel-based localization.

2. Related Work

Classical localization bounds include CRB and Ziv-Zakai bounds for time-of-arrival and angle-of-arrival methods [5,6]. Information-theoretic approaches have been studied for linear Gaussian channels, using the I-MMSE relationship to relate mutual information derivatives to estimation error [2]. De Bruijn identities have been applied to entropy growth under Gaussian noise addition [4]. None of these works provide explicit bounds for nonlinear MIMO CSI functions dependent on spatial position.

3. MIMO Channel Model and Gaussian Conditioning

We consider a single user located at position X R 2 and a base station equipped with an M-element uniform linear array (ULA). The narrowband MIMO channel can be represented as a superposition of K propagation paths:
h ( X ) = k = 1 K α k ( X ) a ( θ k ( X ) ) ,
where α k ( X ) C is the complex path gain for path k, a ( θ k ( X ) ) C M is the array steering vector associated with the angle of arrival θ k ( X ) , and K represents the total number of multipath components. The steering vector for a uniform linear array is often expressed as [7]:
a ( θ ) = 1 M 1 e j 2 π d λ sin ( θ ) e j 2 π d λ ( M 1 ) sin ( θ ) T ,
where d is the inter-element spacing and λ is the carrier wavelength. This formulation ensures that the array response captures the phase differences induced by signals arriving from different directions.
The measured channel state information (CSI) is corrupted by additive complex Gaussian noise:
Y = h ( X ) + N , N CN ( 0 , σ 2 I M ) ,
where N is circularly symmetric complex Gaussian (CSCG) noise, modeling thermal noise and receiver imperfections [8]. Note that from the observation model it follows directly that Y h ( X ) = N . In the conditional density p ( Y | X ) this term represents the Gaussian residual component induced by additive noise. The use of CSCG noise ensures that the real and imaginary components are independent and identically distributed with variance σ 2 / 2 each.
Given this model, the conditional probability density function (PDF) of the observed CSI Y , conditioned on the user position X, is a multivariate complex Gaussian:
p ( Y | X ) = 1 ( π σ 2 ) M exp 1 σ 2 Y h ( X ) 2 ,
where · denotes the Euclidean (or 2 ) norm. This Gaussian form arises because the additive noise is Gaussian, and the channel h ( X ) is treated as a deterministic function of X for the purpose of conditioning. This allows us to leverage well-known results in estimation theory and information theory for Gaussian conditional models [6,9].
The model described above forms the foundation for deriving bounds on the mutual information between the user position X and the observed CSI Y , as it explicitly captures the dependence of the channel on spatial position, the array geometry, and the impact of noise.

4. Mutual Information Structure

We analyze the mutual information between the user position X R 2 and the observed channel state information (CSI) vector Y C M . By definition, the mutual information is given by
I ( X ; Y ) = H ( Y ) H ( Y | X ) ,
where H ( · ) denotes differential entropy [10].

Notation

To avoid ambiguity with the channel function h ( X ) , we denote differential entropy by H ( · ) throughout the manuscript.

Conditional Entropy Term

Conditioned on X, the observation model is
Y | X = h ( X ) + N ,
with N CN ( 0 , σ 2 I M ) . Since h ( X ) is deterministic under conditioning, Y | X remains circularly symmetric complex Gaussian with covariance σ 2 I M . The differential entropy of a complex Gaussian random vector Z CN ( 0 , C ) is [8]
H ( Z ) = log det ( π e C ) .
Applying this result with C = σ 2 I M yields
H ( Y | X ) = log det ( π e σ 2 I M ) = M log ( π e σ 2 ) .
Importantly, this term does not depend on X, reflecting the fact that the uncertainty under conditioning is purely due to additive noise.

Marginal Entropy Term

The marginal distribution of Y is obtained by averaging over the distribution of X:
p ( Y ) = p ( Y | X ) p ( X ) d X .
This is generally a Gaussian mixture, since h ( X ) is nonlinear in X. Therefore, Y is not Gaussian in general. Its differential entropy is
H ( Y ) = p ( Y ) log p ( Y ) d Y .
No closed-form expression exists unless h ( X ) is affine in X and X is Gaussian.

Equivalent Representation

An alternative expression for mutual information, useful for bounding arguments, is
I ( X ; Y ) = E X D KL p ( Y | X ) p ( Y ) ,
where D KL ( · · ) denotes Kullback–Leibler divergence [10]. This formulation emphasizes that mutual information quantifies the average distinguishability between conditional output distributions corresponding to different spatial positions.

Connection to Gaussian Bounds

Since among all random vectors with fixed covariance matrix Σ Y , the Gaussian distribution maximizes differential entropy [10], we obtain the upper bound
H ( Y ) log det ( π e Σ Y ) ,
where
Σ Y = E ( Y E [ Y ] ) ( Y E [ Y ] ) H .
Consequently,
I ( X ; Y ) log det ( π e Σ Y ) M log ( π e σ 2 ) .
Bounding I ( X ; Y ) therefore reduces to characterizing or bounding the covariance structure induced by the spatial variability of h ( X ) . This highlights that the informative content of CSI about position arises entirely from the geometry-induced variability of the channel manifold embedded in C M .

5. Covariance Structure, Fisher Information, and Relation to I–MMSE

Covariance Induced by Spatial Variability

Recall that
Y = h ( X ) + N , N CN ( 0 , σ 2 I M ) .
Let μ Y = E [ Y ] . Since N is zero-mean and independent of X,
μ Y = E [ h ( X ) ] .
The covariance of Y is therefore
Σ Y = E ( Y μ Y ) ( Y μ Y ) H
= E ( h ( X ) E [ h ( X ) ] ) ( h ( X ) E [ h ( X ) ] ) H + σ 2 I M .
Thus,
Σ Y = Σ h + σ 2 I M ,
where
Σ h = Cov h ( X ) .
The entire informative structure is therefore encoded in the covariance induced by the spatial variability of the channel manifold.

6. Local Linearization and Explicit Jacobian Form

Assume that X is concentrated around X 0 with covariance Σ X . Using first-order Taylor expansion,
h ( X ) h ( X 0 ) + J ( X 0 ) ( X X 0 ) ,
where
J ( X 0 ) = h ( X ) X | X = X 0 C M × 2 .
Under this approximation,
Σ h J ( X 0 ) Σ X J ( X 0 ) H .
Hence,
Σ Y J ( X 0 ) Σ X J ( X 0 ) H + σ 2 I M .
Substituting into the Gaussian entropy upper bound [10],
I ( X ; Y ) log det I M + 1 σ 2 J ( X 0 ) Σ X J ( X 0 ) H .
This expression provides a clear interpretation:
  • The Jacobian J ( X 0 ) directly reflects the local geometry of the channel manifold embedded in C M . Positions where the manifold is more sensitive (larger singular values of J ( X 0 ) ) allow higher distinguishability between neighboring channel realizations.
  • The bound explicitly links the mutual information to the array geometry and spatial resolution capabilities of the MIMO system.
  • Unlike the classical I–MMSE relation [2], this bound is not restricted to linear fixed channels: it captures local nonlinearities through the Jacobian, providing a first-order approximation of the information content in a general propagation environment.
In combination with Figure 1 and Figure 2, this formulation illustrates both conceptually and numerically how the structure of the channel manifold controls the achievable information for spatial localization and predictive beamforming.
This shows explicitly that the mutual information scales with the singular values of the channel Jacobian, revealing a direct geometric dependence on the array manifold curvature.

Connection to Fisher Information

For the Gaussian model
p ( Y | X ) = CN ( h ( X ) , σ 2 I M ) ,
the Fisher Information Matrix (FIM) with respect to X is [6]
J F ( X ) = 2 σ 2 J ( X ) H J ( X ) .
Under small perturbations, the mutual information admits the local quadratic approximation
I ( X ; Y ) 1 2 Tr J F ( X 0 ) Σ X ,
which coincides with classical small-signal information expansions [10].
This establishes a precise bridge between:
- the global information-theoretic quantity I ( X ; Y ) , - the local estimation-theoretic object J F ( X ) , - and the differential geometry of the channel manifold through J ( X ) .

Relation to I–MMSE and Novelty

The I–MMSE relation [2] states that for linear Gaussian channels
d d SNR I ( X ; Y ) = 1 2 mmse ( SNR ) .
However, that result assumes a linear channel of the form
Y = SNR A X + N ,
with fixed matrix A .
In contrast, in the present setting:
Y = h ( X ) + N ,
the channel mapping itself is nonlinear and geometry-dependent. The informative structure arises from the curvature of the embedded manifold
M = { h ( X ) C M : X R 2 } .
The bound derived above depends explicitly on the Jacobian spectrum
spec J ( X ) H J ( X ) ,
which encodes array geometry and propagation physics.
Therefore:
  • The result is not a direct corollary of the classical I–MMSE identity.
  • It generalizes the linear Gaussian case to nonlinear channel manifolds.
  • It reveals that mutual information is controlled by the intrinsic Riemannian metric induced by J ( X ) .
This geometric characterization of CSI-based spatial information does not appear explicitly in standard treatments of Gaussian channel information theory [2,8], and constitutes a structural extension beyond classical linear models.
The observed low-dimensional embedding, Figure 1, confirms that spatial variability induces a structured manifold rather than isotropic Gaussian variability.
Figure 2 illustrates the distribution of the local mutual information bounds obtained via the Jacobian-based approximation. Each point represents a small neighborhood in the 2D spatial grid of user positions, and the corresponding bound quantifies how distinguishable nearby channel realizations are in the presence of Gaussian noise. The variability in the histogram reflects differences in the local geometry of the channel manifold, showing that positions where the manifold is more curved or sensitive yield higher mutual information. This numerical validation confirms the theoretical predictions derived in Section 3, and provides an intuitive visualization of how array geometry and propagation physics jointly shape the information content of CSI for spatial localization and predictive beamforming.

6.1. Relation to Fisher Information Bounds

Classical localization theory relies on the Fisher Information Matrix (FIM), which characterizes local sensitivity of the likelihood function around a fixed parameter value. Such bounds are inherently local and depend on second-order derivatives of the log-likelihood.
In contrast, the proposed Jacobian-based bound captures the global nonlinear variability of the channel mapping h ( X ) across the entire spatial support of X. Rather than linearizing the likelihood, it integrates the channel differential geometry along spatial trajectories.
Therefore, while Fisher bounds quantify local estimation accuracy, the present bound quantifies global information embedding of spatial uncertainty within the CSI manifold.

6.2. Model Assumptions

The analysis relies on the following assumptions:
  • The channel mapping h ( X ) is continuously differentiable over the spatial domain of interest.
  • The additive noise N is circularly symmetric complex Gaussian with covariance σ 2 I .
  • The spatial variable X is scalar and supported over a bounded interval.
  • Line-of-sight (LoS) propagation dominates the geometric structure of the channel manifold.
These assumptions are standard in geometric channel modeling and enable analytical tractability of the information-theoretic bounds.

7. Global Nonlinear Bound

To extend the local Jacobian bound to a global characterization of the mutual information, we leverage the relationship between mutual information and Kullback–Leibler (KL) divergence [2,4,6,10]. Recall that, for any reference distribution p ( Y | X ¯ ) , the mutual information satisfies
I ( X ; Y ) = E X KL p ( Y | X ) p ( Y | X ¯ ) KL p ( Y ) p ( Y | X ¯ ) E X KL p ( Y | X ) p ( Y | X ¯ ) ,
where the inequality arises because KL divergence is non-negative [10].
For the Gaussian conditional distributions of our channel model
p ( Y | X ) = 1 ( π σ 2 ) M exp 1 σ 2 Y h ( X ) 2 ,
the KL divergence between two such distributions with identical covariance σ 2 I M reduces to a simple squared distance between the means [4,6]:
KL p ( Y | X ) p ( Y | X ¯ ) = 1 σ 2 h ( X ) h ( X ¯ ) 2 .
The remaining challenge is to relate the difference h ( X ) h ( X ¯ ) to the Jacobian along a trajectory connecting X ¯ and X in the spatial domain. Using the path integral representation [11]:
h ( X ) h ( X ¯ ) = X ¯ X J ( s ) d s ,
where J ( s ) = h ( s ) / s is the local Jacobian evaluated along the trajectory s ( t ) connecting X ¯ to X.
Substituting into the KL expression, we obtain a **global nonlinear bound** on the mutual information:
I ( X ; Y ) 1 σ 2 E X X ¯ X J ( s ) d s 2 .
Several important remarks follow:
  • This bound generalizes the local linear bound derived in Section 6 by accumulating channel variability along potentially nonlinear trajectories in the spatial domain.
  • The integral of the Jacobian captures the total channel displacement between two positions, highlighting the role of manifold geometry in controlling the mutual information.
  • Unlike the classical I–MMSE relation [2], which is inherently local, this bound is applicable to nonlinear MIMO channel functions and quantifies information over extended spatial domains.
  • Numerically, this integral can be approximated along discrete grids or Monte Carlo samples of spatial trajectories, allowing validation against the local bounds and the PCA-based manifold visualizations shown in Figure 1 and Figure 2.
In summary, the global bound provides a rigorous upper limit on the information content of CSI across the entire environment, linking array geometry, channel nonlinearity, and spatial uncertainty in a single framework [2,4,6,8,10].
This result has several practical implications in modern wireless systems:
  • Predictive Beamforming: By quantifying the maximum information that CSI can provide about user positions, the bound can guide the design of predictive beam-steering algorithms in mmWave or massive MIMO systems, ensuring robust tracking even under high mobility [12,13].
  • Semantic Localization: In environments where precise GPS is unavailable or unreliable, the bound provides a theoretical limit on the accuracy achievable through CSI-based localization methods, informing system designers of the expected performance under various array configurations [1,5].
  • Adaptive Link Adaptation and Handover: The global bound can help determine regions where channel variability limits the usefulness of CSI, allowing adaptive modulation, coding, and handover strategies to optimize throughput and reliability [3,4].
  • Channel Charting and Trajectory Forecasting: For self-supervised learning approaches in channel charting, the bound provides a principled metric to evaluate the separability of user positions and the predictive power of trajectory forecasting models based on CSI [15,16].

7.1. Geometric Information Scaling in Multi-Layer Hybrid 6G Networks

Consider a hybrid 6G architecture where the user is simultaneously observed by S independent layers (e.g., terrestrial base stations, LEO satellites, HAPS platforms). The composite channel is modeled as
h ( X ) = i = 1 S h i ( X ) ,
where each h i ( X ) corresponds to the channel contribution of layer i.
The total Jacobian becomes
J ( X ) = i = 1 S J i ( X ) ,
where J i ( X ) = X h i ( X ) .
The squared Jacobian norm expands as
J ( X ) 2 = i = 1 S J i ( X ) 2 + i j J i ( X ) , J j ( X ) .
If the different layers provide sufficiently distinct angular perspectives, the cross-terms satisfy
E J i ( X ) , J j ( X ) 0 , i j ,
due to low spatial correlation between steering directions.
Under this angular diversity assumption, the Jacobian norm approximately scales additively:
J ( X ) 2 i = 1 S J i ( X ) 2 .
If each layer contributes comparable geometric sensitivity, i.e.,
J i ( X ) 2 C ,
then
J ( X ) 2 S C ,
implying linear scaling:
J ( X ) 2 = O ( S ) .
Since the global mutual information bound is proportional to J ( X ) 2 / σ 2 , we obtain
I ( X ; Y ) = O ( S ) .
This result reveals a fundamental principle for hybrid 6G architectures: the intrinsic geometric information content of CSI scales with the number of independent spatial observation layers.
Unlike classical capacity arguments, which justify multi-layer networks in terms of throughput or coverage extension, the present bound demonstrates that hybrid terrestrial–non-terrestrial architectures inherently expand the geometric dimension of the CSI manifold.
This provides an information-theoretic foundation for improved localization, predictive beamforming stability, and channel charting robustness in integrated 6G systems.
Figure 3 numerically validates the predicted linear scaling with the number of spatial observation layers.

Tightness

The bound becomes asymptotically tight in the high-SNR regime and when the channel mapping h ( X ) is locally linear over the support of X. In such cases, the integral representation reduces to a first-order approximation and the KL divergence matches the quadratic form induced by the Jacobian norm.

8. Results

This work establishes a unified geometric–information-theoretic framework to quantify the information content of channel state information (CSI) with respect to user spatial position.
We first derived a local Jacobian-based bound by linearizing the nonlinear MIMO channel mapping h ( X ) around a nominal position X 0 , yielding
I ( X ; Y ) 1 2 log det I M + 1 σ 2 J ( X 0 ) Σ X J ( X 0 ) H ,
which explicitly links mutual information to the singular values of the Jacobian, i.e., to the local curvature of the channel manifold. Numerical simulations with a ULA model confirm that regions with larger Jacobian norms yield higher mutual information, validating the theoretical derivation.
We then extended this approach to a global nonlinear bound using the KL-divergence representation of mutual information. By expressing channel variations as a path integral of the Jacobian,
h ( X ) h ( X ¯ ) = X ¯ X J ( s ) d s ,
the bound accumulates spatial channel variability along arbitrary trajectories, generalizing the local linear approximation to nonlinear propagation environments.
These results show that the information content of CSI is governed by the interplay of three factors: the array geometry, the nonlinearity of the channel manifold, and the spatial uncertainty of the user position.
The proposed bounds provide quantitative insights for several applications:
  • Predictive beamforming and mobility tracking, leveraging the Jacobian to assess spatial resolvability;
  • CSI-based localization, characterizing the maximal discrimination between nearby positions;
  • Channel charting and representation learning, where manifold separability governs embedding quality;
  • Adaptive link optimization, identifying regions of low information to guide modulation and handover strategies.
Overall, the framework offers a geometric perspective on CSI, bridging information theory and spatial channel modeling, and provides a principled way to assess the limits of position-dependent information in MIMO systems.

8.1. Implications for Networked Wireless Systems

The derived global Jacobian bound has direct implications for large-scale wireless networks, particularly in scenarios involving massive MIMO, cell-free architectures, and integrated sensing and communication systems. Specifically:
(i) It provides a theoretical upper bound on the CSI entropy that must be conveyed through feedback links.
(ii) It characterizes how array geometry impacts the dimensionality of channel state representations in distributed networks.
(iii) It establishes a fundamental scalability constraint: as spatial uncertainty increases, CSI dimensionality grows according to the manifold expansion rate.
These insights are directly relevant to CSI compression, pilot design, and cooperative multi-node estimation in 6G networks.

8.2. Relation to Fisher Information Bounds

The Fisher Information Matrix (FIM) associated with the model
Y = h ( X ) + N , N CN ( 0 , σ 2 I ) ,
is given by [6]
J F ( X ) = 1 σ 2 X h ( X ) H X h ( X ) .
The FIM characterizes local estimation sensitivity and yields the Cramér–Rao bound:
Cov ( X ^ ) J F ( X ) 1 .
However, Fisher Information is inherently local, as it relies on infinitesimal perturbations around a fixed X.
In contrast, the proposed global Jacobian bound evaluates
I ( X ; Y ) 1 σ 2 E X X ¯ X s h ( s ) d s 2 ,
which accumulates channel sensitivity along spatial trajectories.
Therefore:
  • Fisher Information quantifies local curvature of the likelihood.
  • The proposed bound quantifies global manifold expansion.
  • FIM is estimation-oriented.
  • The proposed bound is information-theoretic and prior-dependent.
To the best of our knowledge, this constitutes the first global geometric information bound on CSI manifolds in multi-antenna wireless systems.

8.3. Difference from Classical Capacity Bounds

Classical MIMO capacity results [8] consider
C = log det I M + ρ M H H H ,
which quantifies the maximum reliable communication rate between transmitted symbols and received signals, where ( · ) H denotes the Hermitian (conjugate transpose) operator.
In contrast, the present work does not analyze symbol-level communication. Instead, it studies the mutual information
I ( X ; Y ) ,
where X represents spatial position and Y corresponds to the observed CSI.
Hence:
  • Capacity bounds measure communication throughput.
  • The proposed bound measures geometric information content of CSI.
  • Capacity depends on input covariance design.
  • The proposed bound depends on spatial manifold curvature.

8.4. Asymptotic Scaling with the Number of Antennas

We analyze the asymptotic behavior of the Jacobian-based information bound as the number of antennas M increases. Consider a line-of-sight (LoS) ULA model:
Figure 4. Network scenario considered in this work. A base station equipped with a M-element ULA observes the CSI vector h ( X ) generated by a user located at spatial coordinate X. The nonlinear channel manifold induced by spatial variations is embedded in C M . The proposed Jacobian-based global bound quantifies the maximum mutual information between spatial uncertainty and observed CSI across the entire coverage region.
Figure 4. Network scenario considered in this work. A base station equipped with a M-element ULA observes the CSI vector h ( X ) generated by a user located at spatial coordinate X. The nonlinear channel manifold induced by spatial variations is embedded in C M . The proposed Jacobian-based global bound quantifies the maximum mutual information between spatial uncertainty and observed CSI across the entire coverage region.
Preprints 201134 g004
h m ( X ) = exp j 2 π d λ m sin θ ( X ) , m = 0 , , M 1 ,
where d is the inter-element spacing and λ is the wavelength.
Define the phase term:
ϕ m ( X ) = 2 π d λ m sin θ ( X ) .
Then:
h m ( X ) = e j ϕ m ( X ) .
Taking the derivative with respect to X:
h m X = e j ϕ m ( X ) j ϕ m ( X ) X .
Using the chain rule:
ϕ m ( X ) X = 2 π d λ m cos θ ( X ) d θ d X .
Hence:
h m X = h m ( X ) j 2 π d λ m cos θ ( X ) d θ d X .
Importantly, the exponential term h m ( X ) remains multiplicative in the derivative. However, when computing the Jacobian norm, its magnitude satisfies:
| h m ( X ) | 2 = 1 ,
since it is a unit-modulus complex exponential.
Therefore, the squared Jacobian norm becomes:
J ( X ) 2 = m = 0 M 1 2 π d λ m cos θ ( X ) d θ d X 2 .
This yields:
J ( X ) 2 m = 0 M 1 m 2 = ( M 1 ) M ( 2 M 1 ) 6 = O ( M 3 ) .

Case 1: Fixed Inter-Element Spacing (d Constant)

If d is fixed (e.g., d = λ / 2 ), the physical aperture grows linearly with M. In this case, the Jacobian norm scales as:
J ( X ) 2 = O ( M 3 ) ,
implying a superlinear growth of local sensitivity due to increasing aperture.

Case 2: Fixed Physical Aperture

If instead the total array length L = M d is kept constant, then:
d = L M .
Substituting into the Jacobian:
J ( X ) 2 L M 2 m = 0 M 1 m 2 = L M 2 O ( M 3 ) = O ( M ) .
Thus, under fixed aperture scaling, the Jacobian norm grows linearly with M.

Implications for the Information Bound

Since the global mutual information bound is proportional to J ( X ) 2 / σ 2 , we obtain:
I ( X ; Y ) = O ( M 3 ) , fixed spacing O ( M ) , fixed aperture
The linear scaling under fixed aperture is physically more meaningful in large-scale network deployments, where the array size is constrained. This result shows that the information content of CSI grows proportionally to the spatial degrees of freedom introduced by additional antennas, rather than superlinearly.
Figure 5 validates the predicted asymptotic scaling of the Jacobian norm. The cubic growth under fixed inter-element spacing and linear growth under fixed aperture confirm the theoretical analysis. The log–log representation reveals linear trends, confirming polynomial scaling. Numerical slope estimation yields approximately 3 under fixed inter-element spacing and 1 under fixed aperture, in full agreement with the theoretical asymptotic analysis.

9. Discussion

The results presented in this work provide a geometric reinterpretation of the information content of CSI, complementing classical statistical analyses of wireless channels [8,10]. While traditional information-theoretic treatments focus on input–output relationships under fixed channel models, our formulation emphasizes the nonlinear mapping X h ( X ) as the fundamental object governing spatial information transfer.
The local Jacobian bound reveals that mutual information is directly controlled by the singular values of J ( X ) , which encode the local stretching of the spatial manifold into the high-dimensional channel space. This establishes a direct connection between spatial resolvability and manifold curvature, providing a geometric interpretation of Fisher information in the context of MIMO propagation [6]. In particular, regions where the manifold is locally flat yield limited discriminability, even under high SNR, whereas highly curved regions enhance distinguishability between nearby spatial points.
The global nonlinear bound extends this interpretation beyond infinitesimal perturbations. By integrating the Jacobian along spatial trajectories, the bound captures accumulated channel displacement over extended regions. This demonstrates that the informative content of CSI is not merely a function of instantaneous sensitivity, but of the cumulative geometric evolution of the channel response across space.
Importantly, the framework is not reducible to the classical I–MMSE relation [2]. While I–MMSE provides a differential relationship between mutual information and estimation error under additive Gaussian channels, it does not explicitly account for nonlinear spatial embeddings induced by array geometry and propagation physics. Here, the geometry of the channel manifold itself becomes the central quantity.
From a system design perspective, this geometric viewpoint suggests that array topology, aperture size, and propagation richness should be evaluated not only in terms of capacity, but also in terms of induced manifold curvature and spatial separability. This opens a new interpretation of CSI as a structured geometric signal rather than merely a random vector realization. The linear scaling with S holds under sufficient angular diversity and low inter-layer correlation. Highly aligned observation geometries may reduce the effective gain due to constructive or destructive Jacobian interactions.

9.1. Implications for NTN and Integrated Sensing and Communication (ISAC)

The demonstrated linear scaling of geometric information with the number of independent spatial observation layers has direct implications for Non-Terrestrial Networks (NTN) and Integrated Sensing and Communication (ISAC) paradigms in 6G systems.
In hybrid terrestrial–non-terrestrial architectures, users may be simultaneously observed by:
(i) Ground-based massive MIMO base stations, (ii) Low Earth Orbit (LEO) satellite platforms, (iii) High-Altitude Platform Stations (HAPS), (iv) Reconfigurable intelligent surfaces (RIS).
Each layer provides an independent angular perspective of the user location, effectively increasing the dimensionality of the CSI manifold.
Unlike classical throughput-based justifications for NTN integration, the present analysis reveals a fundamentally different benefit: multi-layer architectures expand the intrinsic geometric information dimension of the channel representation.
This has several important consequences:
  • Improved localization accuracy due to increased Fisher information accumulation across layers.
  • Enhanced robustness of channel charting and manifold learning techniques under heterogeneous observation geometries.
  • Increased stability of predictive beamforming, as multi-layer diversity reduces local manifold degeneracies.
  • Natural support for joint sensing and communication, where spatial inference and data transmission share the same CSI structure.
From an information-geometric standpoint, hybrid 6G networks do not merely extend coverage or capacity; they reshape the channel manifold itself, increasing its curvature diversity and global observability.
Importantly, the benefit of hybrid terrestrial–non-terrestrial architectures is not limited to coverage extension. From an information-geometric standpoint, each additional observation layer increases the intrinsic dimensional richness of the CSI manifold, thereby enhancing spatial observability and robustness of learning-based channel inference methods.

10. Conclusions

This paper introduced a unified geometric–information-theoretic framework to quantify the information content of channel state information with respect to user spatial position.
By deriving a local Jacobian-based bound and extending it to a global nonlinear formulation, we showed that mutual information between position and received signal is fundamentally governed by the geometry of the channel manifold embedded in C M . The proposed bounds explicitly connect array geometry, propagation nonlinearity, and spatial uncertainty within a single analytical structure.
Numerical simulations based on a ULA model confirmed the theoretical predictions, illustrating how regions of high manifold curvature correspond to increased information content. The results provide theoretical support for applications such as predictive beamforming, CSI-based localization, and channel charting.
Future research directions include extending the framework to multipath environments with stochastic scattering, incorporating temporal dynamics, and developing learning-based estimators constrained by the derived geometric bounds. More broadly, the proposed approach suggests a shift from purely stochastic channel modeling toward geometry-aware information analysis in next-generation wireless networks.

Author Contributions

Conceptualization, A.P. and M.T.; methodology, A.P.; software, A.P.; validation, M.T.; formal analysis, A.P.; writing—original draft preparation, A.P.; writing—review and editing, M.T.

Data Availability Statement

The source code used to generate the numerical simulations and figures presented in this study is publicly available at: https://github.com/apirodd/CSI-Geometric-Bounds. The repository includes the scripts ULA.py and Manifold.py, which reproduce the ULA channel simulation, manifold visualization, and Jacobian-based validation results. No external datasets were used.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CSI Channel State Information
MIMO Multiple-Input Multiple-Output
ULA Uniform Linear Array
SNR Signal-to-Noise Ratio
KL Kullback–Leibler (divergence)
MMSE Minimum Mean Square Error
PDF Probability Density Function
PCA Principal Component Analysis
CSCG Circularly Symmetric Complex Gaussian

References

  1. Patwari, N.; Ash, J.; Kyperountas, S.; Hero, A.O.; Moses, R.L.; Correal, N.S. Locating the nodes: Cooperative localization in wireless sensor networks. IEEE Signal Process. Mag. 2005, 22, 54–69. [Google Scholar] [CrossRef]
  2. Guo, D.; Shamai, S.; Verdú, S. Mutual information and minimum mean-square error in Gaussian channels. IEEE Trans. Inf. Theory 2005, 51, 1261–1282. [Google Scholar] [CrossRef]
  3. Shamai, S.; Verdú, S. Capacity of Channels with Uncoded Side Information. Emerging Telecommunications Technology 2000, 46, 587–600. [Google Scholar] [CrossRef]
  4. Palomar, D.P.; Verdú, S. Gradient of mutual information in linear vector Gaussian channels. IEEE Trans. Inf. Theory 2006, 52, 141–154. [Google Scholar] [CrossRef]
  5. Gezici, S.; Tian, Z.; Giannakis, G.B.; Kobayashi, H.; Molisch, A.F.; Poor, H.V.; Sahinoglu, Z. A look at positioning aspects for future sensor networks. IEEE Signal Process. Mag. 2008, 22, 70–84. [Google Scholar] [CrossRef]
  6. Kay, S.M. Fundamentals of Statistical Signal Processing: Estimation Theory; Prentice Hall: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
  7. Balanis, C.A. Antenna Theory: Analysis and Design, 4th ed.; Wiley: Hoboken, NJ, USA, 2016. [Google Scholar]
  8. Tse, D.; Viswanath, P. Fundamentals of Wireless Communication; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
  9. Telatar, I.E. Capacity of multi-antenna Gaussian channels. Eur. Trans. Telecommun. 1999, 10, 585–595. [Google Scholar] [CrossRef]
  10. Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; Wiley: Hoboken, NJ, USA, 2006. [Google Scholar]
  11. Lee, J.M. Introduction to Smooth Manifolds, 2nd ed.; Springer: New York, NY, USA, 2012. [Google Scholar]
  12. Heath, R.W.; González-Prelcic, N.; Rangan, S.; Roh, W.; Sayeed, A.M. An overview of signal processing techniques for millimeter wave MIMO systems. IEEE J. Sel. Topics Signal Process. 2016, 10, 436–453. [Google Scholar] [CrossRef]
  13. Rappaport, T.S.; Sun, S.; Mayzus, R.; Zhao, H.; Azar, Y.; Wang, K.; Wong, G.N.; Schulz, J.K.; Samimi, M.; Gutierrez, F. Millimeter wave mobile communications for 5G cellular: It will work! IEEE Access 2013, 1, 335–349. [Google Scholar] [CrossRef]
  14. Studer, C.; Medjkouh, S.; Gönültaş, E.; Goldstein, T.; Tirkkonen, O. Channel charting: Locating users within the radio environment using channel state information. IEEE Access 2018, 6, 47682–47698. [Google Scholar] [CrossRef]
  15. Zhao, L.; Yang, Y.; Xiong, Q.; Wang, H.; Yu, B.; Sun, F.; Sun, C. A signature based approach towards global channel charting with ultra low complexity. arXiv 2024, 2403.20091. [Google Scholar] [CrossRef]
  16. Taner, S.; Palhares, V.; Studer, C. Channel charting in real-world coordinates with distributed MIMO. arXiv 2024, 2406.13722. [Google Scholar] [CrossRef]
Figure 1. PCA projection of the ULA channel manifold under single-path LoS propagation.
Figure 1. PCA projection of the ULA channel manifold under single-path LoS propagation.
Preprints 201134 g001
Figure 2. Distribution of the local mutual information bounds computed from the Jacobian of the channel manifold. Each point in the spatial grid corresponds to a small neighborhood around position X, with covariance Σ X . The local bound is given by I ( X ; Y ) log det I M + 1 σ 2 J ( X ) Σ X J ( X ) H , highlighting how the spatial sensitivity of the channel affects the informative content of CSI. The histogram provides a global overview of variability in the mutual information across the environment.
Figure 2. Distribution of the local mutual information bounds computed from the Jacobian of the channel manifold. Each point in the spatial grid corresponds to a small neighborhood around position X, with covariance Σ X . The local bound is given by I ( X ; Y ) log det I M + 1 σ 2 J ( X ) Σ X J ( X ) H , highlighting how the spatial sensitivity of the channel affects the informative content of CSI. The histogram provides a global overview of variability in the mutual information across the environment.
Preprints 201134 g002
Figure 3. Scaling of the Jacobian norm with the number of independent network layers S. Linear growth confirms the predicted geometric information scaling under angular diversity.
Figure 3. Scaling of the Jacobian norm with the number of independent network layers S. Linear growth confirms the predicted geometric information scaling under angular diversity.
Preprints 201134 g003
Figure 5. Jacobian Norm Scaling with M
Figure 5. Jacobian Norm Scaling with M
Preprints 201134 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated