Detecting distributional drift is central to reliable inference in evolving systems, yet existing approaches treat it as a static discrepancy measured by fixed divergence functionals. We introduce an information-geometric approach grounded in the Kerimov–Alekberli (KA) framework, where drift is modeled as a non-equilibrium trajectory on a curved statistical manifold. Three contributions are presented. First, we establish a formal impossibility result: no fixed divergence can achieve uniform optimality under non-stationary dynamics (Theorem 3.1). Second, we connect drift detection to entropy production, linking statistical inference with physical irreversibility (Theorem 4.1). Third, we introduce an asymmetric entropy operator A(θ) into the KA drift equation—a directional term that amplifies early entropy signals through a skew-symmetric perturbation of the gradient flow. Validated by 50 Monte Carlo runs on non-stationary Gaussian processes, adaptive divergence achieves mean detection at step 113 ± 11 vs 117 ± 13 for fixed KL (reference onset at step 100), demonstrating no-regret behaviour: the method is never significantly worse than the best fixed metric, while outperforming scalar baselines (ADWIN, Page-Hinkley) by wide margins (p < 10−4). These results motivate the principle of adaptive divergence, whereby the notion of distance between distributions must itself evolve in response to system dynamics.