Physics-informed neural networks (PINNs) have emerged as powerful tools for solving partial differential equations, but their training remains challenging due to ill-conditioned loss landscapes. While adaptive methods like Adam dominate deep learning, they exhibit instability on stiff PDEs, and second-order methods are computationally prohibitive. We present EPANG-Gen, an optimizer that combines memory-efficient eigen-decomposition with lightweight Bayesian uncertainty quantification. EPANG-Gen introduces three ele- ments: (1) a randomized eigenspace estimator that approximates Hessian curvature with O(dk) memory (k ≪ d), (2) Bayesian R-LayerNorm for per-activation uncertainty estima-tion, and (3) adaptive rank selection (PASA) that dynamically adjusts to problem difficulty. We evaluate EPANG-Gen on four benchmark PDEs—Poisson 1D, Burgers’ equation, Darcy flow, and Helmholtz 2D—and on the Taylor-Green vortex at Re = 100, 000, a canonical 3D turbulence problem. All experiments were conducted under computational constraints (Kaggle, NVIDIA P100 GPU, limited epochs). Results show that EPANG-Gen achieves performance comparable to Adam on the toughest turbulent regime while eliminating the 25% catastrophic failure rate of ADOPT across 72 runs. Ablation studies confirm that eigen- preconditioning contributes to performance improvements of 11–35%. The built-in uncer- tainty estimates provide confidence metrics at negligible cost. This work represents an initial exploration of curvature-aware optimization for PINNs; further validation with larger com- pute resources is needed. Code is available at https://github.com/EPANG-Gen/EPANG-Gen.