Large language models (LLMs) and foundation models (FMs) are reshaping petroleum engineering at a pace no previous wave of artificial intelligence has matched. Between 2022 and 2026 the field went from zero petroleum specific LLMs to eighteen domain-specialized models, more than a dozen subsurface foundation models, and more than twenty commercial industry platforms, while annual publication counts grew more than five-fold from 2020 to 2024. This survey integrates those developments into a single framework. We analyze 296 verified references spanning 2003–2026 across 14 thematic areas and six petroleum sub-disciplines plus one cross-cutting category (geophysics, drilling, reservoir, production, petrophysics, completions, and cross-cutting), from classical natural-language-processing baselines through today’s vision–language models, retrieval-augmented generation stacks, and autonomous agents. Our organizing contributions include (i) a positioning matrix against 25 prior surveys, (ii) a bubble-plot taxonomy of sub-disciplines against AI paradigms, (iii) seven application-category tables, six additional thematic tables, and a dedicated maturity-model table (fourteen tables in total), (iv) a catalog of public petroleum AI systems and enabling substrate, and (v) the PetroLLM Maturity Model — a five-level scaffold (L1 Conversational Q&A, L2 Document Intelligence and Retrieval, L3 Domain-Specialized LLMs, L4 Autonomous Agents and Copilots, L5 Self-Improving Foundation-Model Ecosystems) that situates every surveyed system on a common ladder. The paper closes with a bibliometric snapshot (trends, sub-discipline distribution, method distribution, institutional footprint) and an open-research agenda spanning data, benchmarks, physics integration, safety, multilinguality, and standards. Our headline findings: geophysics leads, reservoir and production lag, petroleum benchmarks are scarce, industry deployments outpace academic publication, and L5 self-improving ecosystems remain aspirational but within a realistic 2030 horizon.