This tutorial presents a first-principles framework for rigorous scientific inference, grounded in a minimal set of explicit, falsifiable methodological principles. These principles enforce transparency of priors and the strict avoidance of post-hoc modification. We argue that the century-long stagnation in Voynich Manuscript (VMS) research is not a failure of scholar effort or data acquisition, but a systemic artifact of model selection. Specifically, the field has been constrained by the Patching Fallacy: the introduction of unconstrained auxiliary parameters to salvage a hypothesis already contradicted by evidence. By adopting a strict Zero-Patch Standard rooted in information theory [2] and Bayesian probability [5], we demonstrate how to deduce a prior directly from the topological invariants of the data. When applied to the VMS, this principled discipline shows that common linguistic and cryptographic models are strongly disfavored under the Zero-Patch Standard. Instead, it supports a Structured Reference System (e.g., a Relational Database or Inventory) as the leading hypothesis consistent with the documented corpus invariants. This assignment is not offered as a settled historical claim but as the information-theoretically minimal explanation under the Zero-Patch constraint derived from the entropy, morphology, and serialization constraints of the evidence.