Video games have evolved into sophisticated media capable of eliciting complex affective states, yet traditional Dynamic Difficulty Adjustment (DDA) systems rely primarily on performance metrics rather than emotional feedback. This research proposes a novel closed-loop architecture for Affective Game Computing on mobile platforms, designed to infer player emotions directly from gameplay inputs and actively steer emotional transitions. A complete experimental platform, including a custom mobile game, was developed to collect gameplay telemetry and device sensor data. The proposed framework utilizes a sequence-to-sequence Transformer-based neural network to predict future game states and emotional responses without the need for continuous camera monitoring, utilizing facial expression analysis only as a ground-truth proxy during training. Crucially, to address the "cold-start" problem inherent in optimization systems—where historical data is unavailable at the session’s onset—a secondary neural network is introduced. This component directly predicts optimal initial game parameters to elicit a specific target emotion, enabling immediate affective steering before sufficient gameplay history is established. Experimental evaluation demonstrates that the model effectively interprets sparse emotional signals as discrete micro-affective events and that the optimization routine can manipulate game parameters to shift the predicted emotional distribution toward a desired profile. While the study identifies challenges regarding computational latency on consumer hardware and the reliance on proxy emotional labels, this work establishes a transparent, reproducible proof-of-concept. It provides a scalable, non-intrusive baseline for future research into emotion-aware adaptation for entertainment and therapeutic serious games.