Wearable systems for rehabilitation monitoring often rely on complex sensor configurations and produce outputs that are difficult to interpret. This limits their practical use. This study investigates whether movement-quality assessment can be achieved accurately and transparently using a reduced set of signals. Using wearable sensor data from lower-limb rehabilitation tasks performed under correct and intentionally erroneous conditions, we extracted a small set of rotation-based features and evaluated them within a supervised ML framework. The results show that these features can reliably distinguish correct from incorrect movement, with classification accuracy around 0.70, while preserving clear biomechanical interpretation. Reduced sensor configurations retained, and in some cases improved, performance, with balanced accuracy reaching 0.947 and 0.917 in the examined tasks. A proof-of-concept real-time formulation further showed that movement deviations can be detected early within repetitions, while limiting false feedback on correct executions to approximately 9%. Overall, the findings show that movement-quality assessment can be achieved with minimal sensing, while also supporting early error detection and practical feedback. These properties are relevant to wearable rehabilitation systems, including IoT applications that depend on efficient sensing, interpretable analysis, and timely feedback.