The institutional drive to deploy digital assistive technologies—from IoT monitoring to AI companions—as a solution to the ageing care crisis functions as an ethical double-edged sword. This article argues that beyond isolated risks, these technologies introduce a systemic tension where gains in safety and efficiency often come at the cost of autonomy, human connection, and equity. We propose a critical framework that diagnoses four interconnected dimensions of this tension: (1) the erosion of privacy and autonomy through pervasive surveillance; (2) the risk of dehumanization in high-tech, low-touch interactions; (3) the "digital grey divide" as a social determinant of health; and (4) the perpetuation of "coded ageism" through algorithmic bias. To bridge the gap between ethical principle and practice, the framework translates this diagnosis into a practical roadmap for "Dignity-by-Design." It operationalizes person-centred care through three actionable shifts: moving from compliance to commitment, replacing static consent with dynamic engagement, and establishing the lived experience of older adults and caregivers as a core design standard via participatory action research. Ultimately, this work provides a critical tool for researchers, developers, and policymakers to guide the ethically-aligned implementation of technologies that truly enhance autonomy, foster trust, and uphold dignity in geriatric care.