Facial expressions are crucial in conveying emotions and for engaging in social interactions. The facial musculature activations and their pattern of movements under emotions are similar in all humans; hence, facial expressions are considered a behavioral phenotype. Facial features related to the expression of various emotions change under different health impairments, including in cognitive decline and pain experience. Hence, evaluating these facial expression deviations in comparison to healthy baseline conditions can help in the early detection of health impairments. Recent advances in machine learning and computer vision have introduced a multitude of tools for extracting human facial features and researchers have explored the application of these tools in early screening and detection of different health conditions. Advances in these studies can specially help in telemedicine applications and in remote patient monitoring, and potentially reduce the current excessive demand on the healthcare system. In addition, once developed, these technologies can assist healthcare professionals in emergency room triage, early diagnosis, and treatments. The aim of the present review is to discuss the available tools that can objectively measure facial features and to record the studies that use these tools in various health assessments.