ARTICLE | doi:10.20944/preprints202203.0403.v1
Subject: Computer Science And Mathematics, Artificial Intelligence And Machine Learning Keywords: behavioral change prediction; learned features; deep feature learning; handcrafted features; bidirectional long-short term memory; autoencoders; temporal convolutional neural network; clinical decision support system; multisensory stimulation therapy; physiological signals.
Online: 31 March 2022 (08:38:58 CEST)
Predicting change from multivariate time series has relevant applications ranging from medical to engineering fields. Multisensory stimulation therapy in patients with dementia aims to change the patient’s behavioral state. For example, patients who exhibit a baseline of agitation may be paced to change their behavioral state to relaxed. This study aims to predict changes in behavioral state from the analysis of the physiological and neurovegetative parameters to support the therapist during the stimulation session. In order to extract valuable indicators for predicting changes, both handcrafted and learned features were evaluated and compared. The handcrafted features were defined starting from the CATCH22 feature collection, while the learned ones were extracted using a Temporal Convolutional Network, and the behavioral state was predicted through Bidirectional Long Short-Term Memory Auto-Encoder, operating jointly. From the comparison with the state-of-the-art, the learned features-based approach exhibits superior performance with accuracy rates of up to 99.42% with a time window of 70 seconds and up to 98.44% with a time window of 10 seconds.
ARTICLE | doi:10.20944/preprints202108.0272.v1
Subject: Engineering, Industrial And Manufacturing Engineering Keywords: Remaining Useful Life; Deep Neural Network; Convolutional Neural Network; Genetic Optimization; Neural Network Optimization; Support Vector Regression; Depth Maps; Normal Maps; 3D Point Clouds.
Online: 12 August 2021 (10:40:23 CEST)
In the current industrial landscape, increasingly pervaded by technological innovations, the adoption of optimized strategies for asset management is becoming a critical key success factor. Among the various strategies available, the “Prognostics and Health Management” strategy is able to support maintenance management decisions more accurately, through continuous monitoring of equipment health and “Remaining Useful Life” forecasting. In the present study, Convolutional Neural Network-based Deep Neural Network techniques are investigated for the Remaining Useful Life prediction of a punch tool, whose degradation is caused by working surface deformations during the machining process. Surface deformation is determined using a 3D scanning sensor capable of returning point clouds with micrometric accuracy during the operation of the punching machine, avoiding both downtime and human intervention. The 3D point clouds thus obtained are transformed into bidimensional image-type maps, i.e., maps of depths and normal vectors, to fully exploit the potential of convolutional neural networks for extracting features. Such maps are then processed by comparing 15 genetically optimized architectures with the transfer learning of 19 pre-trained models, using a classic machine learning approach, i.e., Support Vector Regression, as a benchmark. The achieved results clearly show that, in this specific case, optimized architectures provide performance far superior (MAPE=0.058) to that of transfer learning which, instead, remains at a lower or slightly higher level (MAPE=0.416) than Support Vector Regression (MAPE=0.857).
ARTICLE | doi:10.20944/preprints201710.0115.v1
Subject: Engineering, Electrical And Electronic Engineering Keywords: fall detection; vital signs monitoring; ultra-wideband radar; micro-Doppler
Online: 17 October 2017 (11:45:13 CEST)
Continuous in-home monitoring of older adults living alone aims to improve their quality of life and independence, by detecting early signs of illness and functional decline or emergency conditions. To meet requirements for technology acceptance by seniors (unobtrusiveness, non-intrusiveness, privacy-preservation), this study presents and discusses a new smart sensor system for the detection of abnormalities during daily activities, based on ultra-wideband radar providing rich, not privacy-sensitive, information useful for sensing both cardiorespiratory and body movements, regardless of ambient lighting conditions and physical obstructions (through-wall sensing). The radar sensing is a very promising technology, enabling the measurement of vital signs and body movements at a distance, and thus meeting both requirements of unobtrusiveness and accuracy. In particular, impulse-radio ultra-wideband radar has attracted considerable attention in recent years thanks to many properties that make it useful for assisted living purposes. The proposed sensing system, evaluated in meaningful assisted living scenarios by involving 30 participants, exhibited the ability to detect vital signs, to discriminate among dangerous situations and activities of daily living, and to accommodate individual physical characteristics and habits. The reported results show that vital signs can be detected also while carrying out daily activities or after a fall event (post-fall phase), with accuracy varying according to the level of movements, reaching up to 95% and 91% in detecting respiration and heart rates, respectively. Similarly, good results were achieved in fall detection by using the micro-motion signature and unsupervised learning, with sensitivity and specificity greater than 97% and 90%, respectively.
REVIEW | doi:10.20944/preprints202305.0105.v1
Subject: Computer Science And Mathematics, Artificial Intelligence And Machine Learning Keywords: Review; Human action recognition; Smart living; Multimodality; Real-time processing; Interoperability; Resource-constrained processing; Sensing technology; Machine learning; Deep learning; Signal processing; Smart home; Smart environment; Smart city; Smart Community; Ambient Assisted Living
Online: 3 May 2023 (06:54:40 CEST)
Smart living, a concept that has gained increasing attention in recent years, revolves around integrating advanced technologies in homes and cities to enhance the quality of life for citizens. Sensing and human action recognition are crucial aspects of this concept. Smart living applications span various domains, such as energy consumption, healthcare, transportation, and education, which greatly benefit from effective human action recognition. This field, originating from computer vision, seeks to recognize human actions and activities using not only visual data but also many other sensor modalities. This paper comprehensively reviews the literature on human action recognition in smart living environments, synthesizing the main contributions, challenges, and future research directions. This review selects five key domains: Sensing Technology, Multimodality, Real-time Processing, Interoperability, and Resource-Constrained Processing, as they encompass the critical aspects required for successfully deploying human action recognition in smart living. These domains highlight the essential role that sensing and human action recognition play in successfully developing and implementing smart living solutions. This paper serves as a valuable resource for researchers and practitioners seeking to explore further and advance the field of human action recognition in smart living.
REVIEW | doi:10.20944/preprints202306.0672.v1
Subject: Computer Science And Mathematics, Artificial Intelligence And Machine Learning Keywords: Review; Human action recognition; Smart living; Services; Applications; Context Awareness; Data Availability; Personalization; Privacy; Sensing technology; Machine learning; Deep learning; Signal processing; Smart home; Smart environment; Smart city; Smart Community; Ambient Assisted Living
Online: 9 June 2023 (05:34:18 CEST)
Smart Living, an increasingly prominent concept, entails incorporating sophisticated technologies in homes and urban environments to elevate the quality of life for citizens. A critical success factor for Smart Living services and applications, from energy management to healthcare and transportation, is the efficacy of human action recognition (HAR). HAR, rooted in computer vision, seeks to identify human actions and activities using visual data and various sensor modalities. This paper extensively reviews the literature on HAR in Smart Living services and applications, amalgamating key contributions and challenges while providing insights into future research directions. The review delves into the essential aspects of Smart Living, the state of the art in HAR, and the potential societal implications of this technology. Moreover, the paper meticulously examines the primary application sectors in Smart Living that stand to gain from HAR, such as smart homes, smart healthcare, and smart cities. By underscoring the significance of the four dimensions of Context Awareness, Data Availability, Personalization, and Privacy in HAR, this paper serves as a valuable resource for researchers and practitioners striving to advance Smart Living services and applications.