ARTICLE | doi:10.20944/preprints202007.0101.v1
Subject: Keywords: Term deposit subscription; Neural network; GRU; Convolutional layers; DT; MLP; k-NN
Online: 6 July 2020 (09:13:34 CEST)
Banks are normally offered two kinds of deposit accounts. It consists of deposits like current/saving account and term deposits like fixed or recurring deposits. For enhancing the maximized profit from bank as well as customer perspective, term deposit can accelerate uplifting of finance fields. This paper focuses on likelihood of term deposit subscription taken by the customers. Bank campaign efforts and customer detail analysis can influence term deposit subscription chances. An automated system is approached in this paper that works towards prediction of term deposit investment possibilities in advance. This paper proposes deep learning based hybrid model that stacks Convolutional layers and Recurrent Neural Network (RNN) layers as predictive model. For RNN, Gated Recurrent Unit (GRU) is employed. The proposed predictive model is later compared with other benchmark classifiers such as k-Nearest Neighbor (k-NN), Decision tree classifier (DT), and Multi-layer perceptron classifier (MLP). Experimental study concludes that proposed model attains an accuracy of 89.59% and MSE of 0.1041 which outperform well other baseline models.
ARTICLE | doi:10.20944/preprints202006.0360.v1
Subject: Keywords: Term deposit subscription; 10-fold stratified cross-validation; Neural network; DT; MLP; k-NN
Online: 30 June 2020 (08:22:58 CEST)
For enhancing the maximized profit from bank as well as customer perspective, term deposit can accelerate finance fields. This paper focuses on likelihood of term deposit subscription taken by the customers. Bank campaign efforts and customer details are influential while considering possibilities of taking term deposit subscription. An automated system is provided in this paper that approaches towards prediction of term deposit investment possibilities in advance. Neural network(NN) along with stratified 10-fold cross-validation methodology is proposed as predictive model which is later compared with other benchmark classifiers such as k-Nearest Neighbor (k-NN), Decision tree classifier (DT), and Multi-layer perceptron classifier (MLP). Experimental study concluded that proposed model provides significant prediction results over other baseline models with an accuracy of 88.32% and Mean Squared Error (MSE) of 0.1168.
ARTICLE | doi:10.20944/preprints202012.0310.v1
Subject: Life Sciences, Biochemistry Keywords: Variola major; phylogeographical analysis; long-term calibrations; short- term calibrations
Online: 14 December 2020 (09:21:34 CET)
In order to reconstruct the origin and pathways of variola virus (VARV) dispersion, we analyzed 47 VARV isolates available in public databases and their SNPs. The mean substitution rate of the whole genomes was 9.41x10-6 (95%HPD:8.5-11.3x10-6) substitutions/site/year. The time of the tree root was estimated to be a mean 68 years (95%HPD:60.5–75.9). The phylogeographical analysis showed that the Far East and India were the most probable locations of the tree root and of the inner nodes, respectively, whereas for the outer nodes it corresponded to the sampling locations. The Bayesian Skyline plot showed that the effective number of infections started to grow exponentially in 1915-1920, peaked in the 1940s, and then decreased to zero. Our results suggests that the VARV major strains circulating between 1940s-1970s probably shared a common ancestor originated in the Far East; subsequently moved to India, which became the center of its dispersion to eastern and southern Africa, and then to central Africa and the Middle East, probably following the movements of people between south-eastern Asia and the other places with a common colonial history. These findings may help to explain the controversial reconstructions of the history of VARV obtained using long- and short- term calibrations.
ARTICLE | doi:10.20944/preprints202212.0201.v1
Online: 12 December 2022 (12:11:00 CET)
Conclusive evidence that specic long-term memory formation relies on den- dritic growth and structural synaptic changes has proven elusive. Connec- tionist models of memory based on this hypothesis are confronted with the so-called plasticity stability dilemma or catastrophic interference. Other fun- damental limitations of these models are the feature binding problem, the speed of learning, the capacity of the memory, the localisation in time of an event and the problem of spatio-temporal pattern generation. This paper suggests that the generalisation and long-term memory mechanisms are not correlated. Only the development and the improvement of the feature ex- tractors in the cortex involves structural synaptic changes. We suggest the long-term memory has a separate mechanism which involves protein synthe- sis to encode the information into the structure of these proteins. A model of memory should be capable of explaining the dierence between memorisation and learning. Learning has in our approach two dierent mechanisms. The generalisation in the brain is handled by the proper development of the links between neurons via synapses. The Hebbian learning rule could be applied only for this part of learning. Storing an internal ring pattern involves, in our approach, a new mechanism which puts the information regarding this ring pattern into the structure of special proteins in such a way that it can be retrieved later. The hypotheses introduced in this article includes a physiological assumption which has not been yet verified because it is not currently experimentally accessible. Keywords: Waves, Protein Synthesis, Resonance, Long Term Memory Preprint submitted to Neural Networks
ARTICLE | doi:10.20944/preprints202210.0054.v1
Subject: Social Sciences, Geography Keywords: Cameroon; rainfall; long-term variability; trend tests
Online: 6 October 2022 (08:17:50 CEST)
The rainfall study in the long term is essential for climatic change understanding and socioeconomic development. The main goal of this study is to explore the spatial and temporal variations of precipitation in different time scales (seasonal and annual) in Cameroon. The Mann–Kendall and Pettitt tests were applied to analyze the precipitation variability. On temporal plan, the different regions of Cameroon have recorded significant drops in annual rainfall that Pet-titt's test generally situates around the 1970s. The decreases observed for the northern part of Cameroon regions are between –5.4% (Adamawa) and –7.4% (Far North). Those of west-ern part regions oscillate between –7.5% (South-West) and –12.5% (West). The southern part of Cameroon regions recorded decreases varying between –4.3% (East) and –5.9% (Center). On spatial plan, the divisions of the northern, western and southern parts of Cameroon respectively recorded after the 1970s (a pivotal period in the evolution of precipitation on temporal plan), a precipitation decrease towards the South, the South-West and the West. This study's findings could be helpful for planning and managing water resources in Cameroon.
ARTICLE | doi:10.20944/preprints201904.0058.v1
Subject: Social Sciences, Econometrics & Statistics Keywords: load forecast; short term; probabilistic; Gaussian processes
Online: 4 April 2019 (16:01:54 CEST)
We provide a comprehensive framework for forecasting five minute load using Gaussian processes with a positive definite kernel specifically designed for load forecasts. Gaussian processes are probabilistic, enabling us to draw samples from a posterior distribution and provide rigorous uncertainty estimates to complement the point forecast, an important benefit for forecast consumers. As part of the modeling process, we discuss various methods for dimension reduction and explore their use in effectively incorporating weather data to the load forecast. We provide guidance for every step of the modeling process, from model construction through optimization and model combination. We provide results on data from the PJMISO for various periods in 2018. The process is transparent, mathematically motivated, and reproducible. The resulting model provides a probability density of five-minute forecasts for 24 hours.
CASE REPORT | doi:10.20944/preprints201809.0410.v1
Subject: Behavioral Sciences, Social Psychology Keywords: long-term care, technology, therapy, virtual reality
Online: 20 September 2018 (13:34:02 CEST)
In this study, 6 residents of a long-term care facility were asked to try on Virtual Reality glasses and report their first experiences with Virtual Reality. The results show that Virtual Reality is of great interest to elderly residents of in-patient long-term care facilities. The wearing period was longer than expected and no symptoms of cyber sickness occurred. For the residents it was exciting to explore the virtual environments. Austrian destinations, nature scenes in the mountains and forests but also trips to the zoo, the museum, in churches or even densely populated areas like shopping streets or train stations would be places for the residents, they would like to explore virtually. Far-off destinations such as Rio de Janeiro or the Caribbean are more of an exception. Biographically relevant places such as the parental home or the location of their wedding were not named. Concerning the usability, an adjustment of the VR glasses is necessary for a longer-term use in any case.
Subject: Life Sciences, Biochemistry Keywords: CA3-CA1 synapses; NMDA; AMPA; systems biology; multiscale modeling; synaptic plasticity; long term potentiation; long term depression; hippocampus
Online: 8 January 2021 (13:17:31 CET)
Inside hippocampal circuits, neuroplasticity events that individual cells may undergo during synaptic transmissions occur in the form of Long Term Potentiation (LTP) and Long Term Depression (LTD). The high density of NMDA receptors expressed on the surface of the dendritic CA1 spines confers to hippocampal CA3-CA1 synapses, the ability to easily undergo NMDA-mediated LTP and LTD, that is essential for some forms of explicit learning in mammals. Providing a comprehensive kinetic model that can be used for running computer simulations of the synaptic transmission process is currently a major challenge. Here, we propose a compartmentalized kinetic model for CA3-CA1 synaptic transmission. Our major goal was to tune our model in order to predict the functional impact caused by disease associated variants of NMDA receptors related to severe cognitive impairment. Indeed, for variants Glu413Gly and Cys461Phe, our model predicts negative shifts in the glutamate affinity and changes in the kinetic behavior, consistent with experimental data. These results pinpoint to the predictive power of this multiscale viewpoint, which aims to integrate the quantitative kinetic description of large interaction networks typical of system biology approaches with a focus on the quality of few, key, molecular interactions typical of structural biology ones.
HYPOTHESIS | doi:10.20944/preprints202104.0060.v1
Subject: Behavioral Sciences, Applied Psychology Keywords: Human Memory; Long-term Memory; Episodic; Implicit; Explicit
Online: 2 April 2021 (12:02:21 CEST)
Memory is probably one of the most complex cognitive functions of the human, and in many years, thousands of studies have helped us to better recognize this brain function. One of the reference textbooks in neuroscience, which has also elaborated on the memory function, is written by Prof. Kandel and his colleagues. In this book, I encountered a number of ambiguities when it was explaining the memory system. Here, I am sharing those points, either to find an answer for them, or to let them be a suggestion for our future works. Prof. Kandel has spent most of his meritorious lifetime on studying the memory system; however, the brain is extremely complex, and as a result, we still have many years to comprehensively understand the neural mechanisms of brain functions.
ARTICLE | doi:10.20944/preprints202102.0185.v1
Subject: Earth Sciences, Atmospheric Science Keywords: atmosphere; aerosol; background; particle size; long term; Mediterranean
Online: 8 February 2021 (10:56:35 CET)
The Eastern Mediterranean is a highly populated area with air quality problems as well where climate change already is noticed by higher temperatures and changing precipitation pattern. The anthropogenic aerosol affects health and changing concentra-tions and properties of the atmospheric aerosol affect radiation balance and clouds. Continuous long-term observations are essential in assessing the influence of anthro-pogenic aerosols on climate and health. We present 6 years of observations from Navarino Environmental Observatory (NEO), a new station located at the south west tip of Pelo-ponnese, Greece. The two sites at NEO, were evaluated to show the influence of the local meteorology but also to assess the general background aerosol possible. It was found that the background aerosol was originated from aged European aerosols and was strongly influenced by biomass burning, fossil fuel combustion, and industry. When subsiding into the boundary layer, local sources contributed in the air masses moving south. Mesoscale meteorology determined the diurnal variation of aerosol properties such as mass and number by means of typical sea breeze circulation, giving rise to pronounced morning and evening peaks in pollutant levels. While synoptic scale meteorology, mainly large-scale air mass transport and precipitation, strongly influenced the season-ality of the aerosol properties.
ARTICLE | doi:10.20944/preprints202007.0640.v1
Subject: Engineering, Energy & Fuel Technology Keywords: long-term energy storage; fossil fuels; energy transition
Online: 26 July 2020 (16:38:35 CEST)
Great Britain’s stocks of coal, natural gas, and petroleum have seen major changes to the levels of stored energy over the years 2005 to 2019, a reduction of 200 TWh (35%) from 570 TWh to 370 TWh. The transformation of its electrical system over this timeframe saw a reduction in coal generation, leading to a corresponding reduction of the levels of stockpiled coal of 85 TWh (68%), partially offset by an increase in the stocks of biomass for electrical generation. The reduction in natural gas storage of 24 TWh (44%) was primarily due to the closure of Britain’s only long-term seasonal natural gas storage facility in January 2018. This was partially offset by the construction of medium-term natural gas storage facilities and the use of LNG storage in the years preceding its closure. For stocks of crude oil and oil products the reduction was 35 TWh (21%), linked to the overall reduction in demand.
ARTICLE | doi:10.20944/preprints201910.0180.v1
Subject: Medicine & Pharmacology, Oncology & Oncogenics Keywords: long term survival; Glioblastoma; IDH; EGFR; Ki67; p53
Online: 16 October 2019 (08:30:25 CEST)
Background: Glioblastomas (GBM) is generally burdened, to date, by a dismal prognosis, although Long Term Survivors have a relatively significant incidence. Our specific aim was to determine the exact impact of many surgery-, patient- and tumor-related variable on Survival parameters. Methods: The surgical, radiological and clinical outcomes of patients have been retrospectively reviewed for the present study. All the patients have been operated on in our Institution and classified according their Overall Survival in LTS (Long Term Survivors) and STS (Short Term Survivors). A thorough Review of our surgical series was conducted to compare the oncologic results of the patients in regards to 1. Surgical , 2. Molecular, and 3.Treatment related features. Results: A total of 177 patients were included in the final cohort. Extensive statistical analysis by means of univariate, multivariate and survival analyses disclosed a survival advantage for patients presenting a younger age, a smaller lesion and a better functional status at presentation. From the Histochemical point of view, Ki67(%) was the strongest predictor of better oncologic outcomes. A stepwise analysis of variance outlines the existence of 8 prognostic subgroups according to the molecular patterns of Ki67 overexpression and EGFR, p53 and IDH mutations. Conclusions: On the ground of our statistical analyses we can affirm that the following factors were significant predictors of survival advantage: KPS, Age, Volume of the lesion, Motor disorder at presentation, a Ki67 overexpression. A fine molecular profiling is feasible to precisely stratify the prognosis of GBM patients.
Subject: Engineering, Electrical & Electronic Engineering Keywords: wind power forecasting; short-term prediction; hybrid deep learning; wind farm; long short term memory; gated recurrent network and convolutional layers
Online: 22 September 2020 (03:45:59 CEST)
Accurate forecasting of wind power generation plays a key role in improving the operation and management of a power system network and thereby its reliability and security. However, predicting wind power is complex due to the existence of high non-linearity in wind speed that eventually relies on prevailing weather conditions. In this paper, a novel hybrid deep learning model is proposed to improve the prediction accuracy of very short-term wind power generation for the Bodangora Wind Farm located in New South Wales, Australia. The hybrid model consists of convolutional layers, gated recurrent unit (GRU) layers and a fully connected neural network. The convolutional layers have the ability to automatically learn complex features from raw data while the GRU layers are capable of directly learning multiple parallel sequences of input data. The data sets of five-minute intervals from the wind farm are used in case studies to demonstrate the effectiveness of the proposed model against other advanced existing models, including long short-term memory (LSTM), GRU, autoregressive integrated moving average (ARIMA) and support vector machine (SVM), which are tuned to optimise outcome. It is observed that the hybrid deep learning model exhibits superior performance over other forecasting models to improve the accuracy of wind power forecasting, numerically, up to 1.59 per cent in mean absolute error, 3.73 per cent in root mean square error and 8.13 per cent in mean absolute percentage error.
ARTICLE | doi:10.20944/preprints202107.0122.v1
Subject: Medicine & Pharmacology, Allergology Keywords: peri-implantitis; electrolytic cleaning; air abrasive; augmentation; long term
Online: 6 July 2021 (08:06:56 CEST)
Background: this RCT assesses the 18 months clinical outcomes after regenerative therapy of periimplantitis lesions using either an electrolytic method (EC) to remove biofilms or a combination of powder spray and electrolytic method (PEC). Materials and Methods: Twenty-four patients (24 implants) suffering from periimplantitis were randomly treated by EC or PEC followed by augmentation and submerged healing. Probing pocket depth (PPD), Bleeding on Probing (BoP), suppuration and standardized radiographs were assessed before surgery (T0), 6 months after augmentation (T1), 6 (T2) and 12 (T3) months after replacement of the restoration. Results: Mean of PPD changed from 5.8 ± 1.6 mm (T0) to 3.1 ± 1.4 mm (T3). While BoP and suppuration at T0 was 100 % BoP decreased at T2 to 36.8 % and at T3 to 35.3 %. Suppuration could be found 10.6% at T2 and 11.8% at T3. Radiologic bone level measured from the implant shoulder to the first visible bone to implant contact was 4.9 ± 1.9 mm at me-sial and 4.4 ± 2.2 mm at distal sites (T0) and 1.7 ± 1.7 mm and 1.5 ± 17 mm at T3. Conclusions: Significant radiographic bone fill and improvement of clinical parameters were demonstrated 18 months after therapy.
ARTICLE | doi:10.20944/preprints202101.0134.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Cardiac arrest; normothermia; EEG; SSEP; GWR; long term predictors
Online: 8 January 2021 (10:26:27 CET)
Introduction Early prediction of long term outcomes in patients resuscitated after cardiac arrest (CA) is still challenging. Guidelines suggested a multimodal approach combining multiple predictors. We evaluated whether the combination of the electroencephalography (EEG) reactivity, somatosensory evoked potentials (SSEPs) cortical complex and Gray to White matter ratio (GWR) on brain computed tomography (CT) at different temperatures could predict survival and good outcome at hospital discharge and after six months. Methods We performed a retrospective cohort study including consecutive adult, non-traumatic patients resuscitated from out-of-hospital CA who remained comatose on admission to our intensive care unit from 2013 to 2017. We acquired SSEPs and EEGs during the treatment at 36°C and after rewarming at 37°C, Gray to white matter ratio (GWR) was calculated on the brain computed tomography scan performed within six hours of the hospital admission. We primarily hypothesized that SSEP was associated with favorable functional outcome at distance and secondarily that SSEP provides independent information from EEG and CT. Outcomes were evaluated using the Cerebral Performance Category (CPC) scale at six months from discharge. Results Of 171 resuscitated patients, 75 were excluded due to missing of data or uninterpretable neurophysiological findings. EEG reactivity at 37 °C has been shown the best single predictor of good outcome (AUC 0.803) while N20P25 was the best single predictor for survival at each time point. (AUC 0.775 at discharge and AUC 0.747 at six months follow up) Predictive value of a model including EEG reactivity, average GWR, and SSEP N20P25 amplitude was superior (AUC 0.841 for survival and 0.920 for good outcome) to any combination of two tests or any single test. Conclusion Our study, in which life-sustaining treatments were never suspended, suggests SSEP cortical complex N20P25, after normothermia ad off sedation, is a reliable predictor for survival at any time. When SSEP cortical complex N20P25 is added into a model with GWR average and EEG reactivity, the predictivity for good outcome and survival at distance is superior than each single test alone.
REVIEW | doi:10.20944/preprints202012.0779.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Social isolation; risk factors; older adults; long-term care
Online: 31 December 2020 (09:24:17 CET)
Objectives: A wealth of literature has established risk factors for social isolation among older people, however much of this research has focused on community-dwelling populations. Relatively little is known about how risk of social isolation is experienced among those living in long-term care (LTC) homes. We conducted a scoping review to identify possible risk factors for social isolation among older adults living in LTC homes. Methods: A systematic search of five online databases retrieved 1535 unique articles. Eight studies met the inclusion criteria. Results: Thematic analyses revealed that possible risk factors exist at three levels: individual (e.g., communication barriers), systems (e.g., location of LTC facility), and structural factors (e.g., discrimination). Discussion: Our review identified several risk factors for social isolation that have been previously documented in literature, in addition to several risks that may be unique to those living in LTC homes. Results highlight several scholarly and practical implications.
ARTICLE | doi:10.20944/preprints202012.0688.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: optimal decay; general decay; swelling porous problem; memory term
Online: 28 December 2020 (11:26:20 CET)
The present work studies a swelling porous-elastic system with viscoelastic damping. We establish a general and optimal decay estimate which generalizes some recent results in the literature. Our result is established without imposing the usual equal-wave-speed condition associated with similar problems in literature.
ARTICLE | doi:10.20944/preprints202007.0509.v1
Subject: Medicine & Pharmacology, Ophthalmology Keywords: laser excimer; myopia surgery; long term; Femto-LASIK; PRK
Online: 22 July 2020 (09:53:46 CEST)
Refractive surgery is an increasingly popular procedure to decrease spectacle or contact lens dependency. The two most commonly used surgical techniques to correct myopia is Photorefractive keratectomy (PRK) and Femtosecond- Lasik (FS-LASIK)There are few publications that gathers such a long term follow up of both surgical techniques (2) Methods It has been performed a retrospective non-randomized study 509 PRK eyes and 310 FS-LASIK surgeries were followed for 10 years for the treatment of myopia and compound myopic astigmatism. Patients were followed up three months, one year, 2 years, 5 and 10 years. The safety index of both procedures was defined as a quotient between the postoperative BCVA (Best Corrected Visual Acuity) and the preoperative BCVA. The predictability is calculated as difference between the expected spherical equivalent and the achieved spherical equivalent. The efficacy index was calculated as a quotient between postoperative UCVA divided by the preoperative BCVA (3) Results. The results were: a safety index higher than 100% (109%) and an efficacy index of 82.4% after 10 years of PRK surgery in both groups. FS-LASIK was the safest surgery after 10 years and the most efficacy technique although in this case there were no statistically significant differences (4) Conclusions. All these data demonstrated better indexes for FS-LASIK
ARTICLE | doi:10.20944/preprints202006.0280.v1
Subject: Medicine & Pharmacology, Obstetrics & Gynaecology Keywords: antenatal stress; hair cortisol; term-placentae; RT-qPCR; human
Online: 21 June 2020 (16:30:08 CEST)
Anxiety, chronical stress and depression during pregnancy are considered to affect the offspring, presumably through placental dysregulation. We have studied the term placentae of pregnancies clinically monitored with the Beck’s Anxiety Inventory (BAI) and Edinburgh Postnatal Depression Scale (EPDS). A cutoff threshold for BAI/EPDS of 10 classed patients into an Index group (>10, n=23) and a Control group (<10, n=23). Cortisol concentrations in hair (HCC) were periodically monitored throughout pregnancy and delivery. Expression differences of main glucocorticoid pathway genes: i.e. corticotropin-releasing hormone (CRH), 11β-hydroxysteroid dehydrogenase (HSD11B2), glucocorticoid receptor (NR3C1), as well as other key stress biomarkers (Arginine Vasopressin, AVP and O-GlcNAc transferase, OGT) were explored in medial placentae using real-time qPCR and western blotting. Moreover, gene expression changes were considered for their association with HCC, offspring, gender and birthweight. A significant dysregulation of gene expression for CRH, AVP and HSD11B2 genes was seen in the Index group, compared to controls, while OGT and NR3C1 expression remained similar between groups. Placental gene expression of the stress-modulating enzyme 11β-hydroxysteroid dehydrogenase (HSD11B2) was related to both hair cortisol levels (Rho= 0.54; p<0.01) and the sex of the newborn in pregnancies perceived as stressful (Index, p<0.05). Gene expression of CRH correlated with both AVP (Rho= 0.79; p<0.001) and HSD11B2 (Rho= 0.45; p<0.03), and also between AVP with both HSD11B2 (Rho= 0.6; p<0.005) and NR3C1 (Rho= 0.56; p<0.03) in the Control group but not in the Index group; suggesting a possible loss of interaction in the mechanisms of action of these genes under stress circumstances during pregnancy.
ARTICLE | doi:10.20944/preprints202211.0437.v3
Subject: Engineering, Civil Engineering Keywords: deep neural network; long short-term memory; suspended sediment; discharge
Online: 16 December 2022 (08:08:08 CET)
The dynamics of suspended sediment involves inherent non-linearity and complexity as a result of the presence of both spatial variability of the basin characteristics and temporal climatic patterns. As a result of this complexity, the conventional sediment rating curve (SRC) and other empirical methods produce inaccurate predictions. Deep neural networks (DNNs) have emerged as one of the advanced modeling techniques capable of addressing inherent non-linearity in hydrological processes over the last few decades. DNN algorithms are used to perform predictive analysis and investigate the interdependencies among the most pivotal water quantity and quality parameters i.e., discharge, suspended sediment concentration (SSC), and turbidity. In this study, the Long short-term memory (LSTM) algorithm of DNNs is used to model the discharge-suspended sediment relationship for the Stony Clove Creek. The simulations were run using primary data on discharge, SSC and turbidity. For the development of the DNN models and examining the effects of input vectors, combinations of different input vectors (namely discharge, and SSC) for the current and previous days are considered. Furthermore, a suitable modelling approach with an appropriate model input structure is suggested based on model performance indices for the training and testing phases. The performance of developed models is assessed using statistical indices such as root mean square error (RMSE), mean absolute error (MAE), and coefficient of determination (R2). Statistically, the performance of DNN-based models in simulating the daily SSC performed well with observed sediment concentration series data. The study demonstrates the suitability of the DNN approach for simulation and estimation of daily SSC, opening up new research avenues for applying hybrid soft computing models in hydrology.
ARTICLE | doi:10.20944/preprints202211.0278.v1
Subject: Mathematics & Computer Science, Other Keywords: Long Short-Term Memory; time series forecasting; commodities; technical analysis
Online: 15 November 2022 (07:00:55 CET)
This article presents the implementation of a model to estimate the future price of commodities in the Brazilian market from time series of short-term technical evaluation. For this, data from two databases were used, one referring to the foreign market (opening values, maximum, minimum, closing, closing adjustment and volume) and the other, from the Brazilian market (the price of the day), considering commodities, sugar, cotton, corn, soybean and wheat. Subsequently, the technical indicators were calculated from the TA-Lib technical analysis library. Pearson’s correlation coefficient was applied, records with low correlation were removed, and then the database was consolidated. From the pre-processed data, Long Short-Term Memory (LSTM) recurrent neural networks were used to perform data prediction at the one and three day interval. These models were evaluated using the mean square error (MSE), obtaining results between 0.00010 and 0.00037 on test data one day ahead, and from 0.00017 to 0.00042 three days ahead. However, based on the results obtained, it was observed that the developed model obtained a promising forecasting performance for all the commodities evaluated. As a main contribution, there is the consolidation of databases that can be used in future scientific research. Furthermore, based on its interpretation, it can assist in decision making regarding the buying and selling of commodities to increase financial gains.
ARTICLE | doi:10.20944/preprints202208.0170.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: neuron; astrocyte; network; short-term memory; spatial frequency; computational biology
Online: 9 August 2022 (04:04:31 CEST)
Working memory refers to the capability of the nervous system to selectively retain short-term memories in an active state. The long-standing viewpoint is that neurons play an indispensable role and working memory is encoded by synaptic plasticity. Furthermore, some recent studies have shown that calcium signaling assists the memory processes and the working memory might be affected by the astrocyte density. Over the last few decades, growing evidence has also revealed that astrocytes exhibit diverse coverage of synapses which are considered to participate in neuronal activities. However, very little effort has yet been made to attempt to shed light on the potential correlations between these observations. Hence, in this article we will leverage a computational neuron-astrocyte model to study the short-term memory performance subject to various astrocytic coverage and we will demonstrate that the short-term memory is susceptible to this factor. Our model may also provide plausible hypotheses for the various sizes of calcium events as they are reckoned to be correlated with the astrocytic coverage.
ARTICLE | doi:10.20944/preprints202202.0051.v1
Subject: Life Sciences, Other Keywords: Glioblastoma; survival prediction; Machine Learning; biomarkers; HumanPSDTM; Long-term survivor
Online: 3 February 2022 (12:00:23 CET)
Glioblastoma (GBM) is a very aggressive malignant brain tumor with the vast majority of patients surviving less than 12 months (Short-term survivors [STS]). Only around 2% of patients survive more than 36 months (Long-term survivors [LTS]). Studying these extreme survival groups might help in better understanding GBM biology. This work aims at exploring application of machine learning methods in predicting survival groups(STS, LTS). We used age and gene expression profiles belonging to 249 samples from publicly available datasets. 10 Machine learning methods have been implemented and compared for their performances. Hyperparameter tuned random forest model performed best with accuracy of 80% (AUC of 74% and F1_score of 85%). The performance of this model is validated on external test data of 16 samples. The model predicted the true survival group for 15 samples achieving an accuracy of 93.75%. This classification model is deployed as a web tool GlioSurvML. The top 1500 features which retained classification efficiency (Accuracy of 80%, AUC of 74%) were studied for enriched pathways and disease-causal biomarker associations using the HumanPSDTM database. We identified 199 genes as possible biomarkers of GBM and/or similar diseases (like Glioma, astrocytoma, and others). 57 of these genes are shown to be differentially expressed across survival groups and/or have impact on survival. This work demonstrates the application of machine learning methods in predicting survival groups of GBM.
ARTICLE | doi:10.20944/preprints202109.0026.v1
Subject: Life Sciences, Microbiology Keywords: Cefotaxime; S. haemolyticus; neonates; sub-MIC; biofilms; short-term evolution
Online: 1 September 2021 (14:39:54 CEST)
Critical care of neonates involves substantial usage of antibiotics and exposure to multidrug resistant (MDR) nosocomial pathogens. These pathogens are often exposed to sub-MIC doses of antibiotics which might result in a range of physiological effects. Therefore, to understand the outcome of sub-inhibitory dosage of antibiotics on Staphylococcus populations, nasal swab specimens were collected from 34 neonates admitted to the Sick Newborn Care Unit between 2017-2018, a total of 41 non-repetitive isolates were included in this study. Staphylococcus haemolyticus was the prevalent species (58.54%) with high non-susceptibility to cefotaxime (CTX) (79.16%), gentamicin (87.50%), and meropenem (54.17%). Biofilm forming abilities of S. haemolyticus isolates in the presence of sub-optimal CTX (30μg/mL), the predominantly prescribed β-lactam antibiotic, were then determined by crystal violet assays and extracellular DNA (eDNA) quantitation. CTX was found to significantly enhance biofilm production among the non-susceptible isolates (p-valueWilcoxin test- 0.000008) with increase in eDNA levels (p-valueWilcoxin test- 0.000004). Additionally, no changes in non-susceptibility were observed among populations of two MDR isolates, JNM56C1 and JNM60C2 after >500 generations of growth in the absence of antibiotic selection in vitro. These findings demonstrate that sub-MIC concentration of CTX induces biofilm formation and short-term non-exposure to antibiotics does not alter non-susceptibility among S. haemolyticus isolates.
ARTICLE | doi:10.20944/preprints202107.0252.v1
Online: 12 July 2021 (12:03:06 CEST)
Deep neural networks (DNNs) have made a huge impact in the field of machine learning by providing unbeatable humanlike performance to solve real-world problems such as image processing and natural language processing (NLP). Convolutional neural network (CNN) and recurrent neural network (RNN) are two typical architectures that are widely used to solve such problems. Time sequence-dependent problems are generally very challenging, and RNN architectures have made an enormous improvement in a wide range of machine learning problems with sequential input involved. In this paper, different types of RNN architectures are compared. Special focus is put on two well-known gated-RNN’s Long Term Short Memory (LSTM) and Gated Recurrent Unit (GRU). We evaluated these models on the task of force estimation system in pouring. In this study, four different models including multi-layers LSTM, multi-layers GRU, single-layer LSTM and single-layer GRU) were created and trained. The result suggests that multi-layer GRU outperformed other three models.
ARTICLE | doi:10.20944/preprints202002.0177.v3
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: bias; simulation; long-term monitoring; Delta Smelt; San Francisco Estuary
Online: 23 June 2021 (11:50:11 CEST)
In fisheries monitoring, catch is assumed to be a product of fishing intensity, catchability, and availability, where availability is defined as the number or biomass of fish present and catchability refers to the relationship between catch rate and the true population. Ecological monitoring programs use catch per unit of effort (CPUE) to standardize catch and monitor changes in fish populations; however, CPUE is proportional to the portion of the population that is vulnerable to the type of gear that is used in sampling, which is not necessarily the entire population. Programs often deal with this problem by assuming that catchability is constant, but if catchability is not constant, it is not possible to separate the effects of catchability and population size using monitoring data alone. This study uses individual-based simulation to separate the effects of changing environmental conditions on catchability and availability in environmental monitoring data. The simulation combines a module for sampling conditions with a module for individual fish behavior to estimate the proportion of available fish that would escape from the sample. The method is applied to the case study of the well-monitored fish species Delta Smelt (Hypomesus transpacificus) in the San Francisco Estuary, where it has been hypothesized that changing water clarity may affect catchability for long-term monitoring studies. Results of this study indicate that given constraints on Delta Smelt swimming ability, it is unlikely that the apparent declines in Delta Smelt abundance are due to an effect of changing water clarity on catchability.
ARTICLE | doi:10.20944/preprints202106.0011.v1
Subject: Life Sciences, Biochemistry Keywords: long covid; symptom cluster; persistent symptoms; long-term; Mexico; survey
Online: 1 June 2021 (09:44:47 CEST)
Recently, several reports have emerged describing the long-term consequences of COVID-19 that may affect multiple systems, suggesting its chronicity. As further research is needed, we conducted a longitudinal observational study to report the prevalence and associated risk factors of long-term health consequences of COVID-19 by symptom clusters in patients discharged from the Temporary COVID-19 Hospital (TCH) in Mexico City. Self-reported clinical symptom data were collected via telephone calls over 90 days post-discharge. Among 4670 patients discharged from the TCH, we identified 45 symptoms across eight symptom clusters (neurological; mood disorders; systemic; respiratory; musculoskeletal; ear, nose, and throat; dermatological; and gastrointestinal). We observed that the neurological, dermatological, and mood disorder symptom clusters persisted in >30% of patients at 90 days post-discharge. Although most symptoms decreased in frequency between day 30 and 90, alopecia and the dermatological symptom cluster significantly increased (p<0·00001). Women were more prone than men to develop long-term symptoms and invasive mechanical ventilation also increased the frequency of symptoms at 30-days post-discharge. Overall, we observed that symptoms often persisted regardless of disease severity. We hope these findings will help promote public health strategies that ensure equity in the access to solutions focused on the long-term consequences of COVID-19.
ARTICLE | doi:10.20944/preprints202105.0722.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Dementia; multicomponent training; long-term care home; social ethical approach
Online: 31 May 2021 (09:45:37 CEST)
Multicomponent training is recommended for people with dementia living in long-term care homes. Nevertheless, evidence is limited and people with severe dementia are often excluded from trials. Hence, the aim of this study was to investigate (1) the feasibility and (2) the requirements regarding a multicomponent training for people with moderate to severe dementia. The study was conducted as an uncontrolled single arm pilot study with a mixed methods approach. 15 nursing home residents with a mean age of 82 years (range: 75-90 years; female: 64%) with moderate to severe dementia received 16 weeks of multicomponent training. Feasibility and requirements of the training were assessed by a standardized observation protocol. Eleven participants regularly attended the intervention. The highest active participation was observed during gait exercises (64%), the lowest during strength exercises (33%). It was supportive if exercises were task-specific or related to everyday life. This study confirms that a multicomponent training for the target group is (1) feasible and well accepted. To enhance active participation (2) individual instructions and the implementation of exercises related to everyday life is required. The effectiveness of the adapted training should be tested in future randomized controlled trials.
ARTICLE | doi:10.20944/preprints202102.0325.v1
Subject: Biology, Anatomy & Morphology Keywords: Hazel Grouse; Bohemian Forest; Long-Term Monitoring; Population Trend; TRIM.
Online: 16 February 2021 (13:33:25 CET)
The population dynamics of Hazel Grouse was studied by presence/ absence recording at stationary sites along fixed routes (110 km) during 1972-2019 in the central part of the Bohemian Forest (Šumava, Czech Republic). The 100-km² study area covered altitudes between 600 m (Rejstejn) and 1,253 m a.s.l., (mount Sokol). Our data base contained indices of Hazel Grouse occupancy: positive sites/ controlled sites for a yearly increasing number of Hazel Grouse occurrence sites (N = 134) for 48 years. We used a loglinear Poisson-regression method to analyze the long-term population trend for Hazel Grouse in the study area. In the period 1972 to 2006 we found a stable Hazel Grouse population (p = 0.83). From 2006-2007 to 2019, the population index dropped (-3.8% per year, p < 0.05) for the last 13 years. This decline is assumed to be influenced by habitat loss due to succession resulting in older, more open forest stands, by strongly increasing forestry and windstorm “Kyrill” followed by clear cutting, bark-beetle damage, and removal of pioneer trees in spruce plantations, which diminished buds and catkins, the dominant winter food. The influence of disturbance by increasing touristic activities and/or predation is discussed. Our results could help to optimize conservation efforts for Hazel Grouse in the Bohemian Forest.
Subject: Biology, Anatomy & Morphology Keywords: zebrafish; Danio rerio; sperm motility; fertilization; short-term storage; extender
Online: 27 November 2020 (10:11:39 CET)
The zebrafish Danio rerio is suitable to study gametes as a model organism. There were > 70% of zebrafish spermatozoa activated, because they were contaminated with urine or excrement. The movement of spermatozoon in water was propagated along the flagellum at 16 s after sperm activation, then damped from the end of the flagellum for 35 s and fully disappear at 61 s after activation. For artificial fertilization, milt must be added to an extender, which stops the movement of sperm and keeps the sperm motionless until fertilization. E400 was shown to be the most suitable extender as it allows to store sperm for fertilization for 6 to 12 h at 0-2oC. Sperm motility decreased only to 36% at 12 h post stripping (HPS) for E400 extender and to 19% for Kurokura extender. To achieve an optimal level of fertilization and hatching, a test tube with a well-defined amount of 6,000,000 spermatozoa in E400 extender per 100 eggs and 100 µl of activation solution has proved to be more successful than using a Petri dish. The highest fertilization and hatching rates reached 80% and 40-60%, respectively, with milt stored for 1.5 h in E400 extender at 0-2oC.
CASE REPORT | doi:10.20944/preprints201908.0278.v1
Subject: Medicine & Pharmacology, Oncology & Oncogenics Keywords: FOLFIRINOX; pancreatic ductal adenocarcinoma; surgery; liver metastases; long term survival
Online: 27 August 2019 (05:16:03 CEST)
Metastatic pancreatic ductal adenocarcinoma pancreatic (PDAC) is characterized by poor prognosis and short survival. Today, the use of new polytherapeutic regimens increases clinical outcome of these patients opening new clinical scenario. A crucial issue related to the actual improvement achieved with these new regimens is represented by the occasional possibility to observe a radiological complete response of metastatic lesions in patients with synchronous primary tumor. What could be the best therapeutic management of these patients? Could surgery represent an indication? Herein we reported a case of a patient with a PDAC of the head with multiple liver metastasis, who underwent first line chemotherapy with mFOLFIRINOX. After 10 cycles, he achieved a complete radiological response of liver metastases and a partial response of pancreatic lesion. A, duodenocephalopancreasectomy was performed. Due to liver a lung metastases after 8 months from surgery, a second line therapy was started with a disease free survival and overall survival of 8 months and 45 months, respectively. Improvement in the molecular characterization of PDAC could help in the selection of patients suitable for multimodal treatments.
ARTICLE | doi:10.20944/preprints201905.0034.v1
Subject: Biology, Entomology Keywords: long-term; sex ratio; action threshold; pest management; insecticide use
Online: 6 May 2019 (08:19:10 CEST)
A long-term investigation of D. suzukii dynamics in wild blueberry fields from 2012 - 2018 demonstrates relative abundance is still increasing seven years after initial invasion. Relative abundance is determined by physiological date of first detection and air temperatures the previous winter. Date of first detection of flies does not determine date of fruit infestation. The level of fruit infestation is determined by year, fly pressure, and insecticide application frequency. Frequency of insecticide application is determined by production system. Non-crop wild fruit and predation influences fly pressure; increased wild fruit abundance results in increased fly pressure. Increased predation rate reduces fly pressure, but only at high abundance of flies, or when high levels of wild fruit are present along field edges. Male sex ratio might be declining over the seven years. Action thresholds were developed from samples of 92 fields from 2012 - 2017 that related cumulative adult male trap capture to the following week likelihood of fruit infestation. A two-parameter gamma density function describing this probability was used to develop a risk-based gradient action threshold system. The action thresholds were validated from 2016-2018 in 35 fields and were shown to work well in two of three years (2016 and 2017).
ARTICLE | doi:10.20944/preprints201808.0105.v1
Subject: Life Sciences, Other Keywords: depression; total protein; elder people; physical function; long-term care
Online: 6 August 2018 (09:41:35 CEST)
Due to its devastating consequences, late life depression is an important public health problem. The aim of the study was the analysis of variables which may potentially influence risk of depression (GDS-SF). Furthermore, the aim was to study possible mediating effect of given variables on the relationship between the total protein concentration and risk of depression in older-adults with chronic diseases, and physical function impairment. The research sample included a total of 132 older adults with chronic conditions and physical function impairments, remaining under a long-term care in residential environment. Negative linear correlation was observed between patients’ physical functionality, total protein concentration, concentration of HDL cholesterol, arm circumference, and the risk of depression. Considerably stronger relationship was observed between total protein concentration, and GDS-SF, in elderly suffering from sensory dysfunction (b = −6.42, 95% CI = −11.27; −1.58). The effect of the mediation between depression risk is correlated to total protein concentration in blood serum, and the mediators are probably low function impairment and low levels of 25 (OH)D vitamin. Cohort control research is suggested to confirm the hypothesis.
ARTICLE | doi:10.20944/preprints201806.0295.v1
Subject: Behavioral Sciences, Other Keywords: long-term care, elderly people, behavior assessment, factor analysis, independence
Online: 19 June 2018 (10:59:03 CEST)
The rapid growth rate of the elderly population is a serious current issue in most countries, affecting them economically through needed medical treatment and healthcare planning. The priority concern is how to reduce the number of elderly people requiring long-term healthcare and raise the number who are able to live independently. This study executed a behavior assessment of elderly person’s self-reported use of electric scooters and analyzed their degree of acceptance of these assisted living tools, partly through a related factor analysis of our survey instrument. We used this questionnaire survey as our research method, applying SPSS22 software for factor analysis that revealed five survey facets.
ARTICLE | doi:10.20944/preprints202106.0104.v1
Subject: Earth Sciences, Atmospheric Science Keywords: hydrological research basin; precipitation; temperature; long-term trends; climate change; evapotranspiration
Online: 3 June 2021 (11:35:58 CEST)
While the ongoing climate change is well documented, the impacts exhibit a substantial variability, both in direction and magnitude, visible even at regional and local scales. However, the knowledge of regional impacts is crucial for the design of mitigation and adaptation measures, particularly when changes in the hydrological cycle are concerned. In this paper we present hydro-meteorological trends based on observations from a hydrological research basin in Eastern Austria between 1979-2019. The analysed state variables include the air temperature, the precipitation, and the catchment runoff. Additionally, trends for the catchment evapotranspiration were derived. The analysis shows that while the mean annual temperature was decreasing and annual temperature minima remained constant, the annual maxima were rising. The long-term trends indicate a shift of precipitation to the summer with minor variations observed for the remaining seasons and at an annual scale. Observed precipitation intensities mainly increased in spring and summer between 1979-2019. The catchment evapotranspiration, computed based on catchment precipitation and outflow, showed an increasing trend for the observed time period.
ARTICLE | doi:10.20944/preprints202102.0401.v1
Subject: Engineering, Automotive Engineering Keywords: Machine learning; Ultrasonic measurements; Long Short-Term Memory; Industrial Digital technologies
Online: 18 February 2021 (09:31:43 CET)
Beer fermentation is typically monitored by periodic sampling and off-line analysis. In-line sensors would remove the need for time-consuming manual operation and provide real-time evaluation of the fermenting media. This work uses a low-cost ultrasonic sensor combined with machine learning to predict the alcohol concentration during beer fermentation. The highest accuracy model (R2=0.952, MAE=0.265, MSE=0.136) used a transmission-based ultrasonic sensing technique along with the measured temperature. However, the second most accurate model (R2=0.948, MAE=0.283, MSE=0.146) used a reflection-based technique without the temperature. Both the reflection-based technique and the omission of the temperature data are novel to this research and demonstrate the potential for a non-invasive sensor to monitor beer fermentation.
ARTICLE | doi:10.20944/preprints202101.0512.v1
Subject: Biology, Anatomy & Morphology Keywords: fish; functional data analysis; long-term monitoring; habitat; occupancy; modeling; California
Online: 25 January 2021 (15:11:52 CET)
Abundance of estuarine fish species has declined globally. In the San Francisco Estuary (SFE), long-term monitoring documented declines of many species including the anadromous species Longfin Smelt (Spirinchus thaleichthys). To improve management and recovery planning, we identified patterns in the timing, seasonal occupancy, and distribution of Longfin Smelt in a monitoring study (San Francisco Bay Study) for five regions of the SFE using a generalized additive model. We then investigated the year-to-year variability in the shape of the seasonal relationships using functional data analysis (FDA). FDA separated the variability due to population size from variability due to differences in occupancy timing. We found that Longfin Smelt have a consistent seasonal distribution pattern, that two trawl types were needed to accurately describe the pattern, and that the pattern is largely consistent with the hypothesized conceptual model. After accounting for variability in occupancy due to year-class strength, the timing of occupancy has shifted in three regions. The most variable period for the upstream regions Suisun Bay and Confluence was age-0 summer and for the downstream region Central Bay, was age-0 late fall. This manifested as a recent delay in the typical fall re-occupation of upstream regions, reducing Longfin Smelt abundance as calculated by another monitoring study (Fall Midwater Trawl); thus, a portion of recent reductions in Fall Midwater Trawl abundance of Longfin Smelt result from changes in behavior rather than a decline in abundance. The presence of multiple monitoring surveys allowed analysis of distribution from one data set to interpret patterns in abundance of another. Future investigations will examine environmental conditions as covariates during these periods and could improve our understanding of what conditions contribute to the shifting occupancy timing of Longfin Smelt, and possibly provide insight into the long-term quality of the San Francisco Estuary as habitat.
REVIEW | doi:10.20944/preprints202010.0406.v1
Subject: Biology, Anatomy & Morphology Keywords: DHA; Brain; MFSD2a; SPM; Fetus; Placenta; infacnts; Neurogenesis; Pregnancy; Pre-term
Online: 20 October 2020 (08:37:41 CEST)
Dietary components are important for the structural and functional development of the brain. Among these, docosahexaenoic acid,22:6n-3 (DHA) is critically required for the structure and development of the growing fetal brain in utero. DHA is the major n-3 long-chain fatty acid in brain gray matter representing about 15% of all fatty acids in the human frontal cortex. DHA affects neurogenesis, neurotransmitter, synaptic plasticity & transmission, and signal transduction in the brain. Studies in animals and humans show that adequate levels of DHA in neural membranes are important for cortical astrocyte maturation and vascular coupling, and helps cortical glucose uptake and metabolism. In addition, specific metabolites of DHA are bioactive molecules that protect tissues from oxidative injury and stress in the brain. A low DHA level in the brain results in behavior changes and is associated with learning problems and memory deficits. In humans, the third trimester-placental supply of maternal DHA to the growing fetus is critically important as the growing brain obligatory requires DHA during this window period. Besides, DHA is also involved in the early placentation process, essential for placental development. This underscores the critical importance of maternal DHA intake for the structural and functional development of the brain. This review describes DHA's multiple roles during gestation, lactation, and the consequences of its lower intake during pregnancy and postnatally on the children's brain development and function.
ARTICLE | doi:10.20944/preprints202008.0629.v1
Subject: Medicine & Pharmacology, Nutrition Keywords: Community Health Survey; CHS; PM10 long-term effect; young adults; BMI
Online: 28 August 2020 (09:26:19 CEST)
Background: The associations between long-term exposure to particulate matters (PM) in residential ambiance and obesity are comparatively less elucidated among young adults. Methods: Using 2017 Community Health Survey data with aged 19−29 participants in 25 communities, Seoul, the relationship between obesity and long−term PM10 levels of living district was examined. We defined obesity as overweight (25≤BMI<30) or obese (30≤BMI) using Body Mass Index (BMI) from self-reported anthropometric information. Analysis was conducted sampling weighted logistic regression models by fitting municipal PM10 levels according to individual residence periods with 10 years and more residing in a current municipality. Socio-demographic factors were adjusted over all models and age−specific effect was explored among aged 19–24 and 25–29. Results: Total study population are 3,655 [men 1,680 (46.0%) and aged 19–24 1,933 (52.9%)] individuals. Among the communities with greater level of PM10; 2001–2005, associations with obesity were increased for overall with residence period; 10 years ≤ [Odds ratio, OR 1.071, 95% Confidence interval (CI) 0.969–1.185], 15 years ≤ [OR 1.118, 95% CI 1.004–1.245], and 20 years ≤ [OR 1.156, 95% CI 1.032–1.294]. However, decreased associations were detected for PM10; 2006–2010, and age–specific effects were modified according to the residence period. Conclusions: Although currently PM10 levels are decreasing, higher levels of PM10 exposure at the residential area during the earlier life-time may contribute in increasing obesity among young adults.
REVIEW | doi:10.20944/preprints202008.0346.v1
Subject: Medicine & Pharmacology, Nutrition Keywords: Integrative review; Short-term Calorie Reduction; Fasting; Cancer; Chemotherapy; Calorie Restriction
Online: 15 August 2020 (09:41:11 CEST)
Recent preclinical studies have shown the potential benefits of short-term calorie reduction (SCR) on cancer treatment. In this integrative review, we aimed to identify and synthesize current evidence regarding the feasibility, process, and effects of SCR in cancer patients receiving chemotherapy. PubMed, Cumulative Index to Nursing and Allied Health Literature, Ovid Medline, PsychINFO, and Embase were searched for original research articles using various combinations of Medical Subject Heading terms. Among the 311 articles identified, seven studies met the inclusion criteria. The majority of the reviewed studies was small randomized controlled trials or cohort study with fair quality. The results suggest that SCR is safe and feasible. SCR is typically arranged around the chemotherapy with the duration ranging from 24 to 96 hours. Most studies examined the protective effects of SCR on normal cells during chemotherapy. The evidence supports that SCR had the potential to enhance both physical and psychological wellbeing of patients during chemotherapy. SCR is a cost-effective intervention with great potential. Future well-controlled studies with sufficient sample sizes are needed to examine the full and long-term effects of SCR and its mechanism of action.
ARTICLE | doi:10.20944/preprints202007.0719.v1
Subject: Biology, Other Keywords: SARS-CoV-2; long-term; neutralization antibody; lymphocyte functionality; viral pathogenicity.
Online: 30 July 2020 (12:16:21 CEST)
COVID-19 patients can recover with a median SARS-CoV-2 clearance of 20 days post initial symptoms (PIS). However, we observed some COVID-19 patients with existing SARS-CoV-2 for more than 50 days PIS. This study aimed to investigate the cause of viral clearance delay and the infectivity in these patients. Demographic data and clinical characteristics of 22 long-term COVID-19 patients were collected. SARS-CoV-2 nucleic acid, peripheral lymphocyte count, and functionality were assessed. SARS-CoV-2-specific and neutralization antibodies were detected, followed by virus isolation and genome sequencing. The median age of the studied cohort was 59.83±12.94 years. All patients were clinically cured after long-term SARS-CoV-2 infection ranging from 53 to 112 days PIS. Peripheral lymphocytes counts were normal. Interferon gamma (IFN-ƴ)-generated CD4+ and CD8+ cells were normal as 24.68±9.60% and 66.41±14.87%. However, the number of IFN-ƴ-generated NK cells diminished (58.03±11.78%). All patients presented detectable IgG, which positively correlated with mild neutralizing activity (ID50=157.2, P=0.05). SARS-CoV-2 was not isolated, and a cytopathic effect was lacking. Only three synonymous variants were identified in spike protein coding regions. In conclusion, decreased IFN-γ production by NK cells and low neutralizing antibodies might favor SARS-CoV-2 long-term existence. Further, low viral load and weak viral pathogenicity was observed in COVID-19 patients with long-term SARS-CoV-2 infection.
BRIEF REPORT | doi:10.20944/preprints201912.0009.v1
Subject: Social Sciences, Economics Keywords: Great Recession; health care expenditures; long-term; convergence analysis; Phillips-Sul
Online: 2 December 2019 (10:15:40 CET)
This paper examines whether the Great Recession has altered the disparities of the US regional health care expenditures. We test the null hypothesis of convergence for the US real per capita health expenditure for the period 1980-2014. Our results indicate that the null hypothesis of convergence is clearly rejected for the total sample as well as for the pre-Great Recession period. Thus, no changes are found in this regard. However, we find that the Great Recession has modified the composition of the estimated convergence clubs, offering a much more concentrated picture in 2014 than in 2008, with most of the states included in a big club, and only 5 (Nevada, Utah, Arizona, Colorado and Georgia) exhibiting a different pattern of behavior. These two estimated clubs diverge and, consequently, the disparities in the regional health sector have increased.
ARTICLE | doi:10.20944/preprints201908.0155.v2
Subject: Engineering, Control & Systems Engineering Keywords: Long short-term memory; Brain dynamics; Data-driven modeling; Complex systems
Online: 18 September 2019 (13:05:22 CEST)
Modeling brain dynamics to better understand and control complex behaviors underlying various cognitive brain functions are of interests to engineers, mathematicians, and physicists from the last several decades. With a motivation of developing computationally efficient models of brain dynamics to use in designing control-theoretic neurostimulation strategies, we have developed a novel data-driven approach in a long short-term memory (LSTM) neural network architecture to predict the temporal dynamics of complex systems over an extended long time-horizon in future. In contrast to recent LSTM-based dynamical modeling approaches that make use of multi-layer perceptrons or linear combination layers as output layers, our architecture uses a single fully connected output layer and reversed-order sequence-to-sequence mapping to improve short time-horizon prediction accuracy and to make multi-timestep predictions of dynamical behaviors. We demonstrate the efficacy of our approach in reconstructing the regular spiking to bursting dynamics exhibited by an experimentally-validated 9-dimensional Hodgkin-Huxley model of hippocampal CA1 pyramidal neurons. Through simulations, we show that our LSTM neural network can predict the multi-time scale temporal dynamics underlying various spiking patterns with reasonable accuracy. Moreover, our results show that the predictions improve with increasing predictive time-horizon in the multi-timestep deep LSTM neural network.
ARTICLE | doi:10.3390/sci1010007.v1
Subject: Keywords: quantum mechanics; EEG; short term memory; astrocytes; neocortical dynamics; vector potential
Online: 11 December 2018 (00:00:00 CET)
Background: Previous papers have developed a statistical mechanics of neocortical interactions (SMNI) fit to short-term memory and EEG data. Adaptive Simulated Annealing (ASA) has been developed to perform fits to such nonlinear stochastic systems. An N-dimensional path-integral algorithm for quantum systems, qPATHINT, has been developed from classical PATHINT. Both fold short-time propagators (distributions or wave functions) over long times. Previous papers applied qPATHINT to two systems, in neocortical interactions and financial options. Objective: In this paper the quantum path-integral for Calcium ions is used to derive a closed-form analytic solution at arbitrary time that is used to calculate interactions with classical-physics SMNI interactions among scales. Using fits of this SMNI model to EEG data, including these effects, will help determine if this is a reasonable approach. Method: Methods of mathematical-physics for optimization and for path integrals in classical and quantum spaces are used for this project. Studies using supercomputer resources tested various dimensions for their scaling limits. In this paper the quantum path-integral is used to derive a closed-form analytic solution at arbitrary time that is used to calculate interactions with classical-physics SMNI interactions among scales. Results: The mathematical-physics and computer parts of the study are successful, in that there is modest improvement of cost/objective functions used to fit EEG data using these models. Conclusions: This project points to directions for more detailed calculations using more EEG data and qPATHINT at each time slice to propagate quantum calcium waves, synchronized with PATHINT propagation of classical SMNI.
ARTICLE | doi:10.20944/preprints202301.0502.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: working memory; STDP; short-term plasticity; spiking neural network; flexible cluster formation
Online: 27 January 2023 (10:32:08 CET)
Working memory (WM) is a brain system for short-term storage and manipulation of information and plays an important role in complex cognitive tasks. In the synaptic theory of WM memorized elements are stored in the form of short-term potentiated connections in a sample population of neurons. In this paper, we show that such populations can be formed due to the mechanisms of spike-timing-dependent plasticity (STDP) – the phase dependence associated with the ratio of the pulse times of the interacting neurons. We propose a WM model considering two types of plasticity: short-term plasticity and STDP. We have shown formation of neuronal clusters encoding items in the WM model, that can be formed by external stimulation of a group of neurons due to the mechanisms of STDP and hold and reactivated by short-term plasticity mechanisms. The dynamic formation of neuronal clusters instead of pre-formed clusters gives additional flexibility to the model.
ARTICLE | doi:10.20944/preprints202211.0361.v1
Subject: Medicine & Pharmacology, Cardiology Keywords: Heart Rate Variability; Inflammatory markers; Long-term Covid-19; Autonomic nervous system.
Online: 21 November 2022 (01:21:37 CET)
Background: Heart rate variability is a non-invasive, measurable, and established autonomic nervous system test. Long-term COVID-19 sequelae are unclear; however, acute symptoms have been studied. Objectives: To determine autonomic cardiac differences between long COVID-19 patients and heathy controls and evaluate associations among symptoms, comorbidities, and laboratory findings. Methods: This single-center study included long COVID-19 patients and healthy controls. The heart rate variability (HRV), a quantitative marker of autonomic activity, was monitored for 24 h using an ambulatory electrocardiogram system. HRV indices were compared between case and control groups. Symptom frequency and inflammatory markers were evaluated. The significance level of 5% (p-value 0.05) was adopted. Results: A total of 47 long COVID-19 patients were compared to 42 healthy controls. Patients averaged 43.8 (SD14.8) years old, and 60.3% were female. In total, 52.5% of patients had moderate illness. Post-exercise dyspnea was most common (71.6%), and 53.2% lacked comorbidities. COVID-19 patients had 4 times more dyslipidemia. CNP, D-dimer, and CRP levels were elevated (p-values of 0.0098, 0.0023, and 0.0015, respectively). The control group had greater SDNN24 and SDANNI (OR = 0.98 (0.97 to 0.99; p = 0.01)). Increased low-frequency (LF) indices in COVID-19 patients (OR = 1.002 (1.0001 to 1.004; p = 0.030)) and high-frequency (HF) indices in the control group (OR = 0.987 (0.98 to 0.995; p = 0.001)) were also associated. Conclusions: Patients with long COVID-19 had lower HF values than healthy individuals. These variations are associated with increased parasympathetic activity, which may be related to long COVID-19 symptoms and inflammatory laboratory findings.
ARTICLE | doi:10.20944/preprints202210.0043.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Commodities; Long Short-Term Memory; Machine Learning; Neural Networks; Prediction; Technical analysis
Online: 5 October 2022 (13:39:16 CEST)
This paper presents the development and implementation of a machine learning model to estimate the future price of commodities in the Brazilian market from technical analysis indicators. For this, two databases were obtained regarding the commodities sugar, cotton, corn, soybean and wheat, which were submitted to the steps of data cleaning, pre-processing and subdivision. From the pre-processed data, recurrent neural networks of the long short-term memory type were used to perform the prediction of data in the interval of 1 and 3 days ahead. These models were evaluated using mean squared error, obtaining an accuracy between 0.00010 and 0.00037 on the test data for 1 day ahead and 0.00015 to 0.00041 for 3 days ahead. However, based on the results obtained, it can be stated that the developed model obtained a good prediction performance for all commodities evaluated.
REVIEW | doi:10.20944/preprints202208.0239.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: long-term care; healthcare workers; mental health; moral distress; resilience; COVID-19
Online: 12 August 2022 (12:43:46 CEST)
Healthcare workers (HCWs) in long-term care (LTC) faced and continue to experience significant emotional and psychological distress throughout the pandemic. Despite this, little is known about the unique experiences of LTC workers. This scoping review synthesizes existing research on the experiences of HCWs in LTC during the COVID-19 pandemic. Following Arksey and O’Malley’s framework, data were extracted from six databases from inception of the pandemic to June 2022. Among 3,808 articles screened, 40 articles were included in the final analysis. Analyses revealed three interrelated themes: carrying the load (moral distress); building pressure and burning out (emotional exhaustion); and working through it (a sense of duty to care). Given the impacts of the pandemic on both HCW wellbeing and patient care, every effort must be made to address the LTC workforce crisis and evaluate best practices for supporting HCWs experiencing mental health concerns during and post-COVID-19.
ARTICLE | doi:10.20944/preprints202107.0308.v1
Subject: Social Sciences, Accounting Keywords: Relational benefits; calculative and affective commitment; long-term orientation; multi-channel agency
Online: 13 July 2021 (12:21:34 CEST)
Our study provides guidelines on how to build long-term customer relationship in the non-contract mechanism context. More specifically, the findings show that special, social, and core benefits influence calculative commitment, and operational and special benefits influence affective commitment. This study also supports that calculative and affective commitment play a crucial role in understanding multi-channel agencies’ loyalty. In sum, this study revealed that calculative and affective commitment can be considered as partial or full mediators in the relationship between RBs (relational benefits) and loyalty. This study not only contributed to the existing SET (social exchange theory) and RBs paradigm but also provided practical implications for food distribution management.
ARTICLE | doi:10.20944/preprints202012.0809.v1
Subject: Life Sciences, Biochemistry Keywords: Long-term care; care homes; nursing homes; dementia; quality improvement; palliative care
Online: 31 December 2020 (13:16:03 CET)
Important policy developments in dementia and palliative care in nursing homes between 2010 and 2015 in Flanders, Belgium might have influenced which people die in nursing homes and how they die. We aimed to examine differences between 2010 and 2015 in the prevalence and characteristics of residents with dementia in nursing homes in Flanders, and their palliative care service use and comfort in the last week of life. We used two retrospective epidemiological studies, including 198 residents in 2010 and 183 in 2015, who died with dementia in representative samples of nursing homes in Flanders. We found a 23%-point increase in dementia prevalence (P-value<0.001), with a total of 11%-point decrease in severe to very severe cognitive impairment (P=0.04). Controlling for this difference in resident characteristics, in the last week of life, there were increases in the use of pain assessment (+20%-point; P<0.001) and assistance with eating and drinking (+10%-point; P=0.02) but no change in total comfort. The higher prevalence of dementia in nursing homes with no improvement in residents’ total comfort while dying emphasize an urgent need to better support nursing homes in improving their capacities to provide timely and high-quality palliative care services to more residents dying with dementia.
REVIEW | doi:10.20944/preprints202010.0597.v1
Subject: Keywords: monarch butterflies; Danaus plexippus; population status; conservation; long-term studies; milkweed limitation
Online: 28 October 2020 (15:32:14 CET)
There are a large number of wildlife and insect species that are in trouble on this planet, and most believe that monarch butterflies in eastern North America are too, because of the well-publicized declines of their winter colonies in central Mexico in the last 25 years. A small number of studies over the last decade have cast doubt on this claim by showing declines are not evident at other stages of the annual cycle. To determine how extensive this pattern is, I conducted an exhaustive review of peer-reviewed and grey literature on (eastern) monarch population censuses and studies, conducted across all seasons, and extracted data from these sources to evaluate how monarch abundance has or has not changed over time. I identified 20 collections of data that included butterfly club reports, compilations of citizen-science observations, migration roost censuses, long-term studies of isotopic signatures, and even museum records. These datasets range in duration from 15 years to over 100 years, and I endeavored to also update each with information from the most current years. I also re-examined the winter colony data after incorporating historical records of colony measurements dating back to 1976. This represents the most complete and up-to-date synthesis of information regarding this population. When I examined the long-term trajectory within each dataset a distinct pattern emerged. Modest declines are evident within the winter colonies (over the full 45 year dataset), and, within three censuses conducted during the spring recolonization. Meanwhile, 16 completely separate monitoring studies conducted during the summer and fall (and from varying locations) revealed either no trend at all or in fact an increase in abundance. While each of these long-term studies has inherent limitations, the fact that all 16 sources of data show the same pattern is undeniable. Moreover, this evidence is consistent with recently-conducted genetic work that shows a lack of decline. Collectively, these results indicate that despite diminishing winter colonies and spring migrations, monarchs in eastern North America are capable of rebounding fully each year, implying that milkweed is not limiting within their collective range. Moreover, there is no indication from these data that the summer population was ever truly diminished by changing agricultural practices in the Midwest that reduced milkweed in crop fields within that region. It is possible that the larger population is not as dependent on Midwestern agricultural milkweed as once thought, and/or that monarchs are adapting to increasingly human-altered landscapes. These results are timely and should bear on the upcoming USFWS decision on whether the monarch requires federal protection in the United States. Importantly, they argue that despite losses of many insects globally, the eastern North American monarch population is not in the same situation.
Subject: Earth Sciences, Geoinformatics Keywords: precipitation downscaling; convolutional neural networks; long short term memory networks; hydrological simulation
Online: 2 April 2019 (12:37:11 CEST)
Precipitation downscaling is widely employed for enhancing the resolution and accuracy of precipitation products from general circulation models (GCMs). In this study, we propose a novel statistical downscaling method to foster GCMs’ precipitation prediction resolution and accuracy for monsoon region. We develop a deep neural network composed of convolution and Long Short Term Memory (LSTM) recurrent module to estimate precipitation based on well-resolved atmospheric dynamical fields. The proposed model is compared against GCM precipitation product and classical downscaling methods in the Xiangjiang River Basin in South China. Results show considerable improvement compared to the ECMWF-Interim reanalysis precipitation. Also, the model outperforms benchmark downscaling approaches, including 1) quantile mapping, 2) support vector machine, and 3) convolutional neural network. To test the robustness of the model and its applicability in practical forecast, we apply the trained network for precipitation prediction forced by retrospective forecasts from ECMWF model. Compared to ECMWF precipitation forecast, our model makes better use of the resolved dynamical field for more accurate precipitation prediction at lead time from 1 day up to 2 weeks. This superiority decreases along forecast lead time, as GCM’s skill in predicting atmospheric dynamics being diminished by the chaotic effect. At last, we build a distributed hydrological model and force it with different sources of precipitation inputs. Hydrological simulation forced with the neural network precipitation estimation shows significant advantage over simulation forced with the original ERA-Interim precipitation (with NSE value increases from 0.06 to 0.64), and the performance is just slightly worse than the observed precipitation forced simulation (NSE=0.82). This further proves the value of the proposed downscaling method, and suggests its potential for hydrological forecasts.
ARTICLE | doi:10.20944/preprints201811.0126.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Speech/Music Classification; Enhanced Voice Service, Long Short-Term Memory, Big Data
Online: 5 November 2018 (17:02:36 CET)
Speech/music classification that facilitates optimized signal processing from classification results has been extensively adapted as an essential part of various electronics applications, such as multi-rate audio codecs, automatic speech recognition, and multimedia document indexing. In this paper, a new technique to improve the robustness of speech/music classifier for 3GPP enhanced voice service (EVS) using long short-term memory (LSTM) is proposed. For effective speech/music classification, feature vectors implemented with the LSTM are chosen from the features of the EVS. Experiments show that LSTM-based speech/music classification produces better results than conventional EVS under a variety of conditions and types of speech/music data.
ARTICLE | doi:10.20944/preprints201808.0075.v2
Subject: Biology, Physiology Keywords: Long Term Evolution, LTE; 4G; mobile phone; nociception; pain; thermal pain threshold
Online: 20 August 2018 (15:01:09 CEST)
Although the majority of mobile phone (MP) users do not attribute adverse effects on health or well-being to MP-emitted radiofrequency (RF) electromagnetic fields (EMFs), the exponential increase in the number of RF devices necessitates continuing research aimed at the objective investigation of such concerns. In this work, we investigate the effects of acute exposure from Long Term Evolution (LTE) MP EMFs on thermal pain threshold in healthy young adults. We use a protocol that was validated in a previous study in a capsaicin-induced hyperalgesia model, and was also successfully used to show that exposure from an RF source mimicking a Universal Mobile Telecommunications System (UMTS) MP led to mildly stronger desensitization to repeated noxious thermal stimulation relative to the sham condition. Using the same experimental design, we did not find any effects of LTE exposure on thermal pain threshold. The present results are in accordance with previous evidence suggesting that effects are likely to be placebo/nocebo effects and are unrelated to the brief acute LTE EMF exposure itself. The fact that this is dissimilar to our previous results on UMTS exposure implies that RF modulations might differentially affect pain perception, and points to the necessity of further research in the topic.
ARTICLE | doi:10.20944/preprints201804.0248.v1
Subject: Medicine & Pharmacology, Ophthalmology Keywords: Ab interno trabeculectomy; Trabectome surgery; long-term outcomes; microinvasive glaucoma surgeries; MIGS
Online: 19 April 2018 (09:41:02 CEST)
Purpose: To analyze the five-year results of Trabectome ab interno trabeculectomy of a single glaucoma center. Method: In this retrospective interventional single-center case series, data of 93 patients undergoing ab interno trabeculotomy between September 2010, and December 2012 were included. Kaplan-Meier analysis was performed using success criteria defined as postoperative intraocular pressure (IOP) ≤21 mm Hg, or >20% reduction from preoperative IOP, and no need for further glaucoma surgery. Risk factors for failure were identified using Cox proportional hazards ratio (HR). Results: The retention rate for five years follow-up was 66%. The cumulative probability of success at 1, 2, 3, 4 and 5 years was 82.6%, 76.7%, 73.9%, 72.3%, and 67.5%. Risk factors for failure were lower baseline IOP (HR=0.27, P=0.001), younger age (HR=0.25, P=0.02), and higher central corneal thickness (HR=0.18, P= 0.01). Pseudoexfoliation was associated with a higher success rate (HR= 0.39, P=0.02). IOP was decreased significantly from 20.0±5.6 mmHg at baseline to 15.6±4.6 mmHg at 5-year follow-up (P=0.001). The baseline number of glaucoma medications was 1.8±1.2, which decreased to 1.0±1.2 medications at 5 years. Conclusion: Trabectome surgery was associated with a good long-term efficacy and safety profile in this single-center case series with a high retention rate. A higher baseline IOP, older age, thinner cornea, and pseudoexfoliation glaucoma were associated with a higher success rate.
BRIEF REPORT | doi:10.20944/preprints202203.0370.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: COVID-19; visitation restrictions; psychological distress; cognitive disfunction; long-term care; rehabilitation ward
Online: 28 March 2022 (15:13:31 CEST)
This report is a narrative of a certified nurse working on a long-term rehabilitation ward for patients with dementia in Japan during the early phase of the COVID-19 pandemic. During this time visitation restrictions had been implemented to prevent the spread of COVID-19 causing psychological distress for patients and their families which nurses had to cope with . The nurse was interviewed twice September–October 2020. The recordings were transcribed verbatim and analysed thematically. Three themes were identified relating to changes in care in response to the pandemic which nurses had to adapt to: the risk of collapse of family members’ roles, anxiety caused by patients forgetting family members and family memories and increased disorientation. During the pandemic, nursing care needs to adapt, ensuring that family attachments and ties continue and minimizing the disruption caused by the pandemic, while ensuring that everyone remains Covid-safe.
ARTICLE | doi:10.20944/preprints202203.0140.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Left ventricular ejection fraction; Left ventricle segmentation; Convolutional long short-term memory; Echocardiography
Online: 10 March 2022 (04:19:30 CET)
Cardiovascular disease is the leading cause of death worldwide. A key factor in assessing the risk of cardiovascular disease is left ventricular functional evaluation. Left ventricular (LV) systolic function is evaluated by measuring the left ventricular ejection fraction (LVEF) using echocardiography data. Therefore, quick and accurate left ventricle segmentation is important for estimating the LVEF. However, it is difficult to accurately segment the left ventricle due to changes in the shape and area of the left ventricle during cardiac cycles. In this study, we proposed a framework that considers changes in the shape and area of the left ventricle during the cardiac cycle by applying the convolutional long short-term memory (CLSTM) approach. In addition, we evaluated the left ventricular segmentation and multidimensional quantification of the proposed system in comparison to manual and automated segmentation methods. In addition, to assess the validity of CLSTM, the values of multi-dimensional quantification metrics were compared and analyzed using graphs and Bland–Altman plots on a frame-by-frame basis. We demonstrated that the CLSTM method effectively segments the left ventricle by considering the LV activity. In conclusion, we demonstrated that LV segmentation based on our framework may be utilized to accurately estimate LVEF values.
ARTICLE | doi:10.20944/preprints202110.0182.v1
Subject: Engineering, Energy & Fuel Technology Keywords: Electricity peak load; Taoussa’s energy sources; Long-term electricity demand planning; Scenarios simulation
Online: 12 October 2021 (12:53:37 CEST)
A long-term forecast study on the electricity demand of Taoussa of Mali is conducted in this paper, with various scenarios of socioeconomic and technological conditions. The analysis tool, which is applied in scenarios simulation, is the Model for Analysis of Energy Demand from the International Atomic Energy Agency. The analysis results are annual electricity demand and peak load forecast for the electrification from the period 2020 to 2035. During the planning period, the analysis results show that the electricity demand will increase to 49.40 MW (332.57 GWh) for the low scenario (LS), 66.46 MW (472.61 GWh) for the reference scenario (RS), and 89.47 MW (635 GWh) for the high scenario (HS). In addition, the total electricity demand increased at an average rate of 8.13% in the LS, 10.31% in the RS and 12.56% in the HS in all sectors. The electricity peak demand is expected to grow at 7.92%, 10.53% and 12.91% corresponding to the three scenarios; in this case, the system peak demand in 2035 will increase to 64.88 MW for the LS, 92.2 MW for the RS and 126.22 MW, the days of peak load are between 17th -23rd in May. The Industry sector will be the biggest electricity consumer of Taoussa area.
ARTICLE | doi:10.20944/preprints202109.0316.v1
Subject: Biology, Physiology Keywords: temporal lobe epilepsy; hippocampus; 4-aminopyridine; epilepsy model; long-term potentiation; AMPA receptor.
Online: 17 September 2021 (12:45:31 CEST)
Even brief epileptic seizures can lead to activity-dependent structural remodeling of neural circuitry. Animal models show that the functional plasticity of synapses and changes in the intrinsic excitability of neurons can be crucial for epileptogenesis. However, the exact mechanisms underlying epileptogenesis remain unclear. We induced epileptiform activity in rat hippocampal slices for 15 min using a 4-aminopyridine (4-AP) in vitro model and observed hippocampal hyperexcitability for at least 1 hour. We tested several possible mechanisms of this hyperexcitability, including changes in intrinsic membrane properties of neurons, presynaptic and postsynaptic alterations. Neither input resistance nor other essential biophysical properties of hippocampal CA1 pyramidal neurons were affected by epileptiform activity. The glutamate release probability also remained unchanged, as the frequency of miniature EPSCs and the paired amplitude ratio of evoked responses did not change after epileptiform activity. However, we found an increase in the AMPA/NMDA ratio, suggesting alterations in the properties of postsynaptic glutamatergic receptors. Thus, the increase in excitability of hippocampal neural networks is realized through postsynaptic mechanisms. In contrast, the intrinsic membrane properties of neurons and the probability of glutamate release from presynaptic terminals are not affected in a 4-AP model.
Subject: Social Sciences, Accounting Keywords: Short-term trading; mean reversion; VIX; SPY; linear stochastic process; MACD; Bollinger Bands
Online: 29 July 2021 (16:24:34 CEST)
One of the key challenges of stock trading is the stock prices follow a random walk process, which is a special case of a stochastic process, and are highly sensitive to new information. A random walk process is difficult to predict in the short-term. Many linear process models that are being used to predict financial time series are structural models that provide an important decision boundary, albeit not adequately considering the correlation or causal effect of market sentiment on stock prices. This research seeks to increase the predictive capability of linear process models using the SPDR S\&P 500 ETF (SPY) and the CBOE Volatility (VIX) Index as a proxy for market sentiment. Three econometric models are considered to forecast SPY prices: (i) Auto-Regressive Integrated Moving Average (ARIMA), (ii) Generalized Auto Regressive Conditional Heteroskedasticity (GARCH), and (iii) Vector Autoregression (VAR). These models are integrated into two technical indicators, Bollinger Bands and Moving Average Convergence Divergence (MACD), focusing on forecast performance. The profitability of various algorithmic trading strategies is compared based on a combination of these two indicators. This research finds that linear process models that incorporate the VIX Index do not improve the performance of algorithmic trading strategies.
ARTICLE | doi:10.20944/preprints202104.0269.v1
Subject: Keywords: Travel Time Prediction; Deep Learning; Long Short Term Memory Networks; transit; temporal correlation
Online: 9 April 2021 (15:04:06 CEST)
This study introduces a comparative analysis of two deep learning (multilayer perceptron neural networks (MLP-NN) and the long short term memory networks (LSTMN)) models for transit travel time prediction. The two models were trained and tested using one-year worth of data for a bus route in Blacksburg, Virginia. In this study, the travel time was predicted between each two successive stations to all the model to be extended to include bus dwell times. Additionally, two additional models were developed for each category (MLP of LSTM): one for only segments including controlled intersections (controlled segments) and another for segments with no control devices along them (uncontrolled segments). The results show that the LSTM models outperform the MLP models with a RMSE of 17.69 sec compared to 18.81 sec. When splitting the data into controlled and uncontrolled segments, the RMSE values reduced to 17.33 sec for the controlled segments and 4.28 sec for the uncontrolled segments when applying the LSTM model. Whereas, the RMSE values were 19.39 sec for the controlled segments and 4.67 sec for the uncontrolled segments when applying the MLP model. These results demonstrate that the uncertainty in traffic conditions introduced by traffic control devices has a significant impact on travel time predictions. Nonetheless, the results demonstrate that the LSTMN is a promising tool that can has the ability to account for the temporal correlation within the data. The developed models are also promising tools for reasonable travel time predictions in transit applications.
ARTICLE | doi:10.20944/preprints202010.0046.v3
Subject: Life Sciences, Biochemistry Keywords: Glioblastoma; master regulators; upstream analysis; IGFBP2; FRA-1; FOSL1; short term survivors; transcription factors
Online: 17 February 2021 (12:58:18 CET)
Only two percent of Glioblastoma multiforme (GBM) patients respond to standard care and survive beyond 36 months (long-term survivors, LTS) while the majority survive less than 12 months (short-term survivors, STS). To understand the mechanism leading to poor survival, we analyzed publicly available datasets of 113 STS and 58 LTS. This analysis revealed 198 differentially expressed genes (DEGs) that characterize aggressive tumor growth and may be responsible for the poor prognosis. These genes belong largely to the GO-categories “epithelial to mesenchymal transition” and “response to hypoxia”. In this paper we applied upstream analysis approach which involves state-of-art promoter analysis and network analysis of the dysregulated genes potentially responsible for short survival in GBM. Binding sites for transcription factors associated with GBM pathology like NANOG, NF-κB, REST, FRA-1, PPARG and seven others were found enriched in the promoters of the dysregulated genes. We reconstructed the gene regulatory network with several positive feedback loops controlled by five master regulators – IGFBP2, VEGFA, VEGF165, PDGFA, AEBP1 and OSMR which can be proposed as biomarkers and as therapeutic targets for enhancing GBM prognosis. Critical analysis of this gene regulatory network gives insights on mechanism of gene regulation by IGFBP2 via several transcription factors including the key molecule of GBM tumor invasiveness and progression, FRA-1. All the observations are validated in independent cohorts and their impact on overall survival is studied.
ARTICLE | doi:10.20944/preprints201908.0211.v2
Subject: Keywords: Rehabilitation, Stroke, Long-term care, Quality of life, Post-stroke checklist, Unmet needs
Online: 26 August 2019 (12:21:22 CEST)
Background: This study investigated the prevalence of worsening problems using Post Stroke Checklist (PSC) at 3, 6, and 12 months post-stroke and their associations with health-related quality of life. Methods: In stroke patients admitted between June 2014 and December 2015, PSC and EuroQol-5Dthree level (EQ-5D-3L) were assessed at post-stroke 3 (n=181), 6 (n=175), and 12months (n=89). The prevalence of worsening problems and its association withEQ-5D-3L at post-stroke 3 and 6months were analyzed. Results: An average of 0.59 (range 0–12), 1.47 (range 0–12), and 1.00 (range 0–10) worsening problems per patient was identified at 3, 6, and 12months after stroke, respectively. The most frequently and continuously identified worsening problems were mood disturbances (reported by 8.8%, 16.0% and13.5% of patients at 3, 6, and 12 months post-stroke, respectively). Worsening mobility was significantly associated with worse EQ-5D index at post-stroke 3 months (β,-0.583; 95% CI, -1.045 to -0.120). The worsening of mobility and communication was significantly associated with worse EQ-5D index at post-stroke 6 months (mobility: β,-0.170; 95% CI, -0.305 to -0.034, communication: β,-0.164; 95% CI, -0.309 to -0.020). Conclusions: PSC may be useful for the detection of various subjective worsening problems during serial clinical follow-up after stroke. Appropriate rehabilitation and management strategy to solve the identified problems could improve the quality of life in stroke survivors.
Subject: Earth Sciences, Environmental Sciences Keywords: riparian restoration; water quality; vegetation; geomorphological condition assessment; long-term monitoring; aerial imagery
Online: 25 April 2019 (12:50:44 CEST)
Riparian restoration is an important objective for landscape managers seeking to redress the widespread degradation of riparian areas and the ecosystem services they provide. This study investigated the long-term outcomes of ‘one-off’ restoration activities undertaken in the Upper Murrumbidgee Catchment, NSW, Australia. The objective of the restoration was to protect and enhance riparian vegetation and control erosion, and consequently reduce sediment and nutrient delivery into the Murrumbidgee River. To evaluate the outcomes 10 years after restoration, rapid riparian vegetation and geomorphological assessments were undertaken at 29 sites spanning the four different restoration methods used (at least five replicates per treatment), as well as at nine comparable untreated sites. We also trialed the use of aerial imagery to compare width of riparian canopy vegetation and projective foliage cover prior to restoration with that observed after 10 years. Aerial imagery demonstrated the width of riparian canopy vegetation and projective foliage cover increased in all restored sites, especially those with native plantings. The rapid assessment process indicated that 10 years after riparian restoration, the riparian vegetation was in a better condition at treated sites compared to untreated sites. Width of riparian canopy vegetation, native mid-storey cover, native canopy cover and seedling recruitment were significantly greater in treated sites compared to untreated sites. Geomorphological condition of treated sites was significantly better than untreated sites, demonstrating the importance of livestock exclusion to improve bank and channel condition. Our findings illustrate the value of ‘one-off’ restoration activities in achieving long-term benefits for riparian health. We have demonstrated that rapid assessments of the vegetation and geomorphological condition can be undertaken post-hoc to determine the long-term outcomes, especially when supported with analysis of historical aerial imagery.
ARTICLE | doi:10.20944/preprints201703.0058.v1
Subject: Mathematics & Computer Science, Other Keywords: Smartphone sensing; mobile-social integration; automatic recognition; social data; long-term health monitoring
Online: 10 March 2017 (17:32:31 CET)
Over the past decades, overweight and obesity has become a global epidemic and the leading threat for death. To prevent the serious risk, an overweight or obese individual must apply a long-term weight-management strategy to control food intake and physical activities, which is however, not easy. Recently, with the advances of information technology, more and more people can use wearable devices and smartphones to obtain physical activity information, while they can also access various health-related information from Internet online social networks (OSNs). Nevertheless, there is a lack of an integrated approach that can combine these two methods in an efficient way. In this paper, we address this issue and propose a novel mobile-social framework for health recognition and recommendation, namely, H-Rec2. The main ideas of H-Rec2 include (1) to recognize the individual's health status using smartphone as a general platform, and (2) to recommend physical activity and food intake based on personal health information, life science principles, and health-related information obtained from OSNs. To demonstrate the potentials of the H-Rec2 framework, we develop a prototype that consists of four important components: (1) an activity recognition module that senses physical activity using accelerometer, (2) a health status modeling module that applies a novel algorithm to generate personalized health status index, (3) a restaurant information collection module that collects relevant information from OSN, and (4) a restaurant recommendation module that provides personalized and context-aware recommendation. To evaluate the prototype, we conduct both objective and subjective experiments, which confirm the performance and effectiveness of the proposed system.
ARTICLE | doi:10.20944/preprints201702.0057.v2
Subject: Mathematics & Computer Science, Analysis Keywords: suspension bridges; fourth order wave equation; nonlinear damping; source term; existence; blow up
Online: 24 February 2017 (09:06:57 CET)
In this paper, we consider a fourth-order suspension bridge equation with nonlinear damping term |ut|m-2ut and source term |u|p-2u. We give necessary and sufficient condition for global existence and energy decay results without considering the relation between m and p. Moreover, when p>m, we give sufficient condition for finite time blow-up of solutions. The lower bound of the blow-up time Tmax is also established. It worth to mention that our obtained results extend the recent results of Wang (J. Math. Anal. Appl., 2014) to the nonlinear damping case.
ARTICLE | doi:10.20944/preprints202201.0107.v1
Subject: Engineering, Energy & Fuel Technology Keywords: Very short term load forecasting; VSTLF; Short term load forecasting; STLF; deep learning; RNN; LSTM; GRU; machine learning; SVR; random forest; extreme gradient boosting, energy consumption; ARIMA; time series prediction.
Online: 10 January 2022 (12:17:35 CET)
Commercial buildings are a significant consumer of energy worldwide. Logistics facilities, and specifically warehouses, are a common building type yet under-researched in the demand-side energy forecasting literature. Warehouses have an idiosyncratic profile when compared to other commercial and industrial buildings with a significant reliance on a small number of energy systems. As such, warehouse owners and operators are increasingly entering in to energy performance contracts with energy service companies (ESCOs) to minimise environmental impact, reduce costs, and improve competitiveness. ESCOs and warehouse owners and operators require accurate forecasts of their energy consumption so that precautionary and mitigation measures can be taken. This paper explores the performance of three machine learning models (Support Vector Regression (SVR), Random Forest, and Extreme Gradient Boosting (XGBoost)), three deep learning models (Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU)), and a classical time series model, Autoregressive Integrated Moving Average (ARIMA) for predicting daily energy consumption. The dataset comprises 8,040 records generated over an 11-month period from January to November 2020 from a non-refrigerated logistics facility located in Ireland. The grid search method was used to identify the best configurations for each model. The proposed XGBoost models outperform other models for both very short load forecasting (VSTLF) and short term load forecasting (STLF); the ARIMA model performed the worst.
ARTICLE | doi:10.20944/preprints202211.0037.v1
Subject: Arts & Humanities, Linguistics Keywords: non-native speech learning; talker variability; phonetically-irrelevant variability; long-term retention; cognitive abilities
Online: 2 November 2022 (03:05:23 CET)
Talker variability has been reported to facilitate generalization and retention of speech learning, but is also shown to place demands on cognitive resources. Our recent study provided evidence that phonetically-irrelevant acoustic variability in single-talker (ST) speech is sufficient to induce equivalent amounts of learning to the use of multiple-talker (MT) training. This study is a follow-up contrasting MT versus ST training with varying degrees of temporal exaggeration to examine how cognitive measures of individual learners may influence the role of input variability in immediate learning and long-term retention. Native Chinese-speaking adults were trained on the English /i/-/ɪ/ contrast. We assessed the trainees’ working memory and selective attention before training. Trained participants showed retention of more native-like cue weighting in both perception and production regardless of talker variability condition. The ST training group showed long-term benefit in word identification, whereas the MT training group did not retain the improvement. The results demonstrate the role of phonetically-irrelevant variability in robust speech learning and modulatory functions of nonlinguistic working memory and selective attention, highlighting the necessity to consider the interaction between input characteristics, task difficulty, and individual differences in cognitive abilities in assessing learning outcomes.
ARTICLE | doi:10.20944/preprints202210.0112.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: ARIMA; convolutional neural network; Kalman filter; passenger flow; transportation; short-term prediction; stochastic model
Online: 10 October 2022 (03:05:34 CEST)
The passenger prediction flow is very significant to transportation sustainability. This is due to some chaos of traffic jams encountered by the road users during their movement to the offices, schools, or markets at earlier of the days and during closing periods. This problem is peculiar to the transportation system of the Federal University of Technology Minna, Nigeria. However, the prevailing technique of passenger flow estimation is non-parametric which depends on the fixed planning and is easily affected by noise. In this research, we proposed the development of a hybrid intelligent passenger frequency prediction model using the Auto-Regressive Integrated Moving Average (ARIMA) linear model, Convolutional Neural Network (CNN), and Kalman Filter Algorithm (KFA). The passengers’ frequency of arrival at the bus terminals is obtained and enumerated through the closed-circuit television (CCTV) and demonstrated using the Markovian Queueing Systems Model (MQSM). The ARIMA model was used for learning and prediction and compared the result with the combined techniques of using CNN-KFA. The autocorrelation coefficient functions (ACF) and partial autocorrelation coefficient functions (PACF) are used to examine the stationary data with different features. The performance of the models was analyzed and evaluated in describing the short-term passenger flow frequency at each terminal using the Mean Absolute Percentage Error (MAPE) and Mean Squared Error (MSE) values. The CNN-Kalman-filter model was fitted into the short-term series and the MAPE values are below 10%. The Mean Square Error (MSE) shows that the CNN-Kalman Filter model has the overall best performance with 83.33% of the time better than the ARIMA model and provides high accuracy in forecasting.
ARTICLE | doi:10.20944/preprints202210.0004.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Electrical Power Grids; Fault Forecasting; Long Short-Term Memory; Time Series Forecasting; Wavelet Transform
Online: 3 October 2022 (10:36:14 CEST)
The electric power distribution utility is responsible for providing energy to consumers in a continuous and stable way, failures in the electrical power system reduce the reliability indexes of the grid, directly harming its performance. For this reason, there is a need for failure prediction to reestablish power in the shortest possible time. Considering an evaluation of the number of failures over time, this paper proposes to perform a failure prediction during the first year of the pandemic in Brazil (2020) to verify the feasibility of using time series forecasting models for fault prediction. The Long Short-Term Memory (LSTM) model will be evaluated to obtain a forecast result that can be used by the electric power utility to organize the maintenance teams. The Wavelet transform shows to be promising in improving the predictive ability of the LSTM, making the Wavelet LSTM model suitable for the study at hand. The results show that the proposed approach has better results regarding the evaluation of the error in prediction and has robustness when a statistical analysis is performed.
ARTICLE | doi:10.20944/preprints202206.0279.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Long Term Evolution; Radio Resource Management; Packet Scheduling; Cognitive Radio; Multi agent Qlearning; Matlab
Online: 21 June 2022 (03:57:22 CEST)
In this paper, we propose, implement, and test two novel downlink LTE scheduling algorithms. The implementation and testing of these algorithms were in Matlab, and they are based on the use of Reinforcement Learning, more specifically, the Qlearning technique for scheduling two types of users. The first algorithm is called a Collaborative scheduling algorithm, and the second algorithm is called a Competitive scheduling algorithm. The first type of the scheduled users is the Primary Users, and they are the licensed subscribers that pay for their service. The second type of the scheduled users is the Secondary Users, and they could be unlicensed subscribers that dont pay for their service, device to device communications, or sensors. Each user whether it is a primary or secondary is considered as an agent. In the Collaborative scheduling algorithm, the primary user agents will collaborate in order to make a joint scheduling decision about allocating the resource blocks to each one of them, then the secondary user agents will compete among themselves to use the remaining resource blocks. In the Competitive scheduling algorithm, the primary user agents will compete among themselves over the available resources, then the secondary user agents will compete among themselves over the remaining resources. Experimental results show that both scheduling algorithms converged to almost ninety percent utilization of the spectrum, and provided fair shares of the spectrum among users.
ARTICLE | doi:10.20944/preprints202108.0569.v1
Subject: Behavioral Sciences, Cognitive & Experimental Psychology Keywords: visual short-term memory; repetitive transcranial magnetic stimulation; visual memory precision; serial memory effects
Online: 31 August 2021 (11:43:33 CEST)
We investigated the role of the human medio-temporal complex (hMT+) in the memory encoding and storage of a sequence of four coherently moving RDKs by applying repetitive transcranial magnetic stimulation (rTMS) during an early or late phase of the retention interval. Moreover, in a second experiment we also tested whether disrupting the functional integrity of hMT+ during the early phase impaired the precision of the encoded motion directions. Overall, results showed that both recognition accuracy and precision were worse in middle serial positions, suggesting the occurrence of primacy and recency effects. We found that rTMS delivered during the early (but not the late) phase of the retention interval was able to impair not only recognition of RDKs, but also the precision of the retained motion direction. However, such impairment occurred only for RDKs presented in middle positions along the presented sequence, where performance was already closer to chance level. Altogether these findings suggest an involvement of hMT+ in the memory encoding of visual motion direction. Given that both position sequence and rTMS modulated not only recognition but also precision of the stored information, these findings are in support of a model of visual short-term memory with a variable resolution of each stored item, consistent with the assigned amount of memory resources, and that such item-specific memory resolution is supported by the functional integrity of area hMT+.
Subject: Engineering, Automotive Engineering Keywords: Computational fluid dynamic; Long short term memory; Vortex bladeless wind turbine; Prediction; Correlation matrix.
Online: 9 June 2021 (07:38:21 CEST)
Energy harvesting from wind turbines has been explored by researchers for more than a century from conventional turbines up to the latest bladeless turbines. Amongst these bladeless turbines, vortex bladeless wind turbine (VBT) harvests energy from oscillation of a turbine body. Due to the novelty of this science and the widespread researches around the world, one of the most important issues is to optimize and predict produced power. To enhance the produced output electrical power of VBT, the fluid-solid interactions (FSI) were analyzed to collect a dataset for predicting procedure. Long short-term memory (LSTM) method has been used to predict the produced power of VBT from the collected data. The reason of choosing LSTM from various artificial neural network methods is that the parameters of VBT study are all time- dependent and the LSTM is one of the most accruable algorithms for predicting time series data. In order to find the relationship between the parameter and the variables used in this research, a correlation matrix was presented. According to the value of 0.3 for the root mean square error (RMSE), a comparative analysis between the simulation results and its prediction shows that the LSTM method is very accurate for these types of research. Furthermore, the LSTM method has significantly reduced the computation time so that the prediction time of desired values has been reduced from an average of 2 and a half hours to two minutes. Also, one of the most important achievements of this study is to suggest a mathematical relation of VBT output power which helps to extend it in a different size of VBT with a high range of parameter variations.
ARTICLE | doi:10.20944/preprints202103.0302.v2
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Searaser; Flow-3D; Prediction; Long short term memory; deep neural network; Root mean error.
Online: 13 April 2021 (09:51:25 CEST)
Accurate forecasts of ocean waves energy can not only reduce costs for investment but it is also essential for management and operation of electrical power. This paper presents an innovative approach based on the Long Short Term Memory (LSTM) to predict the power generation of an economical wave energy converter named “Searaser”. The data for analyzing is provided by collecting the experimental data from another study and the exerted data from numerical simulation of searaser. The simulation is done with Flow-3D software which has high capability in analyzing the fluid solid interactions. The lack of relation between wind speed and output power in previous studies needs to be investigated in this field. Therefore, in this study the wind speed and output power are related with a LSTM method. Moreover, it can be inferred that the LSTM Network is able to predict power in terms of height more accurately and faster than the numerical solution in a field of predicting. The network output figures show a great agreement and the root mean square is 0.49 in the mean value related to the accuracy of LSTM method. Furthermore, the mathematical relation between the generated power and wave height was introduced by curve fitting of the power function to the result of LSTM method.
ARTICLE | doi:10.20944/preprints202104.0302.v1
Subject: Medicine & Pharmacology, Allergology Keywords: carbon-ion radiotherapy; uterine cervical cancer; adenocarcinoma; long-term follow-up; cisplatin; concurrent chemoradiotherapy
Online: 12 April 2021 (12:51:36 CEST)
The clinical significance of carbon-ion radiotherapy (CIRT) for adenocarcinoma (AC) of the uterine cervix has been assessed in several single-institutional studies. To validate the significance, we conducted a multi-institutional survey of CIRT for locally advanced AC (LAAC) of the uterine cervix. We retrospectively analyzed the clinical outcomes of patients with stage IIB–IVA LAAC of the uterine cervix who underwent chemo-CIRT or CIRT alone between April 2010 and April 2016. Patients received 74.4 Gy (relative biological effectiveness [RBE]) in 20 fractions of CIRT or 55.2 Gy (RBE) in 16 fractions of CIRT plus 3 sessions of brachytherapy. Patients aged ≤70 years with adequate bone marrow and organ function were administered cisplatin weekly (40 mg/m2 per week for up to 5 weeks). Fifty-five patients were enrolled in this study. The median follow-up period was 67.5 months. The 5-year overall survival (OS) and local control (LC) rates were 68.6% and 65.2%, respectively. Multivariate analysis showed that the initial tumor response within 6 months was significantly associated with LC and OS. The present study represents promising outcomes of CIRT or chemo-CIRT for LAAC of the uterine cervix, especially in the cases showing initial rapid regression of the tumor.
ARTICLE | doi:10.20944/preprints202011.0205.v1
Subject: Engineering, Other Keywords: Neural Networks; Long-Short Term Model; Water demand; Forecasting; Sustainable development goals; Water Goal.
Online: 5 November 2020 (10:17:30 CET)
Climate change has become the greatest threat to the survival of world and its ecosystem. With the irreversible impact on the ecosystem, problems like rise in sea level, food-insecurity, natural resources scarcity, seasonal disorders have increased over the past few years. Among these problems, the issue of water scarcity due to the lack of water resources and global warming has plagued several nations. Owing to the rising concerns over water scarcity United Nations (UN) has acknowledged water as a primary resource to the development of societies under the ‘Water Goal’ of the sustainable development goals. As the changing climate and intermittent availability of water resources pose major challenges to forecast demand, especially in countries like the United Arab Emirates (UAE) which has one of the highest per capita residential water consumption rates in the world. Therefore, the aim of this study is to propose an accurate water demand forecasting technique that incorporates all significant factors to predict the future water demands of the UAE. The forecasting model used is the Long Short Term Memory (LSTM), with the factors considered are mean temperature, mean rainfall, relative humidity, Gross Domestic Product (GDP), Consumer Price Index (CPI) and population growth. The LSTM model predicts the water demand forecasting in the UAE showing that the future demand will decrease from 1821 million m3 in 2018 to 1809.9 million m3 in 2027.
ARTICLE | doi:10.20944/preprints202009.0176.v1
Subject: Biology, Agricultural Sciences & Agronomy Keywords: soil health; soil organic matter; greenhouse gases; climatic change scenarios; Chernozems; long-term experiment
Online: 8 September 2020 (06:11:53 CEST)
Organic carbon (OC) accumulation in soil mitigates greenhouse gases emission and improves soil health. We aimed to quantify the dynamics of OC stock in soils and to justify technologies that allow annual increasing OC stock in the arable soil layer by 4‰. We based the study on a field experiment established in 1936 in the 9-field crop rotation with a fallow on Chernozem in European Russia. The RothC version 26.3 was used for the reproducing and forecasting OC dynamics. In all fertilizer applications at FYM background, there was a decrease in the OC stock with preferable loss of active OC, except the period 1964-71 with 2-5‰ annual OC increase. The model estimated the annual C input in the arable soil layer as 1,900 kg·ha-1. For increasing OC stocks by 4‰ per year, one should raise input to 2400 kg·ha-1. Simulation was made for 2016-2090 using climate scenarios RCP4.5 and RCP8.5. Crop rotation without fallowing provided an initial increase of 3‰ and 6‰ of stocks in the RCP8.5 and RCP4.5 scenarios accordingly, followed by a loss in accumulated OC. Simulation demonstrates difficulties to increase OC concentration in Chernozems under intensive farming and potential capacity to rise OC stock through yield management.
ARTICLE | doi:10.20944/preprints202001.0295.v1
Subject: Life Sciences, Virology Keywords: Hepatitis B virus; hepatocyte nuclear factor 4 alpha; long-term infection; ERK signaling pathway
Online: 25 January 2020 (15:25:57 CET)
Hepatitis B virus (HBV) infection is a major factor in development of various liver diseases such as hepatocellular carcinoma (HCC). Among HBV encoded proteins, HBV X protein (HBx) is known to play key role in development of HCC. Hepatocyte nuclear factor 4α (HNF4α) is a nuclear transcription factor which is critical for hepatocyte differentiation. However, the expression level as well as its regulatory mechanism in HBV infection have yet to be clarified. Here, we observed the suppression of HNF4α in cells which stably express HBV whole genome or HBx protein alone, while transient transfection of HBV replicon or HBx plasmid had no effect on the HNF4α level. Importantly, in the stable HBV- or HBx-expressing hepatocytes, the downregulated level of HNF4α was restored by inhibiting ERK signaling pathway. Our data showed that HNF4α was suppressed during long-term HBV infection in cultured HepG2-NTCP cells as well as in mouse model following hydrodynamic injection of pAAV-HBV or in mice intravenously infected with rAAV-HBV. Importantly, HNF4α downregulation increased cell proliferation which contributed to the formation and development of tumor in xenograft nude mice. The data presented here provided several proofs for the effect of HBV infection in manipulating HNF4α regulatory pathway in HCC development.
ARTICLE | doi:10.20944/preprints201807.0019.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: Clustering; Forecasting; Hierarchical Time-Series; Individual Electrical Consumers; Scalable; Short Term; Smart Meters; Wavelets
Online: 2 July 2018 (17:43:29 CEST)
Smart grids require flexible data driven forecasting methods. We propose clustering tools for bottom-up short-term load forecasting. We focus on individual consumption data analysis which plays a major role for energy management and electricity load forecasting. The two first sections are dedicated to the industrial context and a review of individual electrical data analysis. We are interested in hierarchical time-series for bottom-up forecasting. The idea is to disaggregate the signal in such a way that the sum of disaggregated forecasts improves the direct prediction. The 3-steps strategy defines numerous super-consumers by curve clustering, builds a hierarchy of partitions and selects the best one minimizing a forecast criterion. Using a nonparametric model to handle forecasting, and wavelets to define various notions of similarity between load curves, this disaggregation strategy applied to French individual consumers leads to a gain of 16\% in forecast accuracy. We then explore the upscaling capacity of this strategy facing massive data and implement proposals using R, the free software environment for statistical computing. The proposed solutions to make the algorithm scalable combines data storage, parallel computing and double clustering step to define the super-consumers.
ARTICLE | doi:10.20944/preprints201801.0097.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: deep learning; automatic modulation classification; classifier fusion; convolutional neural network; long short-term memory
Online: 11 January 2018 (04:47:00 CET)
Deep learning has recently attracted much attention due to its excellent performance in processing audio, image, and video data. However, few studies are devoted to the field of automatic modulation classification (AMC). It is one of the most well-known research topics in communication signal recognition, which remains challenging for traditional methods due to the complex disturbance from other sources. This paper proposes a heterogeneous deep model fusion (HDMF) method to solve the problem in a unified framework. The contributions include: 1) The convolutional neural network (CNN) and long short-term memory (LSTM) are combined by two different ways without prior knowledge involved; 2) A large database, including eleven types of single-carrier modulation signals with various noises as well as a fading channel, is collected with various signal-to-noise ratios (SNRs) based on a real geographical environment; and 3) Experimental results demonstrate that HDMF is super capable of copping with the AMC problem, and achieves much better performance when compared with the independent network. The source code and the database will be publically available.
ARTICLE | doi:10.20944/preprints202208.0124.v1
Subject: Social Sciences, Education Studies Keywords: Digital Systems; Educational Systems; State-space Models; Optimal Control; Long-term learning prediction; Learning Analytics
Online: 5 August 2022 (14:37:36 CEST)
Every month teachers face the dilemma of what exercises should their students practice, and what their consequences are on long-term learning. Since teachers prefer to pose their own exercises, this generates a large number of questions, each one attempted by a small number of students. Thus, we couldn’t use models based on big data such as deep learning. Instead, we developed a simple to understand state-space model that predicts end-of-year national test scores. We used 2,386 online fourth-grade mathematic questions designed by teachers and each attempted by some of the 500 students in 24 low socioeconomic schools. We found that the state-space model predictions improved month-by-month and that in most months it outperformed linear regression models. Moreover, the state-space estimator provides for each month a direct mechanism to simulate different practice strategies and compute their impact on the end-of-year standardized national test. We built iso-impact curves based on two critical variables: the number of questions solved correctly in the first attempt and the total number of exercises attempted. This allows the teacher to visualize the trade-off between asking students to do exercises more carefully or doing more exercises. To the best of our knowledge, this model is the first of its kind in education. It is a novel tool that supports teachers drive whole classes to achieve long-term learning targets.
ARTICLE | doi:10.20944/preprints202110.0237.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: Software reliability; deep learning; long short-term memory; project similarity and clustering; cross-project prediction
Online: 18 October 2021 (10:33:39 CEST)
Software reliability is an important characteristic for ensuring the qualities of software products. Predicting the potential number of bugs from the beginning of a development project allows practitioners to make the appropriate decisions regarding testing activities. In the initial development phases, applying traditional software reliability growth models (SRGMs) with limited past data does not always provide reliable prediction result for decision making. To overcome this, herein we propose a new software reliability modeling method called deep cross-project software reliability growth model (DC-SRGM). DC-SRGM is a cross-project prediction method that uses features of previous projects’ data through project similarity. Specifically, the proposed method applies cluster-based project selection for training data source and modeling by a deep learning method. Experiments involving 15 real datasets from a company and 11 open source software datasets show that DC-SRGM can more precisely describe the reliability of ongoing development projects than existing traditional SRGMs and the LSTM model.
ARTICLE | doi:10.20944/preprints202102.0011.v1
Subject: Medicine & Pharmacology, Allergology Keywords: Elderly with dementia; needs; utilization; essential care service package; long-term care system; health policy
Online: 1 February 2021 (11:24:37 CET)
Alzheimer’s disease and related dementias (ADRD) remain a public health challenge in developing counties. We developed a needs-based essential care service package (ECSP) for care planning of persons living with dementia (PLWD) using a cross-sectional survey among PLWD in institutions in six cities in China (n= 1,299). Face-to-face interviews were conducted with caregivers of PLWD by trained staff between 2018 and 2019. Care service needs and utilization by the level of cognitive impairment were summarized. The average age of PLWD was 80.7 years. 76% of participants had severe cognitive impairment. The needs-based ECSP with 30 service items would be sufficient in supporting care services of PLWD in China, of which seven items are core care. The selection plan for ECSP at different levels is designed as “General Care Services + Selective Care Services”, in which service items for low-, mid-and high-level care for PLWD are 7+3, 7+6, and 7+10, respectively. The findings provide the first large-scale data on service needs and utilization of PLWD in mainland China. The ECSP for PLWD advanced in the paper was a practicable and effective quantitative management means. It is deserved to application in a large scale.
Subject: Social Sciences, Accounting Keywords: generational responsibility; sustainable consumption; economic crises; long‐term orientation; collectivism; corporate social responsibility; competitive strategies
Online: 24 December 2020 (14:23:51 CET)
The rise of Asian and the stagnation of Western middle classes over the last thirty years have resulted in gradual convergence of income of large parts of the world’s population. Recent global crises ‐ the Great Recession and the COVID-19 pandemic ‐ have led to a decline in income and increase in income uncertainty. Rise in consumption of lower quality goods of shorter durability and an overall decline in demand and economic activity resulted as challenges to the global economy. In this paper, we argue that generational responsibility in consumption can be an environmentally sustainable response to crises which enables the economies to overcome the crisis of confidence and reaffirms community ties. As an element of long‐term orientation in consumption, generational responsibility is a cultural phenomenon dependent on solidarity within family and the wider community. It is characterized by consideration of consequences of consumption choices on the environment, and the abundance of savings and the usability of goods to be inherited by future generations. For companies, willing to revisit their traditional business models and incorporate principles of sustainability in their competitive strategies, promotion of generational responsibility can become a new source of competitive advantage and a driver of economic recovery.
ARTICLE | doi:10.20944/preprints202012.0527.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: epilepsy; seizure detection; electroencephalography; classification with a deferral option; home monitoring; long-term monitoring; wearables
Online: 21 December 2020 (13:40:43 CET)
Wearable technology will become available and allow prolonged electroencephalography (EEG) monitoring in the home environment of patients with epilepsy. Neurologists analyse the EEG visually and annotate all seizures, which patients often under report. Visual analysis of a 24 hour EEG recording typically takes one to two hours. Reliable automated seizure detection algorithms will be crucial to reduce this analysis. We study a dataset of behind-the-ear EEG measurements. Our first aim was to develop a methodology to reduce the EEG dataset by classifying part of the data automatically, while retaining 100% detection sensitivity (DS). Prediction confidences are determined by temperature scaling of the classification model outputs and trust scores. A DS of approximately 90% (99%) can be achieved when automatically classifying around 90% (60%) of the data. Perfect DS can be achieved when automatically classifying 50% of the data. Our second contribution demonstrates that a common modelling strategy, where predictions from several short EEG segments are used to obtain a final prediction, can be improved by filtering out untrustworthy segments with low trust scores. The false detection rate shows a relative decrease between 21% and 43%, and the DS shows a small increase or decrease.
ARTICLE | doi:10.20944/preprints202012.0133.v1
Subject: Biology, Anatomy & Morphology Keywords: soil organic carbon; long-term experiments; RothC model; climate change; "4 per 1000" initiative; Retisols
Online: 7 December 2020 (09:36:42 CET)
Soil organic carbon (SOC) sequestration in arable soils is a challenging goal for soil management. Multiple factors should be considered for the prediction of the soil capacity to fix atmospheric carbon. In this study, we focused on the effect of crop rotation and previous land use for future carbon sequestration on two experimental fields with identical soils (Retisols) and input of organic fertilizers. We analyzed the SOC dynamics and used the Roth C model to forecast SOC changes under RCP4.5 and RCP8.5 scenarios. Our experimental and modelling results indicated a consistent increase in SOC stocks and the stable fractions of soil organic matter (SOM). The increase in SOC was higher in the experiment with the crop-grassland rotation that in the experiment with a rotation of row crops and barley. With similar total SOC stocks, the efficiency of soil management differed as reflected by the contrasting composition of SOM, as fields with a long cultivation history showed higher SOM stability. The goal of 4‰ annual increase of SOC stocks may be reached under crop- grassland rotation in 2020-40 and 2080-90 when applying mineral or organic fertilizer system for scenario RCP4.5, and mineral fertilizer system in 2080-2090 for scenario RCP8.5.
ARTICLE | doi:10.20944/preprints202105.0783.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: cancer, cancer survivor, exercise, athletes, competition, long-term effects, late effects, living with and beyond cancer
Online: 31 May 2021 (14:02:13 CEST)
Athletes living with and beyond cancer can continue to train and, in some cases, compete during treatment. Following cancer treatment, athletes can return to competitive sport but need to learn to adapt their physical strength and training to lingering effects of cancer. It is critical for oncology healthcare providers to use the principles of assess, refer and advise to exercise oncology programs that are appropriate for the individual. Managing side effects of treatment is key to being able to train during and immediately following cancer treatment. Keen attention to fatigue is important at any point in the cancer spectrum to avoid overtraining and optimize the effects of training.
ARTICLE | doi:10.20944/preprints202012.0315.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Intrusion Detection Systems; Anomaly detection; Sequential analysis; Random Forest; Multi-Layer Perceptron; Long-Short Term Memory
Online: 14 December 2020 (09:36:58 CET)
With the latest advances in information and communication technologies, greater amounts of sensitive user and corporate information are constantly shared across the network making it susceptible to an attack that can compromise data confidentiality, integrity and availability. Intrusion Detection Systems (IDS) are important security mechanisms that can perform a timely detection of malicious events through the inspection of network traffic or host-based logs. Throughout the years, many machine learning techniques have proven to be successful at conducting anomaly detection but only a few considered the sequential nature of data. This work proposes a sequential approach and evaluates the performance of a Random Forest (RF), a Multi-Layer Perceptron (MLP) and a Long-Short Term Memory (LSTM) on the CIDDS-001 dataset. The resulting performance measures of this particular approach are compared with the ones obtained from a more traditional one, that only considers individual flow information, in order to determine which methodology best suits the concerned scenario. The experimental outcomes lead to believe that anomaly detection can be better addressed from a sequential perspective and that the LSTM is a very reliable model for acquiring sequential patterns in network traffic data, achieving an accuracy of 99.94% and a f1-score of 91.66%.
Subject: Engineering, Electrical & Electronic Engineering Keywords: arm motion recognition; micro-doppler signature; time series analysis; dynamic time warping; long short-term memory
Online: 16 December 2019 (11:42:44 CET)
Hand and arm gesture recognition using radio frequency (RF) sensing modality proves valuable in man-machine interface and smart environment. In this paper, we use time series analysis method for accurately measuring the similarity of the micro-Doppler (MD) signatures between the training and test data, thus providing improved gesture classification. We characterize the MD signatures by the maximum instantaneous Doppler frequencies depicted in the spectrograms. In particular, we apply the dynamic time warping (DTW) method and compare its performance with that of the long short-term memory (LSTM) network. Both methods take into account the values as well as the temporal evolution and trends of time series data. It is shown that the DTW method achieves high gesture classification rates and is robust to time misalignment.
ARTICLE | doi:10.20944/preprints202210.0139.v1
Subject: Engineering, Construction Keywords: concrete dams; prediction model; empirical modal decomposition method; wavelet threshold; sparrow search algorithm; long short-term memory
Online: 11 October 2022 (04:32:08 CEST)
The deformation monitoring information of concrete dams contains some high-frequency com-ponents, and the high-frequency components are strongly nonlinear, which reduces the accuracy of dam deformation prediction. In order to solve such problems, this paper proposes a concrete dam deformation monitoring model based on empirical mode decomposition (EMD) combined with wavelet threshold noise reduction and sparrow search algorithm (SSA) optimization of long short-term memory network (LSTM). The model uses EMD combined with wavelet threshold to decompose and denoise the measured deformation data. On this basis, the LSTM model based on SSA optimization is used to mine the nonlinear function relationship between the reconstructed monitoring data and various influencing factors. The example analysis shows that the model has good calculation speed, fitting and prediction accuracy and it can effectively mine the date char-acteristics inherent in the measured deformation, and reduce the influence of noise components on the modeling accuracy.
ARTICLE | doi:10.20944/preprints202206.0238.v1
Subject: Earth Sciences, Atmospheric Science Keywords: neural networks; satellite images; class imbalance; feature attribution; lightning prediction; nowcasting; short-term forecasts; machine learning; meteorology
Online: 16 June 2022 (10:48:59 CEST)
While thunderstorms can pose severe risks to property and life, forecasting remains challenging, even at short lead times, as these often arise in meta-stable atmospheric conditions. In this paper, we examine the question of how well we could perform short-term (up to 180min) forecasts using exclusively multi-spectral satellite images and past lighting events as data. We employ representation learning based on deep convolutional neural networks in an “end-to-end” fashion. Here, a crucial problem is handling the imbalance of the positive and negative classes appropriately in order to be able to obtain predictive results (which is not addressed by many previous machine-learning-based approaches). The resulting network outperforms previous methods based on physically-based features and optical flow methods (similar to operational prediction models) and generalizes across different years. A closer examination of the classifier performance over time and under masking of input data indicates that the learned model actually draws most information from structures in the visible spectrum, with infrared imaging sustaining some classification performance during the night.
ARTICLE | doi:10.20944/preprints202202.0143.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: photovoltaic (PV) power forecast; multiple PV forecasting; short-term PV forecasting; motion estimation; optical flow; smart grid
Online: 10 February 2022 (02:22:32 CET)
The power-generation capacity of grid-connected photovoltaic (PV) power systems is increasing. As output power forecasting is required by electricity market participants and utility operators for the stable operation of power systems, several methods have been proposed using physical and statistical approaches for various time ranges. A short-term (30 min ahead) forecasting method has been previously proposed by our laboratory for geographically distributed PV systems using motion estimation. This study focuses on an important parameter for estimating the proposed motion and optimizing the parameter. This parameter is important because it is associated with the smoothness of the vector field, which is the result of motion estimation and influences the forecasting accuracy. In the periods with drastic power output changes, the evaluation was conducted on 101 PV systems located within a circle of 15-km radius in the Kanto region of Japan. The results indicate that the absolute mean error of the proposed method with the optimized parameter is 10.3%, whereas that of the persistent prediction method is 23.7%. Therefore, the proposed method is effective in forecasting for periods when PV output changes drastically in a short time.
ARTICLE | doi:10.20944/preprints202112.0264.v1
Subject: Medicine & Pharmacology, Clinical Neurology Keywords: concussion; mild traumatic brain injury; working memory; long-term cognitive outcome; support vector machine classifier; personalized prediction
Online: 16 December 2021 (10:24:08 CET)
Concussion, also known as mild traumatic brain injury (mTBI), commonly causes transient neurocognitive symptoms, but in some cases, it causes cognitive impairment, including working memory (WM) deficit, which can be long-lasting and impede a patient’s return to work. The predictors of long-term cognitive outcomes following mTBI remain unclear because abnormality is often absent in structural imaging findings. The purpose of the study was to determine whether machine learning-based models using functional magnetic resonance imaging (fMRI) biomarkers and demographic or neuropsychological measures at baseline could effectively predict 1-year cognitive outcomes of concussion. We conducted a prospective, observational study of patients with mTBI who were compared with demographically-matched healthy controls enrolled between September 2015 to August 2020. Baseline assessments were collected within the first week of injury, and follow-ups were conducted at 6 weeks, 3 months, 6 months, and 1 year. Potential demographic, neuropsychological, and fMRI features were selected according to the significance of correlation with the estimated changes in WM ability. The support vector machine classifier was trained using these potential features and estimated changes in WM between the predefined time periods. Patients demonstrated significant cognitive recovery at the third month, followed by worsened performance after 6 months, which persisted until 1 year after concussion. Approximately half of the patients experienced prolonged cognitive impairment at 1-year follow up. Satisfactory predictions were achieved for patients whose WM function did not recover at 3 months (accuracy=87.5%), 6 months (accuracy=83.3%), 1 year (accuracy=83.3%), and performed worse at 1-year follow-up compared to baseline assessment (accuracy=83.3%). This study demonstrated the feasibility of personalized prediction for long-term postconcussive WM outcomes based on baseline fMRI and demographic features, opening a new avenue for early rehabilitation intervention in selected individuals with possible poor long-term cognitive outcomes.
ARTICLE | doi:10.20944/preprints202110.0049.v2
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: long short-term memory; minimum message length; time series; neural network; deep learning; Bayesian statistics; probabilistic modeling
Online: 12 October 2021 (11:41:30 CEST)
We investigate the power of time series analysis based on a variety of information-theoretic approaches from statistics (AIC, BIC) and machine learning (Minimum Message Length) - and we then compare their efficacy with traditional time series model and with hybrids involving deep learning. More specifically, we develop AIC, BIC and Minimum Message Length (MML) ARMA (autoregressive moving average) time series models - with this Bayesian information-theoretic MML ARMA modelling already being new work. We then study deep learning based algorithms in time series forecasting, using Long Short Term Memory (LSTM), and we then combine this with the ARMA modelling to produce a hybrid ARMA-LSTM prediction. Part of the purpose of the use of LSTM is to seek capture any hidden information in the residuals left from the traditional ARMA model. We show that MML not only outperforms earlier statistical approaches to ARMA modelling, but we further show that the hybrid MML ARMA-LSTM models outperform both ARMA models and LSTM models.
ARTICLE | doi:10.20944/preprints202106.0255.v1
Subject: Social Sciences, Accounting Keywords: SARS-CoV-2; Care home; Long-term care; Social care; Preparedness; Contingency plan; Safety culture; Workforce; Survey
Online: 9 June 2021 (10:52:34 CEST)
(1) Background: Nursing homes’ preparedness in managing a public health crisis has been fragile, with effects on safety culture. The objective of this study was to assess nursing homes’ COVID-19 preparedness in Southern Portugal, including personnel’s work experiences during the pandemic. (2) Methods: We used a COVID-19 preparedness checklist to be completed by management teams, followed by follow-up calls to nursing homes. Thereafter, a survey to personnel was applied. Data analysis included descriptive statistics, exploratory factor analysis, and thematic analysis of open-end questions. (3) Results: In total, 71% (138/195) of eligible nursing homes returned the preparedness checklist. We conducted 83 follow-up calls and received 720 replies to the personnel survey. On average, 25% of nursing homes did not have an adequate decision-making structure to respond to the pandemic. Outbreak capacity and training were fragile areas among nursing homes’ contingency plans. We identified compliance with procedures and nonpunitive response to mistakes as fragile areas of safety culture, and teamwork as a strong safety area. (4) Conclusions: To strengthen how nursing homes cope with upcoming phases of the COVID-19 pandemic or future public health emergencies, nursing homes’ preparedness and safety culture should be fostered and closely monitored.
ARTICLE | doi:10.20944/preprints202106.0043.v1
Subject: Medicine & Pharmacology, Pediatrics Keywords: newborn screening; research; long-term follow-up; NBSTRN; LPDR; RUSP. (3-10 keywords separated by semi colons)
Online: 1 June 2021 (15:10:29 CEST)
The goal of newborn screening is to improve health outcomes by identifying and treating affected newborns. This manuscript provides an overview of a data tool to facilitate the longitudinal collection of health information on newborns diagnosed with a condition through NBS. The Newborn Screening Translational Research Network (NBSTRN) developed the Longitudinal Pediatric Data Resource (LPDR) to capture, store, analyze, visualize, and share genomic and phenotypic data over the lifespan of NBS identified newborns to facilitate understanding of genetic disease, and to assess the impact of early identification and treatment. NBSTRN developed a consensus-based process using clinical care experts to create, maintain, and evolve question and answer sets organized into common data elements (CDEs). The LPDR contains 24,172 core and disease-specific CDEs for 118 rare genetic diseases, and the CDEs are being made available through the NIH CDE Repository. The number of CDEs for each condition average of 2,200 with a range from 69 to 7,944. The LPDR is used by state NBS programs, clinical researchers, and community-based organizations. Case level, de-identified data sets are available for secondary research and data mining. The development of the LPDR for longitudinal data gathering, sharing, and analysis supports research and facilitates the translation of new discoveries into clinical practice.
ARTICLE | doi:10.20944/preprints202103.0625.v1
Subject: Life Sciences, Biochemistry Keywords: three population mathematical model; CAR-T lymphocytes; memory CAR-T cells; long-term immunity; tumor-induced immunosupression
Online: 25 March 2021 (14:39:02 CET)
Immunotherapy has gained great momentum with chimeric antigen receptor T cell (CAR-T) therapy, in which patient’s T lymphocytes are genetically manipulated to recognize tumor-specific antigens increasing tumor elimination efficiency. In the last years, CAR-T cell immunotherapy for hematological malignancies achieved a great response rate on patients and is a very promising therapy for several other malignancies. Each new CAR design requires a preclinical proof-of-concept experiment using immunodeficient mouse models. The absence of a functional immune system in these mice makes them simple and suitable to be mathematically modeled. In this work, we developed a three population mathematical model to describe tumor response to CAR-T cell immunotherapy in immunodeficient mouse models, encompassing interactions between a non-solid tumor and CAR-T cells (effector and long-term memory). We account for several phenomena, such as tumor-induced immunosuppression, memory pool formation, and conversion of memory into effector CAR-T cells in the presence of new tumor cells. Individual donor and tumor specificities were considered as uncertainties in the model parameters. Our model is able to reproduce several CAR-T cell immunotherapy scenarios, with different CAR receptors and tumor targets reported in the literature. We found that therapy effectiveness mostly depends on some specific parameters such as the differentiation of effector to memory CAR-T cells, CAR-T cytotoxic capacity, tumor growth rate, and tumor-induced immunosuppression. In summary, our model can contribute to reduce and optimize the number of in vivo experiments with in silico tests to select specific scenarios that could be tested in experimental research. Such in silico laboratory was made available in a Shiny R-based platform called CARTmath. It is an open-source, easy to run simulator, available at github.com/tmglncc/CARTmath or directly on the webpage cartmath.lncc.br, containing this manuscript results as examples and documentation. The developed model, together with the CARTmath platform, provides potential use for assessing different CAR-T cell immunotherapy protocols and associated efficacy, becoming an accessory towards in silico trials.
ARTICLE | doi:10.20944/preprints202012.0208.v1
Subject: Biology, Anatomy & Morphology Keywords: soil organic carbon; soil health; long-term experiments; RothC model; climate change; "4 per 1000" initiative; Podzols
Online: 8 December 2020 (17:30:04 CET)
Soil organic carbon (SOC) is an essential condition for soil health and a potential sink for greenhouse gases. SOC dynamics in a long-term field experiment with mineral and organic fertilization on loamy sand Podzol in Vladimir Region, Russia, was traced with the dynamic carbon model RothC since 1968 until the present time. During this period, C stock increased 21% compared with the initial level in the treatment with the application of manure in an average annual rate of 10 t·ha-1. The model was also used to forecast SOC changes until 2090 for two contrasting RCP4.5 and RCP8.5 climatic scenarios. Until 2090, the steady growth of SOC stocks is expected in all compared treatments for both climate scenarios. This rate of growth was the highest until 2040, decreased in 2040-2070 and increased again in 2070-2090 for RCP4.5. The highest annual gain was within 21-27‰ under RCP4.5 and 16-21‰ in 2020-2040 in 0-20 cm soil layer. The expected accumulation of C allows increasing current C stock 1.6-1.7 times for RCP4.5 and 2.0-2.2 times for RCP8.5 scenario. Modelling demonstrated potentially more favourable conditions for SOC stability in arable Podzols than in Retisols in Central Russia in the 21st century.
ARTICLE | doi:10.20944/preprints201911.0259.v1
Subject: Engineering, Civil Engineering Keywords: lightweight aggregate concrete; reinforced concrete; flexural elements; curvature; short-term loading; tension stiffening; constitutive model; numerical modelling.
Online: 22 November 2019 (08:28:04 CET)
In the present trend of constructing taller and longer structures, the application of lightweight aggregate concrete is becoming an increasing important advanced solution in the modern construction industry. In engineering practice, the analysis of lightweight concrete elements is performed using the same algorithms used for normal concrete elements. As an alternative to traditional engineering methods, nonlinear numerical algorithms based on constitutive material models may be used. The paper presents a comparative analysis of curvature calculations for flexural lightweight concrete elements, incorporating analytical code methods EN 1992-1 and ACI 318-14, as well as a numerical analysis using the constitutive model of cracked tensile lightweight concrete recently proposed by the authors. To evaluate the adequacy of the theoretical predictions, experimental data of 51 lightweight concrete beams tested during five different programmes were collected. A comparison of theoretical and experimental results showed that the most accurate predictions are obtained using numerical analysis and the constitutive model proposed by the authors. In the future, the latter algorithm can be used as a reliable tool for improving the design standard methods or numerical modelling of lightweight concrete elements subjected to short-term loading.
ARTICLE | doi:10.20944/preprints201811.0505.v1
Subject: Arts & Humanities, Linguistics Keywords: Italian, readability, GULPEASE, literature, statistics, characters, words, sentences, punctuation marks, short−term memory, word interval, time interval
Online: 20 November 2018 (15:32:11 CET)
Statistics of languages are calculated by counting characters, words, sentences, word rankings. Some of these random variables are also the main “ingredients” of classical readability formulae. Revisiting the readability formula of Italian, known as GULPEASE, shows that of the two terms that determine the readability index G – the semantic index G_C, proportional to the number of characters per word, and the syntactic index G_F, proportional to the reciprocal of the number of words per sentence −, G_F is dominant because G_C is, in practice, constant for any author throughout seven centuries of Italian Literature. Each author can modulate the length of sentences more freely than he can do with the length of words, and in different ways from author to author. For any author, any couple of text variables can be modelled by a linear relationship y=mx, but with different slope m from author to author, except for the relationship between characters and words, which is unique for all. The most important relationship found in the paper is, in author’s opinion, that between the short−term memory capacity, described by Miller’s “7∓2 law”, and the word interval, a new random variable defined as the average number of words between two successive punctuation marks. The word interval can be converted into a time interval through the average reading speed. The word interval is spread in the same of Miller’s law, and the time interval is spread in the same range of short−term memory response times. The connection between the word interval (and time interval) and short−term memory appears, at least empirically, justified and natural, and should further investigated. Technical and scientific writings (papers, essays etc.) ask more to their readers. A preliminary investigation of these texts shows clear differences: words are on the average longer, the readability index G is lower, word and time intervals are longer. Future work done on ancient languages, such as Greek or Latin, could bring us a flavor of the short term−memory features of these ancient readers.