ARTICLE | doi:10.20944/preprints202206.0031.v1
Subject: Life Sciences, Immunology Keywords: ERU; IL8; PMN; granulocyte; latent activation; extravasation
Online: 2 June 2022 (10:54:38 CEST)
In the pathophysiology of autoimmune-mediated uveitis, granulocytes have emerged as possible disease mediators and were shown to be latently activated in equine recurrent uveitis (ERU), a spontaneous disease model. We therefore used granulocytes from ERU horses to identify early molecular mechanisms involved in this dysregulated innate immune response. Primary granulocytes from healthy and ERU horses were stimulated with IL8 and cellular response was analyzed with differential proteomics, which revealed significant differences in protein abundance of 170 proteins in ERU. Subsequent ingenuity pathway analysis identified three activated canonical pathways “PKA signaling”, “PTEN signaling” and “leukocyte extravasation”. Clustered to the leukocyte extravasation pathway, we found the membrane-type GPI-anchored protease MMP25, which was increased in IL8 stimulated ERU granulocytes. These findings point to MMP25 as a possible regulator of granulocyte extravasation in uveitis and a role of this molecule in the impaired integrity of the blood-retina-barrier. In conclusion, our analyses show a clearly divergent reaction profile of latently activated granulocytes upon IL8 stimulation, and provide basic information for further in-depth studies on early granulocyte activation in non-infectious ocular diseases. This may be of interest for development of new approaches in uveitis diagnostics and therapy. Raw data are available via ProteomeXchange with identifier PXD013648.
REVIEW | doi:10.20944/preprints202205.0316.v1
Subject: Life Sciences, Virology Keywords: HIV; protease; CARD8; NNRTI; Inflammasome; Latent reservoir
Online: 24 May 2022 (03:54:52 CEST)
HIV-1 protease (PR) is a viral enzyme that cleaves viral polyprotein precursors to convert them into functional forms, a process essential to generate infectious viral particles. Due to its broad substrate specificity, HIV-1 PR can also cleave certain host cell proteins. Several studies have identified host cell substrates of HIV-1 PR and described the potential impact of their cleavage on HIV-1-infected cells. Of particular interest is the interaction between PR and the caspase recruitment domain-containing protein 8 (CARD8) inflammasome. While PR typically has low levels of intracellular activity prior to viral budding, induction of premature PR activation to trigger CARD8-mediated cell killing may help eliminate latent reservoirs in people living with HIV. In this review, we discuss the viral and host substrates of HIV-1 protease and highlight potential applications and advantages of targeting CARD8 sensing of HIV-1 PR.
CASE REPORT | doi:10.20944/preprints202202.0291.v1
Subject: Medicine & Pharmacology, Dentistry Keywords: granulomatous cheilitis; latent tuberculosis; IGRA; antibiotic treatment
Online: 23 February 2022 (12:08:19 CET)
The granulomatous cheilitis (GC) presents a heterogeneous group of disorders characterised by a granulomatous inflammation/reaction of the lips to various stimuli. Numerous etiologies have been proposed, including genetic, immunologic, allergic and infectious. Among the secondary causes of GC, a distant infection by Mycobacterium tuberculosis should be considered. The GC could be the clinical presentation of a tuberculide resulting from a hypersensitivity reaction to an underlying focus of active or latent tuberculosis infection (LTBI). This communication describes a woman diagnosed with GC related to LTBI, who responded well to antituberculosis treatment.
COMMUNICATION | doi:10.20944/preprints202004.0305.v1
Subject: Medicine & Pharmacology, Clinical Neurology Keywords: multiple sclerosis; tuberculosis; immunosuppressive therapy; Latent tuberculosis
Online: 17 April 2020 (15:34:17 CEST)
Tuberculosis (TB) is an infectious-contagious disease caused by M. tuberculosis (Koch’s bacillus). About one-quarter of the world’s population is infected with that bacillus and at risk of developing TB disease. Latent tuberculosis corresponds to people who have been infected by TB bacteria but are not (yet) ill. The most vulnerable population to TB activation includes HIV infected, drug abuse and autoimmune disease patients. Multiple Sclerosis (MS) is a chronic and autoimmune neurological disease caused by lymphocytic infiltration. Its prevalence worldwide is 22.2 million cases of MS. There is a relation between TB and MS: due to immunomodulation or immunosuppression treatment of MS (reactivation of latent infection), or due to the intense inflammatory response before the infection of the bacillus (increased susceptibility to the development of autoimmune diseases). Screening for TB includes complete patient history, physical exam, chest radiography, and Tuberculin Skin Test or IGRA (Interferon Gamma Release Assay). This investigation is suggested when MS drugs (immunomodulatory and immunosuppressant medications) are prescribed. If a patient has positive results, the treatment for MS should not be delayed for the finishing TB treatment. In this paper, considering the high prevalence of tuberculosis, we recommend that TB screening should be also done at the moment of Multiple Sclerosis diagnosis.
ARTICLE | doi:10.20944/preprints201810.0338.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: text classification; topic modelling; latent semantic analysis; latent dirichlet allocation; hierarchical sentiment dictionary; contextually-oriented hierarchical corpus; text tonality; evaluation
Online: 16 October 2018 (07:55:35 CEST)
The research presents the Methodology of Improving the Accuracy in Text Classification in Light of Modelling the Latent Semantic Relations (LSR). The aim of this Methodology is to find the ways of eliminating the Limitations of Discriminant and Probabilistic methods for LSR revealing and customizing the Text Classification Process to the more accurate recognition of the text tonality. This aim should be achieved by using the knowledge about the text’s Hierarchical Semantic Context in the form of Corpora-based Hierarchical Sentiment Dictionary. The main scientific contribution of this research is the following set of approaches to improve the qualitative characteristics of Text Classification process: combination of the Discriminant and Probabilistic methods allowing to decrease the influences of the Limitations of these methods on the LSR revealing process; considering each document as a complex structure allowing to estimate documents integrally by separated classification of topically completed textual component (paragraphs); taking into account the features of Argumentative type of documents (Reviews) allowing to use the author’s subjective evaluation of text tonality for development the Text Classification methodology. Tonality, expressed by the Review’s author, has a significant, but not critical, effect on the qualitative indicators of Sentiment Recognition.
REVIEW | doi:10.20944/preprints202111.0089.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Latent Dirichlet Allocation; Natural Language Processing; Condition based maintenance
Online: 3 November 2021 (14:59:02 CET)
In the field of industrial process monitoring, more and more interest is being shown in specific process categories. These include time-varying processes, that is, those processes whereby the response one receives as output from the system depends on when the input signal is sent into it. There are many reasons for this process variability and such contexts are not always analyzed with this operational characteristic at their core. At the same time, interest in certain categories of techniques is also becoming more prominent, to meet certain application needs. Among these, clustering and unsupervised techniques in general are gaining ground. This is largely due to the difficulty of finding fault data with which to train, for example, supervised models. On the other hand, the clustering technique, on which this contribution focuses, also makes it possible to compensate for the lack of complete knowledge of the structure of the process itself. With these two considerations in mind, this contribution proposes a literature review on the topic of clustering applied in time-varying contexts, in the maintenance field. The aim is to present an overview of the main fields of study, the role of clustering in this context and the main clustering techniques used.
ARTICLE | doi:10.20944/preprints201810.0758.v1
Subject: Biology, Plant Sciences Keywords: latent PPO; optimisation; peripheral membrane POD; total phenol content
Online: 1 November 2018 (18:00:20 CET)
The present protocol described extraction of active polyphenol oxidase and peroxidase from a plant rich in phenolics and chlorophylls in the post-harvest browning syndrome of B. myrtifolia. Initially, general optimisation using conventional enzyme extractions was performed. However, along with membrane-bound proteins, chlorophylls and phenols were also released with Triton X (TTX). With a view to obtaining high enzymatic activity, removal of the released chlorophylls and phenols by formation of TTX-114 micelles in the detergent rich phase after high-temperature induced phase separation was tested.
ARTICLE | doi:10.20944/preprints201911.0015.v1
Subject: Engineering, Civil Engineering Keywords: phase change materials (pcms); metals; container; latent heat storage; corrosion
Online: 3 November 2019 (15:06:53 CET)
Phase Change Materials (PCMs) are latent heat storage media with high potential of integration in building structures and technical systems. Their solid-liquid transition is commonly utilized for thermal energy storage in building applications. It also means that some kind of encapsulation is necessary. This is often solved with metal containers that also have high thermal conductivity and resistance to mechanical damage enhancing the performance these so called latent heat thermal energy storage (LHTES) systems. However selection of suitable metal is rather challenging. It depends, among other things, on the elimination of undesirable interaction between storage medium and surrounding metal. Heat storage medium must be reliably sealed in metal container especially when the storage system is integrated in systems like domestic hot water storage tanks, where PCM leaks can negatively affect human health. The aim of this study was evaluation of interaction between selected commercially available organic and inorganic PCMs and metals. The evaluation is based on the calculation of corrosion rate and use gravimetric method for determination of the weigh variations of the metal samples. Results show that aluminium is the most suitable container material with lowest mass loss and suffered only minimal visual changes on the surface after prolonged exposure to PCMs.
ARTICLE | doi:10.20944/preprints201812.0052.v1
Subject: Social Sciences, Business And Administrative Sciences Keywords: Festival; guitar; culture; Spain; SEM; AMOS; latent variables; observed variables.
Online: 4 December 2018 (09:47:34 CET)
The Cordoba Guitar Festival is one of the most important cultural events in Spain. This article analyses the musical preferences, satisfaction, attitudinal loyalty and behavioural loyalty of spectators who attended the 36th festival held in July 2016, as well as the festival’s economic impact on the city. To achieve this aim, a structural equation model (SEM) was used. The results show the goodness-of-fit of the model and indicate that the observed data fit the expected dataset.
ARTICLE | doi:10.20944/preprints201805.0266.v1
Subject: Earth Sciences, Atmospheric Science Keywords: rainfall; lidar; disdrometer; evaporation; meteorology; climate change; latent heat; precipitation
Online: 21 May 2018 (11:09:01 CEST)
In this paper we illustrate a new, simple and complementary ground-based methodology to retrieve the vertically resolved atmospheric precipitation intensity through a synergy between measurements from the National Aeronautics and Space Administration (NASA) Micropulse Lidar network (MPLNET), an analytical model solution and ground-based disdrometer measurements. The presented results are obtained at two mid-latitude MPLNET permanent observational sites, located respectively at NASA Goddard Space Flight Center, USA, and at the Universitat Politècnica de Catalunya, Barcelona, Spain. The methodology is suitable to be applied to existing and/or future lidar/ceilometer networks with the main objective of either providing near-real time (3h latency) rainfall intensity measurements and/or to validate satellite missions, especially for critical light precipitation (<3 mm hr−1).
ARTICLE | doi:10.20944/preprints202101.0332.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: flexible count models; balanced gamma distribution; Jensen–Shannon divergence; latent equidispersion
Online: 18 January 2021 (12:23:53 CET)
Most existing flexible count regression models allow only approximate inference. This work proposes a new framework to provide an exact and flexible alternative for modeling and simulating count data with various types of dispersion (equi-, under- and overdispersion). The new method, referred as “balanced discretization”, consists in discretizing continuous probability distributions while preserving expectations. It is easy to generate pseudo random variates from the resulting balanced discrete distribution since it has a simple stochastic representation in terms of the continuous distribution. For illustrative purposes, we have developed the family of balanced discrete gamma distributions which can model equi-, under- and overdispersed count data. This family of count distributions is appropriate for building flexible count regressionmodels because the expectation of the distribution has a simple expression in terms of the parameters of the distribution. Using the Jensen–Shannon divergence measure, we have shown that under equidispersion restriction, the family of balanced discrete gamma distributions is similar to the Poisson distribution. Based on this, we conjecture that while covering all types of dispersion, a count regression model based on the balanced discrete gamma distribution will allow recovering a near Poisson distribution model fit when the data is Poisson distributed.
ARTICLE | doi:10.20944/preprints201904.0170.v1
Subject: Medicine & Pharmacology, Other Keywords: topic modelling; latent dirichlet allocation; text mining; assisted reproduction; ART; IVF
Online: 15 April 2019 (12:25:12 CEST)
Study question: What are the current trends of research in Human Assisted Reproduction around the world? Summary answer: USA is the leading country, followed by the UK, China, France and Italy. The largest research area is “laboratory techniques”, although other areas such as “public health”, “quality, ethics and law” and “female factor” are gaining ground worldwide. What is known already: Scientific research, especially in health and medical sciences, aims at addressing specific needs that society (and, especially, patients) perceives as pressing. One of the main challenges for policymakers and research funders alike is therefore to align research priorities to societal needs. We can thus think of research agendas in terms of a demand side (societal needs) and a supply side (research outputs). Research output in Human Assisted Reproduction has expanded in the past years, as indicated by the increasing number of scientific publications in indexed journals in this area. Nevertheless, no map of research related to assisted reproduction has been produced so far, hindering the identification of potential areas of improvement and need. Study design, size, duration: 26,000+ scientific publications (articles, letters, and reviews) on Human Assisted Reproduction produced worldwide between 2005 and 2016 were analyzed. These publications were indexed in PubMed or obtained from reference list of indexed publications included in the analysis.Participants/materials, setting, methods: The corpus of publications was obtained by combining the MeSH terms: “Reproductive techniques”, “Reproductive medicine”, “Reproductive health”, “Fertility”, “Infertility”, and “Germ cells”. Then it was analyzed by means of text mining algorithms (Topic Modeling (TM) based on Latent Dirichlet Allocation (LDA)), in order to obtain the main topics of interest. Finally, these categories were analyzed across world regions and time. Main results and the role of chance: We identified 44 main topics, which were further grouped in 11 macro categories, form larger to smaller: “laboratory techniques”, “male factor”, “quality, ethics and law”, “female factor”, “public health and infectious diseases”, “basic research and genetics”, “pregnancy complications and risks”, “general infertility and ART”, “psychosocial aspects”, “cancer”, and “research methodology”. The USA was the leading country in number of publications, followed by the UK, China, France and Italy. Interestingly, research contents in high income countries is fairly homogeneous across macro-categories, and it is dominated by “laboratory techniques” in Western and Southern Europe, and by “quality, ethics and law” in North America, Australia and New Zealand. In middle income countries we observe that research is mainly performed on “male factor”, and noticeably less on “female factor”. Finally, research on “public health and infectious diseases” predominates in low-income countries. Regarding temporal evolution of research, “laboratory techniques” is the most abundant topic on a yearly basis, and relatively constant over time. However, since production in most of the other categories is increasing, the relative contribution of this research category is actually decreasing. Publication is especially increasing in “public health and infectious diseases” (in all world regions, but especially in low income countries), “quality, ethics and law” (high income countries), and “female factor” (middle income countries). Limitations, reasons for caution: Three main factors might limit the robustness of our work: the textual corpus analyzed is based on abstract and titles, the reproducibility of the stochastic algorithms applied, which may produce slightly differing results at each run, and the interpretation of the topics obtained. Wider implications of the findings: This study should prove beneficial in the design of research strategies and policies that foster the alignment between supply (assisted reproduction research) and demand (society). Study funding/competing interest(s): PTQ-14-06718 of the Spanish MINECO Torres Quevedo programme (FAM).
ARTICLE | doi:10.20944/preprints202208.0406.v1
Subject: Social Sciences, Other Keywords: Smartphone; App Usage; Transport Mode Usage; Latent Class Cluster Analysis; Multimodality; Environment
Online: 24 August 2022 (03:59:57 CEST)
Smartphone-based mobility apps enable users to make informed transportation decisions, offering instant access to transport-related information. This development has created a smartphone-enabled ecosystem of mobility services in developed countries while it is slowly picking up pace in the global south, which can contribute towards the decarbonization of urban transport. Work on this has already started in India, and there is considerable evidence indicating the profound impact of these apps on the perceived utility and usage of transport modes, with far-reaching implications for sustainable development goals (SDGs). However, for most users, the use of smartphone apps is a novel trend, and the knowledge of the impacts of usage of existing apps on the usage pattern of transport modes by various user groups is essential for positioning new consolidated app-based services soon. Against this backdrop, the present study uses latent class cluster analysis to empirically investigate the impacts of mobility apps on transport mode usage patterns in Delhi by classifying users into latent classes based on socioeconomic characteristics, attitudes/preferences, smartphone app usage, and mode usage pattern. The characteristics of the latent class and factors affecting the individual’s probability of being classified to these cluster have been discussed, along with some measures to encourage app-based mobility for each cluster.
ARTICLE | doi:10.20944/preprints202103.0360.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: music production, latent space, live system, recurrent autoencoder, dynamic time warping, compression
Online: 15 March 2021 (08:04:00 CET)
The onset of coronavirus disease 2019 (COVID-19), an infectious disease caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), has sparked unprecedented change. Due to the public health guidelines imposed during the COVID-19 pandemic, there is no longer sufficient street traffic for remaining buskers to generate sufficient revenue, leading a majority of street musicians to pursue remote music production. However, real-time music production is notoriously difficult due to the excessively high latencies that current video call platforms such as Zoom and Google Meet harbor. In this paper, we propose an architecture for a platform with end-to-end, near-lossless audio transmission tailored specifically to online joint music production, called Latent Space. We discuss the usage of a recurrent autoencoder with sequence-aware encoding (RAES) and a 1D convolutional layer for audio compression, which we dub ClefNet, as well as propose a new evaluation metric for naive autoencoders (AEs), MSE-DTW loss, which combines the traditional mean square error (MSE) loss function with dynamic time warping (DTW) to prevent an increase in loss when the target sequence predicted by the AE is strictly a temporal variation of the source sequence. Moreover, we detail the logistics of a live system implementation which uses the Web Audio API to extract raw audio samples in real-time to feed into our client-side model before relaying the traffic using peer-to-peer WebRTC technology. The Latent Space platform can be accessed at https://latent-space.tech, and the code and data can be found under the MIT License at https://github.com/rvignav/ClefNet.
ARTICLE | doi:10.20944/preprints202007.0269.v1
Subject: Keywords: regularized latent class analysis; regularization; fused regularization; fused grouped regularization; distractor analysis
Online: 12 July 2020 (16:59:18 CEST)
The last series of Raven's standard progressive matrices (SPM-LS) test were studied with respect to its psychometric properties in a series of recent papers. In this paper, the SPM-LS dataset is analyzed with regularized latent class models (RLCM). For dichotomous item response data, an alternative estimation approach for RLCMs is proposed. For polytomous item responses, different alternatives for performing regularized latent class analysis are proposed. The usefulness of the proposed methods is demonstrated in a simulated data illustration and for the SPM-LS dataset. For the SPM-LS dataset, it turned out the regularized latent class model resulted in five partially ordered latent classes.
ARTICLE | doi:10.20944/preprints201712.0185.v1
Subject: Medicine & Pharmacology, Pediatrics Keywords: SLC25A13; amino acid ratio; citrullinemia; latent liver dysfunction; mitochondrial aspartate-glutamate carrier
Online: 26 December 2017 (10:18:45 CET)
Citrullinemia is the earliest identifiable biochemical abnormality in neonates with intrahepatic cholestasis due to a citrin deficiency (NICCD) and it has been included in newborn screening panels using tandem mass spectrometry. However, only one neonate was positive among 600,000 infants born in Sapporo city and Hokkaido, Japan between 2006 and 2017. We investigated 12 neonates with NICCD who were initially considered normal in newborn mass screening (NBS) by tandem mass spectrometry, but were later diagnosed with NICCD by DNA tests. Using their initial NBS data, we examined citrulline concentrations and ratios of citrulline to total amino acids. Although their citrulline values exceeded the mean of the normal neonates and 80 % of them surpassed +3SD, all were below the cutoff of 40 nmol/mL. The ratios of citrulline to total amino acids significantly elevated in patients with NICCD compared to the control. By evaluating two indicators simultaneously, we could select about 80% of patients with missed NICCD. Introducing an estimated index comprising citrulline values and citrulline to total amino acid ratios could assure NICCD detection by NBS.
ARTICLE | doi:10.20944/preprints202105.0699.v1
Subject: Social Sciences, Accounting Keywords: latent D-scoring model; logistic item response model; identifiability; item parameter estimation; PISA
Online: 28 May 2021 (12:00:49 CEST)
This article shows that the recently proposed latent D-scoring model of Dimitrov is statistically equivalent to the two-parameter logistic item response model. An analytical derivation and a numerical illustration are employed for demonstrating this finding. Hence, estimation techniques for the two-parameter logistic model can be used for estimating the latent D-scoring model. In an empirical example using PISA data, differences of country ranks are investigated when using different metrics for the latent trait. In the example, the choice of the latent trait metric matters for the ranking of countries. Finally, it is argued that an item response model with bounded latent trait values like the latent D-scoring model might have advantages for reporting results in terms of interpretation.
ARTICLE | doi:10.20944/preprints202104.0592.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: Flexible count regression; balanced discrete gamma distribution; deviance statistic; latent equidispersion; likelihood ratio
Online: 22 April 2021 (08:55:29 CEST)
Most existing flexible count regression models allow only approximate inference. Balanced discretization is a simple method to produce a mean-parametrizable flexible count distribution starting from a continuous probability distribution. This makes easy the definition of flexible count regression models allowing exact inference under various types of dispersion (equi-, under- and overdispersion). This study describes maximum likelihood (ML) estimation and inference in count regression based on balanced discrete gamma (BDG) distribution and introduces a likelihood ratio based latent equidispersion (LE) test to identify the parsimonious dispersion model for a particular dataset. A series of Monte Carlo experiments were carried out to assess the performance of ML estimates and the LE test in the BDG regression model, as compared to the popular Conway-Maxwell-Poisson model (CMP). The results show that the two evaluated models recover population effects even under misspecification of dispersion related covariates, with coverage rates of asymptotic 95% confidence interval approaching the nominal level as the sample size increases. The BDG regression approach, nevertheless, outperforms CMP regression in very small samples (n = 15 − 30), mostly in overdispersed data. The LE test proves appropriate to detect latent equidispersion, with rejection rates converging to the nominal level as the sample size increases. Two applications on real data are given to illustrate the use of the proposed approach to count regression analysis.
ARTICLE | doi:10.20944/preprints201808.0076.v1
Subject: Engineering, Energy & Fuel Technology Keywords: Phase Change Materials; PCM; Thermal Energy Storage; Latent Heat Storage; Wood Stove; Stovepipe
Online: 3 August 2018 (15:53:56 CEST)
Batch combustion in wood log stoves is a promising application for latent heat storage (LHS), due to the transient heat production with high peak effects. The current study aimed at designing a compact, passive and durable LHS system storing a substantial part of the heat release during batch combustion and effectively releasing the stored heat to the room for 6 to 10 hours after the last batch. The LHS system consists of a coaxial cylinder located at the top of the wood stove, replacing part of the regular stovepipe. Internal metallic fins were applied as heat transfer enhancement to homogenize the temperature distribution inside the PCM. The effect of radial fin lengths was numerically investigated through a parametric study using five different fin lengths within the PCM. Using 35-mm fins in the 70-mm PCM layer yielded the best trade-off for the application. This configuration enabled achieving a slow but close to complete melting of the PCM within a realistic combustion duration, while avoiding overheating the PCM above its degradation temperature. Thereafter, the discharge allowed releasing the stored latent heat for 6 hours. The exhaust gas inlet temperature proved to have a strong influence on the PCM thermal performance.
ARTICLE | doi:10.20944/preprints202301.0284.v1
Subject: Engineering, Civil Engineering Keywords: Cycling behaviour; Structural equation modelling (SEM); Latent variable; Individuals travel survey; Cyclists behavioural analysis
Online: 16 January 2023 (09:58:39 CET)
Studying Active Transportation (AT) is widespread in America and Europe's research and stud-ies that include Latent variables (LV) are growing to identify exact results of finding what to do to increase the utility of AT. LVs help us to have more accurate research. LVs are defined as psy-chological factors like feeling safe while you ride at night and thus they are not subjective and hard to understand but very important to increase the utility of using AT modes. In this study, most of the previous studies about bicycling have been reviewed. In this research, different var-iables including subjective and LVs are conducted for maximizing the use of bicycle utility and introduced to have better sight for future researchers to deal with modeling AT mode choice. Results represent the importance of having 'will' for using a bicycle, especially in difficult situa-tions and cultural barriers that affect the cycling of women.
ARTICLE | doi:10.20944/preprints202212.0095.v1
Subject: Engineering, Energy & Fuel Technology Keywords: hybrid DHW system; water heater; PCM; melting temperature; latent heat storage; produced water cost
Online: 6 December 2022 (08:51:05 CET)
The solar water heater must be integrated into future residential buildings as the main energy source, which will subsequently reduce the energy cost of water heating. An original configuration for efficient Domestic Hot Water "DHW" storage tank is developed and experimentally evaluated under Saharan climate. This novel DHW configuration includes a hybrid (solar and electric) energy system with a flat plate solar collector coupled with an electric heater. Additionally, phase change material "PCM" mixture that is composed of paraffin wax and animal fat with a melting temperature between 35.58°C and 62.58°C and latent heat between 180 and 210 kJ/kg is integrated into this novel tank configuration. The experimental results indicated that hot water production by using latent heat storage could be economically attractive. In this proposed configuration, one liter of hot water may cost around 0.1362 DZD/liter (i.e., 0.00096 US$/liter) compared to 0.4431 DZD/liter for the conventional water heater, an average energy cost savings of 69.26%. On a yearly basis, the average energy cost savings may reach up to 80.25% if optimal tilt for the solar collector is adopted on a monthly basis. The flat plate collector may be vulnerable to convective heat transfer, and therefore, other solar collectors such as vacuum tube collectors may provide enhanced energy performance.
ARTICLE | doi:10.20944/preprints201701.0107.v1
Subject: Behavioral Sciences, Developmental Psychology Keywords: intelligence; development of intelligence; cognitive development; network models; factor models; psychometrics; latent variable models
Online: 25 January 2017 (03:14:34 CET)
Cronbach’s (1957) famous division of scientific psychology into two disciplines is still actual for the fields of cognition (general mechanisms) and intelligence (dimensionality of individual differences). The welcome integration of the two fields requires the construction of mechanistic models of cognition and cognitive development that explain key phenomena in individual differences research. In this paper we argue that network modeling is a promising approach to integrate the processes of cognitive development and (developing) intelligence into one unified theory. Network models are defined mathematically, describe mechanisms on the level of the individual, and are able to explain positive correlations among intelligence subtest scores - the empirical basis for the well-known g-factor - as well as more complex factorial structures. Links between network modeling, factor modeling and item response theory allow for a common metric, encompassing both discrete and continuous characteristics, for cognitive development and intelligence.
ARTICLE | doi:10.20944/preprints202101.0066.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: predictive modelling; latent information extraction; machine learning; forward model; backward model; ambulance calls; attendance; conveyance
Online: 4 January 2021 (16:44:11 CET)
A novel machine learning approach is presented in this paper, based on extracting latent information and using it to assist decision making on ambulance attendance and conveyance to a hospital. The approach includes two steps: in the first, a forward model analyzes the clinical and, possibly, non-clinical factors (explanatory variables), predicting whether positive decisions (response variables) should be given to the ambulance call, or not; in the second, a backward model analyzes the latent variables extracted from the forward model to infer the decision making procedure. The forward model is implemented through a machine, or deep learning technique, whilst the backward model is implemented through unsupervised learning. An experimental study is presented, which illustrates the obtained results, by investigating emergency ambulance calls to people in nursing and residential care homes, over a one-year period, using an anonymized data set provided by East Midlands Ambulance Service in United Kingdom.
ARTICLE | doi:10.20944/preprints201807.0628.v1
Subject: Physical Sciences, Applied Physics Keywords: Binding energy in lasers; Latent Binding energy in white light; Harnessing binding energy in sunlight
Online: 31 July 2018 (15:22:46 CEST)
Physics behind collimated highly directional nature of lasers, and factors that keep the seven coloured waves that form white light together during their journey from Sun to Earth, in the face of the natural disruptive forces, is not fully understood. Energy levels were measured, in terms of alterations in induced current and voltage, in beams from a red laser, white LED light and the Sunlight before and during their disruption by diffusers (frosted glass for the lasers) and diffractors (diffraction grating for the white light) using a photovoltaic solar cell panel attached to a digital multimeter. Results show that disruption of the beams results in release of extra energy named as ‘Latent Light Binding’ Energy’. It is hypothesized that the ‘binding’ energy keeps laser waves firmly bound together both end-on and side-on enabling laser beams to travel long distances in collimated manner. Likewise, the 7 coloured waves that constitute white light are kept together, probably side-on, in their journey from the Sun to the Earth. The observation that diffraction of sunbeam is associated with increased power generation provides a new lead to improve harnessing of solar energy, where, currently, the focus is mainly on improving efficiency of photovoltaic cell.
ARTICLE | doi:10.20944/preprints201908.0250.v1
Subject: Earth Sciences, Atmospheric Science Keywords: surface heat fluxes; latent heat flux; sensible heat flux; tropics; extratropics; air-sea exchanges; lower atmosphere variables
Online: 25 August 2019 (14:39:40 CEST)
Ocean surface heat fluxes play a significant role in the genesis and evolution of various marine-based atmospheric phenomena, from the synoptic scale down to the microscale. While in-situ measurements from buoys and flux towers will continue to be the standard in regards to surface heat flux estimates, they commonly have significant gaps in temporal and spatial coverage. Previous and current satellite missions have filled these gaps; though they may not observe the fluxes directly, they can measure the variables needed (wind speed, temperature, and humidity) to estimate latent and sensible heat fluxes. However, current remote sensing instruments have their own limitations, such as infrequent coverage, signals attenuated by precipitation, or both. The Cyclone Global Navigation Satellite System (CYGNSS) mission overcomes these limitations over the tropical and subtropical oceans by providing improved coverage in nearly all weather conditions. While CYGNSS (Level 2) primarily estimates surface winds, when coupled with observations or estimates of temperature and humidity from reanalysis data, it can provide estimates of latent and sensible heat fluxes along its orbit. This paper describes the development of the Surface Heat Flux Product for the CYGNSS mission, its current results, and expected improvements and changes in future releases.
ARTICLE | doi:10.20944/preprints202102.0494.v1
Subject: Life Sciences, Biochemistry Keywords: Latent membrane protein 1; Epstein-Barr virus; Herpesvirus; Proteomics; Mass spectrometry; interactions; signaling; extracellular vesicles; exosomes; CD63; Tetraspanin
Online: 22 February 2021 (16:27:27 CET)
Abstract Tetraspanin CD63 is a cluster of cell surface proteins with four transmembrane domains which associates with tetraspanin-enriched microdomains and typically localizes to late endosomes and lysosomes. CD63 plays an important role in cellular trafficking of different proteins, EV cargo sorting and vesicles formation. We have preciously shown that CD63 is important in LMP1 trafficking to EVs and this also affects LMP1 mediated intracellular signaling including MAPK/ERK, NF-κB and mTOR activation. Using the BioID combined with mass spectrometry, we sought to define the broad CD63 interactome and how LMP1 modulates this network of interacting proteins. We identified a total of 1600 total proteins as proximal interacting newtwork of proteins to CD63. Biological process enrichment analysis revealed significant involvement in signal transduction, cell communication, protein metabolism and transportation. The CD63 only interactome was enriched in Rab GTPases, SNARE proteins and sorting nexins while adding LMP1 into the interactome increased presence of signaling and ribosomal proteins. Our results showed that LMP1 alters the CD63 interactome, shifting the network of proteins enrichment from protein localization and vesicle mediated transportation to metabolic processes and translation. We also show that LMP1 interacts with mTor, Nedd4L and PP2A indicating formation of a multiprotein complex with CD63 thereby potentially regulating LMP1 dependent mTor signaling. Collectively, the comprehensive analysis of CD63 proximal interacting proteins provides insights into network of partners required for endocytic trafficking, extracellular vesicle cargo sorting, formation and secretion.
ARTICLE | doi:10.20944/preprints202110.0107.v1
Subject: Keywords: missing item responses; multiple imputation; item response model; PISA; country comparisons; Mislevy-Wu model; latent ignorability; nonignorable item responses
Online: 6 October 2021 (12:52:48 CEST)
Missing item responses are prevalent in educational large-scale assessment studies like the programme for international student assessment (PISA). The current operational practice scores missing item responses as wrong, but several psychometricians advocated a model-based treatment based on latent ignorability assumption. In this approach, item responses and response indicators are jointly modeled conditional on a latent ability and a latent response propensity variable. Alternatively, imputation-based approaches can be used. The latent ignorability assumption is weakened in the Mislevy-Wu model that characterizes a nonignorable missingness mechanism and allows the missingness of an item to depend on the item itself. The scoring of missing item responses as wrong and the latent ignorable model are submodels of the Mislevy-Wu model. This article uses the PISA 2018 mathematics dataset to investigate the consequences of different missing data treatments on country means. Obtained country means can substantially differ for the different scaling models. In contrast to previous statements in the literature, the scoring of missing item responses as incorrect provided a better model fit than a latent ignorable model for most countries. Furthermore, the dependence of the missingness of an item from the item itself after conditioning on the latent response propensity was much more pronounced for constructed-response items than for multiple-choice items. As a consequence, scaling models that presuppose latent ignorability should be refused from two perspectives. First, the Mislevy-Wu model is preferred over the latent ignorable model for reasons of model fit. Second, we argue that model fit should only play a minor role in choosing psychometric models in large-scale assessment studies because validity aspects are most relevant. Missing data treatments that countries can simply manipulate (and, hence, their students) result in unfair country comparisons.
ARTICLE | doi:10.20944/preprints202109.0389.v1
Subject: Engineering, Other Keywords: Deep learning; Variational Autoencoders (VAEs); data representation learning; generative models; unsupervised learning; few shot learning; latent space; transfer learning
Online: 22 September 2021 (16:04:22 CEST)
Despite the importance of few-shot learning, the lack of labeled training data in the real world, makes it extremely challenging for existing machine learning methods as this limited data set does not represent the data variance well. In this research, we suggest employing a generative approach using variational autoencoders (VAEs), which can be used specifically to optimize few-shot learning tasks by generating new samples with more intra-class variations. The purpose of our research is to increase the size of the training data set using various methods to improve the accuracy and robustness of the few-shot face recognition. Specifically, we employ the VAE generator to increase the size of the training data set, including the basic and the novel sets while utilizing transfer learning as the backend. Based on extensive experimental research, we analyze various data augmentation methods to observe how each method affects the accuracy of face recognition. We conclude that the face generation method we proposed can effectively improve the recognition accuracy rate to 96.47% using both the base and the novel sets.
ARTICLE | doi:10.20944/preprints201805.0102.v2
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: machine learning; algorithms; natural language processing, deep learning, vector space models, semantic similarity, distributional semantics, latent semantic analysis, word2vec
Online: 10 May 2018 (05:56:56 CEST)
“You should know the words by the company they keep!” has been one of the most famous slogans attributed to John Rubert Firth, 1957. This has ignited a whole school in linguistic research known as the British empiricist contextualism. Sixty years later, many un- or semi-supervised machine learning algorithms have been successfully designed and implemented aiming at extracting word meaning from within the context of a text corpus. These algorithms treat words, more or less, as vectors of real numbers representing frequencies of word occurrences within context and word meaning as positions of words in a high-dimensional vector space model. Word associations, in turn, are treated as calculated distances among them. With the rise of Deep Learning (DL) and other artificial neural networks based architectures, learning the positioning of words and extracting word associations as measured by their distances has further improved. In this paper, however, we revisited the main stream of algorithmic approaches and set the stage for a partly cross-disciplinary evaluation framework to judge about the nature of the extracted word associations by state-of-the-art machine learning algorithms. Our preliminary results are based on word associations extracted from the application of DL framework on a Google News text corpus, as well as on comparisons with human created word association lists such as word collocation dictionaries and psycholinguistic experiments. The results and conclusions provide some insights into the inherited limitations in interpreting the type of word associations and underpinning relations between words with inevitable consequences in other areas, such as extraction of knowledge graphs or image understanding.
ARTICLE | doi:10.20944/preprints202107.0282.v1
Subject: Physical Sciences, Radiation & Radiography Keywords: PADC; poly(allyl diglycol carbonate); latent track; track core radius; G value; layered structure; REFIT; NISE; detection threshold; chemical criterion
Online: 13 July 2021 (09:34:32 CEST)
Modified structure along latent tracks and track formation process have been investigated in poly(allyl diglycol carbonate), PADC, which is well recognized as a sensitive etched track detector. This knowledge is essential to develop novel detectors with improved track registration property. The track structures of protons and heavy ions (He, C, Ne, Ar, Fe, Kr and Xe) have been examined by means of FT-IR spectrometry, covering the stopping power region between 1.2 to 12,000 eV/nm. Through a set of experiments on low-LET radiations – such as gamma ray -, multi-step damage process by electron hits was confirmed in the radiation-sensitive parts of the PADC repeat-unit. From this result, we unveiled for the first-time the layered structure in tracks, in relation with the number of secondary electrons. We also proved that etch pit was formed when at least two repeat-units were destroyed along the track radial direction. To evaluate the number of secondary electrons around tracks, a series of numerical simulations were performed with Geant4-DNA. Therefore, we are proposing new physical criterions to describe the detection thresholds. Futhermore, we propose a present issue of the definition of detection threshold for semi-relativistic C ions. And as a possible chemical criterion, formation density of hydroxyl group is suggested to express the response of PADC.
ARTICLE | doi:10.20944/preprints202011.0056.v1
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: COVID-19; Deep Learning; Natural Language Processing; Topic Modelling; Text Classification; Latent Dirichlet allocation (LDA); Non-negative matrix factorization (NMF)
Online: 2 November 2020 (15:24:20 CET)
Ongoing COVID-19 Pandemic has resulted into massive damage to various platforms of global economy which has caused disruption to human livelihood. Natural Language Processing has been extensively used in different organizations to categorize sentiments, perform recommendation, summarizing information and topic modelling. This research aims to understand the non-medical impact of COVID-19 on global economy by leveraging the natural language processing methodology. This methodology comprises of text classification which includes topic modelling on unstructured COVID-19 media articles dataset provided by Anacode. Like other Natural Language Processing algorithms, Latent Dirichlet allocation (LDA) and Non-negative matrix factorization (NMF) has been proposed to classify the media articles dataset in order to analyze COVID-19 pandemic impacts in the different sectors of global economy. Model Accuracy was examined based on the coherence and perplexity score which came out to be 0.51 and -10.90 using LDA algorithm. Both the LDA and NMF algorithm identified similar prevalent topics that was impacted by COVID-19 pandemic in multiple sectors of economy. Through intertopic distance map visualization produced by LDA algorithm, it can be reciprocated that general industries which includes children schooling, parental care, and family gatherings had the major impact followed by business sector and the financial industry.
ARTICLE | doi:10.20944/preprints202111.0576.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: vegetation cover; latent heat of fusion; climate model; climate change; energy balance models; finite volume method; WENO reconstruction; Runge-Kutta TVD
Online: 30 November 2021 (17:35:17 CET)
The aim of this work is to introduce a mathematical model representing the evolution of the temperature in a vegetation cover and the ground underneath it. Vegetation, and its interaction with soil, plays a very important role in the protection of soil surface from the action of sun and precipitations. A reduction in the vegetated mass increase the risk of desertification, soil erosion or surface runoffs which which can give rise to soil loss and sediment retention. These processes can favour climate change and global warming, which are major concerns nowadays. The mathematical model presented takes into account the main processes involved in vegetation cover and the interaction with the soil, among which, we can mention the Leaf Area Index, which is a dimensionless quantity defined as the one-sided green leaf area per unit ground surface area, or albedo and co-albedo which are clearly influenced by the vegetation. It is also considered a nonlinear heat capacity in the soil which incorporates the latent heat of fusion, when the phase change takes place. The numerical technique used to solve the mathematical model is based on a finite volume scheme with Weighted Essentially Non Oscillatory technique for spatial reconstruction and the third order Runge-Kutta Total Variation Diminishing numerical scheme is used for time integration. Some numerical examples are solved to obtain the distribution of temperature both in the vegetation cover and the soil.
ARTICLE | doi:10.20944/preprints201901.0083.v1
Subject: Earth Sciences, Geoinformatics Keywords: earthquake; anomaly detection; Google Earth Engine; outliers; interquartile range (IQR); multiparameter; brightness temperature (BT); latent heat flux (LE); land surface temperature; wind speed
Online: 9 January 2019 (11:59:14 CET)
One of the most destructive natural disasters is the earthquake which brings enormous risks to humankind. The objective of the current study was to determine the Earthquake’s remote sensing multiparameter (i.e. land surface temperature (LST), air temperature, specific humidity, precipitation and wind speed) spatiotemporal anomaly of many earthquake samples occurred during 2018 around the world. In this research 11 earthquake (M > 6:0) studied (4 samples selected in a land with transparent sky situations, 3 samples in land within cloudy situations and 4 samples in marine earthquakes). The interquartile range (IQR) and mean ± 2σ methods utilized to improve the efficiency of anomalous differences. As a result, based on the IQR method, negative anomaly before the event detected during the daytime in Mexico and during the nighttime in Afghanistan. In addition, a negative outlier of brightness temperature (BT) detected in Alaska before, after and during the event. In contrast, based on IQR and mean ± 2σ positive anomaly detected in precipitation before and after the event in all investigated examples. According to mean ± 2σ, negative anomaly LST, specific humidity, sea surface temperature (SST_100) and wind detected in most examined earthquake samples. In contrast, positive SST_0 anomaly observed in Fiji and Honduras after the earthquake. Our results suggested in marine earthquakes, for earthquake forecasting we can merge a prior negative anomaly in the wind speed and SST_100. Regarding the in land cloudy sky earthquakes, merging anomaly parameters could be the negative prior anomaly in BT, skin temperature, in contrast, a positive anomaly in precipitation. In land transparent sky earthquake, usually negative prior anomalies in air temperature, specific humidity and LST.
ARTICLE | doi:10.20944/preprints201708.0006.v1
Subject: Keywords: Water Framework Directive; ecological and microbiological water quality; choice experiment; willingness to pay for river water quality; conditional logit; latent class analysis; nonmarket benefits
Online: 3 August 2017 (05:52:45 CEST)
One important motivation for the implementation of the Water Framework Directive is the creation of non-market environmental benefits such as improved ecological quality, or greater opportunities for open-access river recreation via microbial pollution remediation. Pollution sources impacting on ecological or recreational water quality can be uncorrelated but non-market benefits arising from riverine improvements are typically conflated within benefit valuation studies. Using stated preference choice experiments, we seek to disaggregate these sources of value for different river users, thereby allowing decision makers to understand the consequences of adopting alternative investment strategies. Our results suggest anglers derive greater value from improvements to the ecological quality of river water, in contrast to swimmers and rowers for whom greater value is gained from improvements to recreational quality. We also find three distinct groups of respondents: a majority preferring ecological over recreational improvements, a substantial minority holding opposing preference orderings and a small proportion expressing relatively low values for either form of river quality enhancement. As such, this research demonstrates that the non-market benefits which may accrue from different types of water quality improvements are nuanced in terms of their potential beneficiaries and, by inference, their overall value and policy implications.
REVIEW | doi:10.20944/preprints202105.0254.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: deep learning; deep neural networks; recurrent LSTM models; attention layers; latent variable extraction; domain adaptation; yield and growth prediction in greenhouses; energy optimization in retail refrigerator systems; verification and recognition of expiry date in retail food packaging.
Online: 11 May 2021 (15:46:14 CEST)
This paper provides a review of an emerging field in the food processing sector, referring to efficient and safe food supply chains, ’from farm to fork’, as enabled by Artificial Intelligence (AI). Recent advances in machine and deep learning are used for effective food production, energy management and food labeling. Appropriate deep neural architectures are adopted and used for this purpose, including Fully Convolutional Networks, Long Short-Term Memories and Recurrent Neural Networks, Auto-Encoders and Attention mechanisms, Latent Variable extraction and clustering, as well as Domain Adaptation. Three experimental studies are presented, illustrating the ability of these AI methodologies to produce state-of-the-art performance in the whole food supply chain. In particular, these concern: (i) predicting plant growth and tomato yield in greenhouses, thus matching food production to market needs and reducing food waste or food unavailability; (ii) optimizing energy consumption across large networks of food retail refrigeration systems, through optimal selection of systems that can get shut-down and through prediction of the respective food de-freezing times, during peaks of power demand load; (iii) optical recognition and verification of food consumption expiry date in automatic inspection of retail packaged food, thus ensuring safety of food and people’s health.