ARTICLE | doi:10.20944/preprints202212.0123.v2
Online: 9 December 2022 (10:10:08 CET)
(1) Background: This study aims to validate the use of Bloom's revised taxonomy as an instrument for the design of assessment tests; (2) Methods: A validation has been carried out by external judges, as well as by teachers and students, validating the instrument by means of Aiken's V; (3) Results: Judges, teachers and students consider Bloom's revised taxonomy as an effective tool for the design of assessment tests; (4) Conclusions: Using Bloom's revised taxonomy as a model for designing assessment tests promotes learning.
ARTICLE | doi:10.20944/preprints201807.0181.v1
Online: 10 July 2018 (13:41:28 CEST)
Data from NASA’s Soil Moisture Active Passive Mission (SMAP) and from the California Cooperative Oceanic Fisheries Investigations (CalCOFI) were used to examine the freshening that occurred during 2015-2016 in the Southern California Current System. Overall the freshening was found to be related to the 2014-2016 Northeast Pacific Warm Anomaly. The primary goal was to determine the feasibility of using SMAP data to observe the surface salinity signal associated with the warming. As a first step direct comparisons were done with salinity from the CalCOFI data at one-meter depth. During 2015 SMAP was saltier than CalCOFI by 0.5 PSU, but biases were reduced to < 0.1 PSU during 2016. South of 33°N, and within 100 km of the coast, SMAP was fresher in 2015 by almost 0.2 PSU. CalCOFI showed freshening of 0.1 PSU. North of 33°N SMAP and CalCOFI saw significant freshening in 2016, SMAP by 0.4 PSU and CalCOFI by 0.2 PSU. Differences between SMAP and CalCOFI are consistent with the increased stratification in 2015 and changes in the mixed layer depth.
TECHNICAL NOTE | doi:10.20944/preprints202206.0097.v1
Online: 7 June 2022 (08:12:53 CEST)
InSAR and associated analytic methods can measure surface deformation from low earth orbit with a claimed accuracy of centimeters to millimeters. The realized accuracy depends on the area being measured and on the choice of analytic method, suggesting one choose a method in response to the area being measured. Here we consider a specific fixed analytic method and compare the results it produces to measurements gathered from other means in a variety of settings. In particular we compare Sentinel-1 InSAR with GPS at the Kilauea volcano around the 2018 eruption, with GPS in the city of Arica, Chile, and with public survey data at a decommissioned tailings mine. In addition, we compare two independent Sentinel-1 InSAR analyses for a railway station in Oslo, Norway. Our goal is estimate the accuracy of a fully automated Sentinel-1 InSAR pipeline in various settings. Our conclusions are that centimeter level accuracy is a reasonable claim in many, but not all settings, and that accuracy is typically not lost by using an automated pipeline, instead of hand-selecting and tuning parameters.
CONCEPT PAPER | doi:10.20944/preprints202006.0294.v1
Online: 24 June 2020 (09:48:04 CEST)
Ethics is a research field that is obtaining more and more attention in Computer Science due to the proliferation of artificial intelligence software, machine learning algorithms, robot agents (like chatbot), and so on. Indeed, ethics research has produced till now a set of guidelines, such as ethical codes, to be followed by people involved in Computer Science. However, a little effort has been spent for producing formal requirements to be included in the design process of software able to act ethically with users. In the paper, we investigate those issues that make a software product ethical and propose a set of metrics devoted to quantitatively evaluate if a software product can be considered ethical or not.
ARTICLE | doi:10.20944/preprints201704.0159.v1
Online: 25 April 2017 (11:19:25 CEST)
YG-13A represents the highest level of Chinese SAR satellites to date. In this paper, we report on experiments conducted to improve and validate ranging accuracy with YG-13A. We analyze the error sources in the YG-13A ranging system, such as atmospheric path delay, and transceiver channel delay. A real-time atmospheric delay correction model is established to calculate the atmospheric path delay, considering the troposphere delay and ionosphere delay. Six corner reflectors (CRs) were set up to ensure the accuracy of validation methods. Pixel location accuracies of up to 0.479-m standard deviation can be achieved after a complete calibration. We further demonstrate that the adjustment of the CRs can cause a marginal loss of ranging precision. After eliminating this error, the ranging accuracy is improved to 0.237 m. For YG-13A, a single frequency GPS receiver is used and the orbital nominal accuracy is 0.3 m, which is the biggest factor restricting its ranging accuracy. Our results show that the ranging accuracy of YG-13A can achieve decimeter-level, which is lower than centimeter-level accuracy with TerraSAR-X loading a dual frequency GPS. YG-13A has great convenience in terms of access to control points and target location that does not depend on ground equipment.
ARTICLE | doi:10.20944/preprints202212.0262.v1
Subject: Behavioral Sciences, Clinical Psychology Keywords: temperament; measurement; mood disorder; validation; Korean
Online: 15 December 2022 (03:05:48 CET)
Background and Objectives: The Temperament Evaluation of Memphis, Pisa, Paris and San Diego Autoquestionnaire (TEMPS-A) is designed to assess affective temperaments. The short version of TEMPS-A (TEMPS-A-SV) was translated into diverse languages for its broad application in research and clinical settings. However, no study was conducted to validate the Korean version of TEMPS-A-SV among patients with mood disorders. The purpose of this study is to examine the reliability and validity of the TEMPS-A-SV in mood disorder patients of the Korean population. Materials and Methods: In this cross-sectional retrospective study, a total of 715 patients (267 patients with major depressive disorder, 94 patients with bipolar disorder I, and 354 patients with bipolar disorder II) completed the Korean TEMPS-A-SV. Cronbach's alpha and McDonald’s omega was used to assess reliability. Exploratory factor analysis (EFA) was also performed. Spearman's correlation coefficient was used to examine associations between five temperaments. The difference in five temperament scores between gender or diagnosis groups were analyzed, and the correlation between five temperament scores and age were tested. Results: The Korean TEMPS-A-SV displayed good internal consistency (α = 0.65–0.88, ω = 0.66–0.9) and significant correlations between the subscales except one (correlation between hyperthymic and anxious). Using EFA, a two-factor structure was produced: Factor I (cyclothymic, depressive, irritable, and anxious) and Factor II (hyperthymic). The cyclothymic temperament score differed by gender and the anxious temperament score was significantly correlated with age. All the temperaments, except for irritable temperament, showed significant differences between diagnosis groups. Conclusions: Overall, our findings indicate that TEMPS-A-SV is a valid and reliable measure of estimating affective temperaments among Koreans. Our results confirm validity of TEMPS-A-SV among Korean patients with mood disorders. However, more study is required on affective temperaments and associated characteristics in people with mood disorders.
COMMUNICATION | doi:10.20944/preprints202210.0427.v1
Subject: Physical Sciences, Fluids & Plasmas Keywords: Turbulence model; Reynolds stresses; RANS; validation rule
Online: 27 October 2022 (08:39:32 CEST)
In this paper, for the Reynolds-averaged Navier-Stokes equations, a self-closed turbulence model without any adjustable parameter is formulated. The validation rule for self-closed turbulence model is rigorously derived from the Reynolds-averaged Navier-Stokes equation. The rule is not effected by turbulence modelling on the Reynolds stresses.
REVIEW | doi:10.20944/preprints202206.0047.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: Rehabilitation; new technology; validation; study design; methods
Online: 3 June 2022 (11:12:44 CEST)
Important current limitations of the implementation of Evidence-Based Practice (EBP) in the rehabilitation field are related to the validation process of new technologies and interventions. Indeed, most of the strict guidelines that have been developed for the validation of new drugs (i.e., double or triple blinded, strict control of the doses and intensity) cannot – or only partially – be applied in rehabilitation. Well powered high quality randomized controlled trials are more difficult to organize in rehabilitation (e.g., longer duration of the intervention in rehabilitation, more difficult to standardize the intervention compared to drugs’ validation studies, limited funding’s since not sponsorized by big pharma companies), which reduces the possibility of conducting systematic reviews and meta-analyses, as currently high level of evidence is sparse. The current limitations of EBP in rehabilitation are presented in this paper and innovative solutions are suggested such as: technology-supported rehabilitation systems, continuous assessment, pragmatic trials, rehabilitation treatment specification systems, and advanced statistical methods, to tackle the limitations to increase the quality of the research in rehabilitation. The development and implementation of new technologies should increase the quality of research and the level of evidence supporting rehabilitation provided some adaptation in our research methodology.
Subject: Earth Sciences, Geophysics Keywords: InSAR; InSAR calibration/validation; atmosphere/troposphere variations
Online: 21 December 2020 (12:34:10 CET)
Atmospheric propagational phase variations are the dominant source of error for InSAR timeseries analysis, generally exceeding uncertainties from poor SNR or signal correlation. The spatial properties of these errors have been well studied, but their temporal dependence and correction have received much less attention to date. We present here an evaluation of the magnitude of tropospheric artifacts in derived time series after compensation using an algorithm that requires only the InSAR data themselves. The level of artifact reduction equals or exceeds that from many weather model based methods, while avoiding the need to access fine-scale atmosphere parameters globally at all times. Our method consists of identifying all points in an InSAR stack with consistently high correlation, and computing, then removing, a fit of the phase at each of these points with respect to elevation. Comparison with GPS truth yields a reduction of 3, from an rms misfit of 5-6 cm to ~2 cm over time. This algorithm can be readily incorporated into InSAR processing flows without need for outside information.
ARTICLE | doi:10.20944/preprints201810.0089.v2
Subject: Life Sciences, Endocrinology & Metabolomics Keywords: Metabolomics; Benchtop NMR; Biomarkers; Biomolecules; Validation; Protocol
Online: 5 December 2018 (16:14:52 CET)
Novel sensing technologies for liquid biopsies offer a promising prospect for the early detection of metabolic conditions through -omics techniques. Indeed, high-field NMR facilities are routinely used for metabolomics investigations on a range of biofluids in order to rapidly recognize unusual metabolic patterns in patients suffering from a range of diseases. However, these techniques are restricted by the prohibitively large size and cost of such facilities, suggesting a possible role for smaller, low-field NMR instruments in biofluid analysis. Herein we describe selected biomolecule validation on a low-field benchtop NMR spectrometer (60 MHz), and present an associated protocol for the analysis of biofluids on compact NMR instruments. We successfully detect common markers of diabetic control at low-to-medium concentrations through optimized experiments, including glucose (≤ 2.6 mmol./L) and acetone (25 μmol./L), and additionally in readily-accessible biofluids. We present a combined protocol for the analysis of these biofluids with low-field NMR spectrometers for metabolomics, and offer a perspective on the future of this technique appealing to point-of-care applications.
ARTICLE | doi:10.20944/preprints201810.0632.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: Modelica; heat pump; HiL; model validation; testbed
Online: 26 October 2018 (12:11:57 CEST)
Heating systems such as heat pump and combined heat and power cycle systems (CHP) are representing a key component in the future smart grid. Their capability to couple the electricity and heat sector promises a massive potential to the energy transition. Hence, these systems are continuously studied numerical and experimental to quantify their potential and develop optimal control methods. Although numerical simulations provide time and cost-effective solution for system development and optimization, they are exposed to several uncertainties. Hardware in the loop (HiL) system enables system validation and evaluation under different real-life dynamic constraints and boundary conditions. In this paper, a HiL system of heat pump testbed is presented. This system is used to present two case studies. In the first case, the conventional heat pump testbed operation method is compared to the HiL operation method. Energetic and dynamic analyses are performed to quantify the added value of the HiL and its necessity for dynamics analysis. The second case, the HiL testbed is used to validate the heat pump operation in a single family house participating in a local energy market. It enables not only the dynamics of the heat pump and the space heating circuit to be validated but also the building room temperature. The energetic analysis indicated a deviation of 2% and 5% for heat generation and electricity consumption of the heat pump, respectively. The model dynamics emphasized the model capability to present the dynamics of a real system with a temporal distortion of 3%.
REVIEW | doi:10.20944/preprints201806.0191.v1
Subject: Life Sciences, Genetics Keywords: rare disease; functional genomics; genetic variant validation
Online: 12 June 2018 (12:36:08 CEST)
Many insights into human disease have been built on experimental results in Drosophila, and research in fruit flies is often justified on the basis of its predictive value for questions related to human health. Additionally, there is now a growing recognition of the value of Drosophila for the study of rare human genetic diseases, either as a means of validating the causative nature of a candidate genetic variant found in patients, or as a means of obtaining functional information about a novel disease-linked gene when there is little known about it. For these reasons, funders in the US, Europe, and Canada have launched targeted programs to link human geneticists working on discovering new rare disease loci with researchers who work on the counterpart genes in Drosophila and other model organisms. Several of these initiatives are described here, as are a number of output publications that validate this new approach.
SHORT NOTE | doi:10.20944/preprints201801.0030.v1
Subject: Earth Sciences, Space Science Keywords: tri-stereo; DSM, validation; urban surface morphology
Online: 5 January 2018 (05:18:21 CET)
A very high-resolution DSM covering an area of 400km2 over the Athens Metropolitan Area has been produced using Pleiades 1B 0,5m panchromatic tri-stereo images. Applied Remote Sensing and Photogrammetry tools have been used resulted in a 1x1m DSM over the study area. DSM accuracy has been evaluated by comparison with measured elevations with D-GPS and a reference DSM provided by the National Cadaster & Mapping Agency S.A. In addition, different combinations of stereo images have been prepared for further exploitation of the quality of the produced DSM by stereo vs. tri-stereo images. Results show that the produced by the tri-stereo images DSM has an RMSE of 1.17m in elevation (z), which is among the best reported in the relevant literature. Stereo based DSMs from the same sensor have worst performance to this end. Satellite Remote Sensing (SRS) based DSMs over urban areas provide the best cost-effective approach in comparison to airborne-based datasets due to high spatial coverage, lower cost and high temporal coverage. Pleiades-based high-quality DSM products can serve the domains of urban planning/climate, hydrological modelling and natural hazards, as major input for simulation models and morphological analysis at local scale.
ARTICLE | doi:10.20944/preprints201706.0118.v1
Subject: Medicine & Pharmacology, Nutrition Keywords: dietary assessment; FFQ; recall; nutritional biomarker; validation
Online: 27 June 2017 (04:58:14 CEST)
The development of reliable Food Frequency Questionnaires (FFQs) requires detailed information about the level and variation of dietary food intake of the target population. However, these data are often limited. To facilitate the development of new high quality FFQs and validation of existing FFQs, we developed a comprehensive National Dietary Assessment Reference Database (NDARD) detailing information about the level and variation in dietary food intake of people 20-70 years old in the general Dutch population. This paper describes the methods and characteristics of the population included in the NDARD database. 1063 men and 985 women agreed to participate in this research. Dietary intake data were collected using different FFQs, web-based and telephone-based 24-hour recalls, as well as blood and urine-based biomarkers. The baseline FFQ was completed by 1647 participants whose mean BMI was 26±4 kg/m2; 1117 participants completed telephone-based recalls and 1781 participants completed web-based recalls. According to the baseline FFQ, the mean energy intake was 2051±605 kcal/day. The percentage of total energy intake from protein was 15±2 En%, from carbohydrates was 43±6 En%, and from fat was 36±5 En%. This database will enable researchers to validate existing FFQs and to develop new high quality dietary assessment methods.
ARTICLE | doi:10.20944/preprints202301.0563.v1
Subject: Medicine & Pharmacology, Nutrition Keywords: Dietary fiber; food frequency questionnaire; questionnaire screening; validation
Online: 31 January 2023 (02:22:50 CET)
Dietary fiber has been associated with health benefits, therefore, the availability of validated tools to assess food consumption associated with high-fiber foods would allow the quantification of the intake of this functional nutrient, the identification of risk groups and target populations, and the development of public policies and/or programs aimed at improving the health of the population. In this study, a fiber intake short food frequency questionnaire (FFQ) was translated into Spanish and its content validity was determined by a group of experts, to subsequently conduct a pilot test including 198 subjects aged 36+12.5 years, residing in Chile (46 men and 150 women), with the purpose of quantifying dietary fiber intake. The global assessment of the FFQ revealed a validity coefficient of 0.98+0.02; after the application of the pilot, mean dietary fiber intake in adult Chilean residents was of 13 g per day, with similar results to those found in the National Food Consumption Survey 2010 (12.5 g per day in men, and 11.5 g in women). The FFQ is a quick and valid tool to classify people on the basis of their habitual dietary fiber intake.
ARTICLE | doi:10.20944/preprints202301.0451.v1
Subject: Chemistry, Analytical Chemistry Keywords: meloxicam; nimesulide; UV-spectrophotometric determination; cleaning validation samples
Online: 25 January 2023 (09:41:22 CET)
The spectrophotometric methods of determination of the active pharmaceutical ingredients meloxicam and nimesulide were reviewed, and a simple UV-spectrophotometric method for the determination of these active pharmaceutical ingredients in industrial equipment cleaning validation samples were proposed. The methods are based on extraction of the residual quantities of meloxicam and numesulide from the manufacturing equipment surface by the concentrated sodium carbonate solution, and the subsequent UV-spectrophotometric determination of the basic forms of the drugs at the wavelength of 362 nm for meloxicam and at 397 nm for nimesulide. The calibration graphs are linear in the range from 5 to 25 mg/L of both nimesulide and meloxicam, the molar attenuation coefficients are 6100 m2/mol for nimesulide and 9100 m2/mol for meloxicam, the limit of detection is 0.8 mg/L for nimesulide and 1.9 mg/L for meloxicam, the limit of quantification is 2.5 mg/L for nimesulide and 5.8 mg/L for meloxicam, the methods are selective with respect to the common excipients, show a good accuracy (the relative uncertainty does not exceed 4%) and precision (the relative standard deviation does not exceed 5%), do not require lengthy sample preparation and sophisticated laboratory equipment and are suitable for the routine analysis of cleaning validation samples.
REVIEW | doi:10.20944/preprints202210.0034.v1
Subject: Biology, Agricultural Sciences & Agronomy Keywords: insect; genome; biopesticide; silencing; topical; gene target; validation
Online: 5 October 2022 (10:57:47 CEST)
Global crop yields are estimated to be reduced by 30–40% per year on account of plant pests and pathogens. Agricultural insect pests raise concerns about constraining global food security and climate changes contributing to the rise of infestation. The current management relies on plant breeding, associated or not with transgenes and chemical pesticides. Both approaches face serious technology obsolescence on the field due to resistance breakdown or development of insecticide resistance. The need for new Modes of Action (MoA) approaches in managing crop health grows each year, driven by market demands to reduce economic losses and phytosanitary requirements to meet the consumer perception. Disabling pest genes by sequence-specific expression silencing is considered a promising tool in the development of environment and health respectful biopesticides. The specificity conferred by long dsRNA-base solutions give support to minimizing effects on off-targeted genes in the insect pest genome and the target gene in non-target organisms (NTOs). In this review, we summarize the current status of gene silencing by RNA interference (RNAi) for agricultural control. More specifically, we focus on the engineering, development and application of gene silencing to control Lepidoptera by the employment of non-transforming dsRNA technologies. Despite some delivery and stability drawbacks of topical applications, we reviewed works showing convincing proof-of-concept results that point to imminent innovative solutions. Considerations about the regulamentation of the ongoing research on dsRNA-based pesticides to produce commercialized products for exogenous application are discussed. Academic and industry initiatives reveal a worthy effort to accomplish controlling Lepidoptera pests with this new mode of action to provide more sustainable and reliable technologies to field management. New data on genomics of this taxon encourage the increment of a customized target genes portfolio. As a case of study, we illustrate how dsRNA and associated methodologies could be applied to control an important Lepidopteran coffee pest.
ARTICLE | doi:10.20944/preprints202208.0424.v1
Subject: Behavioral Sciences, Clinical Psychology Keywords: Aging; Attitudes; Subjective Well-being; Ageism; Psychometric Validation
Online: 25 August 2022 (03:17:06 CEST)
Scientific literature shows increased interest in the aged and the aging phenomenon. The Aging Attitudes Questionnaire - AAQ was validated for the Portuguese population to understand the importance of attitudes towards old age and their impact on the subjective well-being of the elderly. A sample of 400 subjects (from 18 to 93 years) answered a socio-demographic questionnaire, and the AAQ was composed of three subscales (psychosocial losses, physical change, and psychological growth). The CFA confirmed the tri-factorial structure with very good adjustment of the model to the data with the Cronbach alpha of the total scale scoring .84 and ranging from .65 to .77 for each factor. A total of 9 items were omitted both for poor factor loadings (<0.50. Notwithstanding, 3 items below the criteria were maintained, as they conceptually fit into the factor. Of the final 15 AAQp items, 5 belong to the Psychosocial Loss factor, 6 to Physical Change, and 4 to Psychosocial Growth. This tree factor model explained 50.1 % of the total variance. In conclusion, this study supports that AAQ has acceptable validity, confirming the composite reliability and the discriminant validity, but not the convergent validity. Through multi-group analysis, the invariance of the scale was confirmed. This validation is of pivotal importance once it allows measuring the attitudes towards aging, thus facilitating the promotion of wellbeing across the lifespan.
ARTICLE | doi:10.20944/preprints202206.0126.v1
Online: 8 June 2022 (11:03:04 CEST)
The aim of this study, for determination of verteporfin in real samples (simulated body fluid, and simulated tears, 0.9% isotonic sodium chloride solution, Lactated Ringer IV solution for infusion, 5% dextrose IV solution for infusion, lemon juice and drinking water) was the method validation and to examined by HPLC-DAD-UV. Metod validation parameters such as specificity, linearity, precision, accuracy, robustness, limit of detection (LOD) and limit of quantitation (LOQ) for verteporfin were validated and developed according to the International Conference on Harmonization (ICH) Q2 R1 guidelines. The LOD and LOQ for verteporfin were found 0.06 µg/L and 0.2 µg/L, respectively. The recovery values of the optimization and validation for verteporfin were found in the range of 97.5-100.7%. The relative standard deviations (RSD) for vertepofin were <1%. The developed method was successfully applied to real samples with high accuracy and the recoveries (%) from real samples were 99.9, 100, 98.2, 99.2, 99.4, 98.8 and 99.4, respectively.
ARTICLE | doi:10.20944/preprints202203.0350.v1
Subject: Biology, Entomology Keywords: insecticide resistance; resistance monitoring; method validation; WHO tube
Online: 25 March 2022 (15:40:56 CET)
Accurately monitoring insecticide resistance in target mosquito populations is important to combating malaria and other vector-borne diseases, and robust methods are key. The “WHO susceptibility bioassay” has been used for +60 years: mosquitoes of known physiological status are exposed to a discriminating concentration of insecticide. Several changes to the test procedures have been made historically which may seem minor but could impact bioassay results. The published test procedures and literature for this method were reviewed for methodological details. Areas where there was room for interpretation in the test procedures or where the test procedures were not being followed were assessed experimentally for impact on bioassay results: covering or uncovering of the tube end during exposure, number of mosquitoes per test unit, and mosquito age. Many publications do not cite the most recent test procedures, methodological details are reported which contradict the test procedures referenced or methodological details are not fully reported. As a result, the precise methodology is unclear. Experimental testing showed that using fewer than the recommended 15-30 mosquitoes per test unit significantly reduced mortality, covering the exposure tube had no effect, and using mosquitoes older than 2-5 days old increased mortality, particularly in the resistant strain. Recommendations are made for better reporting of experimental parameters.
ARTICLE | doi:10.20944/preprints202105.0771.v1
Subject: Physical Sciences, Acoustics Keywords: Validation; Kinematic; Inertial measurement units; motion analysis; gait
Online: 31 May 2021 (12:47:51 CEST)
Gait analysis has historically been implemented in laboratory settings with expensive instruments; however, recently, wearable sensors have allowed the integration into clinical applications and use in daily activities. Previous studies have shown poor validity of ankle joints using inertial measurement units (IMUs), especially for small movement ranges. The purpose of this study was to validate the ability of commercially available IMUs to accurately measure the ankle joint angles during running. Ten healthy subjects participated in the study. Validation was performed by comparing the ankle joint angles measured using the wearable device with those obtained using the gold standard motion capture system during running. Reliability was evaluated using the intraclass correlation coefficient and standard error of measurement, whereas validity was evaluated using Pearson coefficient correlation method. Day-to-day reliability was excellent in the two planes for ankle joints. Validity was good in both sagittal and frontal planes for ankle joints. The results suggested that the developed device might be used as an alternative tool to the 3D motion capture system.
ARTICLE | doi:10.20944/preprints202005.0431.v2
Subject: Medicine & Pharmacology, Other Keywords: Hip fracture; Casemix; Validation; Discrimination; Risk score; Calibration
Online: 9 July 2020 (17:23:04 CEST)
Objectives Independent validation of risk scores after hip fracture is uncommon, particularly for evaluation of outcomes other than death. We aimed to assess the Nottingham Hip Fracture Score (NHFS) for prediction of mortality, physical function, length of stay and postoperative complications. Design Analysis of routinely collected prospective data partly collected by follow-up interviews. Setting and Participants Consecutive hip fracture patients were identified from the Northumbria hip fracture database between 2014-2018. Patients were excluded if they were not surgically managed or if scores for predictive variables were missing. Methods C-statistics were calculated to test the discriminant ability of the NHFS, Abbreviated Mental Test Score (AMTS), and ASA grade for in-hospital, 30- and 120-day mortality, functional independence at discharge, 30-days and 120-days, length of stay, and postoperative complications. Results We analysed data from 3208 individuals, mean age 82.6 (SD 8.6). 2192 (70.9%) were female. 194 (6.3%) died during the first 30-days, 1686 (54.5%) were discharged to their own home, 211 (6.8%) had no mobility at 120-days, 141 (4.6%) experienced a postoperative complication. The median length of stay was 18 days (IQR 8-28). For mortality, c-statistics for the NHFS ranged from 0.68-0.69, similar to ASA and AMTS. For postoperative mobility, the c-statistics for the NHFS ranged from 0.74-0.83, similar to AMTS (0.61-0.82) and better than the ASA grade (0.68-0.71). Length of stay was significantly correlated with each score (p<0.001 by Jonckheere-Terpstra test); NHFS and AMTS showed inverted U-shaped relationships with length of stay. For postoperative complications, c-statistics for NHFS (0.54-0.59) were similar to ASA grade (0.53-0.61) and AMTS (0.50-0.58). Conclusions and Implications The NHFS performed consistently well in predicting functional outcomes, moderately in predicting mortality, but less well in predicting length of stay and complications. There remains room for improvement by adding further predictors such as measures of physical performance in future analyses.
ARTICLE | doi:10.20944/preprints202003.0210.v1
Subject: Medicine & Pharmacology, Other Keywords: telemedicine; Questionnaires and Surveys; validation studies; health personnel
Online: 12 March 2020 (09:58:12 CET)
Background: Telemedicine is both effective and can provide efficient care at lower costs. It also enjoys a high acceptance rate among users. The Technology Acceptance Model proposed is based on the two main concepts of ease of use and perceived usefulness and comprises three dimensions: the individual context, the technological context and the implementation or organizational context. There is not a short and validated questionnaire to check the acceptance of telemedicine services amongst health care professionals using a technology acceptance model. Objective To translate and validate a telemedicine acceptance questionnaire based in the technology acceptance model. Methods The study included the following phases: adaptation and translation of the questionnaire into Catalan and psychometric validation which include construct (exploratory factor analysis), consistency (Cronbach’s alpha) and stability (test-retest). Factor analysis was used to describe variability amongst observed variables. Results After removing incomplete responses 144 responses where considered for analysis. The internal consistency measured with the Cronbach’s alpha coefficient was good with an alpha coefficient of 0.84 (95%, CI: 0.79-0.84). The intraclass correlation coefficient was 0.93 (95% CI: 0.852-0.964). The Kaiser-Meyer-Olkin test of sampling was adequate (KMO = 0.818) and the Bartlett test of sphericity was significant (Chi-square 424.188; gl=28; P < .001), indicating that the items were appropriate for a factor analysis. Conclusions The questionnaire validated with this study has robust statistical features that make it a good predictive model of professional’s satisfaction with telemedicine programs.
ARTICLE | doi:10.20944/preprints201912.0349.v1
Subject: Engineering, Mechanical Engineering Keywords: verification and validation; computational thermal analysis; computational physics
Online: 26 December 2019 (02:35:47 CET)
In the power plant industry, the turbine inlet temperature (TIT) plays a key role in the efficiency of the gas turbine and, therefore, the overall—in most cases combined—thermal power cycle efficiency. Gas turbine efficiency increases by increasing TIT. However, an increase of TIT would increase the turbine component temperature which can be critical (e.g., hot gas attack). Thermal barrier coatings (TBCs)—porous media coatings—can avoid this case and protect the surface of the turbine blade. This combination of TBC and film cooling produces a better cooling performance than conventional cooling processes. The effective thermal conductivity of this composite is highly important in design and other thermal/structural assessments. In this article, the effective thermal conductivity of a simplified model of TBC is evaluated. This work details a numerical study on the steady-state thermal response of two-phase porous media in two dimensions using personal finite element analysis (FEA) code. Specifically, the system response quantity (SRQ) under investigation is the dimensionless effective thermal conductivity of the domain. A thermally conductive matrix domain is modeled with a thermally conductive circular pore arranged in a uniform packing configuration. Both the pore size and the pore thermal conductivity are varied over a range of values to investigate the relative effects on the SRQ. In this investigation, an emphasis is placed on using code and solution verification techniques to evaluate the obtained results. The method of manufactured solutions (MMS) was used to perform code verification for the study, showing the FEA code to be second-order accurate. Solution verification was performed using the grid convergence index (GCI) approach with the global deviation uncertainty estimator on a series of five systematically refined meshes for each porosity and thermal conductivity model configuration. A comparison of the SRQs across all domain configurations is made, including uncertainty derived through the GCI analysis.
Subject: Medicine & Pharmacology, Other Keywords: Validation; communication; questionnaire; healthcare attention; patient satisfaction; nursing.
Online: 25 March 2019 (10:40:45 CET)
Background: Healthcare attention is sometimes considered purely technical, but communication has proven to be closely related to clinical results and patient satisfaction. Therefore, evaluation of communication in the scope of healthcare is a priority. The purpose of this study was to validate and adapt, if necessary, the Spanish version of the Communication Styles Inventory (CSI) in a sample of nursing professionals. (2) Methods: The sample was made up of 2313 nursing professionals selected at random from various medical centers in Spain, and is therefore a sample actively employed at the time data were acquired. We started out from the Communication Style Inventory, a questionnaire for evaluating the predominance of certain individual communication behaviors on six scales (expressiveness, preciseness, verbal aggressiveness, questioningness, emotionality and impression manipulativeness). (3) Results: Confirmatory Factor Analysis of the model proposed showed god fit indices. The reliability of the model shown by the Cronbach’s alpha of α=.81 was adequate, and so was single-level and aggregate consistency. Finally, in the analysis of variance by type of contract, configural, metric and scalar invariance was acceptable, but not strict invariance. (4) Conclusions: This instrument progresses in measuring non-technical attributes, such as communication styles, in nursing personnel.
ARTICLE | doi:10.20944/preprints201903.0018.v1
Subject: Physical Sciences, Optics Keywords: Fast Forward Model, Infrared, Emissivity Spectrum, Satellite, Validation
Online: 4 March 2019 (08:42:15 CET)
Timely processing of observations from hyper-spectral imagers, such as SEVIRI (Spinning Enhanced Visible and Infrared Imager), largely depends on fast radiative transfer calculations. This paper mostly concerns the development and implementation of a new forward model for SEVIRI to be applied to real time processing of infrared radiances for the physical retrieval of surface temperature and emissivity. The new radiative transfer model improves computational time by a factor of ≈ 7 compared to the previous versions and makes it possible to process SEVIRI data at nearly real time. The new forward model has been applied for the simultaneous retrieval of surface temperature and emissivity in three infrared channels (8.7, 10.8, 12 μm). The inverse scheme relies on a Kalman filter approach, which allows us to exploit a sequential processing of SEVIRI observations. Based on the new forward model, the paper also presents a validation retrieval performed with in situ observations acquired during a field experiment carried out in 2017 at Gobabeb (Namib desert) validation station. Furthermore, a comparison with IASI (Infrared Atmospheric Sounder Interferometer) emissivity retrievals has been performed as well. It has been found that the retrieved emissivities are in good agreement with each other and with in situ observations, i.e. average differences are generally well below 0.01.
ARTICLE | doi:10.20944/preprints201809.0389.v1
Subject: Engineering, Mechanical Engineering Keywords: Extrapolative Predictions, Model Validation, Bayesian Inference, Structural Dynamics
Online: 19 September 2018 (16:05:50 CEST)
The creation of computer models is often driven by the need to make predictions in regions where there is no data (i.e. extrapolations). This makes validation challenging as it is difficult to ensure that a model will be suitable when it is applied in a region where there are no observations of the system of interest. The current paper proposes a method that can reveal flaws in a model which may be difficult to identify using traditional approaches for model calibration and validation. The method specifically targets the situation where one is attempting to model a dynamical system that is believed to possess time-invariant calibration parameters. The proposed approach allows these parameters to vary with time, even though it is believed that they are time-invariant. The of such an analysis is to identify key discrepancies - indications that a model has inherent flaws and, as a result, should not be used to influence decisions in regions where there is no data. The proposed method isn't necessarily a predictor of extrapolation performance, rather, it is a stringent test that, the authors believe, should be applied before extrapolation is attempted. The approach could therefore form a useful part of wider validation frameworks in the future.
ARTICLE | doi:10.20944/preprints201805.0435.v1
Subject: Engineering, Biomedical & Chemical Engineering Keywords: Kinect; validation; assessment; functional evaluation; shoulder; markerless system
Online: 30 May 2018 (05:59:51 CEST)
Optoelectronic devices are gold standard for 3D evaluation in clinics but due to the complexity of such kind of hardware and the lack of access for patients affordable, transportable and easy to use systems must be developed to be largely used in daily clinics. The KinectTM sensor presents various advantages compared to optoelectronic devices: price, transportability but also some limitations: (in)accuracy of the skeleton detection and tracking as well as the limited amount of available points that make 3D evaluation impossible. To overcome these limitations a novel method has been developed to perform 3D evaluation of the upper limbs. This system is coupled to rehabilitation exercises allowing functional evaluation while performing physical rehabilitation. To validate this new approach a double step method was used. The first step is a laboratory validation where the results obtained with the KinectTM have been compared with results obtained with an optoelectronic device, 40 healthy young adults participated in this first part. The second step was to determine the clinical relevance of such kind of measurement. Results of the healthy subjects were compared with a group of 22 elderly adults and a group of 10 chronic stroke patients to determine if different patterns can be observed. The new methodology and the different steps of the validations are presented in this paper.
COMMUNICATION | doi:10.20944/preprints201610.0106.v1
Subject: Chemistry, Analytical Chemistry Keywords: vanillyl butyl ether; HPLC; method validation; cosmetic product
Online: 25 October 2016 (09:43:18 CEST)
A specific HPLC method has been developed and validated for the determination of vanillyl butyl ether in cosmetic products. The extraction procedure with an isopropanol water 1:1 mixture is described. The method uses a RP-C-18 column with isocratic elution and a UV detector. The mobile phase consists of a mixture of acetonitrile and buffer (Na2HPO4 20mM in water) (30:70 v/v) with a variable flow rate. The method was validated with respect to accuracy, precision (repeatability and reproducibility), specificity and linearity. The procedure described here is simple, selective and reliable for routine quality control analysis and stability tests of commercially available cosmetic products.
ARTICLE | doi:10.20944/preprints201610.0078.v1
Subject: Earth Sciences, Environmental Sciences Keywords: calibration; validation; optical; instrument; processing; imagery; spatial; operational
Online: 19 October 2016 (10:59:29 CEST)
As part of the Copernicus programme of the European Union (EU), the European Space Agency (ESA) has developed and is currently operating the Sentinel-2 mission that is acquiring high spatial resolution optical imagery. This paper provides a description of the calibration activities and the current status of the mission products validation activities. Measured performances, from the validation activities, cover both Top-Of-Atmosphere (TOA) and Bottom-Of-Atmosphere (BOA) products. Results presented in this paper show the good quality of the mission products both in terms of radiometry and geometry and provide an overview on next mission steps related to data quality aspects.
COMMUNICATION | doi:10.20944/preprints202212.0351.v1
Subject: Social Sciences, Other Keywords: uncertainty principle; limits of mathematics; validation models; holistic approach
Online: 20 December 2022 (03:33:52 CET)
Science evolves over a gentle arc spanning centuries, with scientists building upon and extending the hypotheses and discoveries of their forebears when nurturing their own work from ideation to crystallization and finally implementation. However, evidence suggests several limitations of our modern academic pursuits including major inertia and epistemological biases to implement even major advancements. For instance, the transformative uncertainty principles of quantum mechanics are yet to be satisfactorily integrated in modern analyses and publications, even almost a century after Heisenberg received the Nobel prize for these. Another example is ever expanding reliance on mathematics to validate the hypotheses of Physics, and undermine the opinions to the contrary. In addition, modern science limits itself to the era post fifteenth century and hastily rejects premodern achievements despite glaring examples. This reluctance and inertia to capitalize on existing knowledge is a challenge that imperils our intellectual pursuits. A salient facet of science is "the willingness to admit ignorance". Only on this foundational principle can science meaningfully evolve. It is time we take a step back to evaluate widely accepted and foundational premises of modern science and institute structured processes to implement the treasure trove of knowledge amassed by our predecessors. This essay highlights some of the opportunities that can and should be availed capitalizing upon the recent developments of computational and analytical capabilities along with artificial intelligence.
BRIEF REPORT | doi:10.20944/preprints202010.0082.v1
Subject: Engineering, Automotive Engineering Keywords: exploratory analysis; model selection; MLR; K fold cross validation
Online: 5 October 2020 (12:16:38 CEST)
In this project, we use a statistical multiple regression to study the impact of eight various predictors (relative compactness, surface area, wall area, roof area, overall height, orientation, glazing area, glazing area distribution) to estimate the cooling load energy efficiency of residential buildings. We try to analyze and visualize the effect of each predictor with each of the response variable using different classical statistical analysis tools used in describing linear models, in such a way so that we can find out the most strongly related predictor variables. Before starting all of this, we use the idea of model selection by stepwise regression technique and compare the AIC of these models and identified a better model between all of them. Then, we compare a classical linear regression approach by simulations on 768 diverse residential buildings show that we can predict CL with low mean absolute error. By using ANOVA we determine variation in the different residuals. Also, we use non constant variance test to verify it. Furthermore, we check leverage and influence points as well as outliers as well as determined cook distance for influential points. By taking box cox transformation and weights, we also introduce WLS technique to fit the model for better results and did all type of important analysis to understand the energy efficiency. Finally, we show 5-fold cross validation to verify our model.
Subject: Earth Sciences, Geophysics Keywords: solar radiation; meteosat second generation; validation; land surface modelling
Online: 27 October 2019 (04:25:31 CET)
High frequency knowledge of the spatio-temporal distribution of the Downwelling Surface Shortwave Flux (DSSF) and its diffuse fraction (fd) at the surface is nowadays essential for understanding climate processes at the surface-atmosphere interface, plant photosynthesis and carbon cycle, and for the solar energy sector. The EUMETSAT Satellite Application Facility for Land Surface Analysis operationally delivers estimation of the MDSSFTD (Downwelling Surface Short-wave radiation Fluxes – Total and Diffuse fraction) product with an operational status since the year 2019. The method for the retrieval was presented in the companion paper . The part 2 now focuses on the evaluation of the MDSSFTD algorithm and presents the comparison of the corresponding outputs, i.e. total DSSF and diffuse fraction (fd) components, against in-situ measurements acquired at four BSRN stations over a seven-month period. The validation is performed on an instantaneous basis. We show that the satellite estimates of DSSF and fd meet the target requirements defined by the user community for all-sky (clear and cloudy) conditions. For DSSF, the requirements are 20Wm-2 for DSSF<200Wm-2, and 10% for DSSF>=200Wm-2. The MBE and rMBE compared to the ground measurements are 3.618Wm-2 and 0.252%, respectively. For fd, the requirements are 0.1 for fd<0.5, and 20% for fd>=0.5. The MBE and rMBE compared to the ground measurements are -0.044 and -17.699%, respectively. The study also provides a separate analysis of the product performances for clear sky and cloudy sky conditions. The importance of representing the cloud-aerosol radiative coupling in the MDSSFTD method is discussed. Finally, it is concluded that the quality of the Aerosol Optical Depth (AOD) forecasts currently available is enough accurate to obtain reliable diffuse solar flux estimates. This quality of AOD forecasts was still a limitation a few years ago.
ARTICLE | doi:10.20944/preprints201910.0096.v1
Online: 9 October 2019 (10:16:32 CEST)
In this work a modified version of the well-known Simple Water Balance (SWB) model, comprising here three parameters instead of one, was used. Although simple, the model was tested in large-scale river basins in east-central Greece, upstream two hydrometric stations. The available historic runoff records comprised 19 hydrologic years each, on a monthly basis. Thirteen among them were used for calibrating the model, whereas the six subsequent, for validating it. Two different efficiency criteria were used as a measure of performance of the modified model. Their values, calculated for both calibration and validation stages, were close and relatively high. Thus, keeping in mind both the size and complexity of the river basins studied, one can conclude that the modified model, despite its simplistic concept and lumped form, fits satisfactorily the historic runoff series.
Subject: Behavioral Sciences, Cognitive & Experimental Psychology Keywords: Validation; Questionnaire Design; Self-Perception; Diabetes Mellitus; Self Care.
Online: 25 March 2019 (10:00:07 CET)
Background: Level of perceived competence as a basic psychological need could trigger achievement of diabetes self-management goals. Due to lack of a specific data collection tool to measure level of self-competence among Persian speaking patients with diabetes this study was conducted for cross-cultural adaptation and psychometric assessment of the Persian version of Perceived Competence Scale for Diabetes (PCSD-P). Methods: Standard translation/back-translation procedure was carried out to prepare a preliminary draft of the PCSD-P. Content and face validity of the early draft were checked by an expert panel including 15 scholars in the field of health education and promotion as well as nursing education with experience of working and research on diabetes. The final drafted questionnaire was completed by 177 randomly selected patients with type 2 diabetes. Based on the collected data structural validity of the contrived version was appraised using exploratory and confirmatory factor analysis (EFA, CFA). Cronbach's alpha and Intraclass Correlation coefficients (ICC) were used to check the scale’s reliability and internal consistency. ; (3) Results: The estimated measures of Content Validity Index (CVI= 0.95) and Content Validity Ratio (CVR= 0.8) were in the range of acceptable recommended limits. The EFA analysis results demonstrated a single factor solution according to the items’ loadings for the component. The model fit indices i.e. RMSEA= 0.000, CFI=1, TLI=1, GFI= 0.998, NFI= 0.999 RFI= 0.995 confirmed consistency of the hypothesized one-factor solution. Values of the internal consistency and reliability coefficients were also in the vicinity of acceptable range (α= 0.892, ICC=0. 886, P= 0.001). Conclusions: The study findings revealed good internal validity and applicability of the PCSD-P to measure degree of self-competence among Persian speaking type 2 diabetes patients to manage the chronic disease. Due to unrepresentativeness of the study sample future cross-cultural test of PCSD-P on diverse and broader Persian speaking populations is recommended.
ARTICLE | doi:10.20944/preprints201806.0055.v1
Subject: Keywords: quality control; validation; reconstruction of missing data; temperature; precipitation
Online: 5 June 2018 (08:42:40 CEST)
This study provides a unique procedure for validating and reconstructing temperature and precipitation data. Although developed from data in Middle Italy, the validation method is intended to be universal, subject to appropriate calibration according to the climate zones analysed. This~research is an attempt to create shared applicative procedures that are most of the time only theorized or included in some software without a clear definition of the methods. The purpose is to detect most types of errors according to the procedures for data validation prescribed by the World Meteorological Organization, defining practical operations for each of the five types of data controls: gross error checking, internal consistency check, tolerance test, temporal consistency, and~spatial consistency. Temperature and~precipitation data over the period 1931--2014 were investigated. The~outcomes of this process have led to the removal of 375 records (0.02%) of temperature data from 40 weather stations and 1286 records (1.67%) of precipitation data from 118 weather stations, and 171 data points reconstructed. In conclusion, this work contributes to the development of standardized methodologies to validate climate data and provides an innovative procedure to reconstruct missing data in the absence of reliable reference time series.
ARTICLE | doi:10.20944/preprints201702.0026.v1
Subject: Engineering, Marine Engineering Keywords: wave energy; system identification; model validation; wave tank testing
Online: 8 February 2017 (17:00:08 CET)
Empirically based modeling is an essential aspect of design for a wave energy converter. These models are used in structural, mechanical and control design processes, as well as for performance prediction. The design of experiments and methods used to produce models from collected data have a strong impact on the quality of the model. This study considers the system identification and model validation process based on data collected from a wave tank test of a model-scale wave energy converter. Experimental design and data processing techniques based on general system identification procedures are discussed and compared with the practices often followed for wave tank testing. The general system identification processes are shown to have a number of advantages. The experimental data is then used to produce multiple models for the dynamics of the device. These models are validated and their performance is compared against one and other. While most models of wave energy converters use a formulation with wave elevation as an input, this study shows that a model using a hull pressure sensor to incorporate the wave excitation phenomenon has better accuracy.
ARTICLE | doi:10.20944/preprints202211.0460.v1
Subject: Chemistry, Analytical Chemistry Keywords: Fundamental variability; homogeneity; variance; method validation; proficiency testing; measurement uncertainty
Online: 24 November 2022 (14:33:00 CET)
The question whether a given set of test items can be considered “identical” is often addressed in terms of the homogeneity of the test material from which said items were taken. However, for some types of matrices – in particular, for matrices consisting of minute separate particles, only some of which carry the analyte under consideration – even in the case of homogenous test material, an irreducible source of variability between test items may remain: the fundamental variability. In this paper, the concept of fundamental variability is explained, and procedures for reducing and characterizing it are described.
ARTICLE | doi:10.20944/preprints202202.0260.v1
Subject: Earth Sciences, Oceanography Keywords: sea surface salinity; sampling mismatch; sub footprint variability; uncertainty; validation
Online: 22 February 2022 (02:44:05 CET)
Validation of satellite sea surface salinity (SSS) products is typically based on comparisons with in-situ measurements at a few meters depth, that are mostly done at a single location and time. The difference in term of spatio-temporal resolution between the in-situ near-surface salinity and the two-dimensional satellite SSS results in a sampling mismatch uncertainty. The Climate Change Initiative (CCI) project has merged SSS from three satellite missions. Using an optimal interpolation, weekly and monthly SSS and their uncertainties are estimated at a 50 km spatial resolution over the global ocean. Over the 2016-2018 period the mean uncertainty on weekly CCI SSS is 0.13, whereas the standard deviation of weekly CCI minus in-situ Argo salinities is 0.24. Using high resolution SSS simulations, we estimate the expected uncertainty due to the CCI versus Argo sampling mismatch. Most of the largest spatial variability of the satellite minus Argo salinity are observed in regions with large mismatch. A quantitative validation is performed by considering the statistical distribution of the CCI minus Argo salinity normalized by the sampling and retrieval uncertainties. This quantity should follow a Gaussian distribution with a standard deviation of 1, if all uncertainty contributions are properly considered. We find that 1) the sampling mismatch can explain most of the observed differences between Argo and CCI data, especially for monthly products and in dynamical regions (river plumes, fronts), 2) overall, the uncertainties are well estimated in CCI version 3, much better compared to CCI version 2. There are a few dynamical regions where discrepancies remain, and where the satellite SSS, their associated uncertainties and the sampling mismatch estimates should be further validated.
ARTICLE | doi:10.20944/preprints202106.0419.v1
Subject: Social Sciences, Accounting Keywords: Aesthetic literacy, Informational literacy, Promotional literacy, Rhetorical literacy, Students, Validation.
Online: 15 June 2021 (16:13:16 CEST)
Although advertising literacy leads to critical thinking in the face of advertising, but so far, no action has been taken in Iran regarding a tool to measure this type of literacy. And after the investigations, it was determined that although much research has been done on advertising, but the lack of appropriate measurement tools to measure the level of advertising literacy is clearly evident. Therefore, this research provides a valid tool for measuring advertising literacy from students' perspective. In this study, referring to the dimensions of advertising literacy from the perspective of Malmelin (2010) and the views of related professors, a questionnaire was developed and to determine the validity of the structure, confirmatory factor analysis was used. In this study, the statistical population was high school students, considering the impact of advertising in this age range; finally, a tool with four dimensions of informational literacy, aesthetic literacy, rhetorical literacy and promotional literacy was obtained. According to the confirmation of this tool in the present study, it can be used to examine the status of items, their order and prioritization from the perspective of the mentioned population.
ARTICLE | doi:10.20944/preprints202012.0268.v1
Subject: Chemistry, Analytical Chemistry Keywords: Oenothera biennis; standardization; quercetin 3-glucuronide; ellagic acid; method validation
Online: 10 December 2020 (16:43:01 CET)
Toward the standardization of O. biennis sprout extract (OBS-E), we aimed to obtain indicator compounds, using a validated method. HPLC-UVD allowed simultaneous quantification of indicator compounds quercetin 3-glucuronide and ellagic acid. The method was validated in terms of specificity, linearity, precision, accuracy, and limit of detection/limit of quantification (LOD/LOQ). High specificity and linearity was demonstrated, with correlation coefficients of 1.0000 for quercetin 3-glucuronide and 0.9998 for ellagic acid. The LOD/LOQ values were 0.486/1.472 μg/mL for quercetin 3-glucuronide and 1.003/3.039 μg/mL for ellagic acid. Intra-day and inter-day variability tests produced relative standard deviation for each compound of <2%, a generally accepted precision criterion. High recovery rate were also obtained, indicating accuracy validation. The OBS-E prepared using various concentrations of ethanol were then analyzed. The 50% ethanol extract had highest content of quercetin 3-glucuronide, whereas the 70% ethanol extract possessed the lowest. However, the ellagic acid content was highest in the 70% ethanol extract and lowest in the 90% ethanol extract. Thus, quercetin 3-glucuronide and ellagic acid can be used industrially as indicator compounds for O. biennis sprout products, and our validated method can be used to establish indicator compounds for other natural products.
ARTICLE | doi:10.20944/preprints202008.0568.v1
Subject: Earth Sciences, Environmental Sciences Keywords: pedotransfer functions; inverse methods; gravity irrigation; model validation; experimental data
Online: 26 August 2020 (09:01:37 CEST)
In the present work, we evaluate the prediction capability of six Pedotransfer functions (PTFs), reported in the literature, for the saturated hydraulic conductivity estimations (Ks). We used a database with 900 measured samples obtained from the Irrigation District 023, in San Juan del Rio, Queretaro, Mexico. Additionally, six new PTFs were construct for Ks from clay percentage, bulk density and saturation water content data. The results show, for the evaluated models, that one model present an overestimation for Ks>0.5 cm h-1 values, three models have a underestimation for Ks>1.0 cm h-1 and two models have a good correlation (R2>0.98) but are necessary more than three parameters. Nevertheless, the last two models requires from three to four parameters in order to get the optimization. By other hand, the models proposed in this work have a similar correlation with a less number of parameters: the fit is seen to be much better than using the existing ones, achieving a correlation of R2 = 0.9822 with only one variable and a R2 = 0.9901 when we use two.
ARTICLE | doi:10.20944/preprints202005.0217.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: SEAIQR model; stability analysis; COVID-19; optimal control; model validation
Online: 12 May 2020 (13:02:12 CEST)
Background: Outbreak of the Covid-19 is now an ongoing global health emergency. At the end of December 2019, the first infection was reported in Wuhan and the world did not pay attention to this extremely contaminated disease and plucked to react rapidly. The World is in an vulnerable state in disease spreading, facing a great loss of lives and socio-economic aspects also. That is why we have proposed a potential mathematical model with data analysis to predict and control the outcome of this pandemic. Methods: The model presented the epidemic dynamics of multiple compartments. We collected available online data of Spain. In primary step, we estimated the parameters using either the data analysis or reference papers. Then we did the data fitting analysis in comparison with the outcome of our mathematical results. The results of the system depended not only the parameters also on social consciousness. Results: It is found that disease progression in this model is determined by the basic reproductive ratio, $R_0$, the actual epidemic of $R_0$ and effective $R(t)$ of each day. If $R_0>1$, the number of latently infected individuals grows exponentially; endemic solution is stable while infection rate decays if $R_0<1$. The optimal control theory stated that vaccination and treatment strategies are highly effective for reducing both susceptible and infected population and to increase the recover rate high. In Spain, after state of alarm (quarantine) on 14 March 2020, reported cases increasing for 13 days only and from the 14th day, daily reported cases started to decline albeit with small fluctuation. Our proposed model approximates that the disease in Spain could be fully under control by after July 2020. Conclusion: Outbreak will be in control of health care system, reduce the death rate and will ensure social-economic stability.
ARTICLE | doi:10.20944/preprints202003.0384.v1
Subject: Social Sciences, Organizational Economics & Management Keywords: quality; just culture; patient safety; nurses; hospital; measuring instrument validation
Online: 26 March 2020 (07:24:55 CET)
Purpose: "Just culture" is an element of safety culture, and in a broader sense – a part of quality culture. It is the subject of studies, especially in healthcare. This phenomenon is almost unknown in Polish medical facilities. For this reason, the aim of the article is to present the essence and significance of "just culture" in healthcare. The other aim of the research is to present the results of the validation of "just culture" assessment instrument used to recognize the "just culture" maturity level and evaluate the nurses’ beliefs and behaviours in the light of "just culture" criteria. Methodology/Approach: The verified questionnaire consisted of 28 statements in relation to which respondents expressed their opinion on a 5-point Likert scale. The questionnaire was distributed among nurses in one of the largest hospitals in Pomorskie Voivodeship, in Poland. The results based on 68 responses were statistically processed with Statistica 13.1 software. Findings: The obtained results allowed to confirm the reliability of the assessment tool, to recognize the level of „just culture” as wisdom (68%) and to indicate strengths and weaknesses of observed beliefs and behaviours. On this basis, improvement actions were proposed. Originality/Value: We use the original, own prepared questionnaire. This is the first study on "just culture" in healthcare in Poland. There are only few studies devoted to patient safety culture in Poland and no research addressed to "just culture" phenomenon, as well in Poland as in Central Europe. The results in this area allow to recommend the assessment tool for other hospitals and seem to help in understanding the essence of "just culture" implementation.
ARTICLE | doi:10.20944/preprints201908.0294.v1
Subject: Physical Sciences, Mathematical Physics Keywords: time series; Colorado River; water supply; cross-validation; decadal prediction
Online: 28 August 2019 (11:32:10 CEST)
The future of the Colorado River water supply (WS) affects millions of people and the U.S. economy. A recent study suggested a cross-basin correlation between the Colorado River and its neighboring Great Salt Lake (GSL). Following that study, the feasibility of using the previously developed multi-year prediction of the GSL water level to forecast the Colorado River WS was tested. Time-series models were developed to predict the changes in WS out to 10 years. Regressive methods and the GSL water level data were used for the depiction of decadal variability of the Colorado River WS. Various time-series models suggest a decline in the 10-year-averaged WS since 2013 before starting to increase around 2020. Comparison between this WS prediction and the WS projection published in a 2012 government report (derived from climate models) reveals a widened imbalance between supply and demand by 2020. Further research to update similar multi-year prediction of the Colorado River WS is needed. Such information could aid in management decision making in the face of future water shortages.
ARTICLE | doi:10.20944/preprints201901.0241.v1
Subject: Engineering, Civil Engineering Keywords: energy piles; validation; floor slab heat loss; energy; computer simulations
Online: 23 January 2019 (14:04:51 CET)
As the energy efficiency demands for future buildings become increasingly stringent, preliminary assessments of energy consumption are mandatory. These are possible only through numerical simulations, whose reliability crucially depends on boundary conditions. We therefore investigate their role in numerical estimates for the usage of geothermal energy, performing annual simulations of transient heat transfer for a building employing a geothermal heat pump plant and energy piles. Starting from actual measurements, we solve the heat equations in 2D and 3D using COMSOL Multiphysics and IDA-ICE, and discover a negligible impact of the multiregional ground surface boundary conditions. Moreover, we verify that the thermal mass of the soil medium induces a small vertical temperature gradient on the piles surface. We also find a roughly constant temperature on each horizontal cross-section, with nearly identical values if the average temperature is integrated over the full plane or evaluated at one single point. Calculating the yearly heating need for an entire building we then show that the chosen upper boundary condition affects the energy balance dramatically. Using directly the pipes’ outlet temperature induces a 54% overestimation of the heat flux, while the exact ground surface temperature above the piles reduces the error to 0.03%.
ARTICLE | doi:10.20944/preprints201901.0071.v1
Subject: Behavioral Sciences, Other Keywords: Facebook; Facebook intrusion; couple relationships; conflicts; jealousy; psychometric properties; validation
Online: 8 January 2019 (15:15:56 CET)
The present study evaluates the psychometric properties of the Conflicts in Romantic Relationships Over Facebook Use Scale with a sample of Puerto Rican adults. A total of 577 Puerto Ricans participated on this confirmatory and psychometric study. The results confirmed that the scale has a multidimensional structure. These dimensions are: Partner Facebook intrusion, Conflict over Facebook use, and Jealousy over Facebook use. A total of 18 items complied with the criteria of discrimination and presented appropriate factorial loads (6 items per dimension). The Cronbach’s Alpha indexes of the dimensions ranged between .87 and .95 and the omega coefficients ranged between .88 and .95. In summary, the instrument has the appropriate psychometric properties to continue with validation studies, as well as to be implemented in various work areas, both theoretical and applied.
ARTICLE | doi:10.20944/preprints201808.0325.v1
Subject: Engineering, Energy & Fuel Technology Keywords: modelling; lead-acid battery; parameter identification; genetic algorithms; experimental validation
Online: 18 August 2018 (06:14:37 CEST)
Accurate and efficient battery modeling is essential to maximize the performance of isolated energy systems and to extend battery lifetime. This paper proposes a battery model that represents the charging and discharging process of a lead-acid battery bank. This model is validated over real measures taken from a battery bank installed in a research center placed at “El Chocó”, Colombia. In order to fit the model, three optimization algorithms (Particle Swarm Optimization, Cuckoo Search, and Particle Swarm Optimization+Perturbation) are implemented and compared, being the last one a new proposal. This research shows that the model with the proposed algorithm is able to estimate and manage the real battery characteristics as SOC and charging/discharging voltage. The comparison between simulations and real measures shows that the model is able to absorb reading problems, signal delays, and scaling errors. The approach we present can be implemented in other types of batteries especially those used in stand-alone systems.
ARTICLE | doi:10.20944/preprints201808.0243.v1
Subject: Social Sciences, Sociology Keywords: alcoholism; health professionals’ attitudes; social perception; Seaman-Mannello scale; validation
Online: 14 August 2018 (05:42:32 CEST)
Objective: The goal of this study was to analyse the attitudes and perceptions of emergency and mental health nurses through the validation of the SM-GIBED scale in specialised care in Spain on alcoholics and other drug-dependent patients. Design and Setting: This cross-sectional study was developed using the Spanish hospital version of the Seaman-Mannello scale to denominate the SM-GIBED scale. Participants: 170 Emergency and Mental Health Nursing from five Spanish Hospitals. Intervention: Self-administered questionnaire to analiyse the perceptions and attitudes about the drug addict and the alcoholic. Primary and Secondary Outcome Measures: A descriptive and inferential analysis of the study variables was carried out. A psychometric analysis was performed to validate the scale. Results: A total of 170 questionnaires were collected from 257 healthcare workers. Overall, 99.1% of the participants had contact with drug-dependent patients during their professional experience. Nearly 75% had difficulties in treating them. The psychometric analysis of the SM-GIBED scale in the Spanish context obtained values of KMO = 0.655 and Bartlett's test p < 0.000. Cronbach's alpha of 0.738 was obtained from the reliability analysis. A reliability analysis of each of the SM-GIBED questions found no case with an alpha lower than 0.71. In conclusion, positive aspects include an ingratiating attitude and subject-to-subject communication when nurses self-define as empathic and non-paternalistic. Among the negative aspects, there is a lack of communication skills and assertiveness with these patients. This highlights a certain degree of resignation and dissatisfaction when working with drug addicts.
ARTICLE | doi:10.20944/preprints202301.0212.v1
Subject: Medicine & Pharmacology, Nursing & Health Studies Keywords: aphasia; surveys and questionnaires; standardised nursing terminology; nursing assessment; validation studies
Online: 12 January 2023 (06:34:25 CET)
(1) Background: The CEECCA questionnaire assesses the ability to communicate among individuals with aphasia. It was designed using the NANDA-I and NOC standardised nursing languages (SNLs), reaching high content validity index and representativeness index values. The questionnaire was pilot-tested, demonstrating its feasibility for use by nurses in any healthcare setting. This study aims to identify the psychometric properties of this instrument. (2) Methods: 47 individuals with aphasia recruited from primary and specialist care facilities. The instrument was tested for construct validity and criterion validity, reliability, internal consistency, and responsiveness. The NANDA-I and NOC SNLs and the Boston test were used for criterion validity testing. (3) Results: 5 language dimensions explain 78.6% of the total variance. Convergent criterion validity tests showed concordances of up to 94% (Cohen’s κ: 0.9; p<0.001) using the Boston test, concordances of up to 81% using DCs of NANDA-I diagnoses (Cohen’s κ: 0.6; p<0.001), and concordances of up to 96% (Cohen’s κ: 0.9; p<0.001) using NOC indicators. Internal consistency (Cronbach’s alpha) was 0.98. Reliability tests revealed test-retest concordances of 76%-100% (p<0,001). (4) Conclusions: The CEECCA is an easy-to-use, valid, reliable instrument to assess the ability to communicate among individuals with aphasia.
ARTICLE | doi:10.20944/preprints202212.0316.v1
Subject: Earth Sciences, Oceanography Keywords: sea surface salinity; ocean reanalysis; moored buoy; in situ measurements; validation
Online: 19 December 2022 (03:45:59 CET)
Sea surface salinity (SSS) is one of the Essential Climate Variables (ECVs) as defined by the Global Climate Observing System (GCOS). Acquiring high-quality SSS datasets with high spatial-temporal resolution is cruicial for research on the hydrological cycle and earth climate. This study assessed the quality of SSS data provided by four high-resolution ocean reanalysis products, including the Hybrid Coordinate Ocean Model (HYCOM) 1/12° global reanalysis, the The Copernicus Global 1/12° Oceanic and Sea Ice GLORYS12 Reanal-ysis, the Simple Ocean Data Assimilation (SODA) reanalysis, the ECMWF Oceanic Reanalysis System 5 (ORAS5) product and the Estimating the Circulation and Climate of the Ocean Phase II (ECCO2) reanalysis. Regional comparison in the Mediterranean Sea shows that reanalysis largely depicts the accurate spatial SSS structure away from river mouths and coastal areas but slightly underestimates the mean SSS values. Better SSS reanalysis performance is found in the Levantine Sea while larger SSS uncertainties are found in the Adriatic Sea and the Aegean Sea. The global comparison with CMEMS level-4 (L4) SSS show generally con-sistent large-scale structures. The mean ΔSSS between monthly gridded reanalysis data and in situ analyzed data is -0.1 PSU in the open seas between 40°S and 40°N with the mean Root Mean Square Deviation (RMSD) generally smaller than 0.3 PSU and the majority of correlation coefficients higher than 0.5. Comparison with collocated buoy salinity shows that reanalysis products well captures the SSS variations at the locations of tropical moored buoy arrays at weekly scale. Among all the four products, the data quality of HYCOM re-analysis SSS is highest in marginal sea, GLORYS12 has the best performance in the global ocean especially in tropical regions. Comparatively, ECCO2 has the overall worst performance to reproduce SSS states and variations by showing the largest discrepancies with CMEMS L4 SSS.
ARTICLE | doi:10.20944/preprints202012.0080.v1
Subject: Medicine & Pharmacology, Allergology Keywords: multivariate linear method; validation; diagnosis; discriminative; signatures of disease; schizophrenia; depression
Online: 3 December 2020 (10:38:31 CET)
In order to overcome this problem our group designed a novel machine learning technique, multivariate linear method (MLM) which can capture convergent data from voxel-based morphometry, functional resting state and task-related neuroimaging and the relevant clinical measures. In this paper we report results from convergent cross-validation of biological signatures of disease in a sample of patients with schizophrenia as compared to depression. Our model provides evidence that the combination of the neuroimaging and clinical data in MLM analysis can inform the differential diagnosis in terms of incremental validity to reach 90 % accuracy of the prediction.
ARTICLE | doi:10.20944/preprints202008.0713.v1
Subject: Chemistry, Food Chemistry Keywords: NMR; alcoholic beverages; ethanol; methanol; acetaldehyde; screening; validation; food control; PULCON
Online: 31 August 2020 (06:21:35 CEST)
Due to legal regulations, the rise of globalised (online) commerce and the need for public health protection, the analysis of spirits (alcoholic beverages > 15 % vol) is a task with growing importance for governmental and commercial laboratories. In this article a newly developed method using nuclear magnetic resonance (NMR) spectroscopy for the simultaneous determination of 15 substances relevant for the quality and authenticity assessment of spirits is described. The new method starts with a simple and rapid sample preparation and does not need an internal standard. For each sample a group of 1H-NMR spectra is recorded, among them a 2D spectrum for analyte identification and 1D spectra with suppression of solvent signals for quantification. Using the Pulse Length Based Concentration Determination (PULCON) method, concentrations are calculated from curve fits of the characteristic signals for each analyte. The optimisation of the spectra, their evaluation and the transfer of the results are done fully automatically. Glucose, fructose, sucrose, acetic acid, citric acid, formic acid, ethyl acetate, ethyl lactate, acetaldehyde, ethanol, methanol, n-propanol, isobutanol, isopentanol, 2-phenylethanol and 5-(hydroxymethyl)furfural (HMF) can be quantified with an overall accuracy better than 8 %. This new NMR-based targeted quantification method enables the simultaneous and efficient quantification of relevant spirits ingredients in their typical concentration ranges in one process with good accuracy. It has proven to be a reliable method for all kinds of spirits in routine food control.
ARTICLE | doi:10.20944/preprints202008.0376.v1
Subject: Behavioral Sciences, General Psychology Keywords: Organizational skills, test development, general population, goal-directed behaviors, psychometric validation
Online: 18 August 2020 (05:12:27 CEST)
Organizational skills are a set of cognitive abilities responsible for goal-directed behaviors. While they are moderately studied in clinical settings, the assessment of organizational skills in the general population remains under-studied. This paper presents the new Durand Organizational Skills Questionnaire (DOSQ), which was developed to examine the factors associated with organizational abilities in the general population. Exploratory factor analysis, validated by a confirmatory factor analysis, suggests eight factors: Work Organization, Communication Clarity, Punctuality, Goal-Oriented Behavior, Assiduity, Workspace Organization, Strategies, and Attentiveness. Three studies using samples from the general population provided evidence for the reliability and validity of the DOSQ’s scores. Overall, the results suggest that the DOSQ offers a valid approach to measuring organizational skills in the general population.
ARTICLE | doi:10.20944/preprints202006.0351.v1
Subject: Keywords: Brain Tumor; Machine Learning; Ensemble techniques; AdaBoost; Cross-Validation; Stratified technique
Online: 29 June 2020 (07:27:38 CEST)
Brain Tumor is one of the severe diseases and occurrence of this disease threats human life. Detection of brain tumor in advance can secure patient’s life from unwanted loss. Well-timed and swift disease detection and treatment strategy can lead to improved quality of life in these patients. This paper attempts to use Machine Learning based ensemble approaches for recognising patients with brain tumor. Ensemble technique based AdaBoost classifier and 10-fold stratified cross-validation method are assembled in single platform is proposed in this paper for prediction of brain tumor. This prediction is compared against three baseline classifiers such as Gradient Boost, Random Forest and Extra Trees classifier. Experimental result implies the superiority of this model with an accuracy of 98.97%, f1-score of 0.99, kappa statistics score of 0.95 and MSE of 0.0103.
ARTICLE | doi:10.20944/preprints202006.0210.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: SIHR model; COVID-19; Basic reproduction number; stability analysis; model validation
Online: 17 June 2020 (08:16:59 CEST)
The COVID-19 epidemic is an emerging infectious disease of the viral zoonosis type caused by the corona-virus strain SARS-CoV-2, is classified as a human-to-human communicable disease and is currently a pandemic worldwide. In this paper, we propose conceptual mathematical models the epidemic dynamics of four compartments. We have collected data from the Djibouti health ministry. We define the positivity, boundedness of solutions and basic reproduction number. Then, we study local and global stability and bifurcation analysis of equilibrium to examine its epidemiological relevance. Finally, we analyze the fit of the data in comparison with the result of our mathematical results, to validate the model and estimating the important model parameters and prediction about the disease, we consider the real cases of Djibouti from 23th March to 10th June, 2020.
ARTICLE | doi:10.20944/preprints202001.0385.v1
Subject: Earth Sciences, Geoinformatics Keywords: wildfires; susceptibility mapping; machine learning; random forest; model validation; Liguria region
Online: 31 January 2020 (11:40:30 CET)
Wildfire susceptibility maps display the wildﬁres occurrence probability, ranked from low to high, under a given environmental context. Current studies in this field often rely on expert knowledge, including or not statistical models allowing to assess the cause-effect correlation. Machine learning (ML) algorithms can perform very well and be more generalizable thanks to their capability of learning from and make predictions on data. Italy is highly affected by wildfires due to the high heterogeneity of the territory and to the predisposing meteorological conditions. The main objective of the present study is to elaborate a wildfire susceptibility map for Liguria region (Italy) by applying Random Forest, an ensemble ML algorithm based on decision trees. Susceptibility was assessed by evaluating the probability for an area to burn in the future considering where wildfires occurred in the past and which are the geo-environmental factors that favor their spread. Different models were compared, including or not the neighboring vegetation and using an increasing number of folds for the spatial-cross validation. Susceptibility maps for the two fire seasons were finally elaborated and validated and results critically discussed highlighting the capacity of the proposed approach to identify the efficiency of fire fighting activities.
ARTICLE | doi:10.20944/preprints201910.0039.v1
Subject: Earth Sciences, Environmental Sciences Keywords: tree species; forest; biodiversity; time series; spatial autocorrelation; cross-validation; accuracy
Online: 3 October 2019 (13:56:18 CEST)
Mapping forest composition using multiseasonal optical time series is still challenging. Highly contrasted results are reported from one study to another suggesting that drivers of classification errors are still under-explored. We evaluated the performances of single-year Formosat-2 time series to discriminate tree species in temperate forests in France and investigated how predictions vary statistically and spatially across multiple years. Our objective was to better estimate the impact of spatial autocorrelation in the validation data on measurement accuracy and to understand which drivers in the time series are responsible for classification errors. The experiments were based on ten Formosat-2 image time series irregularly acquired during the seasonal vegetation cycle from 2006 to 2014. Due to lot of clouds in the year 2006, an alternative 2006 time series using only cloud-free images has been added. Thirteen tree species were classified in each single-year dataset based on the SVM algorithm. The performances were assessed using a spatial leave-one-out cross validation (SLOO-CV) strategy, thereby guaranteeing full independence of the validation samples, and compared with standard non-spatial leave-one-out cross-validation (LOO-CV). The results show relatively close statistical performances from one year to the next despite the differences between the annual time series. Good agreements between years were observed in monospecific tree plantations of broadleaf species versus high disparity in other forests composed of different species. A strong positive bias in the accuracy assessment (up to 0.4 of Overall Accuracy) was also found when spatial dependence in the validation data was not removed. Using the SLOO-CV approach, the average OA values per year ranged from 0.48 for 2006 to 0.60 for 2013, which satisfactorily represents the spatial instability of species prediction between years.
ARTICLE | doi:10.20944/preprints201808.0363.v1
Subject: Physical Sciences, Applied Physics Keywords: quality control; BSRN; solar radiation; satellite-retrieve irradiance; ground stations; validation
Online: 21 August 2018 (04:23:52 CEST)
Quality control (QC) may be a lengthy and tedious process. As a result, most data users use data from meteorological services without performing data quality checks. The South African Weather Service (SAWS) re-established the national solar radiometric network comprising of 13 new stations within the six climatic zones of the country. This study reports on the performance results of the Baseline Surface Radiation Network (BSRN) QC procedures applied to the solar radiation data within the SAWS radiometric network. The overall percentage performance of the SAWS solar radiation network based on BSRN QC methodology is 97.79%, 93.64%, 91.6% and 92.23% for Long Wave Downward Irradiance (LWD), Global Horizontal Irradiance (GHI), Diffuse Horizontal Irradiance (DHI) and Direct Normal Irradiance (DNI) respectively with operational problems largely dominating the percentage of bad data. The overall average performance of the Surface Solar Radiation Dataset – Heliosat (SARAH) data records for the GHI estimation for all the stations showed a Mean Bias Deviation (MBD) of -8.28 Wm-2, a Mean Absolute Deviation (MAD) of 9.06 Wm-2 and the Root Mean Square Deviation (RMSD) of 11.02 Wm-2. The correlation (quantified by R2) between ground-based and SARAH-derived GHI time series was ~ 0.98. The established network has the potential of providing high quality minute solar radiation data sets (GHI, DHI, DNI and LWD) and auxiliary hourly meteorological parameters vital for scientific and practical applications in renewable energy technologies in South Africa.
ARTICLE | doi:10.20944/preprints201805.0150.v1
Subject: Earth Sciences, Geoinformatics Keywords: Quantitative Precipitation Estimates; Validation; PERSIANN-CCS; meteorological radar; Satellite Rainfall Estimates
Online: 9 May 2018 (15:37:29 CEST)
QPEs (Quantitative Precipitation Estimates) obtained from remote sensing or ground-based radars could complement or even be an alternative to rain gauge readings. However, to be used in operational applications, a validation process has to be carried out, usually by comparing their estimates with those of a rain gauges network. In this paper, the accuracy of two QPEs are evaluated for three extreme precipitation events in the last decade in the southeast of the Iberian Peninsula. The first QPE is PERSIANN-CCS, a satellite-based QPE. The second is a meteorological radar with Doppler capabilities that works in the C band. Pixel-to-point comparisons are made between the values offered by the QPEs and those obtained by two networks of rain gauges. The results obtained indicate that both QPEs were well below the rain gauge values, especially in extreme rainfall time slots. There seems to be a weak linear association between the value of the discrepancies and the precipitation value of the QPEs. It does not seem that radar is more accurate than PERSIANN-CCS, despite its larger spatial resolution and its commonly higher effectiveness. The main conclusion is that neither PERSIANN-CCS nor radar, without empirical calibration, are acceptable QPEs for the real-time monitoring of meteorological extremes in the southeast of the Iberian Peninsula.
ARTICLE | doi:10.20944/preprints202208.0504.v1
Subject: Medicine & Pharmacology, Pharmacology & Toxicology Keywords: Amphetamine-related drugs; Forensic Toxicology; blood; UPLC-qTOF-MS; MMSPE; Validation; SWGTOX
Online: 30 August 2022 (04:09:57 CEST)
Abuse of amphetamine-related drugs (ARDs) causes traffic accidents, violence, and overdose. In forensic toxicology, analysis for ARDs in biological samples can help identify those driving or performing other tasks under the influence of drugs, clarify the cause of death, and identify recent drug users. In this study, we validated a pseudo-isocratic UPLC-qTOF-MS method following mixed mode cation exchange (MMSPE) extraction for analysis of ARDs in blood. The procedure requires 250 μL of blood to achieve a limit of quantification (LOQ) and detection (LOD) of 20 ng/mL for all analytes. In aged animal blood samples, extraction recoveries of 63-90% and matrix effects of 9-21% were observed. Precision and accuracy for all analytes were within 20% and 89–118%, respectively. The analytical method was developed and validated in accordance with the Scientific Working Group for Forensic Toxicology (SWGTOX) Standard. It has acceptable accuracy and precision for use in doping control and forensic toxicology.
ARTICLE | doi:10.20944/preprints202208.0179.v1
Subject: Life Sciences, Other Keywords: In-house validation study; reproducibility precision; measurement uncertainty; prediction interval; uncertainty interval
Online: 9 August 2022 (10:56:40 CEST)
Measurement uncertainty is typically expressed in terms of a symmetric interval , where denotes the measurement result and the expanded uncertainty. However, in the case of heteroscedasticity, symmetric uncertainty intervals can be misleading. In this paper, a different approach for the calculation of uncertainty intervals is introduced. This approach is applicable when a validation study has been conducted with samples with known concentrations. It will be shown how, under certain circumstances, asymmetric uncertainty intervals arise quite naturally and lead to more reliable uncertainty intervals.
ARTICLE | doi:10.20944/preprints202208.0059.v1
Subject: Biology, Other Keywords: Staphylococcus epidermidis; metabolic network validation; minimal cut sets; knock-outs; systems biology
Online: 2 August 2022 (09:33:09 CEST)
Increasingly, systems biology is gaining relevance in basic and applied research. The combination of computational biology with wet lab produces a synergy that results in an exponential increase in knowledge of biological systems. The study of microorganisms such as Staphylococcus epidermidis RP62A enables the researcher to understand better its metabolic network, which allows the design of effective strategies to treat infections caused by this species or others. S. epidermidis is the second cause of infection in patients with joint implants, so treating its proliferation seems vital for public health. There are different approaches to the analysis of metabolic networks. Flux Balance Analysis (FBA) is one of the most widespread streams. It allows the study of large metabolic networks, their structural properties, the optimization of metabolic flux, and the search for intervention strategies to modify the state of the metabolic network. This work presents the validation of the Staphylococcus epidermidis RP62A metabolic network model elaborated by Díaz-Calvo et al.. Then, we elaborate further on the network analysis’s essential reactions, classifying them. Finally, we introduce some proposals to intervene in the network and design knock-outs.
ARTICLE | doi:10.20944/preprints202202.0164.v1
Subject: Life Sciences, Genetics Keywords: rare variants; genome-wide association study; validation test; SNP chip; genomic selection
Online: 11 February 2022 (15:59:26 CET)
The experiments described in this research article were designed to test the effect of rare variants into genomic prediction in dairy cattle. Common polymorphisms are able to explain only a small proportion of the underlying genetic variation of complex phenotypes. Variants representing functional mutations with large effects on complex phenotypes are expected to be rare due to natural (humans) or artificial (livestock) selection pressure. Therefore, it is important to check whether the use of rare variants could increase the accuracy of ranking of animals by providing the tool for more precise differentiation among the bulls with high additive genetic merit. The goal of our study was to verify whether including rare variants in a genomic selection model allows for a more accurate description of the additive genetic background of traits under selection in dairy cattle. We used the linear mixed model for comparison SNP estimates for Holstein-Friesian cattle of the two data sets – a set containing only single nucleotide polymorphisms defined by minor allele frequency ≥ 0.01, which is routinely used in the Polish genomic evaluation system (46,216 SNPs), and a set containing SNPs selected based only on the call rate (54,378 SNPs). Based on the SNP estimates we also calculated DGV and GEBV and compared them between both data sets. In all the analyses we used production, fertility, conformation and udder health traits. We also assessed the time required for the two most computationally demanding components of genomic selection: preparing genotype data, and estimation of SNP effects between those two data sets. The results of our study indicated that the analysis including rare variants resulted in changes in the individual ranking of the top 100 male and female candidates, but had no effect on the outcome of the quality of EBV prediction as expressed by the Interbull validation test.
ARTICLE | doi:10.20944/preprints202105.0333.v1
Subject: Life Sciences, Biochemistry Keywords: genome editing; CRISPR; protoplast; , targeted mutagenesis; TREX2; construct validation; transi-ent expression
Online: 14 May 2021 (13:44:26 CEST)
Cas endonuclease-mediated genome editing provides a long-awaited molecular biological approach to the modification of predefined genomic target sequences in living organisms. Although cas9/guide (g)RNA constructs are straightforward to assemble and can be customized to target virtually any site in the plant genome, the implementation of this technology can be cumbersome, especially in species like Triticale that are difficult to transform, for which only limited genome information is available and/or which carry comparatively large genomes. To cope with these challenges, we have pre-validated cas9/gRNA constructs (1) by frameshift restitution of a reporter gene co-introduced by ballistic DNA transfer to barley epidermis cells, and (2) via transfection in Triticale protoplasts followed by either a T7E1-based cleavage assay or by deep-sequencing of target-specific PCR amplicons. For exemplification, we addressed the Triticale ABA 8’-hydroxylase 1 gene, one of the putative determinants of pre-harvest sprouting of grains. We further show that in-del induction frequency in Triticale can be increased by TREX2 nuclease activity, which holds true for both well- and poorly performing gRNAs. The presented results constitute a sound basis for the targeted induction of heritable modifications in Triticale genes.
ARTICLE | doi:10.20944/preprints202006.0333.v1
Subject: Keywords: Lung Cancer Prediction; Neural Network; Cross-validation; Gradient Boosting Classifier; Automated tool
Online: 28 June 2020 (09:56:30 CEST)
Lung cancer is known as lung carcinoma. It is a disease which is malignant tumor leading to the uncontrolled cell growth in the lung tissue. Lung Cancer disease is one of the most prominent cause of death in all over world. Early detection of this disease can assist medical care unit as well as physicians to provide counter measures to the patients. The objective of this paper is to approach an automated tool that takes influential causes of lung cancer as input and detect patients with higher probabilities of being affected by this disease. A neural network classifier accompanied by cross-validation technique is proposed in this paper as a predictive tool. Later, this proposed method is compared with another baseline classifier Gradient Boosting Classifier in order to justify the prediction performance.
ARTICLE | doi:10.20944/preprints202002.0178.v1
Subject: Medicine & Pharmacology, Pharmacology & Toxicology Keywords: DILIrank; DILI; drug hepatotoxicity; QSAR; nested cross-validation; virtual screening; in silico
Online: 14 February 2020 (02:24:04 CET)
Drug induced liver injury (DILI) remains one of the challenges in the safety profile of both authorized drugs and candidate drugs and predicting hepatotoxicity from the chemical structure of a substance remains a challenge worth pursuing, being also coherent with the current tendency for replacing non-clinical tests with in vitro or in silico alternatives. In 2016 a group of researchers from FDA published an improved annotated list of drugs with respect to their DILI risk, constituting “the largest reference drug list ranked by the risk for developing drug-induced liver injury in humans”, DILIrank. This paper is one of the few attempting to predict liver toxicity using the DILIrank dataset. Molecular descriptors were computed with the Dragon 7.0 software, and a variety of feature selection and machine learning algorithms were implemented in the R computing environment. Nested (double) cross-validation was used to externally validate the models selected. A number of 78 models with reasonable performance have been selected and stacked through several approaches, including the building of multiple meta-models. The performance of the stacked models was slightly superior to other models published. The models were applied in a virtual screening exercise on over 100,000 compounds from the ZINC database and about 20% of them were predicted to be non-hepatotoxic.
REVIEW | doi:10.20944/preprints202210.0390.v1
Subject: Life Sciences, Other Keywords: aflatoxins; APEDA; EU-ML; FSSAI-ML; mycotoxins; ochratoxin A; patulin; regulation, method validation.
Online: 25 October 2022 (12:30:35 CEST)
Mycotoxins are deleterious fungal secondary metabolites that contaminate food and feed, thereby creating concerns regarding food safety. Common fungal genera can easily proliferate in Indian tropical and sub-tropical conditions, and scientific attention is warranted to curb its growth. To address this, two nodal governmental agencies, namely the Agricultural and Processed Food Products Export Development Authority (APEDA) and Food Safety and Standards Authority of India (FSSAI), have developed and implemented analytical methods and quality control procedures to monitor mycotoxin levels in a range of food matrices and assess risks to human health over the past two decades. However, comprehensive information on such advancements in mycotoxin testing and issues in implementing these regulations is inadequately covered in recent literature. The aim of this review is thus to uphold a systematic picture of the role played by the FSSAI and APEDA for mycotoxin control at the domestic level and for the promotion of international trade along with certain challenges in dealing with mycotoxin monitoring. Additionally, it unfolds various regulatory concerns regarding mycotoxin mitigation in India. Overall, it provides valuable insights to the Indian farming community, food supply chain stakeholders, and researchers about India’s success story in arresting mycotoxins throughout the food supply chain.
ARTICLE | doi:10.20944/preprints201912.0093.v1
Subject: Life Sciences, Genetics Keywords: mixed linear model; genotyping-by-sequencing; functional validation; RT-qPCR; resistance genes; GWAS
Online: 7 December 2019 (12:41:39 CET)
Meloidogyne javanica causing root-knot nematode in soybean is an important problem in soybean areas, leading to several yield losses. Some accessions have been identified carrying resistance loci to this nematode specie. In this study, a set of 317 soybean accessions were characterized for resistance to M. javanica. Genome-wide association study (GWAS) was performed using SNPs from genotyping-by-sequencing (GBS), and a region of 29.2 Kbp on chromosome 13 was identified. The haplotype analysis showed that SNPs were able to discriminate susceptible and resistant accessions, leading to 25 accessions sharing the resistance locus. Furthermore, 5 accessions may be new M. javanica resistance sources. The screening of the SNPs in the USDA soybean germplasm showed that several accessions previous reported as resistance to other nematodes also showed the resistance haplotype on chromosome 13. High levels of concordance among the phenotypes of Brazilian cultivars and the SNPs in chromosome 13 were observed. A in silico analysis of the mapped region on soybean genome revealed a presence of 5 genes with structural similarity with major resistance genes. The expression levels of the candidate genes in the interval demonstrated a potential pseudogene, and other two model genes up-regulated in the resistance source after pathogen infection. The SNPs associated to the region conferring resistance is a important tool for introgression of the resistance by marker-assisted selection in soybean breeding programs.
ARTICLE | doi:10.20944/preprints201906.0036.v1
Subject: Earth Sciences, Geoinformatics Keywords: digital elevation models; multi-source fusion; multi-scale fusion; global evaluation; accuracy validation.
Online: 5 June 2019 (10:26:30 CEST)
The quality of digital elevation models (DEMs) is inevitably affected by the limitations of the imaging modes and the generation methods. One effective way to solve this problem is to merge the available datasets through data fusion. In this paper, a fusion-based global DEM dataset (82°S-82°N) is introduced, which we refer to as GSDEM-30. This is a 30-m DEM mainly reconstructed from the unfilled SRTM1, AW3D30, and ASTER GDEM v2 datasets combining the multi-source and multi-scale fusion techniques. A comprehensive evaluation of the GSDEM-30 data, as well as the 30-m ASTER GDEM v2 and AW3D30 DEM, was presented. Global ICESat GLAS data and the local National Elevation Dataset (NED) were used as the reference for the vertical accuracy validation, while GlobeLand30 was introduced for the landscape analysis. Furthermore, we employed the maximum slope approach to detect the potential artefacts in the DEMs. The results show that the GDEM data are seriously affected by noise and artefacts. With the advantage of the multiple datasets and the refined post-processing, the GSDEM-30 are contaminated with fewer anomalies than both ASTER GDEM and AW3D30. The fusion techniques used can also be applied to the reconstruction of other fused DEM datasets.
ARTICLE | doi:10.20944/preprints201905.0309.v1
Subject: Earth Sciences, Oceanography Keywords: MODIS; oceanography; remote sensing; Saildrone; sea surface salinity; sea surface temperature; SMAP; validation
Online: 27 May 2019 (10:19:17 CEST)
Traditional ways of validating satellite-derived sea surface temperature (SST) and sea surface salinity (SSS) products, using comparisons with buoy measurements, do not allow for evaluating the impact of mesoscale to submesoscale variability. Here we present the validation of remotely-sensed SST and SSS data against the unmanned surface vehicle (USV) – Saildrone – measurements from the Spring 2018 Baja deployment. More specifically, biases and root mean square differences (RMSD) were calculated between USV-derived SST and SSS values, and six satellite-derived SST (MUR, OSTIA, CMC, K10, REMSS, and DMI) and three SSS (JPLSMAP, RSS40, RSS70) products. Biases between the USV SST and OSTIA/CMC/DMI were approximately zero while MUR showed a bias of 0.2C. OSTIA showed the smallest RMSD of 0.36C while DMI had the largest RMSD of 0.5C. An RMSD of 0.4C between Saildrone SST and the satellite-derived products could be explained by the daily variability in USV SST which currently cannot be resolved by remote sensing measurements. For SSS, values from the JPLSMAP product showed saltier biases of 0.2 PSU, while RSS40 and RSS70 showed fresh biases of 0.3 PSU. An RMSD of 0.4 PSU could not be explained solely by the daily variability of the USV-derived SSS. Coherences were significant at the longer wavelengths, with a local maximum at 100 km that is most likely associated with the mesoscale turbulence in the California Current System.
COMMUNICATION | doi:10.20944/preprints202112.0420.v3
Subject: Chemistry, Analytical Chemistry Keywords: Non-targeted methods; method validation; food fraud; food authenticity; mass spectrometry; spectroscopy; NGS; NMR
Online: 23 May 2022 (11:10:00 CEST)
Through its suggestive name, non-targeted methods (NTMs) do not aim at a predefined "needle in the haystack". Instead, they exploit all the constituents of the haystack. This new form of analytical methods is increasingly finding applications in food and feed testing. However, the concepts, terms, and considerations related to this burgeoning field of analytical testing needs to be propagated for the benefit of ones associated in academic research, commercial development, and official control. This paper addresses the frequently asked questions around notations and terminologies surrounding NTMs. The widespread development and adoption of these methods also necessitates the need to develop approaches to NTM validation, i.e., evaluating the performance characteristics of a method to determine if it is fit-for-purpose. This work aims to provide a roadmap to approaching NTM validation. In doing so, the paper deliberates on the different considerations that influence the approach to validation and provides suggestions thereof.
ARTICLE | doi:10.20944/preprints202101.0545.v3
Subject: Engineering, Energy & Fuel Technology Keywords: Solar radiation; Satellite-derived irradiance; Global Horizontal Irradiance; Clear sky model; ground stations; validation
Online: 18 March 2021 (14:30:42 CET)
Access to reliable, clean, modern cooking enhances life chances. One option is photovoltaic cooking systems. Accurate solar data is needed to ascertain to what extent these can satisfy the needs of local people. This paper investigates how to choose the most accurate satellite derived solar irradiance database for use in Africa. This is necessary because there is a general shortage of ground measurements for Africa. The solar data is needed to model the output of solar cooking systems, for instance, a solar panel, battery and electric pressure cooker. Four easily accessible satellite databases are validated against ground measurements using a range of statistical tests. Results demonstrate the impact of the mathematical measure used and the phenomenon of balancing errors. Fitting of the satellite model to appropriate climate zone and/or nearby measurements improves accuracy, as does higher spatial and temporal resolution of input parameters. That said, all the four databases reviewed were found to be suitable for simulating PV yield in East Africa.
ARTICLE | doi:10.20944/preprints202102.0600.v1
Subject: Engineering, Automotive Engineering Keywords: Tunnel boring machine (TBM); correlation models; mechanical and operational parameters; performance prediction; model validation
Online: 26 February 2021 (09:31:43 CET)
The study takes into account different classes of tunnel boring machines (TBMs), with the aim of identifying correlation models which are meant to estimate, at a preliminary design phase, the construction time of a tunnel and to evaluate the mechanical and operational parameters of the TBMs, starting from the knowledge of the tunnel length and/or the excavation diameter. To achieve this goal, first of all a database was created, thanks to the collection of the most meaningful technical parameters from a large number of tunnels; afterward, it was statistically analysed through Microsoft Excel. In a first phase, forecasting models were identified for the three types of machines investigated, separately for compact rocks (open TBM) and fractured rocks (single and double shield TBM). Then, the mechanical parameters collected through the database were analysed, with the aim of obtaining models that take into account, in addition to the type of TBM, the geological aspect, and the type of rock characterising the rock mass. Finally, the validation of the study was proposed in a real case, represented by the Moncenisio base tunnel, a work included in the new Turin–Lyon connection line. The estimated values were compared with the real ones, in order to verify the accuracy of the experimental models identified.
ARTICLE | doi:10.20944/preprints202008.0693.v1
Subject: Engineering, Industrial & Manufacturing Engineering Keywords: Industry 4.0; Product Data Management; Product Life Cycle Management; Concurrent Engineering; Validation of Design
Online: 31 August 2020 (04:17:05 CEST)
All departments in a business work separately, but for the same purpose.In this article, a system that allows not only the mechanical design department but also the manufacturing, storage, process planning, quality control, electrical design, purchasing departments, etc. to have access to the required information has been developed. Initially, current manufacturng result informations is collected from the project attandees. Secondly, a workflow is designed dependent on the current data flow. All the project stakeholders are introduced to join and use product data management system. In the absence of this kind of system, loss of time, scraps and loss of engineering time would be investigated. This allowed the company owners to be sure that no faulty revision of design will be produced after the system started. On the other hand automation of bill of materials generation provided the purchasing department correct and up to date information about outsourced parts. Allowing different engineering disciplines to work together provided more suitable environment. Gradually this conditions allowed all the departments work faster and market the new product much faster than before the system. Tracing the workflows for management purposes would be handled by the system. A ‘Validation of Design’ process is modelled for the company.
ARTICLE | doi:10.20944/preprints202006.0360.v1
Subject: Keywords: Term deposit subscription; 10-fold stratified cross-validation; Neural network; DT; MLP; k-NN
Online: 30 June 2020 (08:22:58 CEST)
For enhancing the maximized profit from bank as well as customer perspective, term deposit can accelerate finance fields. This paper focuses on likelihood of term deposit subscription taken by the customers. Bank campaign efforts and customer details are influential while considering possibilities of taking term deposit subscription. An automated system is provided in this paper that approaches towards prediction of term deposit investment possibilities in advance. Neural network(NN) along with stratified 10-fold cross-validation methodology is proposed as predictive model which is later compared with other benchmark classifiers such as k-Nearest Neighbor (k-NN), Decision tree classifier (DT), and Multi-layer perceptron classifier (MLP). Experimental study concluded that proposed model provides significant prediction results over other baseline models with an accuracy of 88.32% and Mean Squared Error (MSE) of 0.1168.
ARTICLE | doi:10.20944/preprints201911.0390.v1
Subject: Chemistry, Food Chemistry Keywords: caffeine; 16-O-methylcafestol; kahweol; furfuryl alcohol; tetramethylsilan (TMS); magnetic resonance spectroscopy; validation studies
Online: 30 November 2019 (10:20:17 CET)
Monitoring coffee quality as a means of detecting and preventing economically motivated fraud is an important aspect of international commerce today. Therefore, there is a compelling need for rapid high throughput validated analytical techniques such as quantitative proton NMR spectroscopy for screening and authenticity testing. For this reason, we sought to validate an NMR spectroscopic method for routine screening of coffee for quality and authenticity. A factorial experimental design was used to investigate the influence of NMR device, extraction time and nature of coffee on the content of caffeine, 16-O-methylcafestol (OMC), kahweol, furfuryl alcohol and 5-hydroxymethylfurfural (HMF) in coffee. The method was successfully validated for specificity, selectivity, sensitivity and linearity of detector response. The proposed method produced satisfactory precision for all analytes in roasted coffee, except for kahweol in canephora (robusta) coffee. The proposed validated method may be used for routine screening of roasted coffee for quality and authenticity control, as its applicability was demonstrated during the recent OPSON VIII Europol-Interpol operation on coffee fraud control.
Subject: Life Sciences, Biochemistry Keywords: enzyme kinetics; Jupyter notebook; kinetic modelling; matplotlib; NMR spectroscopy; optimisation; parametrisation; PySCeS; SciPy; validation
Online: 11 June 2019 (11:15:01 CEST)
Bottom-up systems biology entails the construction of kinetic models of cellular pathways by collecting kinetic information on the pathway components (e.g. enzymes) and collating this into a kinetic model, based for example on ordinary differential equations. This requires integration and data transfer between a variety of tools, ranging from data acquisition in kinetics experiments, to fitting and parameter estimation, to model construction, evaluation and validation. Here, we present a workflow that uses the Python programming language, specifically the modules from the SciPy stack, to facilitate this task. Starting from raw kinetics data, acquired either from spectrophotometric assays with microtitre plates or from NMR spectroscopy time courses, we demonstrate the fitting and construction of a kinetic model using scientific Python tools. The analysis takes place in a Jupyter notebook, which keeps all information related to a particular experiment together in one place and thus serves as an e-labbook, enhancing reproducibility and traceability. The Python programming language serves as an ideal foundation for this framework because it is powerful yet relatively easy to learn for the non-programmer, has a large library of scientific routines and active user community, is open-source and extensible, and many computational systems biology software tools are written in Python or have a Python API. Our workflow thus enables investigators to focus on the scientific problem at hand rather than worrying about data integration between disparate platforms.
ARTICLE | doi:10.20944/preprints202208.0361.v1
Subject: Engineering, Mechanical Engineering Keywords: Blower wind tunnel design; CFD; Boundary layer controller; Turbulent intensity Validation; Power spectral density (PSD)
Online: 19 August 2022 (08:31:04 CEST)
new subsonic blower wind tunnel design has been studied both numerically and experimentally, it is also referred to as “blower” wind tunnel. This paper, is initially aimed to address each sequential stage of the wind tunnel design process. Rather than applying the standard method of modelling solely the flow in the test section, a large-scale CFD model of the whole wind tunnel was employed. The loss of every constituent element was calculated and then all the losses are added up to determine the power needed for the wind tunnel operation which is used as “intake fan” boundary conditions in the CFD model. Then, flow uniformity and turbulent intensity measurements in an empty test section using a Pitot-static tube and hot wire anemometer (HWA) were introduced to validate the CFD results. The results showed that flow quality was significantly affected by boundary layer controllers (honeycomb and mesh screens) in the settling chamber and wide-angle diffuser. Investigations were also conducted to evaluate the flow deficit in the wake area behind a convex hump model using HWA. This was additional experimental tests carried out to validate the suitability of the wind tunnel flow aerodynamic research.
ARTICLE | doi:10.20944/preprints202201.0227.v1
Subject: Behavioral Sciences, Other Keywords: Bayesian inference; race and ethnicity imputation; All Payer Claims Database; vital statistics death records; validation
Online: 17 January 2022 (12:40:15 CET)
Background: All Payer Claims Databases (APCD) are a rich source of health information, however, race and ethnicity (R&E) data are largely missing. Bayesian Improved Surname Geocoding (BISG) is a common R&E imputation method, yet, validation of BISG in APCDs is lacking. We used the BISG to impute missing R&E in the Oregon APCD. Methods: BISG imputed R&E for Asian Pacific Islanders (API), Blacks, Hispanics and Whites were contrasted to the gold standard (vital statistics) and sensitivity and specificity improvements were assessed. Logistic regression examined whether missing R&E was random across patient characteristics. Results: Among 85,857 individuals in the study, 32.1% (n=27,594) had missing R&E. Missing R&E was not randomly distributed. There were higher odds of missingness among males, Whites, those age 65 and older, and commercially insured individuals. Differences in the percent missing were also found by co-morbid conditions and mortality causes. Imputing the missing R&E with BISG method improved the sensitivity to identify White, Black, API, and Hispanics. Conclusions: APCDs can benefit from enhancing missing R&E with BISG imputation to perform more robust population-health level analyses and identify inequities according to R&E without losing power or dropping non-random records with missing R&E data.
ARTICLE | doi:10.20944/preprints201809.0200.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: objective clustering; biclustering; gene regulatory networks; reconstruction; validation; gene expression profiles; noise component; systems stability
Online: 11 September 2018 (13:48:12 CEST)
ARTICLE | doi:10.20944/preprints201608.0149.v1
Subject: Earth Sciences, Environmental Sciences Keywords: landsat 8 OLI; Nalban Lake; East Kolkata Wetland; chlorophyll-a prediction; study points; validation points
Online: 15 August 2016 (13:51:19 CEST)
1) Landsat operational land imager (OLI) data and consequent laboratory measurements were used to predict Chlorophyll-a (Chl-a) concentration and the trophic states for an inland lake within the East Kolkata Wetland, India; 2) The most suitable band ratio was identified by performing Pearson correlation analysis between Chl-a concentrations and possible OLI band and band ratios from the study points; 3) The results showed highest correlation coefficient from the band ratio OLI5/OLI4 with an R value of 0.85. The prediction model was then developed by applying regression analysis between the band ratio OLI5/OLI4 and Chl-a concentration of the study points. The reflectance ratios of the validation points were given as input on the prediction model and the model output was considered as predicted Chl-a values of the validation points to check the efficiency of the prediction model. The regression model between laboratory-derived Chl-a value and model-fitted Chl-a value of the validation points revealed a high correlation with an R2 value of 0.78. Trophic State Index (TSI) of the lake was also calculated from laboratory-derived Chl-a value and model-fitted Chl-a value of the validation points. The study presented a high correlation of TSI determined from predicted data with TSI from laboratory reference data (R = 0.88). The TSI values of the lake ranged from 65 to 75 which indicate that the lake is appeared to be eutrophic to hypereutrophic conditions. 4) This empirical study showed that Landsat 8 OLI imagery can be effectively applied to estimate Chl-a levels and trophic states for inland lakes.
ARTICLE | doi:10.20944/preprints201809.0056.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: mmWave; 5G heterogeneous network; meshed backhaul; outdoor dynamic crowd; SDN; dynamic construction; testbed; numerical analysis; experimental validation
Online: 4 September 2018 (05:51:50 CEST)
5G heterogeneous network overlaid by millimeter-wave (mmWave) access employs mmWave meshed backhauling as a promising cost-efficient backhaul architecture. Due to the nature of mobile traffic distribution in practice which is both time-variant and spatially non-uniform, dynamic construction of mmWave meshed backhaul is prerequisite to support the varying traffic distribution. Focusing on such scenario of outdoor dynamic crowd (ODC), this paper proposes a novel method to control mmWave meshed backhaul for efficient operation of mmWave overlay 5G HetNet through Software-Defined Network (SDN) technology. Our algorithm is featured by two functionalities, i.e., backhauling route multiplexing for overloaded mmWave small cell base stations (SC-BSs) and mmWave SC-BSs’ ON/OFF status switching for underloaded spot. In this paper, the effectiveness of the proposed meshed network is confirmed by both numerical analyses and experimental results. Simulations are conducted over a practical user distribution modeled from measured data in realistic environments. Numerical results show that the proposed algorithm can cope with the locally intensive traffic and reduce energy consumption. Furthermore, a WiGig (Wireless Gigabit Alliance certified) device based testbed is developed for Proof-of-Concept (PoC) and preliminary measurement results confirm the proposed dynamic formation of the meshed network’s efficiency.
ARTICLE | doi:10.20944/preprints202301.0480.v1
Subject: Life Sciences, Virology Keywords: avian influenza; highly pathogenic avian influenza; next generation sequencing; whole genome sequencing; nanopore technology; methods comparison; clinical validation
Online: 26 January 2023 (15:19:53 CET)
As exemplified by the global response to the SARS-CoV-2 pandemic, whole genome sequencing played an important role in monitoring the evolution of novel viral variants and provided guidance on potential antiviral treatments. The recent rapid and extensive introduction and spread of highly pathogenic avian influenza virus in Europe, North America and elsewhere raises the need for similarly rapid sequencing to aid in appropriate response and mitigation activities. To facilitate this objective, we investigated a next generation sequencing platform that uses a portable nanopore sequencing device to generate and present data in real time. This platform offers the potential to extend in-house sequencing capacities to laboratories that may otherwise lack resources to adopt sequencing technologies requiring large benchtop instruments. We evaluated this platform for routine use in a diagnostic laboratory. In this study we evaluated different primer sets for the whole genome amplification of influenza A virus and evaluated five different library preparation approaches for sequencing on the nanopore platform using the MinION flow-cell. A limited amplification procedure and a rapid procedure were found to be best among the approaches taken.
ARTICLE | doi:10.20944/preprints202211.0367.v1
Subject: Social Sciences, Education Studies Keywords: Sustainable gender equality; self-efficacy; gender mainstreaming; STEM higher education; STEM student teachers’ perceptions; scale validation, Spain, Creece
Online: 21 November 2022 (03:38:01 CET)
In the context of the Education-2030 Framework for Action, an important goal for initial STEM teacher education is to provide professional development on equality and gender awareness. This study explored whether STEM prospective secondary teachers are prepared to implement a sustainable gender-sensitive practice upon graduation. To this end, we cross-culturally validated the TEGEP (Teacher Self-Efficacy for Gender Equality practice) scale and compared STEM student teachers’ perceptions of self-efficacy by country and sex. Participants were 205 STEM (science, technology, engineering, and mathematics) secondary school student teachers (136 Greek and 69 Spanish) drawn from seven public universities (six Greek, one Spanish). Statistical analysis confirmed the structure and factor invariance of the TEGEP across country and between sexes showing evidence that gender equality self-efficacy level is only moderate and that perceived competence in gender knowledge was significantly higher in Greek than in Spanish STEM student teachers, while the latter felt more competent than the Greek in developing values and attitudes in regards to gender. The study provides a cross-validated instrument to measure gender equality self-efficacy in STEM teacher education and evaluate sustainable changes after planned interventions.
ARTICLE | doi:10.20944/preprints202103.0589.v1
Subject: Materials Science, Biomaterials Keywords: LLDPE; quasi-static and dynamic experimental tests, impact energy absorption; material parameter identification; constitutive material model; validation; simulation
Online: 24 March 2021 (13:38:40 CET)
Current industrial trends bring new challenges in energy absorbing systems. Polymer materials as the traditional packaging material seem to be promising due to their low weight, structure and production price. Based on the review, the linear low-density polyethylene material was identified as the most promising material for absorbing impact energy. The current paper addresses the identification of the material parameters and the development of a Constitutive material model to be used in future design by virtual prototyping. The paper deals with the experimental measurement of the stress-strain relations of the linear low-density polyethylene under static and dynamic loading. The quasi-static measurement is realized in two perpendicular principal directions and is supplemented by a test measurement in the 45 degrees direction, i.e. exactly between the principal directions. The quasi-static stress-strain curves are analyzed as an initial step for dynamic strain rate dependent material behavior. The dynamic response is tested in the drop tower using a spherical impactor hitting the flat material multi-layered specimen at two different energy levels. The strain rate dependent material model is identified by optimizing the static material response obtained in the dynamic experiments. The material model is validated by the virtual reconstruction of the experiments and by comparing the numerical results to the experimental ones.
ARTICLE | doi:10.20944/preprints202002.0134.v1
Subject: Life Sciences, Biochemistry Keywords: autophagy; autophagonizer; target identification of label-free compound; target validation; autophagic flux; autophagy inhibition; lysosomal integrity function of Hsp70
Online: 11 February 2020 (09:03:51 CET)
Manipulating autophagy is a promising strategy for treating cancer as several autophagy inhibitors shown to induce autophagic cell death. One of these, autophagonizer (APZ), induces apoptosis-independent cell death by binding an unknown target via an unknown mechanism. To identify APZ targets we used a label-free drug affinity responsive target stability (DARTS) approach with a liquid chromatography/tandem mass spectrometry (LC-MS/MS) readout. Of 35 protein interactors, we identified Hsp70 as a key target protein of unmodified APZ in autophagy. Either APZ treatment or Hsp70 inhibition attenuates integrity of lysosomes, which leads to autophagic cell death exhibiting an excellent synergism with a clinical drug, temozolomide, in vitro, in vivo, and orthotropic glioma xenograft model. These findings demonstrate the potential of APZ to induce autophagic cell death and its development to combinational chemotherapeutic agent for glioma treatment. Collectively, our study demonstrated that APZ, a new autophagy inhibitor, can be used as a potent antitumor drug candidate to get over unassailable glioma and revealed a novel function of Hsp70 in lysosomal integrity regulation of autophagy.
ARTICLE | doi:10.20944/preprints201801.0217.v1
Subject: Earth Sciences, Atmospheric Science Keywords: data assimilation; statistical diagnostics of analysis residuals; estimation of analysis error, air quality model diagnostics; Desroziers method; cross-validation
Online: 23 January 2018 (16:23:25 CET)
We examine how observations can be used to evaluate an air quality analysis by verifying against passive observations (i.e. cross-validation) that are not used to create the analysis and we compare these verifications to those made against the same set of (active) observations that were used to generate the analysis. The results show that both active and passive observations can be used to evaluate of first moment metrics (e.g. bias) but only passive observations are useful to evaluate second moment metrics such as variance of observed-minus-analysis and correlation between observations and analysis. We derive a set of diagnostics based on passive observation–minus-analysis residuals and we show that the true analysis error variance can be estimated, without relying on any statistical optimality assumption. This diagnostic is used to obtain near optimal analyses that are then used to evaluate the analysis error using several different methods. We compare the estimates according to the method of Hollingsworth Lonnberg, Desroziers, a diagnostic we introduce, and the perceived analysis error computed from the analysis scheme, to conclude that as long as the analysis is optimal, all estimates agrees within a certain error margin. The analysis error variance at passive observation sites is also obtained.
ARTICLE | doi:10.20944/preprints202209.0134.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: nasal function; validation; software; nasal resistance; rhinomanometry; acoustic rhinometry; peak nasal inspiratory flow meter; practice patterns; objective measurement outcomes; parameters
Online: 9 September 2022 (09:41:14 CEST)
Background: The Davidson Airway Function & Nasal Evaluation (DAFNE) Scoring System was developed as an intuitive and research-based scoring system that could be validated through beta testing and easily introduced to healthcare providers of several subspecialties who treat nasal obstruction and breathing disorders (MDs, PAs, PTs, APRNs, DDSs, and DCs). This scoring system was shown to increase the knowledge of airway function, nasal measurement parameters, and identification of proper treatment options for sleep and breathing disorders. The basis for the DAFNE score was developed from a systematic review of nasal measurement data. Methods: Electronic searches of PubMed, MEDLINE, EMBASE Cochrane Library, and Scopus of publications between 1988-2022 were used to identify studies validating nasal function measurement parameters to create the algorithm for the DAFNE Score™. The systematic review was accomplished using the 2020 ‘Preferred Reporting Items for Systematic Reviews’ (PRISMA) guidelines. Results: Twenty studies met the inclusion criteria for systematic review. Primary outcomes measurements demonstrated reliability, repeatability and validity of the DAFNE measurement technologies, data and output. Conclusions: The data analysis and systematic review uncovered a need and framework to develop and validate a web-based software algorithm for global access to improve the understanding of data interpretation of nasal measurements from three nasal measurement technologies. DAFNE Scoring should be used as an adjunct tool in routine clinical practice and research to further understand the technology data output and how to collaborate with other healthcare providers to improve patient outcomes.
Subject: Medicine & Pharmacology, Other Keywords: Chikungunya virus; alphavirus; antiviral therapy; direct-acting antivirals; host-directed antivirals; in silico screening; in vivo validation; antiviral drug development
Online: 10 June 2021 (09:15:45 CEST)
Chikungunya virus (CHIKV) is a mosquito-transmitted alphavirus that has re-emerged in recent decades, causing large-scale epidemics in many parts of the world. CHIKV infection leads to a febrile disease known as chikungunya fever (CHIKF), which is characterised by severe joint pain and myalgia. As many patients develop a painful chronic stage and neither antiviral drugs nor vaccines are available, the development of a potent CHIKV inhibiting drug is crucial for CHIKF treatment. A comprehensive summary of current antiviral research and development of small-molecule inhibitor against CHIKV is presented in this review. We highlight different approaches used for the identification of such compounds and further discuss the identification and application of promising viral and host targets.
REVIEW | doi:10.20944/preprints201901.0285.v1
Subject: Engineering, Control & Systems Engineering Keywords: cyber physical systems; industry 4.0; MDE; lifetime verification & validation; dependability; correctness; flexibility; real-time self-adaptation, self-management; self-healing
Online: 29 January 2019 (04:45:47 CET)
Cyber Physical Systems (CPS) has been a popular research area in the last decade. The dependability of CPS is still a critical issue, and rare survey has been published in this domain. CPS is a dynamic complex system, which involves various multidisciplinary technologies. To avoid human error and to simplify management, self-management CPS (SCPS) is a wise choice. And to achieve dependable self-management, systematic solution is necessary to verify the design and to guarantee the safety of self-adaptation decision, as well as to maintain the health of SCPS. This survey first recalls the concepts of dependability, and proposes a generic environment-in-loop processing flow of self-management CPS, and then analyzes the error sources and challenges of self-management through the formal feedback flow. Focus on reducing the complexity, we first survey the self-adaptive architecture approaches and applied dependability means; then we introduce a hybrid multi-role self-adaptive architecture, and discuss the supporting technologies for dependable self-management at the architecture level. Focus on dependable environment-centered adaption, we investigate the verification and validation (V&V) methods for making safe self-adaptation decision and the solutions for processing decision dependably. For system-centered adaption, the comprehensive self-healing methods are summarized. Finally, we analyze the missing pieces of the technology puzzle and the future directions. In this survey, the technical trends for dependable CPS design and maintenance are discussed, an all-in-one solution is proposed to integrate these technologies and build a dependable organic SCPS. To the best of our knowledge, this is the first comprehensive survey on dependable SCPS building and evaluation.
ARTICLE | doi:10.20944/preprints202301.0371.v1
Subject: Medicine & Pharmacology, Pharmacology & Toxicology Keywords: Network pharmacology; GO enrichment analysis; Key target validation; Hyperlipidemia; Hepatic steatosis; herbal combination; combinational effect; Arum ternata; Poria cocos; Zingiber officinale
Online: 20 January 2023 (06:41:44 CET)
The network pharmacology (NP) approach is a valuable novel methodology for understanding the complex pharmacological mechanisms of medicinal herbs. In addition, various in silico analysis techniques combined with the NP can improve the understanding of various issues in natural product research. This study assessed the therapeutic effects of Arum ternata (AT), Poria cocos (PC), and Zingiber officinale (ZO) on hyperlipidemia after network pharmacologic analysis. A protein–protein interaction (PPI) network of forty-one key targets was analyzed to discover core functional clusters of the herbal compounds. The KEGG pathway and gene ontology (GO) term enrichment analysis identified significant categories of hypolipidemic mechanisms. The STITCH database indicated a high connection with several statin drugs deduced by the similarity in targets. AT, PC, and ZO regulated the genes related to the energy metabolism and lipogenesis in HepG2 cells loaded with free fatty acids (FFAs). Furthermore, a combinational effect of the mixture of three herbs was found. The herbal combination exerted superior efficacy compared to a single herb, particularly in regulating acetyl-CoA carboxylase (ACC) and carnitine palmitoyltransferase 1 (CPT-1). In conclusion, the network pharmacologic approach was used to assess potential targets of the herbal combination for treatment. Experimental data from FFAs-induced HepG2 cells suggested that the combination of AT, PC, and ZO might attenuate hyperlipidemia and its associated hepatic steatosis.
Subject: Keywords: bioprocess models; model validation; model calibration; Quality by Design; mechanistical and statistical models; hybrid models; chemometric models; Biopharmaceutical engineering; regulatory guidance
Online: 10 May 2021 (09:57:09 CEST)
In bioprocess engineering the Qualtiy by Design (QbD) initiative encourages the use of models to define design spaces. However, clear guides on how models for QbD are validated are still missing. In this review we provide a comprehensive overview about validation methods, mathematical approaches and metrics currently applied in bioprocess modeling. The methods cover analytics for data used for modeling, model training and selection, measures for predictiveness and model uncertainties. We point out general issues in model validation and calibration for different types of models and put this into context of existing health authority recommendations. This review provides the start-point for developing a guidance for model validation approaches. There is no one-fits-all approach but this review shall help to identify the best fitting validation method or combination of methods for the specific task and type of bioprocess models that is developed.
ARTICLE | doi:10.20944/preprints201711.0047.v1
Subject: Earth Sciences, Atmospheric Science Keywords: data assimilation; statistical diagnostics of analysis residuals; estimation of analysis error, air quality model diagnostics; Desroziers et al. method; cross-validation
Online: 7 November 2017 (10:09:42 CET)
We present a general theory of estimation of analysis error covariances based on cross-validation as well as a geometric interpretation of the method. In particular we use the variance of passive observation–minus-analysis residuals and show that the true analysis error variance can be estimated, without relying on the optimality assumption. This approach is used to obtain near optimal analyses that are then used to evaluate the air quality analysis error using several different methods at active and passive observation sites. We compare the estimates according to the method of Hollingsworth-Lönnberg, Desroziers et al., a new diagnostic we developed, and the perceived analysis error computed from the analysis scheme, to conclude that, as long as the analysis is near optimal, all estimates agree within a certain error margin.
ARTICLE | doi:10.20944/preprints202301.0131.v1
Subject: Medicine & Pharmacology, Pathology & Pathobiology Keywords: BCG; bladder cancer; non-muscle-invasive bladder cancer; prospective validation; pT1 high-grade bladder cancer; risk stratification; ROL; substaging; TURBT; urothelial carcinoma.
Online: 9 January 2023 (01:17:29 CET)
Patients with pT1 high-grade (HG) bladder cancer (BC) and a very high risk of progression might benefit from immediate radical cystectomy (RC), but this option remains controversial. Validation of a standardized method to evaluate the extent of lamina propria (LP) invasion (with recognized prognostic value) in transurethral resections (TURBT) specimens is still needed. The Rete Oncologica Lombarda (ROL) system showed a high predictive value for progression after TURBT in recent retrospective studies. Our aim was to validate ROL system on a large mono-institutional prospective series of primary urothelial carcinomas. From 2016 to 2020, we adopted ROL for all patients with pT1 HG BC on TURBT. We employed a 1.0-mm threshold to stratify tumors in ROL1 and ROL2. A total of 222 pT1HGBC were analyzed. Median age was 74 years, with male predominance (73.8%). ROL was feasible in all cases: 91 cases were ROL1 (41%) and 131 ROL2 (59%). At a median follow up of 26.9 months (IQR 13.8-40.6), we registered 80 recurrences and 40 progressions. ROL was a significant predictor of tumor progression at both univariable (HR 3.53; CI 95% 1.56 – 7.99; p<0.01) and multivariable (HR 2.90; CI 95% 1.25 – 6.75; p=0.01) Cox regression analyses. At Kaplan-Meier estimates, ROL showed correlation with both PFS (p=0.0012) and RFS (p=0.0167). Our results confirmed the strong predictive value of ROL for progression on a large prospective series. We encourage the application of ROL for reporting the extent of LP invasion, substaging T1 HG BC, and improving risk tables for urological decision making.
TECHNICAL NOTE | doi:10.20944/preprints202112.0250.v1
Subject: Earth Sciences, Oceanography Keywords: regional sea level; satellite altimetry; tide gauge; validation; mission bias; North Sea; Sentinel-3A; Jason-1; Jason-2; Jason-3; Envisat; Saral
Online: 15 December 2021 (09:25:54 CET)
Consistent calibration and monitoring is a basic prerequisite for providing reliable time series of global and regional sea level variations from altimetry. The precision of sea level measurements and regional biases for six altimeter missions (Jason-1/2/3, Envisat, Saral, Sentinel-3A) is assessed at eleven GNSS-controlled tide gauge stations in the German Bight (SE North Sea) for the period 2002 to 2019. The gauges are partly located at the open water, partly at the coast close to mudflats. The altimetry is extracted at virtual stations with distances from 2 to 24 km from the gauges. The processing is optimized for the region and adjusted for the comparison with instantaneous tide gauges readings. An empirical correction is developed to account for mean height gradients and slight differences of the tidal dynamics between gauge and altimetry which improves the agreement between the two data sets by 15-75%. The precision of the altimeters is depending on location and mission and is shown to be at least 1.8 to 3.7 cm based on an assumed precision of 2 cm for the gauges. The accuracy of the regional mission biases is strongly dependent on the mean sea surface heights near the stations. The most consistent biases are obtained based on the CLS2011 model with mission dependent accuracies from 1.3 to 3.4 cm. Hence, the GNSS-controlled tide gauges operated operationally by WSV might complement the calibration and monitoring activities at dedicated CalVal stations.
ARTICLE | doi:10.20944/preprints202110.0122.v2
Subject: Earth Sciences, Geophysics Keywords: ICESat-2; Laser Altimetry; Kinematic GPS Experiments; Glaciology; Surge Glaciers; Svalbard; Density Dimension Algorithm for Ice Surfaces; Airborne Validation of Satellite Data
Online: 13 October 2021 (10:45:21 CEST)
The topic of this paper is the airborne evaluation of ICESat-2 Advanced Topographic Laser Altimeter System (ATLAS) measurement capabilities and surface-height-determination over crevassed glacial terrain, with a focus on the geodetical accuracy of geophysical data collected from a helicopter. To obtain surface heights over crevassed and otherwise complex ice surface, ICESat-2 data are analyzed using the density-dimension algorithm for ice surfaces (DDA-ice), which yields surface heights at the nominal 0.7~m along-track spacing of ATLAS data. As the result of an ongoing surge, Negribreen, Svalbard, provided an ideal situation for the validation objectives in 2018 and 2019, because many different crevasse types and morphologically complex ice surfaces existed in close proximity. Airborne geophysical data, including laser altimeter data (profilometer data at 905~nm frequency), differential Global Positioning System (GPS), Inertial Measurement Unit (IMU) data, on-board-time-lapse imagery and photographs, were collected during two campaigns in summers of 2018 and 2019. Airborne experiment setup, geodetical correction and data processing steps are described here. To date, there is relatively little knowledge of the geodetical accuracy that can be obtained from kinematic data collection from a helicopter. Our study finds that (1)~Kinematic GPS data collection with correction in post-processing yields higher accuracies than Real-Time-Kinematic (RTK) data collection. (2)~Processing of only the rover data using the Natural Resources Canada Spatial Reference System Precise Point Positioning (CSRS-PPP) software is sufficiently accurate for the sub-satellite validation purpose. (3)~Distances between ICESat-2 ground tracks and airborne ground tracks were generally better than 25~m, while distance between predicted and actual ICESat-2 ground track was on the order of 9~m, which allows direct comparison of ice-surface heights and spatial statistical characteristics of crevasses from the satellite and airborne measurements. (4)~The Lasertech Universal Laser System (ULS), operated at up to 300~m above ground level, yields full return frequency (400~Hz) and 0.06-0.08~m on-ice along-track spacing of height measurements. (5)~Cross-over differences of airborne laser altimeter data are 0.1918 $\pm$ 2.385~m along straight paths over generally crevassed terrain, which implies a precision of approximately 2.4~m for ICESat-2 validation experiments. (6)~In summary, the comparatively light-weight experiment setup of a suite of small survey equipment mounted on a Eurocopter (Helicopter AS-350) and kinematic GPS data analyzed in post-processing using CSRS-PPP leads to high accuracy repeats of the ICESat-2 tracks. The technical results (1)-(6) indicate that direct comparison of ice-surface heights and crevasse depths from the ICESat-2 and airborne laser altimeter data is warranted. The final result of the validation is that ICESat-2 ATLAS data, analyzed with the DDA-ice, facilitate surface-height determination over crevassed terrain, in good agreement with airborne data, including spatial characteristics, such as surface roughness, crevasse spacing and depth, which are key informants on the deformation and dynamics of a glacier during surge.
ARTICLE | doi:10.20944/preprints201901.0294.v2
Subject: Mathematics & Computer Science, Analysis Keywords: Data preprocessing; data validation; recommendation engine; E-commerce; Click-through rate; Buy-through rate; online customer behavior; non-parametric outlier removal; personalization
Online: 1 February 2019 (10:22:37 CET)
E-commerce businesses employ recommender models to assist in identifying a personalized set of products for each visitor. To accurately assess the recommendations’ influence on customer clicks and buys, three target areas—customer behavior, data collection, user-interface —will be explored for possible sources of erroneous data. Varied customer behavior misrepresents the recommendations’ true influence on a customer due to the presence of B2B interactions and outlier customers. Non-parametric statistical procedures for outlier removal are delineated and other strategies are investigated to account for the effect of a large percentage of new customers or high bounce rates. Subsequently, in data collection we identify probable misleading interactions in the raw data, propose a robust method of tracking unique visitors, and accurately attributing the buy influence for combo products. Lastly, user-interface issues discuss the possible problems caused due to the recommendation widget’s positioning on the e-commerce website and the stringent conditions that should be imposed when utilizing data from the product listing page. This collective methodology results in an exact and valid estimation of the customer’s interactions influenced by the recommendation model in the context of standard industry metrics such as Click-through rates, Buy-through rates, and Conversion revenue.
CONCEPT PAPER | doi:10.20944/preprints202204.0104.v1
Subject: Life Sciences, Other Keywords: Method validation; droplet digital PCR; orthogonal factorial design; variance components; Poisson assumption; cloglog model; target DNA copies per droplet; Monte Carlo; prediction interval
Online: 12 April 2022 (08:46:49 CEST)
For the in-house validation of a droplet digital PCR method, a factorial experimental design was implemented. This design serves different purposes. On the one hand, it is an efficient design in relation to the workload involved in achieving a desirable level of reliability of variance estimates. On the other hand, it allows a partitioning of total variance into different components, thus providing information regarding the dominant sources of random variation. The statistical modelling reflects the actual measurement mechanism, establishing relationships between nominal target DNA copies per well, the range of variation of copy numbers per droplet, probability of detection values, and estimated numbers of copies.
ARTICLE | doi:10.20944/preprints201707.0044.v1
Subject: Engineering, Control & Systems Engineering Keywords: cyber physical systems; industry 4.0; MDE; hardware and software co-design; lifetime verification & validation; dependability; correctness; flexibility; self-management; self-adapting; self-healing
Online: 17 July 2017 (10:27:33 CEST)
Though Cyber Physical Systems (CPS) become very popular in last the decade, dependability of CPS is still a critical issue and related survey is rare. We try to spell out the jigsaw of technologies and figure out the technical trends of dependable self-managing CPS. This survey first recalls the motivation and the similar concepts. By analyzing four generic architectures, we summarize the common characteristics and related assurance technologies, and propose a more generic environment-in-loop processing flow of CPS and a formal interaction flow between physical space and cyber space. Further, the similarity between correctness and dependability is formally analyzed and the new five research questions of dependable self-managing CPS are presented. Then we review the critical technologies and related correctness verification & validation (V&V) methods, the architectures for dependable self-managing CPS. Further, the detail dependability management and V&V technologies are surveyed, which covers the areas of running-time fault management methods and whole life cycle V&V technologies, maintenance and available tool sets. For holistic CPS development, Modeling techniques and MDE (model driven engineering) based V&V methods are analyzed in detail. Then we complete the jigsaw of technologies and figure out the missing part. Further, we propose the technical challenges and the further direction. To our best knowledge, this is the first comprehensive survey on dependable self-managing CPS development and evaluation.