ARTICLE | doi:10.20944/preprints201805.0026.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: weighted distribution; Poisson-Lindley distribution; discrete distribution; weighted negative binomial Poisson-Lindley distribution
Online: 2 May 2018 (11:47:51 CEST)
This study introduces a new discrete distribution which is a weighted version of Poisson-Lindley distribution. The weighted distribution is obtained using the negative binomial weight function and can be fitted to count data with over-dispersion. The p.m.f., p.g.f. and simulation procedure of the new weighted distribution, namely weighted negative binomial Poisson-Lindley (WNBPL), are provided. The maximum likelihood method for parameter estimation is also presented. The WNBPL distribution is fitted to several insurance datasets, and is compared to the Poisson and negative binomial distributions in terms of several statistical tests.
ARTICLE | doi:10.20944/preprints202106.0543.v1
Subject: Medicine & Pharmacology, Allergology Keywords: radiomics; diffusion-weighted; MRI; cervical cancer
Online: 22 June 2021 (14:21:15 CEST)
Objectives: To explore the potential of Radiomics alone and in combination with diffusion-weighted derived quantitative parameter namely apparent diffusion co-efficient (ADC) using supervised classification algorithms in predicting outcomes and prognosis. Materials and Methods: Retrospective evaluation of the imaging was done for a study cohort of uterine cervical cancer, candidates for radical treatment with chemo radiation. ADC values were calculated from the darkest part of the tumor, both before (labeled preADC) and post treatment (labeled postADC) with chemo radiation. Post extraction of 851 Radiomics features and feature selection by taking the union of the features which had Pearson correlation >0.35 for recurrence, >0.49 for lymph node and >0.40 for metastasis, analysis was done to predict clinical outcomes. Results: The study enrolled 52 patients who presented with variable FIGO stages and age range 28–79 (Median = 53 years) with median follow-up of 26.5 months (range, 7–76 months). Disease recurrence occurred in 12 patients (23%). Metastasis occurred in 15 patients (28%). A model generated with 24 radiomics features and preADC using a monotone multi-layer perceptron neural network to predict the recurrence yields AUC of 0.80 and kappa value as 0.55 and shows that addition of radiomics features on ADC values improves the statistical metrics by 40% approximately for AUC and 223% approximately for Kappa. Similarly, neural network model for prediction of metastasis re-turns AUC of 0.84 and kappa value as 0.65 over performs by 25% for AUC and 140% for Kappa approximately. There was a significant input of GLSZM features (SALGLE and LGLZE) and GLDM features (SDLGLE and DE) correlation with clinical outcomes of recurrence and metastasis. Conclusions: The study is an effort to bridge the unmet need of translational predictive biomarkers in stratification of uterine cervical cancer patients based on prognosis.
BRIEF REPORT | doi:10.20944/preprints202004.0455.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: statistics; mean; weighted mean; average; mathematical thinking
Online: 25 April 2020 (02:50:55 CEST)
This study explores students’ understanding of one measure of central tendency, the mean. A teaching experiment was conducted to understand how sixth-grade students made sense of this concept. Findings suggest that the students know how to solve mathematical problems related to mean using procedural understanding and lack of conceptual understanding.
ARTICLE | doi:10.20944/preprints202301.0385.v1
Subject: Mathematics & Computer Science, Geometry & Topology Keywords: Complex Network; Weighted Network; Node Importance; Dynamic Programming
Online: 23 January 2023 (01:45:54 CET)
The heterogeneous structure implies that a few nodes may be crucial in maintaining network structural and functional properties. Identifying these crucial nodes correctly and quickly is a primary issue as contemporarily may face the mushrooming of large-scale datasets. Besides, the ‘weight issue’ is always ignored in this field which edge weight may play a positive/negative role in contributing to the node importance in different weighted networks. This paper provides a novel algorithm, Weighted Expectation Algorithm (WEA), which aims to ensure accuracy and speed of computation by taking advantage of dynamic programming to better handle the task of large-scale networks. Additionally, the weight issue that edge weights may contribute differently is addressed by a simply quantitative definition. Two standard experiments show WEA can maintain the network structure in connectivity well (by the lowest average robustness 0.192) and identify the node importance better in the spreading function test of spreading dynamics (by the highest average Kendall’s tau-b 0.678). In addition, the time complexities of different algo-rithms are evaluated, and their time-consuming are tested, proving that WEA consumes a rela-tively short time.
ARTICLE | doi:10.20944/preprints202007.0682.v1
Subject: Social Sciences, Economics Keywords: Business fluctuations; financial stability; output gap; weighted maturity
Online: 28 July 2020 (12:34:23 CEST)
Many countries have been facing the problem of bank insolvency across the globe. Asset deterioration is one of the main reasons for insolvency of banks. The objective of the paper is to ascertain the determinants of nonperforming loans (NPLs) in the banking sector of Pakistan for the period 2006-16. Other than the bank specific and macro variables proposed by the literature, the roles of weighted maturity and output gap are for the first time examined. We find significant impact of output gap on NPLs however weighted maturity has insignificant role in shaping the future NPLs. Bank specific drivers of NPLs include bank size and capital adequacy ratio.
ARTICLE | doi:10.20944/preprints201803.0124.v1
Subject: Engineering, Control & Systems Engineering Keywords: weighted centroid; signal intensity; attenuation model; combined model
Online: 16 March 2018 (04:23:19 CET)
Aiming at the defects of low precision and time cumulative error, an external wireless signal weighted centroid localization algorithm aided inertial positioning method is designed in this paper. According to the signal strength of each anchor node received at the test point, the distance between the anchor node and the anchor node is obtained by using the attenuation model of the wireless signal. Three anchor nodes are used to measure the distance between the anchor node and the measured point. We can obtain the area to be measured according to the actual situation, the position of the measured point is obtained by the weighted centroid localization algorithm and a combined model of wireless signal aided inertial navigation system is established. The simulation results show that the method can greatly improve the positioning accuracy and restrain the divergence of the longitude error and latitude error.
ARTICLE | doi:10.20944/preprints202109.0231.v1
Subject: Engineering, General Engineering Keywords: Sound insulation; partition walls; perforated studs; acoustic model; sound reduction index; finite element analysis; A-weighted pink noise; A-weighted urban noise
Online: 14 September 2021 (10:07:36 CEST)
Steels studs are an inevitable part of drywall construction as they are lightweight and offer the required structural stability. However, the studs act as sound bridges between the plasterboards reducing the overall sound insulation of the wall. Overcoming this often calls for wider cavity walls and complex stud decoupling fixtures that increase the installation cost while reducing the floor area. As an alternative approach, this research reveals the potential of perforated studs to improve the acoustic insulation of drywall partitions. The acoustic and structural performance is characterized using a validated finite element model that acted as a prediction tool in reducing the number of physical tests required. The results established that an acoustic numerical model featuring fluid-structure-interaction can predict the weighted sound reduction index of a stud wall assembly at an accuracy of ±1 dB. The model was used to analyze six perforated stud designs and found to outperform the sound insulation of non-perforated drywall partitions by reducing the sound bridging. Overall, the best performing perforated stud design was found to offer improvements in acoustic insulation of up to 4 dB, while being structurally compliant.
ARTICLE | doi:10.20944/preprints202201.0028.v1
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: Ordered Weighted Averaging (OWA); decision making under uncertainty; deep learning
Online: 5 January 2022 (10:29:33 CET)
Among many research areas to which Ron Yager contributed are decision making under uncertainty (in particular, under interval and fuzzy uncertainty) and aggregation – where he proposed, analyzed, and utilized the use of Ordered Weighted Averaging (OWA). The OWA algorithm itself provides only a specific type of data aggregation. However, it turns out that if we allows several OWA stages one after another, we get a scheme with a universal approximation property – moreover, a scheme which is perfectly equivalent to deep neural networks. In this sense, Ron Yager can be viewed as a (grand)father of deep learning. We also show that the existing schemes for decision making under uncertainty are also naturally interpretable in OWA terms.
ARTICLE | doi:10.20944/preprints202108.0455.v1
Subject: Engineering, Electrical & Electronic Engineering Keywords: matrix-weighted graphs; multi-agent systems; clustered consensus; global consensus
Online: 23 August 2021 (14:48:25 CEST)
This paper extends the concept of weighted graphs to matrix weighted graphs. The consensus algorithms dictate that all agents reach consensus when the weighted graph is connected. However, it is not always the case for matrix weighted graphs. The conditions leading to different types of consensus have been extensively analysed based on the properties of matrix-weighted Laplacians and graph theoretic methods. However, in practice, there is concern on how to pick matrix-weights to achieve some desired consensus, or how the change of elements in matrix weights affects the consensus algorithm. By selecting the elements in the matrix weights, different clusters may be possible. In this paper, we map the roles of the elements of the matrix weights in the systems consensus algorithm. We explore the choice of matrix weights to achieve different types of consensus and clustering. Our results are demonstrated on a network of three agents where each agent has three states.
Subject: Mathematics & Computer Science, Algebra & Number Theory Keywords: Central tendency; Weighted geometric mean; means; variance; non-parametric statistics
Online: 6 April 2021 (14:02:04 CEST)
Various means (the arithmetic mean, the geometric mean, the harmonic mean, the power means) are often used as central tendency statistics. A new statistic of such type is offered for a sample from a distribution on the positive semi-axis, the gamma-weighted geometric mean. This statistic is a certain weighted geometric mean with adaptive weights.
ARTICLE | doi:10.20944/preprints201806.0080.v1
Subject: Social Sciences, Economics Keywords: data envelopment analysis; biennial Luenberger index; geographically weighted regression; EEP
Online: 6 June 2018 (09:59:47 CEST)
This paper proposes a new non-radial biennial Luenberger energy and environmental performance index (EEPI) to measure the EEP change in various Chinese cities. The sources of EEP change, in terms of technical efficiency change and technological change, are examined by Luenberger EEPI. The contributions from specific undesirable outputs and energy inputs to the EEP change are identified by means of the non-radial efficiency measure. The proposed approach is applied to evaluate the EEP of the industrial sector in 283 cities in China over 2010-2014. Factors influencing the emission abatement potential are investigated by employing geographically weighted regression (GWR) model. We find that 1) changes in EEP can be attributed to technological progress but that technological progress slows down across the study period; 2) the soot emission performance experiences a downtrend among four specific sub-performances in line with the truth that severe haze happened frequently in China; 3) the best performers begin to move from the coastal to inland cities with the less resource consumption and higher ecological equality; 4) cities with the strongest positive effect in regards to pollution intensity on emission abatement potential are located in the areas around the Bohai Gulf, where air pollution is particularly severe.
ARTICLE | doi:10.20944/preprints202108.0268.v1
Subject: Keywords: Fuzzy collaborative intelligence; Dynamic random access memory; Fuzzy weighted intersection; Forecasting
Online: 11 August 2021 (18:08:46 CEST)
In a collaborative forecasting task, experts may have unequal authority levels. However, this has rarely been considered reasonably in the existing fuzzy collaborative forecasting methods. In addition, experts may not be willing to discriminate their authority levels. To address these issues, an auto-weighting fuzzy weighted intersection (FWI) fuzzy collaborative intelligence approach is proposed in this study. In the proposed auto-weighting FWI fuzzy collaborative intelligence approach, experts’ authority levels are automatically and reasonably assigned based on their past forecasting performances. Subsequently, the auto-weighting FWI mechanism is established to aggregate experts’ fuzzy forecasts. The theoretical properties of the auto-weighting FWI mechanism have been discussed and compared with those of the existing fuzzy aggregation operators. After applying the auto-weighting FWI fuzzy collaborative intelligence approach to a case of forecasting the yield of a DRAM product from the literature, its advantages over several existing methods were clearly illustrated.
ARTICLE | doi:10.20944/preprints202108.0206.v1
Subject: Social Sciences, Geography Keywords: Remote sensing; GIS; AHP; Groundwater potential zone; Weighted overlay analysis; Kilinochchi
Online: 9 August 2021 (16:56:29 CEST)
The scarcity of surface water resources in the dry season in the Kilinochchi district increases the demand for freshwater. Therefore, the existing groundwater resources should be managed to overcome the situation. Several authors worldwide have published studies on the delineation of potential groundwater zone. However, only a few studies addressed the delineation of potential groundwater zones in the Kilinochchi district. This study aims to delineate potential groundwater zones in Kilinochchi, Sri Lanka using integrated Remote Sensing, Geographic Information Systems, and Analytic Hierarchy Process techniques. Groundwater potential zones are demarcated for the Kilinochchi district by overlaying thematic layers: geology, geomorphology, land use/land cover, soil types, drainage density, slope, lineament, and rainfall. Saaty's scale was applied to the assigned weights of the chosen thematic layers and their features. The thematic layers were integrated into a Geographic Information System, and a weighted overlay analysis is carried out to delineate groundwater zones. Thus the resultant map is categorized into five different potential zones: very low, low, moderate, high, and very high. It was found that the very high groundwater potential zone is mainly found in the north-eastern part of the study area covering 111.26 km2. The upper north-western, middle, and eastern parts of the study area fall within the high groundwater potential zone covering about 507.74 km2. The moderate groundwater potential zones (309.89 km2) mainly occurred in the western part, and the extreme west part of the study area falls under low (207.78 km2) and very low (59.12 km2) zones. The groundwater potential map was validated with the existing seventy-nine wells, which indicated a good prediction accuracy of 81.8%. This research will help policymakers better manage the Kilinochchi district's groundwater resources and gives scope for further research into groundwater exploration in the area.
ARTICLE | doi:10.20944/preprints202008.0218.v1
Subject: Medicine & Pharmacology, Oncology & Oncogenics Keywords: hepatocellular carcinoma; diffusion-weighted imaging; magnetic resonance; hepatic arterial infusion chemotherapy
Online: 9 August 2020 (15:34:00 CEST)
This study aimed to identify the utility of diffusion-weighted magnetic resonance (MR) imaging with an apparent diffusion coefficient (ADC) map as a predictor of the intrahepatic response of hepatocellular carcinoma (HCC) to cisplatin-based hepatic arterial infusion chemotherapy (HAIC). We retrospectively evaluated 113 consecutive patients with HCC who underwent gadoxetic acid-enhanced and diffusion-weighted MR imaging. The appropriate cutoff for the tumor-to-liver ADC ratio was determined to be 0.741. Of the 113 patients, 51 (45%) presented with a tumor-to-liver ADC ratio < 0.741. Evaluation of the intrahepatic treatment response after 2-3 cycles of HAIC in these 51 patients revealed that 20 patients (39%) experienced an objective response to HAIC. On the other hand, only 10 of the 62 patients with a tumor-to-liver ADC ratio ≥ 0.741 (16%) experienced an objective response. Thus, the objective response rate was significantly higher in patients with a tumor-to-liver ADC ratio < 0.741 than in those with a tumor-to-liver ADC ratio ≥ 0.741 (P = 0.006). Multivariate logistic regression analysis using parameters including perfusion alteration, percentage of a non-enhancing portion, and tumor-to-liver ADC ratio revealed that a tumor-to-liver ADC ratio < 0.741 (odds ratio 3.03; P = 0.015) is the sole predictor of an objective response to HAIC. Overall survival rates were significantly higher in patients with objective responses to HAIC than in those without objective responses (P = 0.001 by log-rank test). In conclusion, patients with unresectable HCC with a tumor-to-liver ADC ratio < 0.741 showed a favorable intrahepatic response to HAIC. Therefore, diffusion-weighted MR imaging can play a critical role as a predictor of response to cisplatin-based HAIC in unresectable HCC.
ARTICLE | doi:10.20944/preprints202005.0354.v1
Subject: Mathematics & Computer Science, Other Keywords: ensemble learning; machine learning; Python; spatial distance; statistical distance; weighted ensemble
Online: 23 May 2020 (04:54:39 CEST)
In this paper, we introduce deboost, a Python library devoted to weighted distance ensembling of predictions for regression and classification tasks. Its backbone resides on the scikit-learn library for default models and data preprocessing functions. It offers flexible choices of models for the ensemble as long as they contain the predict method, like the models available from scikit-learn. deboost is released under the MIT open-source license and can be downloaded from the Python Package Index (PyPI) at https://pypi.org/project/deboost. The source scripts are also available on a GitHub repository at https://github.com/weihao94/DEBoost.
ARTICLE | doi:10.20944/preprints201902.0007.v1
Subject: Biology, Agricultural Sciences & Agronomy Keywords: Zea mays L.; nitrogen; chessboard design; geographically weighted regression; yield response functions
Online: 1 February 2019 (09:22:35 CET)
There is a large body of research on determining the impact of field variability of soil on crop yields. In contrast, site-specific information about crop responses to agronomic treatments is less frequent. On-Farm Precision Experimentation (OFPE) brings important information to understand the spatial variation of crop response to agronomic practices and thus to improve agronomic decisions. The objective of this work was to investigate the spatial variability of corn yield responses to nitrogen and seed rates using OFPE in four fields in the US Midwest. Geographically weighted regression was applied to generate local regression coefficients, which were used to delineate response zones in each field. The results showed the existence of great potential to adjust the rates of these inputs according to the response of each zone identified by the proposed method. The results of this study can be applied to reevaluate expectations on variable rate prescriptions guided largely by soil and variability.
ARTICLE | doi:10.20944/preprints202210.0129.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: PageRank; Time-Weighted PageRank; collective subjects; citation intensity; scientific research; research productivity; scientometrics
Online: 10 October 2022 (14:04:52 CEST)
This study aims to estimate the scientific productivity of collective subjects. The objective is to build a method for evaluating scientific productivity that allows calculating productivity, including for new collective subjects with a small citation network—the paper purposes the Time-Weighted PageRank method with citation intensity (TWPR-CI). The Citation Network Dataset (ver. 13) has been analyzed to verify the method. The dataset includes more than 5 million scientific publications and 48 million citations. There have been allocated four classes of collective subjects (more than 27,000 collective subjects in total). For each class, scientific productivity estimates from 2000 to 2021 were calculated using the PageRank, Time-Weighted PageRank, and TWPR-CI methods. It is shown that the advantage of the TWPR-CI method is the higher sensitivity of the scientific productivity estimates for new collective subjects on average during the first ten years of observation. At the same time, the assessment of scientific productivity for other collective subjects according to this method is stable. However, the small citation network of the new collective subjects does not allow an adequate assessment of scientific productivity during the first years of its operation. Therefore, the TWPR-CI method can be used to assess the scientific productivity of collective subjects, in particular the productivity of new ones.
ARTICLE | doi:10.20944/preprints202101.0322.v1
Subject: Materials Science, Biomaterials Keywords: Nanomaterial Selection; Pertinent attributes; MADM; TOPSIS; Normalization; Weighted normalization; Coefficient of similarity; Ranking
Online: 18 January 2021 (11:51:50 CET)
The paper presents attribute based characterization of nanomaterials method for computer storage and retrieval as knowledgebase. The knowledgebase permits indepth understanding and comparison between nanomaterials available with the scientists and product developers to satisfy their research and development (R & D) needs. Techniques for order preference by similarity to ideal solution (TOPSIS) is proposed to evaluate nanomaterials in the presence of multiple attributes. The method normalizes attributes to nullify the effect of different units and their values in the range of 0 to 1. The relative importance of different attributes for different applications is considered. The weight vector is derived using Eigen value formulation. The positive and negative benchmark solutions are derived. Euclidean distance of alternatives from these best and worst solution leads to the development of proximity /goodness/suitability index for ranking. Final decision is taken by decision makers by SWOT analysis and short and long term strategies of the organisation. The methodology is illustrated with the help of an example and step-by-step procedure. Results, discussion and conclusion highlights the importance and practical application.
REVIEW | doi:10.20944/preprints202001.0388.v1
Subject: Mathematics & Computer Science, Information Technology & Data Management Keywords: aggregation operators; composite aggregation operators; weighted aggregation operators; transformation; duality; group decision making
Online: 31 January 2020 (13:57:17 CET)
Aggregating data is the main line of any discipline dealing with fusion of information from the knowledge-based systems to the decision-making. The purpose of aggregation methods is to convert a list of objects, all belonging to a given set, into a single representative object of the same set usually by an n-ary function, so-called aggregation operator. Since the useful aggregation functions for modeling real-life problems are limit, the basic problem is to construct a proper aggregation operator for each situation. During the last decades, a number of construction methods for aggregation functions have been developed to build new classes based on the well-known operators. This paper reviews some of these construction methods where they are based on transformation, composition and weighted rule.
ARTICLE | doi:10.20944/preprints202012.0799.v1
Subject: Social Sciences, Accounting Keywords: Airport ground handling services; equipment purchase decision, AHP weighted; Membership function; fuzzy linear programming
Online: 31 December 2020 (12:29:35 CET)
The airport ground handling services (AGHS) equipment supplier selection problem involves a safety guarantee on the part of the AGHS company that carries out the daily work. AGHS company can prevent aircraft damage and delays in airlines schedules, and ensure reliable and high-quality ground handling service. In our research, we developed an AGHS equipment supplier selection model based on the analytic hierarchy process and an AHP weighted fuzzy linear programming approach, and we solved the AGHS equipment supplier’s selection problem. The main objective of this article is to create an AHP and AHP-FLP decision model in order to help the AGHS company authorities select the best AGHS equipment supplier. The practical application in AGHS equipment supplier selection decisions can be interpreted as demonstrating that the proposed model provides knowledge and practical value for the AGHS industry.
ARTICLE | doi:10.20944/preprints201906.0004.v1
Subject: Engineering, Other Keywords: weighted dissimilarity measure; feature-based indoor positioning; signals of opportunity; location-dependent standard deviation
Online: 3 June 2019 (08:37:55 CEST)
We propose an iterative scheme for feature-based positioning using a new weighted dissimilarity measure with the goal of reducing the impact of large errors among the measured or modeled features. The weights are computed from the location-dependent standard deviations of the features and stored as part of the reference fingerprint map (RFM). Spatial filtering and kernel smoothing of the kinematically collected raw data allow efficiently estimating the standard deviations during RFM generation. In the positioning stage, the weights control the contribution of each feature to the dissimilarity measure, which in turn quantifies the difference between the set of online measured features and the fingerprints stored in the RFM. Features with little variability contribute more to the estimated position than features with high variability. Iterations are necessary because the variability depends on the location, and the location is initially unknown when estimating the position. Using real WiFi signal strength data from extended test measurements with ground truth in an office building, we show that the standard deviations of these features vary considerably within the region of interest and are neither simple functions of the signal strength nor of the distances from the corresponding access points. This is the motivation to include the empirical standard deviations in the RFM. We then analyze the deviations of the estimated positions with and without the location-dependent weighting. In the present example the maximum radial positioning error from ground truth are reduced by 40% comparing to kNN without the weighted dissimilarity measure.
ARTICLE | doi:10.20944/preprints201809.0362.v1
Subject: Chemistry, Analytical Chemistry Keywords: solid-phase microextraction; air sampling; air analysis; volatile organic compounds; COMSOL; time-weighted average
Online: 19 September 2018 (04:08:14 CEST)
Determination of time-weighted average (TWA) concentrations of volatile organic compounds (VOCs) in air using solid-phase microextraction (SPME) is advantageous over other sampling techniques, but is often characterized by insufficient accuracies, particularly at longer sampling times. Experimental investigation of this issue and disclosing the origin of the problem is problematic and often not practically feasible due to high uncertainties. This research is aimed at developing the model of TWA extraction process and optimization of TWA air sampling by SPME using finite element analysis software (COMSOL Multiphysics). It was established that sampling by porous SPME coatings with high affinity to analytes is affected by slow diffusion of analytes inside the coating, an increase of analytes concentrations in the air near the fiber tip due to equilibration, and eventual lower sampling rate. The increase of a fiber retraction depth (Z) resulted in better recoveries. Sampling of studied VOCs using 23-ga Car/PDMS assembly at maximum possible Z (40 mm) was proven to provide more accurate results. Alternative sampling configuration based on 78.5 x 0.75 mm i.d. SPME liner was proven to provide similar accuracy at improved detection limits. Its modification with the decreased internal diameter from the sampling side should provide even better recoveries. The developed model offers new insight into optimization of air and gas sampling using SPME.
ARTICLE | doi:10.20944/preprints201806.0411.v1
Subject: Mathematics & Computer Science, Analysis Keywords: Ostrowski's inequality; Čebyšev inequality; Lupaş inequality; weighted integrals; probability density functions; cumulative probability function
Online: 26 June 2018 (11:35:58 CEST)
In this paper we establish some weighted integral inequalities of Ostrowski, Čebyšev and Lupaş type. Applications for continuous probability density functions supported on infinite intervals with two examples are also given.
ARTICLE | doi:10.20944/preprints201712.0057.v1
Subject: Earth Sciences, Other Keywords: dimension reduction; feature extraction; hyperspectral image; weighted feature space; low rank representation; spectral clustering
Online: 11 December 2017 (06:55:22 CET)
Containing hundreds of spectral bands (features), hyperspectral images (HSIs) have high ability in discrimination of land cover classes. Traditional HSIs data processing methods consider the same importance for all bands in the original feature space (OFS), while different spectral bands play different roles in identification of samples of different classes. In order to explore the relative importance of each feature, we learn a weighting matrix and obtain the relative weighted feature space (RWFS) as an enriched feature space for HSIs data analysis in this paper. To overcome the difficulty of limited labeled samples which is common case in HSIs data analysis, we extend our method to semisupervised framework. To transfer available knowledge to unlabeled samples, we employ graph based clustering where low rank representation (LRR) is used to define the similarity function for graph. After construction the RWFS, any arbitrary dimension reduction method and classification algorithm can be employed in RWFS. The experimental results on two well-known HSIs data set show that some dimension reduction algorithms have better performance in the new weighted feature space.
ARTICLE | doi:10.20944/preprints202202.0322.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: cumulative entropy; cumulative residual entropy; extropy; gini mean difference; tsallis entropy; weighted cumulative residual entropy
Online: 25 February 2022 (04:44:39 CET)
In this work, we introduce a generalized measure of cumulative residual entropy and study its properties. We show that several existing measures of entropy such as cumulative residual entropy, weighted cumulative residual entropy and cumulative residual Tsallis entropy, are all special cases of the generalized cumulative residual entropy. We also propose a measure of generalized cumulative entropy, which includes cumulative entropy, weighted cumulative entropy and cumulative Tsallis entropy as special cases. We discuss generating function approach using which we derive different entropy measures. Finally, using the newly introduced entropy measures, we establish some relationships between entropy and extropy measures.
ARTICLE | doi:10.20944/preprints201808.0001.v1
Subject: Social Sciences, Library & Information Science Keywords: normalized indicators; correlation analysis; Source Normalized Impact per Paper; SNIP; Field-Weighted Citation Impact; FWCI
Online: 1 August 2018 (05:57:55 CEST)
Recently, more and more countries are entering the global race for university competitiveness. On the one hand, global rankings are a convenient tool for quantitative analysis. On the other hand, their indicators are often difficult to quickly calculate, they often contradict each other. We thought about using widely available indicators for a quick analysis of the University's publication strategy. We opted for the normalized citation indicators available in SciVal analytical tool, i.e. Source Normalized Impact per Paper (SNIP) and Field-Weighted Citation Impact (FWCI). We have demonstrated the possibility of applying the correlation analysis to the impact indicators of a document and a journal on the sample of the social and humanitarian fields at Peoples' Friendship University of Russia. Particular attention was paid to the application of the results in practice.
ARTICLE | doi:10.20944/preprints202206.0420.v1
Subject: Medicine & Pharmacology, Oncology & Oncogenics Keywords: anterior mediastinal lesions; diffusion weighted imaging; diffusion magnetic resonance imaging; gadolinium; thyroma; lymphoma; thymus neoplasms; biopsy
Online: 30 June 2022 (08:19:29 CEST)
Background. To describe the characteristics of anterior mediastinal masses on conventional magnetic resonance imaging (MRI) and to assess the role of the Apparent Diffusion Coefficient (ADC) value in distinguishing benign from malignant mediastinal lesions. Methods. We conducted a retrospective cross-sectional study on 55 patients with anterior mediastinal mass who performed MRI before treatment. Biopsy and histopathological assessments were done after that. A radiologist evaluates the changes of signal intensity on these sequences: T1- weighted VIBE DIXON pre and post-contrast with Gadolinium, T2 HASTE, T2 TIRM, DWI/ADC, to determine the size, margin of the lesion, the presence of fat, cystic in it. ADCs values were calculated from the ADC maps which were constructed from b = 0 and b = 2000 Results. The study was composed of 55 patients, with 5 benign lesions and 50 malignant lesions. The ADCmean, ADCmedian, ADC10, and ADC90 in the histogram-based approach and hot-spot-ROI-based mean ADC for the malignant lesions was significantly lower than those found in benign lesions (P-value <0.05). The hot-spot-ROI-based mean ADC had the highest value in differentiation between benign and malignant mediastinal lesions, between group A (benign lesions, thymoma A, AB, B1) and group B (thymoma B2, B3 and other malignant lesions). The cut-off point of the ADC value differentiating malignant from benign mediastinal lesions was 1,17x10-3mm2/s with sensitivity of 80%, and specificity of 80%. The cut off point of the ADC value differentiating group A from group B was 0.99 x10-3mm2/s with sensitivity of 78,4%, and specificity of 88.9%. The cut-off point of the ADC value differentiating lymphoma from other malignant lesions was 0,91 x10-3mm2/s with sensitivity of 100%, and specificity of 60.5%. Conclusion. Diffusion-weighted MRI and measurement of ADC value in histogram-based approach and hot-spot-ROI-based mean ADC are very helpful in the differentiation between benign and malignant anterior mediastinal lesions.
ARTICLE | doi:10.20944/preprints202108.0318.v1
Subject: Social Sciences, Education Studies Keywords: Inter-rater reliability; preservice teacher performance assessment; PACT; edTPA; weighted kappa; cognitive task analysis; qualitative; quantitative
Online: 16 August 2021 (10:51:52 CEST)
The Performance Assessment for California Teachers (PACT) is a high stakes summative assessment that was designed to measure pre-service teacher readiness. We examined the inter-rater reliability (IRR) of trained PACT evaluators who rated 19 candidates. As measured by Cohen’s weighted kappa, the overall IRR estimate was .17 (poor strength of agreement). IRR estimates ranged from -.29 (worse than expected by chance) to .54 (moderate strength of agreement); all were below the standard of .70 for consensus agreement. Follow up interviews of 10 evaluators revealed possible reasons we observed low IRR, such as departures from established PACT scoring protocol, and lack of, or inconsistent, use of a scoring aid document. Evaluators reported difficulties scoring the materials that candidates submitted, particularly the use of Academic Language. Cognitive Task Analysis (CTA) is suggested as a method to improve IRR in the PACT and other teacher performance assessments such as the edTPA.
Subject: Mathematics & Computer Science, Artificial Intelligence & Robotics Keywords: soft constraints; Ordered Weighted Averaging Operators; Volunteered Geographic Information; standing water area mapping; decision attitude modeling
Online: 29 December 2019 (08:24:59 CET)
The paper proposes a human explainable artificial intelligence approach for mapping the status of environmental phenomena from multisource geo data. It is both knowledge and data driven: it exploits remote sensing expert’s knowledge to define the contributing factors from which partial evidence of the environmental status can be computed. Furthermore, it aggregates the partial evidences to compute a map of the environmental status by adapting to a region of interest through a learning mechanism exploiting Volunteered Geographic Information (VGI), both from in situ observations and photointerpretation. The approach is capable to capture the specificities of local context as well as to cope with the subjectivity and incompleteness of expert’s knowledge. The proposal is exemplified to map the status of standing water areas (i.e. water bodies and river, human driven or natural hazard flooding) by considering satellite data and geotagged observations. Results of the validation experiments were performed in three areas of Northern Italy, characterized by distinct ecosystems. Results of the proposed methodological framework showed better performances than traditional approaches based on single spectral indexes thresholding. The use of expert’s knowledge, possibly imprecise/uncertain and incomplete, the need of few ground truth data for learning, and finally the explainability of learned rules are the distinguishing characteristics of the proposal with respect to traditional machine learning methods.
ARTICLE | doi:10.20944/preprints201608.0146.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: exponentially weighted moving average control chart (EWMA); autoregressive integrated moving average (ARIMA); average run length (ARL)
Online: 15 August 2016 (10:43:14 CEST)
In this paper we propose the explicit formulas of Average Run Length (ARL) of Exponentially Weighted Moving Average (EWMA) control chart for Autoregressive Integrated Moving Average: ARIMA (p,d,q) (P, D, Q)L process with exponential white noise. To check the accuracy, the ARL results were compared with numerical integral equations based on the Gauss-Legendre rule. There was an excellent agreement between the explicit formulas and the numerical solutions. Additionally, we compared the computational time between our explicit formulas for the ARL with the one obtained via Gauss-Legendre numerical scheme. The computational time for the explicit formulas was approximately one second that is much less than the numerical approximations. The explicit analytical formulas for evaluating ARL0 and ARL1 can produce a set of optimal parameters which depend on the smoothing parameter (λ) and the width of control limit (H), for designing an EWMA chart with a minimum ARL1.
ARTICLE | doi:10.20944/preprints202111.0462.v1
Subject: Social Sciences, Econometrics & Statistics Keywords: cooperative game theory; power indices; weighted voting games; dynamic programming; precoalitions; Shapley value; Owen value; Banzhaf index.
Online: 25 November 2021 (08:15:34 CET)
We study the efficient computation of power indices for weighted voting games with precoalitions amongst subsets of players (reflecting, e.g., ideological proximity) using the paradigm of dynamic programming. Starting from the state-of-the-art algorithms for computing the Banzhaf and Shapley-Shubik indices for weighted voting games we present a framework for fast algorithms for the three most common power indices with precoalitions, i.e. the Owen index, the Banzhaf-Owen index and the Symmetric Coalitional Banzhaf index, and point out why our new algorithms are applicable for large numbers of players. We discuss implementations of our algorithms for the three power indices with precoalitions in C++ and review computing times as well as storage requirements.
REVIEW | doi:10.20944/preprints201809.0236.v1
Subject: Life Sciences, Biophysics Keywords: molecular dynamics simulation; rare event; string method; multiscale enhanced sampling; weighted ensemble; multidrug transporter; Onsager-Machlup action
Online: 13 September 2018 (12:01:39 CEST)
To understand functions of biomolecules such as proteins, not only structures but their conformational change and kinetics are important to be characterized but its atomistic details are hard to obtain both experimentally and computationally. We review our recent computational studies using novel enhanced sampling techniques for conformational sampling of biomolecules and calculations of their kinetics. For efficiently characterizing the free energy landscape of a biomolecule, we introduce the multiscale enhanced sampling method, which uses a combined system of atomistic and coarse-grained models. Based on the idea of Hamiltonian replica exchange, we can recover the statistical properties of the atomistic model without any biases. We next introduce the string method as a path search method to calculate the minimum free energy pathways along a multidimensional curve in high dimensional space. Finally we introduce novel methods to calculate kinetics of biomolecules based on the ideas of path sampling: One is the Onsager-Machlup action method, and the other is the weighted ensemble method. Some applications of above methods to biomolecular systems are also discussed and illustrated.
ARTICLE | doi:10.20944/preprints202008.0048.v1
Subject: Medicine & Pharmacology, Oncology & Oncogenics Keywords: 8505C cell line; apoptosis; BCPAP cell line; CFLAR; DDX19B; IL6; oxidative phosphorylation; SPINT2; thyroid hormone synthesis; weighted pathway regulation
Online: 2 August 2020 (17:32:47 CEST)
Publically available (own) transcriptomic data were re-analyzed to quantify the alteration of functional pathways in the thyroid cancer, establish the gene hierarchy, identify potential gene targets and predict the effects of their manipulation. The expression data were generated from one case of papillary thyroid carcinoma (PTC) and from genetically manipulated BCPAP (papillary) and 8505C (anaplastic) human thyroid cancer cell lines. The study used the genomic fabric perspective that considers the transcriptome as a multi-dimensional mathematical object based on the three independent characteristics that can be derived for each gene from the expression data. We found remarkable remodeling of the thyroid hormone synthesis, cell cycle, oxidative phosphorylation and apoptosis pathways. Serine peptidase inhibitor, Kunitz type, 2 (SPINT2) was identified as the Gene Master Regulator of the investigated PTC. The substantial increase of the expression synergism of SPINT2 with apoptosis genes in the cancer nodule with respect to the surrounding normal tissue (NOR) suggests that its experimental overexpression may force the PTC cells into apoptosis with negligible effect on the NOR cells. The predictive value of the expression coordination for the expression regulation was validated with data from 8505C and BCPAP cells before and after lentiviral transfection with DDX19B.
ARTICLE | doi:10.20944/preprints202301.0484.v1
Subject: Mathematics & Computer Science, Analysis Keywords: Ekeland variational principle; Takahashi minimization principle; Caristi fixed point theorem; weighted graph; partially ordered metric space; completeness; the OSC property
Online: 26 January 2023 (17:12:51 CET)
We prove a version of Ekeland Variational Principle (EkVP) in a weighted graph $G$ and its equivalence to Caristi fixed point theorem and to Takahashi minimization principle. The usual completeness and topological notions are replaced with some weaker versions expressed in terms of the graph $G$. The main tool used in the proof is the OSC property for sequences in a graph. Converse results, meaning the completeness of graphs for which one of these principles holds is also considered.
Subject: Mathematics & Computer Science, Numerical Analysis & Optimization Keywords: equipment vendor selection; fuzzy TOPSIS; fuzzy weighted average left and right score; multi-choice goal programming; multi-aspiration goal programming
Online: 30 May 2019 (08:42:27 CEST)
The airport ground handling service (AGHS) equipment vendor selection (AGHSEVS) problem is critical for ramp work safety management, because AGHS equipment malfunctions affect airport ramp work safety. Appropriate vendor selection can prevent aircraft damage and delays in airlines schedules, and ensure reliable and high-quality ground handling service. The AGHSEVS problem is a time-consuming and complex process that requires professional knowledge and experience to make judgments. Specifically, AGHSEVS is a multi-criteria decision-making (MCDM) problem. Previous research has seldom integrated MCDM methods with linear and goal programming to solve the AGHSEVS problem. The objective of this study was to develop a new system evaluation model for AGHSEVS by considering both qualitative and quantitative methods. We test the proposed approach on an AGHS company in Taiwan.
ARTICLE | doi:10.20944/preprints202007.0638.v1
Subject: Social Sciences, Econometrics & Statistics Keywords: municipal waste in Poland; geographically weighted regression with spatial error term; analysis of regionally divergent determinants; spatial processes; communes; sustainable development
Online: 26 July 2020 (15:40:19 CEST)
This article attempts to identify factors impacting on the quantity of municipal waste in Polish 2478 communes (LAU-2), taking into account the variability of particular determinants influence depending on their regional diversification. The analysis covers the years 2005-2018. The dependent variable is the volume of municipal waste in kg per capita, whereas the group of determinants include: economic and human development, uncontrolled dumping sites, population density, population at the working age, migration, tourism, urbanization, dwellings and housing, retail sales, entities, education and investments in waste management; The geographically weighted regression with spatial error term (GWR-SEM) was employed in this study. The model enabled not only the specification of the waste production determinants, but also the analysis of the variability in the strength and direction of dependencies occurring between the examined variables in individual communes. The results proved that the higher the level of education the less waste is generated (in the north-central Poland); the business entities and working-age population are crucial for the waste quantity in communes of eastern Poland; the factors most considerably as to regional range affecting the waste quantity are urban and business development; as to strength–higher education and the share of working age individuals;
ARTICLE | doi:10.20944/preprints201910.0342.v1
Subject: Mathematics & Computer Science, Probability And Statistics Keywords: Tubercolusis (TB); Poisson Autoregressive (PAR); Poisson Exponentially Weighted Moving Average Model (PEWMA); Hepatitis; Human Immunodeficiency Virus (HIV); Acquired Immune Deficiency Syndrome (AIDs)
Online: 29 October 2019 (15:51:16 CET)
The research work examined the trend of HIV/AIDS, Tuberculosis, and Hepatitis diseases in Plateau state. Annual data from 2003 to 2018 was collected from the department of biostatistics at Plateau State Specialist Hospital (PSSH), Jos. The methods of analysis used are the Poisson Autoregressive Model (PAR(1)) and the Poisson Exponentially Weighted Moving Average Model (PEWMA). The results revealed a significant annual decrease of 23.9% and 4% in Tuberculosis and HIV/AIDS respectively. Furthermore, the results showed a significant annual increase of 46% in Hepatitis. The PEWMA model used revealed that TB increased by 0.02% when there is an increase in HIV but not significant, while Hepatitis significantly aggravates TB by at least 0.24%. Also, there is a significant rise in HIV by 0.85% when TB increases but Hepatitis has no such effect on HIV. Lastly, PEWMA model indicated a rise of 0.5% in Hepatitis cases when there is an increase in TB, but a surge in HIV has no such effect on Hepatitis cases in Jos. The study recommended that fight against TB should be intensified since TB cases significantly affect both HIV and Hepatitis in Jos, Nigeria.
ARTICLE | doi:10.20944/preprints201711.0124.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: neutrosophic number; neutrosophic number harmonic mean operator (NNHMO); neutrosophic number weighted harmonic mean operator (NNWHMO); cosine function, score function; multi criteria group decision making
Online: 20 November 2017 (09:53:31 CET)
The concept of neutrosophic number is a significant mathematical tool to deal with real scientific problems because it can tackle indeterminate and incomplete information which exists generally in real problems. In this article, we use neutrosophic numbers (a + bI), where a and bI denote determinate component and indeterminate component respectively. We explore the situations in which the input information is needed to express in terms of neutrosophic numbers. We define score functions and accuracy functions for ranking neutrosophic numbers. We then define a cosine function to determine unknown criteria weights. We define neutrosophic number harmonic mean operators and proved their basic properties. Then, we develop two novel MCGDM strategies using the proposed aggregation operators. We solve a numerical example to demonstrate the feasibility and effectiveness of the proposed two strategies. Sensitivity analysis with variation of “I” on neutrosophic numbers is performed to demonstrate how the preference ranking order of alternatives is sensitive to the change of “I”. The efficiency of the developed strategies is ascertained by comparing the obtained results from the proposed strategies with the existing strategies in the literature.
ARTICLE | doi:10.20944/preprints201703.0119.v1
Subject: Mathematics & Computer Science, Analysis Keywords: Lévy–Khintchine representation; integral representation; Bernstein function; Stieltjes function; Toader–Qi mean; weighted geometric mean; Bessel function of the first kind; probabilistic interpretation; probabilistic interpretation; application in engineering; inequality
Online: 16 March 2017 (11:31:31 CET)
In the paper, by virtue of a Lévy–Khintchine representation and an alternative integral representation for the weighted geometric mean, the authors establish a Lévy–Khintchine representation and an alternative integral representation for the Toader–Qi mean. Moreover, the authors also collect an probabilistic interpretation and applications in engineering of the Toader–Qi mean.
ARTICLE | doi:10.20944/preprints202301.0086.v1
Subject: Medicine & Pharmacology, General Medical Research Keywords: diffusion-weighted whole body imaging with background suppression（DWIBS）; magnetic res-onance imaging (MRI); lung cancer; response evaluation criteria in solid tumors (RECIST); ap-parent diffusion coefficient (ADC); BD Score
Online: 5 January 2023 (01:43:25 CET)
Chemotherapy for lung cancer has made remarkable progress, and its selection has been subdivided by genetic analysis. Although response evaluation criteria in solid tumors (RECIST) is a simple and highly objective evaluation method, it has been pointed out that it has the disadvantage of not being able to accurately evaluate the therapeutic effect of molecularly targeted drugs and immune checkpoint inhibitors. The purpose of this study is to determine whether quantitative evaluation of DWIBS is useful in determining the effectiveness of treatment for lung cancer. There were 31 patients with lung cancer and 56 patterns were obtained. First, the determination of treatment effect (PD, SD, PR) based on RECIST was performed on CT. Second, tDV and ADC (median) were measured using BD score from DWI images taken at the same time, and the rate of change was calculated. Then, we compared and examined the correlation between RECIST and the rate of change in tDV. There were correlations between RECIST and the rate of change in tDV in PD and PR cases respectively. Compared RECIST on CT, DWIBS using BD score would be more accurate for response evaluation of treatment. There was a tendency that tDV decreased with increasing ADC values, but some of the cases had a dissociated response with increasing ADC values and increasing tDV. Thus, since the color display of ADC values allows us to infer their contents, it would be useful for the evaluation of cases such as dissociated response and pseudoprogression, which are suspected to be atypical responses that are difficult to evaluate with RECIST and other methods.
ARTICLE | doi:10.20944/preprints201802.0165.v1
Subject: Medicine & Pharmacology, Clinical Neurology Keywords: Cerebral autosomal dominant arteriopathy with subcortical infarcts and leuco encephalopathy (CADASIL); Carotid Endarterectomy (CEA); Modified Rankin Scale (MRS); Computed Tomography(CT); Tissue plasminogen activator (tPA); Diffusion weighted Imaging (DWI); Recognition of Stroke in the Emergency Room (ROSIER) scale; Magnetic resonance Imaging (MRI); Internal Carotid Artery (ICA)
Online: 26 February 2018 (11:46:47 CET)
In advanced world stroke is one of the disabling cause of death that can be managed with thrombolysis if presents early despite further risk of intracerebral haemorrhage. Secondary prevention is an important objective in ischaemic stroke where recurrence is very high with subsequent stroke. Carotid End Arterectomy impact a definitive and effective role for both symptomatic and asymptomatic carotid stenosis for secondary stroke prevention in selective cases. Thrombolysis is a potential primary management for certain group whereas carotid surgery employs secondary preventative measure in a specified ischaemic stroke group.