ARTICLE | doi:10.20944/preprints201810.0347.v1
Subject: Engineering, Industrial And Manufacturing Engineering Keywords: lean manufacturing; PDCA; defects; Pareto chart; flowchart
Online: 16 October 2018 (09:09:54 CEST)
Defects are considered one of the wastes in manufacturing systems that negatively affect the delivery times, cost and quality of products leading to manufacturing companies facing a critical situation with the customers and to not comply with the IPC-A-610E standard for the acceptability of electronic components. This is the case is a manufacturing company located in Tijuana, Mexico. Due to an increasing demand on the products manufactured by this company, several defects have been detected in the welding process of electronic boards, as well as in the components named Thru-Holes. It is for this reason that this paper presents a lean manufacturing application case study. The objective of this research is to reduce at least 20% the defects generated during the welding process. In addition, it is intended to increase 20% the capacity of 3 double production lines where electronic boards are processed. As method, the PDCA cycle, is applied. The Pareto charts and the flowchart are used as support tools. As results, defects decreased 65%, 79% and 77% in three analyzed product models. As conclusion, the PDCA cycle, the Pareto charts, and the flowchart are excellent quality tools that help decrease the number of defective components.
ARTICLE | doi:10.20944/preprints202310.0137.v1
Subject: Physical Sciences, Mathematical Physics Keywords: Public goods; Repeated game; Strong reciprocity; Quantum entanglement; Pareto
Online: 3 October 2023 (09:32:35 CEST)
We construct a repeated quantum game with public goods by the quantum entanglement and strong reciprocity. By using the paradigm of quantum game analysis, we obtain that both the quantum entanglement and strong reciprocity are helpful to intensify cooperation and achieving Pareto optimality. In addition, under an example which is greenhouse green viable planting industry, we verify that the quantum entanglement and strong reciprocity have a positive role in cooperation.
ARTICLE | doi:10.20944/preprints202001.0007.v1
Subject: Social Sciences, Language And Linguistics Keywords: syntax; Pareto-optimality; bottleneck method; phase transitions; statistical mechanics
Online: 2 January 2020 (04:21:01 CET)
What are relevant levels of description when investigating human language? How are these levels connected to each other? Does one description yield smoothly into the next one such that different models lie naturally along a hierarchy containing each other? Or, instead, are there sharp transitions between one description and the next, such that to gain a little bit accuracy it is necessary to change our framework radically? Do different levels describe the same linguistic aspects with increasing (or decreasing) accuracy? Historically, answers to these questions were guided by intuition and resulted in subfields of study, from phonetics to syntax and semantics. Need for research at each level is acknowledged, but seldom are these different aspects brought together (with notable exceptions). Here we propose a methodology to inspect empirical corpora systematically, and to extract from them, blindly, relevant phenomenological scales and interactions between them. Our methodology is rigorously grounded in information theory, multi-objective optimization, and statistical physics. Salient levels of linguistic description are readily interpretable in terms of energies, entropies, phase transitions, or criticality. Our results suggest a critical point in the description of human language, indicating that several complementary models are simultaneously necessary (and unavoidable) to describe it.
Subject: Computer Science And Mathematics, Data Structures, Algorithms And Complexity Keywords: Chemometric data,sparse autoencoder, gaussian process regressor, pareto optimization.
Online: 9 May 2019 (11:31:46 CEST)
We proposed a deep learning based chemometric data analysis technique. We trained L2 regularized sparse autoencoder end-to-end for reducing the size of the feature vector to handle the classic problem of curse of dimensionality in chemometric data analysis. We introduce a novel technique of automatic selection of nodes inside hidden layer of an autoencoder through pareto optimization. Moreover, linear regression, ϵ-SVR , and Gaussian process regressor are applied on the reduced size feature vector for the regression. We evaluated our technique on orange juice and wine dataset and results are compared against state-of-the-art methods. Quantitative results are shown on Normalized Mean Square Error (NMSE) and the results show considerable improvement in the state-of- the-art.
ARTICLE | doi:10.20944/preprints201612.0088.v1
Subject: Computer Science And Mathematics, Data Structures, Algorithms And Complexity Keywords: multiple criteria analysis; algorithms performance; Pareto optimality; quality indicator
Online: 16 December 2016 (08:31:29 CET)
In multi-objective optimization problems, the optimization target is to obtain a set of non-dominated solutions. Comparing solution sets is crucial in evaluating the performances of different optimization algorithms. The use of performance indicators is common in comparing those sets and, subsequently, optimization algorithms. A good solution set must be close to the Pareto-optimal front, well-distributed, maximally extended and fully filled. Therefore, an effective performance indicator must encompass these features as a whole and must be Pareto dominance compliant. Unfortunately, some of the known indicators often fail to properly reflect the quality of a solution set or cost a lot to compute. This paper demonstrates that the Degree of Approximation (DOA) quality indicator, is a weakly Pareto compliant unary indicator that gives a good estimation of the match between the approximated front and the Pareto-optimal front. Moreover, DOA computation is easy and fast.
ARTICLE | doi:10.20944/preprints202102.0100.v1
Subject: Business, Economics And Management, Accounting And Taxation Keywords: game theory; quantum games; Nash equilibrium; Pareto-efficiency; correlated equilibria
Online: 3 February 2021 (09:51:34 CET)
The aim of the paper is to investigate Nash equilibria and correlated equilibria of classical and quantum games in the context of their Pareto optimality. We study four games: the prisoner's dilemma, battle of the sexes and two versions of the game of chicken. The correlated equilibria usually improve Nash equilibria of games but require a trusted correlation device. We analyze the quantum extension of these games in the Eisert-Wilkens-Lewenstein formalism with the full SU(2) space of players’ strategy parameters. It has been shown that the Nash equilibria of these games in quantum mixed Pauli strategies are closer to Pareto optimal results than their classical counterparts. The relationship of mixed Pauli strategies equilibria and correlated equilibria is also analyzed.
ARTICLE | doi:10.20944/preprints202011.0031.v1
Subject: Engineering, Automotive Engineering Keywords: BIM; Insulation Design; Building Envelope; Multi-objective; Optimisation; Pareto-front
Online: 2 November 2020 (11:26:02 CET)
Insulation systems for the floor, roof and external walls play a prominent role in providing a thermal barrier for the building envelope. Design decisions made for the insulation material type and thickness can alleviate potential impacts on the embodied energy and improve the building thermal performance. This design problem is often addressed using a BIM-integrated optimisation approach. However, one major weakness lies in the current studies is that BIM is merely used as the source for design parameters input. This study proposes a BIM-based envelope insulation optimisation design framework using a common software Revit to find the trade-off between the total embodied energy of the insulation system and the thermal performance of the envelope by considering the material type and thickness. In addition, the framework also permits data visualisation in a BIM environment, and subsequent material library mapping together with instantiating the optimal insulation designs. The framework is tested on a case study based in Sydney, Australia. By analysing sample designs from the Pareto front, it is found that slight improvement in the thermal performance (1.3399 to 1.2112 GJ/m2) would cause the embodied energy to increase by more than 50 times.
ARTICLE | doi:10.20944/preprints202106.0089.v1
Subject: Engineering, Energy And Fuel Technology Keywords: TSO-DSO coordination; Pareto front; Bi-level optimisation; Optimal power flow
Online: 2 June 2021 (15:53:22 CEST)
The incorporation of renewable energy into power systems poses serious challenges to the transmission and distribution power system operators (TSOs and DSOs). To fully leverage these resources there is a need for a new market design with improved coordination between TSOs and DSOs. In this paper we propose two coordination schemes between TSOs and DSOs: one centralised and another decentralised that facilitate the integration of distributed based generation; minimise operational cost; relieve congestion; and promote a sustainable system. To this end, we approximate the power equations with linearised equations so that the resulting optimal power flows (OPFs) in both the TSO and DSO become convex optimisation problems. In the resulting decentralised scheme, the TSO and DSO collaborate to optimally allocate all resources in the system. In particular, we propose an iterative bi-level optimisation technique where the upper level is the TSO that solves its own OPF and determines the locational marginal prices at substations. We demonstrate numerically that the algorithm converges to a near optimal solution. We study the interaction of TSOs and DSOs and the existence of any conflicting objectives with the centralised scheme. More specifically, we approximate the Pareto front of the multi-objective optimal power flow problem where the entire system, i.e., transmission and distribution systems, is modelled. The proposed ideas are illustrated through a five bus transmission system connected with distribution systems, represented by the IEEE 33 and 69 bus feeders.
ARTICLE | doi:10.20944/preprints202305.1198.v1
Subject: Computer Science And Mathematics, Probability And Statistics Keywords: heavy-tailed distributions; Pareto law; Lotka law; Zipf law; probability generating function.
Online: 17 May 2023 (05:41:04 CEST)
We give two examples of the appearance of heavy-tailed distributions in applications to social sciences. Among these distributions are the laws of Pareto, Lotka, and some new ones. The examples are illustrated by constructing suitable toy models.
ARTICLE | doi:10.20944/preprints202107.0499.v1
Subject: Business, Economics And Management, Accounting And Taxation Keywords: Generalized Pareto distribution; Exponential income distribution; Technology factor; Information stock; Decentralized decisions
Online: 21 July 2021 (15:20:38 CEST)
This paper provides attempts to formalize Hayek’s theory of knowledge. It has been theoretically shown that exponential income distribution is a spontaneous order of the well-functioning market economy. We show that this theoretical result is supported by the empirical evidence from the United Kingdom and China. In particular, we empirically show how the income structure of China evolved towards an exponential distribution after the market-oriented economic reformation. Furthermore, we strictly prove that, if the income structure of an economy obeys an exponential distribution, the income summation over all households leads to an aggregate production function with Hicks-neutral-like technical progress, in which the technology factor is exactly equal to society’s information stock that is a result of combining all of decentralized decisions.
ARTICLE | doi:10.20944/preprints201908.0014.v1
Subject: Engineering, Industrial And Manufacturing Engineering Keywords: tungsten; zinc; tailings re-processing; multi-criteria optimization; modelling regression; Pareto optimal
Online: 1 August 2019 (11:43:59 CEST)
The growth of demand for metallic minerals has faced with the need for new techniques and improving technologies for all mine life cycle operations. Nowadays, the exploitation of old tailings and mine wastes facilities could represent a solution to this demand, with economic and environmental advantages. W-Sn Panasqueira Mine has been operating for more than 100 years. Its first processing plant “Rio” was located near Zêrere river being the mineral processing residues deposited on the top hillside on the margin of this river in Cabeço do Pião tailings dam. The lack of maintenance and monitoring of this enormous structure in the last twenty years represents high risks to the environment and population of the surrounding region. The re-mining of the tailings by hydrometallurgical methods was considered, in order to satisfy these two enounced conditions - metals demand and environmental risk, aiming for the sale of the metal to pay the environmental intervention. Field samples campaign allowed collecting data and results from laboratory tests driving to use regression optimization. The re-mining solution was studied, taking into account the technical, economic, social, and environmental aspects.
ARTICLE | doi:10.20944/preprints202312.0596.v1
Subject: Computer Science And Mathematics, Mathematics Keywords: Pareto optimal elemen; Bi-multi-objective optimization; Interval order; Ambiguity; Markowitz portfolio selection
Online: 8 December 2023 (09:17:23 CET)
We introduce and discuss a generalization of the classical multi-objective optimization to pairs of functions. This procedure is referred to as bi-multi-objective optimization. A justification of this general optimization procedure is presented, related both to multi-objective optimization under ambiguity concerning individual preferences and to Pareto optimality for a family of preferences with nontransitive indifference. Incidentally, the binary relation naturally associated to a bi-multi-objective optimization problem is represented by a finite bi-multi-utility, which generalizes to the nontransitive case the classical finite multi-utility representation. An important application is presented to Markowitz portfolio selection under ambiguity concerning both the vector of returns and the covariance matrix.
ARTICLE | doi:10.20944/preprints202305.0125.v1
Subject: Biology And Life Sciences, Biophysics Keywords: trade-off; dialectic; TRIZ; Inventive Principle; wood wasp ovipositor; intracranial endoscope; Pareto curve
Online: 3 May 2023 (09:44:40 CEST)
Our knowledge of physics and chemistry is relatively well defined. Results from that knowledge are predictable as, largely, are those of their technical offspring such as electrical, chemical, me-chanical and civil engineering. By contrast biology is relatively unconstrained and unpredictable. A factor common to all areas is the trade-off, which provides a means of defining and quantifying a problem and, ideally, its solution. In order to understand the anatomy of the trade-off and how to handle it, its development (as the dialectic) is tracked from Hegel and Marx to its implementa-tion as dialectical materialism in Russian philosophy and TRIZ, the Theory of Invention. With the ready availability of mathematical techniques, such as multi-objective analysis and the Pareto set, the trade-off is well-adapted to bridging the gaps between the quantified and the unquantifiable, allowing modelling and the transfer of concepts by analogy. It is thus an ideal tool for biomimet-ics. An intracranial endoscope can be derived with little change from the egg-laying tube of a wood wasp. More complex transfers become available as the technique is developed. Most im-portant, as more trade-offs are analyzed, their results are stored to be used again in the solution of problems. There is no other system in biomimetics which can do this.
ARTICLE | doi:10.20944/preprints202112.0506.v1
Subject: Computer Science And Mathematics, Data Structures, Algorithms And Complexity Keywords: multi-objective; evolutionary algorithms; Pareto optimality; Wasserstein distance; network vulnerability; resilience; sensor placement.
Online: 31 December 2021 (11:01:51 CET)
This paper is focused on two topics very relevant in water distribution networks (WDNs): vulnerability assessment and the optimal placement of water quality sensors. The main novelty element of this paper is to represent the data of the problem, in this case all objects in a graph underlying a water distribution network, as discrete probability distributions. For vulnerability (and the related issue of re-silience) the metrics from network theory, widely studied and largely adopted in the water research community, reflect connectivity expressed as closeness centrality or, betweenness centrality based on the average values of shortest paths between all pairs of nodes. Also network efficiency and the related vulnerability measures are related to average of inverse distances. In this paper we propose a different approach based on the discrete probability distribution, for each node, of the node-to-node distances. For the optimal sensor placement, the elements to be represented as dis-crete probability distributions are sub-graphs given by the locations of water quality sensors. The objective functions, detection time and its variance as a proxy of risk, are accordingly represented as a discrete e probability distribution over contamination events. This problem is usually dealt with by EA algorithm. We’ll show that a probabilistic distance, specifically the Wasserstein (WST) distance, can naturally allow an effective formulation of genetic operators. Usually, each node is associated to a scalar real number, in the optimal sensor placement considered in the literature, average detection time, but in many applications, node labels are more naturally expressed as histograms or probability distributions: the water demand at each node is naturally seen as a histogram over the 24 hours cycle. The main aim of this paper is twofold: first to show how different problems in WDNs can take advantage of the representational flexibility inherent in WST spaces. Second how this flexibility translates into computational procedures.
ARTICLE | doi:10.20944/preprints201712.0181.v2
Subject: Business, Economics And Management, Economics Keywords: dynamic game; feedback Nash equilibrium; Pareto solution; monetary union; macroeconomics; public debt; coalitions
Online: 22 January 2018 (16:12:48 CET)
In this paper we present an application of the dynamic tracking games framework to a monetary union. We use a small stylized nonlinear three-country macroeconomic model of a monetary union to analyse the interactions between fiscal (governments) and monetary (common central bank) policy makers, assuming different objective functions of these decision makers. Using the OPTGAME algorithm we calculate solutions for several games: a noncooperative solution where each government and the central bank play against each other (a feedback Nash Equilibrium solution), a fully cooperative solution with all players following a joint course of action (a Pareto optimal solution), and three solutions where various coalitions (subsets of the players) play against coalitions of the other players in a noncooperative way. It turns out that the fully cooperative solution yields the best results, the noncooperative solution fares worst, and the coalition games lie in between, with a broad coalition of the fiscally more responsible countries and the central bank against the less thrifty country coming closest to the Pareto optimum.
ARTICLE | doi:10.20944/preprints202311.0563.v1
Subject: Computer Science And Mathematics, Probability And Statistics Keywords: stopping times; expectation; finite sequences; non-asymptotic; optimal allocation; Kelly criterion; stable pareto; fat tails; generalized hyperbolic
Online: 8 November 2023 (14:35:34 CET)
Financial time series and other human-driven, non-natural processes are known to exhibit fat-tailed outcome distributions. That is, such processes demonstrate a greater tendency for extreme outcomes than the normal distribution or other natural distributional processes would predict. We examine the mathematical expectation, or simply "expectation," traditionally the probability-weighted outcome, regarded since the seventeenth century as the mathematical definition of "expectation." However, when considering the "expectation" of an individual confronted with a finite sequence of outcomes, particularly existential outcomes (e.g., a trader with a limited time to perform or lose his position in a trading operation), we find this individual "expects" the median terminal outcome over those finite trials, with the classical seventeenth-century definition being the asymptotic limit as trials increase to this. Since such finite-sequence "expectations" often differ in values from the classic one, so do the optimal allocations (e.g., growth-optimal). We examine these for fat-tailed distributions. The focus is on implementation, and the techniques described can be applied to all distributional forms.
ARTICLE | doi:10.20944/preprints202309.1112.v1
Subject: Engineering, Architecture, Building And Construction Keywords: pedestrian refuges; refuge islands; speed variation; reduce speed; horizontal deflection; free view; Pareto chart; cause and effect diagram
Online: 18 September 2023 (04:48:02 CEST)
Abstract: The ever-increasing use of motor vehicles causes a number of traffic safety and community issues, which are particularly severe in cities, accompanied with scarcity of parking spaces and challenges encountered in the road layout alteration projects. The commonly applied solutions include designation of through streets and implementation of on-street parking on residential streets and retrofitted traffic calming measures (TCMs). This article presents the results of the study conducted on a two-way street where Metered Parking System MPS was implemented together with diagonal and parallel parking spaces, refuge islands, horizontal deflection and lane narrowing by a single-sided chicane. The aim of this study was to identify those TCMs that effectively helped to reduce the island approach speed. Heuristic method was applied to assess the effect of the respective TCMs on reducing the island approach speed and the key speed reduction determinants were defined using cause and effect diagram and Pareto chart. Comparative analyses were carried out to rate the respective TCMs as effective, moderately effective or ineffective. Section 1 of this article presents the output of the literature review on urban parking analyses and TCM efficacy. Section 2 presents the study site and the applied heuristic method. The study results are presented in Section 3. Section 3 defines the determinants on the cause and effect diagram and analyses the determinants using the Pareto chart. The final conclusions and comments are given in Section 5. Although the study was limited to a single street in Poland, the findings may hold true in other countries where similar TCMs are used.
ARTICLE | doi:10.20944/preprints202306.2178.v1
Subject: Engineering, Telecommunications Keywords: delay; dimensionality reduction; LTE; VoIP; Neural Networks; Support Vector Machines; k-Nearest Neighbors; Feature Selection; Pareto 80/20 rule
Online: 30 June 2023 (07:38:35 CEST)
Delay in data transmission is one of key performance indicators (KPIs) of a network. The planning and project value of delay in network management is of crucial importance for the optimal allocation of network resources and their performance focuses. To create optimal solutions, predictive models, which are currently most often based on machine learning (ML), are used. This paper aims to investigate the training, testing and selection of the best predictive delay model for a VoIP service in an Long Term Evolution (LTE) network using three ML techniques - Neural Networks (NN), Support Vector Machines (SVM) and k-Nearest Neighbors (k-NN). The space of model input variables is optimized by dimensionality reduction techniques: RReliefF algorithm, Backward selection via the recursive feature elimination algorithm and the Pareto 80/20 rule. A three-segment road in the geo-space between the cities of Banja Luka (BL) and Doboj (Db) in the Republic of Srpska (RS), Bosnia and Herzegovina (BiH), covered by the cellular network (LTE) of the M:tel BL operator was chosen for the case study. The results show that, in all three optimization approaches, the k-NN model is selected as the best solution. For the RReliefF optimization algorithm, the best model has 6 inputs and minimum relative error (RE), RE=0.109; for the Backward selection via the recursive feature elimination algorithm, the best model has 4 inputs and RE=0.041; and for the Pareto 80/20 rule, the best model has 11 inputs and RE= 0.049. The comparative analysis of the results concludes that according to observed criteria for the selection of the final model, the best solution is an approach to optimizing the number of predictors based on the Backward selection via the recursive feature elimination algorithm.
ARTICLE | doi:10.20944/preprints202303.0495.v1
Subject: Environmental And Earth Sciences, Water Science And Technology Keywords: floods; frequency analysis; extreme value statistics; Pareto; Wakeby; estimation parameters; approximate form; method of ordinary moments; method of linear moments
Online: 29 March 2023 (02:29:25 CEST)
This article analyzes 6 probability distributions from the Generalized Pareto family, with 3, 4 and 5 parameters, with main purpose to identify other distributions from this family with applicability in flood frequency analysis compared to the distribution already used in the literature from this family such as Generalized Pareto Type II and Wakeby. This analysis is part of a larger and more complex research carried out in the Faculty of Hydrotechnics regarding the elaboration of a norm for flood frequency analysis using the linear moments method. In Romania, the standard method of parameter estimation is the method of ordinary moments, thus the transition from this method to the method of linear moments is desired. All the necessary elements for the distributions use are presented like, the probability density functions, the complementary cumulative distribution functions, the quantile functions, the exact and approximate relations for estimating parameters, for both methods of parameters estimation. All these elements are necessary for a proper transition between the two methods, especially since the use of the method of ordinary moments is done by choosing the skewness of the observed data depending on the origin of the maximum flows. A flood frequency analysis case study, using annual maximum and annual exceedance series, was carried out for the Prigor river, to numerically present the analyzed distributions. The performance of this distributions is evaluated using relative mean error, relative absolute error and linear moments diagram.
ARTICLE | doi:10.20944/preprints201805.0349.v1
Subject: Environmental And Earth Sciences, Environmental Science Keywords: decision support; multi-criteria decision analysis; multiple criteria pareto frontier methods; criterium decision plus; net weaver developer; SADfLOR; ecosystem management decision support system
Online: 24 May 2018 (10:35:18 CEST)
This study examines the potential of combining decision support approaches to identify optimal bundles of ecosystem services. A forested landscape, Zona de Intervenção Florestal of Paiva and Entre-Douro and Sousa (Portugal), is used to test and demonstrate this potential. The landscape extends over 14,000 ha, representing 1,976 stands. The property is fragmented into 376 holdings. The overall analysis was performed in three steps. First, we selected six alternative solutions (A to F) in a Pareto frontier generated by a multiple criteria method within a decision support system (SADfLOR) for subsequent analysis. Next, an aspatial strategic multi-criteria decision analysis (MCDA) analysis was performed with the Criterium DecisionPlus (CDP) component of another decision support system (EMDS) to assess the aggregate performance of solutions A to F for the entire forested landscape with respect to their utility for delivery of ecosystem services. For the CDP analysis, SADfLOR data inputs were grouped into two sets of primary criteria: Wood Harvested and Other Ecosystem Services. Finally, a spatial logic-based assessment of solutions A to F for individual stands of the study area was performed with the NetWeaver component of EMDS. The NetWeaver model was structurally and computationally equivalent to the CDP model, but the key NetWeaver metric is a measure of the strength of evidence that solutions for specific land stands were optimal for the unit. Solutions D and B performed best in the aspatial strategic MCDA analysis, and a composite of the maps generated by NetWeaver demonstrated the spatial basis for the performance of solutions D and B in individual land stands. We conclude with a discussion of how the combination of decision support approaches encapsulated in the two systems could be further automated.
ARTICLE | doi:10.20944/preprints202307.0187.v1
Subject: Business, Economics And Management, Econometrics And Statistics Keywords: Bitcoin; cryptocurreny; extreme value models (EVM); GPD-Normal-GPD (GNG); generalised Pareto distribution (GPD); Kernel density estimator (KDE); Normal distribution; Value-at-Risk (VaR)
Online: 4 July 2023 (11:29:36 CEST)
Cryptocurrencies have obtained a crucial position in the international financial landscape. The cryptocurrency market has been perceived as a highly volatile market since the inception of Bitcoin. This study investigates the relevant performance of extreme value models (EVM) in estimating the Value-at-Risk (VaR) of Bitcoin and Ethereum returns. The extreme value mixture models, GPD-Normal-GPD (GNG) and GPD-KDE-GPD models are fitted to the returns of Bitcoin and Ethereum and the Kupiec likelihood backtesting procedure is performed on the VaR estimates to assess the fits. Both models’ results showed that the fits were a much more decent representation of the observed data when compared to the Normal distribution. The backtesting results showed that the GPD-KDE-GPD model’s fit was superior to that of the GPD-Normal-GPD for both sets of returns at all VaR risk levels except at the 99% level. The results of this study may assist with understanding the dynamics and risks associated with cryptocurrencies and can serve as a beneficial tool for decision-making and risk management to investors, traders, financial institutions and many other participants in the cryptocurrency ecosystem.
ARTICLE | doi:10.20944/preprints202306.0710.v1
Subject: Computer Science And Mathematics, Mathematics Keywords: Optimization problem Multi-objective optimization problems; Dynamic non-linear programming; Fuzzy set; Piecewise quadratic fuzzy numbers; Close interval approximation; α-pareto optimal solution; Decision- making; Stability
Online: 9 June 2023 (11:28:09 CEST)
In real-life scenario, there are many mathematical tools to handle the incomplete and imprecise data. One of them is the fuzzy approach. This article aims to contribute to the literature of fuzzy multi-objective dynamic programming issues involving the fuzzy objective functions. The piecewise quadratic fuzzy numbers characterize these fuzzy parameters. Some basic notions in the problem under the α-pareto optimal solution concept is redefined and analyzed to study the stability of the problem. Furthermore, a technique, named as enhancing decomposition approach, is presented for achieving a subset for the parametric space that contains the sameα-pareto optimal solution. For a better understanding and comprehension of the suggested concept, a numerical example is provided.