Preprint
Article

This version is not peer-reviewed.

Tsallis Entropy in Consecutive r-out-of-n:G Systems: Bounds, Characterization, and Testing for Exponentiality

A peer-reviewed article of this preprint also exists.

Submitted:

12 July 2025

Posted:

15 July 2025

You are already at the latest version

Abstract
This paper investigates the role of Tsallis entropy as a generalized measure of uncertainty in the reliability analysis of consecutive r-out-of-n:G systems, a class of models widely used in engineering applications. We derive new analytical expressions and meaningful bounds for the Tsallis entropy under various lifetime distributions, offering fresh insight into the structural behavior of system-level uncertainty. The approach establishes theoretical connections with classical entropy measures, such as Shannon and Rényi entropies, and provides a foundation for comparing systems under different stochastic orders. A nonparametric estimator is proposed to estimate the Tsallis entropy in this setting, and its performance is evaluated through Monte Carlo simulations. In addition, we develop a new entropy-based test for exponentiality, building on the distinctive properties of system lifetimes. Although Tsallis entropy presents certain limitations, such as non-positivity and lack of additivity, our results support its value as a flexible tool in both reliability characterization and statistical inference.
Keywords: 
;  ;  ;  ;  

1. Introduction

Over the past three decades, consecutive r-out-of-n systems and their variants have attracted significant attention due to their broad applicability across engineering and industrial domains. These systems effectively model real-world configurations such as microwave relay stations in telecommunications, oil pipeline segments, and vacuum assemblies in particle accelerators. Depending on the arrangement and operational logic of the components, several types of consecutive systems can be defined. In particular, a linear consecutive r-out-of-n:G system comprises n components arranged in sequence, where each component is assumed to operate independently and identically (i.i.d.). The system remains functional if at least r consecutive components are operational. Notably, classical series and parallel systems arise as special cases: the series system corresponds to the n-out-of-n:G configuration, while the parallel system aligns with the 1-out-of-nnn:G case. The reliability properties of these systems have been extensively examined under a range of assumptions. Seminal contributions in this area include the works of Jung and Kim [1], Shen and Zuo [2], Kuo and Zuo [3], Chang et al. [4], Boland and Samaniego [5], and Eryılmaz [6,7], among others. Among the various configurations, linear systems that satisfy the condition 2 r n are of particular interest, as they strike a balance between mathematical tractability and practical applicability in reliability analysis. In these systems, the lifetime of the i-th component is typically denoted by T i ​, where i=1,2,…,n. The lifetime of each component is assumed to follow a continuous distribution with probability density function (pdf) f ( t ) , cumulative distribution function (cdf) F ( t ) and survival function S ( t ) = P ( T > t ) . The system lifetime is denoted as T r n : G . Notably, Eryilmaz [8] showed that when 2 r n , the reliability function of the consecutive r -out-of- n :G system is given by:
  S r n : G t = n r + 1 S r t n r S r + 1 t , t > 0 .  
In information theory, a central objective is to quantify the uncertainty inherent in probability distributions. Within this framework, we focus on the application of Tsallis entropy to consecutive r-out-of-n:G systems, where component lifetimes follow continuous probability distributions. Let T be a nonnegative random variable with cumulative distribution function F ( t ) and probability density function f ( t ) . The Tsallis entropy of order β , introduced by Tsallis [9], serves as a valuable measure of uncertainty and is defined as:
H β T = 1 1 β 0 f β t d t 1 = 1 1 β 0 f β 1 F 1 u d u 1 , f o r   a l l   β Θ = 0,1 1 , ,
where F 1 ( u ) = i n f { t ; F ( t ) u } , for u [ 0,1 ] , represents the quantile function of F ( t ) .
In particular, the Shannon differential entropy, a foundational concept in information theory introduced by Shannon [10], can be obtained as the limiting case of the Tsallis entropy as β 1 . Specifically, it is defined by:
H T = l i m H β T β 1 = 0 f t log f t   d t .  
An alternative and insightful representation of Tsallis entropy can be derived from Equation (2) by expressing it in terms of the hazard rate function. Specifically, it takes the form:
                H β T = 1 1 β 1 β E λ β 1 T β 1 ,  
where λ ( t ) = f ( t ) / S ( t ) represents the hazard rate function, E ( ) means the expectation, and T β follows a pdf given by:
f β t = β f t S β 1 t , t > 0 , f o r   a l l   β > 0 .          
It is important to note that Tsallis entropy does not necessarily yield positive values for all β > 0 . However, H β ( T ) remains invariant under location transformations, though it is sensitive to changes in scale. Unlike Shannon entropy, which is additive for independent variables, Tsallis entropy exhibits non-additive behavior. Specifically, for bivariate random variables T 1 , T 2 , we have:
H β T 1 , T 2 = H β T 1 + H β T 2 + ( 1 β ) H β T 1 H β T 2 .
The investigation of information-theoretic properties in reliability systems and order statistics has garnered significant scholarly attention. Foundational contributions in this domain include the works of Wong and Chen [11], Park [12], Ebrahimi et al. [13], Zarezadeh and Asadi [14], Toomaj and and Doostparast [15], Toomaj [16] and Mesfioui et al. [17], among others. More recently, Alomani and Kayid [18] explored additional aspects of Tsallis entropy, focusing on coherent and mixed systems under an i.i.d. assumption. Baratpour and Khammar [19] examined its behavior in relation to order statistics and record values, while Kumar [20] studied its role in the context of r-records. Further advancements were made by Kayid and Alshehri [21], who derived an explicit expression for the lifetime entropy of consecutive r-out-of-n:G systems, established a characterization result, proposed practical bounds, and introduced a nonparametric estimator. Complementing this research, Kayid and Shrahili [22] investigated the fractional generalized cumulative residual entropy for similar systems, providing a computational formulation, preservation properties, and two nonparametric estimators supported by empirical validation.
Building upon the existing body of research, this paper aims to further elucidate the role of Tsallis entropy in the analysis of consecutive r-out-of-n:G systems. By extending earlier results, we offer deeper insights into its structural characteristics, propose refined bounding techniques, and develop estimation methods specifically adapted to these reliability models.
The remainder of this paper is organized as follows. Section 2 presents a novel representation for the Tsallis entropy of consecutive r-out-of-n:G systems, denoted by T r n : G , where component lifetimes are drawn from an arbitrary continuous distribution function F. This representation is derived using the case of the uniform distribution as a baseline. Given the inherent difficulty in deriving exact expressions for Tsallis entropy in complex reliability structures, we address this challenge by establishing several analytical bounds, illustrated through numerical examples. In Section 3, we explore characterization results for Tsallis entropy in the context of consecutive systems, establishing key theoretical properties. Section 4 provides computational evidence supporting our results and introduces a nonparametric estimator tailored for system-level Tsallis entropy. The estimator’s performance is demonstrated using both simulated and real-world data. Finally, Section 5 concludes with a summary of the main findings and their implications.

2. Tsallis Entropy of Consecutive r o u t o f n :G System

This section is divided into three main parts. First, we provide a concise review of key results concerning Tsallis entropy and its connections to Rényi and Shannon differential entropies. In the second part, we derive an explicit expression for the Tsallis entropy of a consecutive r-out-of-n:G system and examine its behavior under various stochastic orders. Finally, we present a set of analytical bounds that shed additional light on the entropy structure of these systems.

2.1. Results on Tsallis Entropy

Differential entropy serves as a measure of the average uncertainty associated with a continuous random variable. Broadly speaking, the more a probability density function f ( t ) deviates from the uniform distribution, the higher its differential entropy. In essence, this quantity reflects the expected amount of information gained from observing a specific realization of the random variable. Rényi entropy offers a more flexible framework for quantifying uncertainty compared to differential entropy. For a non-negative random variable T with probability density function f ( t ) , Rényi entropy introduces a parameter β that enables the exploration of different aspects of uncertainty depending on the weight assigned to various probability outcomes. It is defined as:
                                                                                        R β T = 1 1 β log 0 f β t d t ,   f o r   a l l   β > 0 , β 1 .      
The Tsallis and Renyi entropy are also measures of the disparity of the pdf  f ( t )  from the uniform distribution. Additionally, these information measures take values in [ , ] . It is known that for an absolutely continuous nonnegative random variable T , we have H β T H T , f o r   a l l   0 < β < 1 , and H β T H T , f o r   a l l   β > 1 . Moreover, one can see that the connection of Renyi and Tsallis entropies is given by H β T R β T , f o r   a l l   0 < β < 1 and H β T R β T , f o r   a l l   β > 1 , where R β ( T ) is defined in (3). In the next theorem, we investigate the relationship of the Shannon differential entropy of proportional hazards rate model defined by (5).
Theorem 2.1.
For an absolutely continuous nonnegative random variable T , we have
H β T 1 H T β ,   for   all   0 < β < 1 and   H β T 1 H T β ,   for   all   β > 1
Proof. By the log-sum inequality, we have
0 f β ( t ) log β f t S β 1 t f β t d t     0 f β t d t log 0 f β ( t ) d t 0 f β ( t ) d t       1 0 f β t d t ,  
which implies
0 f ( t ) log λ β ( t ) d t 0 f β ( t ) d t 1 ,
where λ β ( t ) = β λ ( t ) denotes the hazard rate function of T β . By noting that
0 f β t log λ β t d t = 1 H T β ,
we get the results for all 1 β > 0 ( 1 β < 0 ) , and hence the theorem.

2.2. Expression and Stochastic Orders

To obtain the Tsallis entropy expression for the consecutive r-out-of-n:G system, we apply the probability integral transformation W ( r | n : G ) = H T ( r | n : G . It is well known that the transformed variables follow a uniform distribution under certain conditions. Using this transformation, we proceed to obtain an explicit expression for the Tsallis entropy of the system lifetime T r n : G , assuming that the component lifetimes are independently drawn from a continuous distribution function F. Based on Equation (1), the probability density function of T r n : G , given by:
f r n : G t = f t [ r n r + 1 S r 1 t r + 1 n r S r t ] , t > 0 .
Furthermore, when 2 r n , the pdf of W r n : G can be represented as follows:
k r n : G ( w ) = r ( n r + 1 ) ( 1 w ) r 1 ( r + 1 ) ( n r ) ( 1 w ) r , f o r   a l l   0 < w < 1 .
We are now ready to present the following result based on the previous analysis.
Proposition 2.1.
For 2 r n , the Tsallis entropy of T r n : G , can be expressed as follows:
                    H β T r n : G = 1 1 β 0 1 k r n : G β ( w ) f β 1 F 1 ( w ) d w 1 ,   f o r   a l l   β Θ ,  
where k r n : G ( w ) is given by (10).
Proof. 
Applying the transformation w = F ( t ) and referencing Equations (2) and (9), we can obtain the following expressions:
H β T r n : G = 1 1 β 0 f r n : G β t d t 1 = 1 1 β 0 r ( n r + 1 ) S r 1 ( t ) ( r + 1 ) ( n r ) S r ( t ) β f β ( t ) d t 1 = 1 1 β 0 1 r ( n r + 1 ) ( 1 w ) r 1 ( r + 1 ) ( n r ) ( 1 w ) r β f β 1 F 1 ( w ) d w 1 = 1 1 β 0 1 k r n : G β ( w ) f β 1 F 1 ( w ) d w 1 , f o r   a l l   β Θ ,
and this completes the proof.
In the subsequent theorem, we present an alternative representation for H β T r n : G by employing Newton's generalized binomial theorem and Theorem 2.1.
Theorem 2.2.
Under the conditions of Proposition 2.1, we get
S β T r n : G = 1 1 β i = 0 β i [ r ( r n 1 ) ] i + β [ ( r + 1 ) ( n r ) ] i E h β 1 H 1 1 Z i , r , β i + β ( r 1 ) + 1 1 ,
where Z i , r , β B e t a ( i + β ( r 1 ) + 1,1 ) and β i = β ( β 1 ) ( β i + 1 ) i ! for all β Θ .
Proof. 
By defining A = r ( n r + 1 ) and B = ( r + 1 ) ( n r ) , and referring to (10) and (11), we find that
0 1 k r n : G β ( w ) f β 1 H 1 ( w ) d w = 0 1 ( 1 w ) β r 1 ( A B ( 1 w ) ) β f β 1 F 1 ( w ) d w = A β 0 1 z β ( r 1 ) 1 B A z β f β 1 F 1 ( 1 z ) d z , ( t a k i n g z = 1 w ) = A β i = 0 β i A B i ( 1 ) i 0 1 z i + β ( r 1 ) f β 1 F 1 ( 1 z ) d z = A β i = 0 β i r ( r n 1 ) ( r + 1 ) ( n r ) i E f β 1 F 1 1 Z i , r , β i + β ( r 1 ) + 1 ,
where the third equality follows directly from Newton's generalized binomial series ( 1 x ) β =   i = 0 β i ( 1 ) i x i . This result, in conjunction with Eq. (11), complete the proof.
We now illustrate the utility of the representation in (11) through the following example.
Example 2.1.
Consider a linear consecutive 2-out-of-4:G system with the following lifetime
T 2 4 : G = m a x m i n T 1 , T 2 , m i n T 2 , T 3 , m i n , T 3 , T 4 .
Let us assume that the component lifetimes are i.i.d. and follow the log-logistic distribution (known as the Fisk distribution in economics). The pdf of this distribution is represented as follows:
f t = γ t γ 1 1 + t γ 2 ,   t , γ > 0 .
After appropriate algebraic manipulation, the following identity is obtained:
f F 1 ( w ) = γ w γ 1 γ ( 1 w ) γ + 1 γ , for 0 < w < 1
As a result of specific algebraic manipulations, we obtain the following expression for the Tsallis entropy:
H β T 2 4 : G = 1 1 β [ γ β 1 0 1 k 2 4 : G β ( w ) w γ 1 β 1 γ ( 1 w ) γ + 1 β 1 γ d w 1 ] ,   f o r   a l l   β Θ .  
Due to the difficulty of obtaining a closed-form expression, numerical methods are employed to examine the relationship between H β T 2 4 : G and the parameters β and γ . The analysis focuses on the consecutive 2-out-of-4:G system and is carried out for values β > 1 and γ > 1 , as the integral diverges for 0 < β < 1 and 0 < γ < 1 .
As illustrated in Figure 1, the Tsallis entropy decreases as the parameters β and γ increase. This trend reflects the sensitivity of the entropy measure to these parameters and underscores their substantial impact on the system's information-theoretic characteristics.
Definition 2.1.
Let T 1 and T 2 be two absolutely continuous nonnegative random variables with pdfs f 1 and f 2 , cdfs F 1 and F 2 , survival functions S 1 and S 2 , respectively. Then, (i) T 1 is smaller than or equal to T 2 in the dispersive order, denoted by T 1 d i s p   T 2 , if and only if f 1 F 1 1 ( w ) f 2 F 2 1 ( w ) for all 0 < w < 1 ; (ii) T 1 is smaller than T 2 in the hazard rate order, denoted by T 1 h r T 2 , if S 2 ( t ) / S 1 ( t ) is increasing for all t > 0 ; (iii) it is said to be that T exhibits the decreasing failure rate (DFR) property if f 1 ( t ) / S 1 ( t ) is decreasing in t > 0 .
Additional details on these ordering concepts can be found in the comprehensive work of Shaked and Shanthikumar [23]. The following theorem is a direct consequence of the expression given in Equation (11).
Theorem 2.3.
Let T r n : G 1 and T r n : G 2 be the lifetimes of two consecutive r-out-of- n : G systems consisting n i.i.d. components having cdfs F 1 and F 2 , respectively. If T 1 T 2 d i s p   , then
H β T r n : G 1 H β T r n : G 2 , for all β Θ
Proof. 
If T 1 d i s p   T 2 , then for all β ( ) 1 , we have
( 1 β ) H β T r n : G 1   = 0 1 k r n : G β ( w ) f 1 β 1 F 1 1 ( w ) d w 1     0 1 k r n : G β w f 2 β 1 F 2 1 w d w 1 = H β T r n : G 2 .  
This yields that H β T r n : G 1 H β T r n : G 2 , for all β Θ , and this completes the proof.
The following result rigorously demonstrates that, within the class of consecutive r-out-of-n:G systems composed of components with the DFR property, the series system yields the minimum Tsallis entropy.
Proposition 2.2.
Consider the lifetime T r n : G of a consecutive r-out-of- n : G system, comprising n i.i.d. components that exhibit the DFR property. Then, for 2 r n , and for all β Θ ,
  • (i). it holds that H β T 1 : n H β T r n : G .
  • (ii). it holds that H β T 1 : r H β T r n : G .
Proof. 
(i) It is easy to see that T 1 : n h r T r n : G . Furthermore, if T exhibits the DFR property, then it follows that T 1 : n also possesses the DFR property. Due to Bagai and Kochar [24], it can be concluded that T 1 : n T r n : G d i s p   which immediately obtain H β T 1 : n H β T r n : G by recalling Theorem 2.6.
(ii) Based on the findings presented in Proposition 3.2 of Navarro and Eryilmaz [25], it can be inferred that T 1 : r h r T r n : G . Consequently, employing analogous reasoning to that employed in Part (i) leads to the acquisition of similar results.
A notable application of Equation (11) lies in comparing the Tsallis entropy of consecutive r-out-of-n:F systems whose components are independent but follow different lifetime distributions, as formalized in the following result.
Proposition 2.3.
Under the conditions of Theorem 2.6, if H β T 1 H β T 2 for all β > 0 , and i n f A 1 k r n : F ( w ) s u p A 2 k r n : F ( w ) , for
A 1 = w [ 0,1 ] : f 1 F 1 1 ( w ) f 2 F 2 1 ( w ) ,   A 2 = v [ 0,1 ] : f 1 F 1 1 ( w ) > f 2 F 2 1 ( w )
then H β T r n : F 1 H β T r n : F 2 for all 2 r n and β Θ .
Proof. 
Given that H β T 1 H β T 2 for all β > 0 , Equation (2) implies
                                                                                                H β T 2 H β T 1 = 1 1 β 0 1 ζ β w d w 0 ,                                                                                          
where ζ β w = f 2 β 1 F 2 1 w f 1 β 1 F 1 1 w ,   0 < w < 1 . Assuming β > 1 , based on Equation (11), we have
H β T r n : F 2 H β T r n : F 1   = 1 1 β 0 1 k r n : G β w ζ β w d w                                                                                 = 1 1 β A 1 k r n : G β ( w ) ζ β ( w ) d w + A 2 k r n : G β ( w ) ζ β ( w ) d w   (   s i n c e   A 1 A 2                                                                                 = [ 0,1 ] )                                                                                 = 1 1 β A 1 k r n : G β ( w ) ζ β ( w ) d w + A 2 k r n : G β ( w ) ζ β ( w ) d w     s i n c e   A 1 A 2 = [ 0,1 ]                                                       i n f v A 1 k r n : G ( w ) β 1 β A 1 ζ β ( w ) d w + s u p v A 2 k r n : G ( w ) β 1 β A 2 ζ β ( w ) d w       s u p v A 2 k r n : G ( w ) β 1 β 0 1 ζ β ( w ) d w       0 .                                                               s u p v A 2 k r n : G ( w ) β 1 β 0 1 ζ β ( w ) d w       0 .  
The first inequality holds because ζ β ( w ) 0 in A 1 and ζ β ( w ) < 0 in A 2 when β > 1 . The last inequality follows directly from equation (14). Consequently, we have H β T r n : F 1 H β T r n : F 2 for 2 r n , which completes the proof for the case β > 1 . The proof for the case 0 < β 1 , follows a similar argument.
The following example is provided to illustrate the application of the preceding proposition.
Example 2.2.
Assume coherent systems with lifetimes T 2 3 : G 1 = m i n m a x T 1 , T 2 , m a x T 2 , T 3 and T 2 3 : G 2 = m i n m a x Z 1 , Z 2 , m a x Z 2 , Z 3 , where T 1 , T 2 , T 3 are i.i.d. component lifetimes with a common cdf F 1 ( t ) = 1 e 2 t , t > 0 , and Z 1 , Z 2 , Z 3 are i.i.d. component lifetimes with the common cdf F 2 ( t ) = 1 e 6 t , t > 0 . We can easily confirm that H β T = 0.125 and H β Z = 0.0416 , so H β T H β Z . Additionally, since A 1 = [ 0,1 ) and A 2 = { 1 } , we have i n f A 1 g V ( v ) = s u p A 2 g V ( v ) = 0 . Thus Theorem 2.3 implies that H β T 2 3 : G 1 H β T 2 3 : G 2 .

2.3. Some Bounds

In the absence of closed-form expressions for the Tsallis entropy of consecutive systems, particularly when dealing with diverse lifetime distributions or systems comprising a large number of components, bounding techniques become a practical necessity for approximating entropy behavior over the system's lifetime. To address this challenge, the present study examines the role of analytical bounds in characterizing the Tsallis entropy of consecutive r-out-of-n:G systems. Toward this goal, we introduce the following theorem, which provides a lower bound on the Tsallis entropy for such systems. This bound yields valuable insights into entropy dynamics in realistic settings and contributes to a deeper understanding of the underlying system structure.
Lemma 2.1.
Consider a nonnegative continuous random variable T with pdf f and cdf F such that M = f ( m ) < , where m = s u p { x : f ( t ) M } , denotes the mode of the pdf f . Then, for 2 r n , we have
H β T r n : G M β 1 H β W r n : G + 1 1 β M β 1 1 ,   for   all   β Θ
Proof. 
By noting that f F 1 ( w ) M , 0 < w < 1 , then for 0 < β < 1 , we have f β 1 F 1 ( w )   M β 1 , 0 < w < 1 . Now, the identity 1 β > 0 implies that
H β T r n : G   = 1 1 β 0 1 k r n : G β ( w ) h β 1 H 1 ( w ) d w 1       1 1 β M β 1 0 1 k r n : G β ( w ) d w 1       = M β 1 1 β 0 1 k r n : G β ( w ) d w 1     + M β 1 1 β 1 1 M β 1       = M β 1 S β W r n : G + 1 1 β M β 1 1 ,  
and hence the result. When β > 1 , then we have f β 1 F 1 ( w )   M β 1 , 0 < w < 1 . Now, since 1 β < 0 , by using the similar arguments, we have the results.
The following theorem establishes a lower bound for the Tsallis entropy of T r n : G ​. This bound depends on the Tsallis entropy of a consecutive r-out-of-n:G system under the uniform distribution, as well as the mode M of the underlying component lifetime distribution.
Example 2.3.
Assume a linear consecutive r -out-of- n :G system with lifetime
                                                                                                  T r n : G = m a x T [ 1 : k ] , T [ 2 : r + 1 ] , , T [ n r + 1 : n ] ,                                                                                          
where T [ j : m ] = m i n T j , , T m for 1 j < m n . Let further that the lifetimes of the components are i.i.d. having the common mixture of two Pareto distributions with parameters β 1 and β 2 with pdf as follows:
f ( t ) = θ β 1 x β 1 1 + ( 1 θ ) β 2 t β 2 1 , x 1,0 θ 1 , β 1 > β 2 > 0 .
Given that the mode of this distribution is m = 1 , we can determine the mode value M as M = f ( 1 ) = θ β 1 + ( 1 θ ) β 2 . Consequently, from Lemma 2.1, we get
H β T r n : G θ β 1 + 1 θ β 2 β 1 H β W r n : G + 1 1 β θ β 1 + 1 θ β 2 β 1 1 ,   for all β > 0
In the following theorem, we derive bounds for the Tsallis entropy of consecutive r-out-of-n:G systems by leveraging the Tsallis entropy of the individual components.
Theorem 2.4.
When β > 1 ( 0 < β < 1 ) , we have
                                                                                H β T r n : G k r n : G β w H β T + 1 1 β k r n : G β w 1 ,                                          
where k r n : G w where w = 2 n 3 r + 1 ( r + 1 ) ( n r ) .
Proof. 
The mode of k r n : G ( w ) is clearly observed as w = 2 n 3 r + 1 ( r + 1 ) ( n r ) . As a result, we can establish that k r n : G ( w ) k r n : G w for 0 < w < 1 . Therefore, for β > 1 ( 0 < β < 1 ) , we can conclude that:
H β T r n : G     = 1 1 β 0 1 k r n : G β ( w ) f F 1 ( w ) β 1 d w 1       ( ) 1 1 β k r n : G β w 0 1 f F 1 ( w ) β 1 d w 1                                                   = k r n : G β w 1 β 0 1 f F 1 ( w ) β 1   d w 1       + k r n : G β w 1 β 1 1 k r n : G β w             = k r n : G β w H β T + 1 1 β k r n : G β w 1 ,                                                                                                                                                                                                                                
and hence the theorem.
The following example illustrates the lower bound presented in Theorem 2.5 by applying it to a consecutive r-out-of-n:G system.
Example 2.4.
Let us consider a linear consecutive 10-out-of-18:G system with lifetime T 10 18 : G = m a x T [ 1 : 10 ] , T [ 2 : 11 ] , , T [ 10 : 18 ] , where T [ j : m ] = m i n T j , , T m for 1 j < m 18 . In order to conduct the analysis, we assume that the lifetimes of the individual components in the system are i.i.d. according to a common standard exponential distribution. By performing a simple verification, we find that the optimal value w is equal to 0.08, resulting in a corresponding value of k 10 18 : G w as 4.268. Utilizing Theorem 2.4, we can write
H β T 10 18 : G 4.268 H β T + 3.268 / ( 1 β ) ,   for all β > 1 ( 0 < β < 1 )
In the next result, we provide bounds for the consecutive r -out-of- n :G systems which relates to the hazard rate function of the component lifetimes.
Proposition 2.4.
Let T i , i = 1,2 , , n , be the lifetimes of components of a consecutive r -out-of- n : G systems with T r n : G having the common failure rate function λ ( t ) . If 2 r n and β Θ . Then
1 1 β ( 2 r n ) β 1 β E λ β 1 T r n : G , β 12 1 H β T r n : G 1 1 β r β 1 β E λ β 1 T r n : G , β 12 1
where T r n : G , β 12 has the pdf f r n : G , β 12 ( t ) = β f r n : G ( t ) S r n : G β 1 ( t ) , for t > 0 .
Proof. 
It is easy to see that the hazard rate function of T r n : G can be expressed as λ r n : G ( t ) =   ψ r , n ( S ( t ) ) λ ( t ) , where
ψ r , n ( z ) = r n r + 1 r + 1 n r z n r + 1 n r z , 0 < z < 1 .
Since ψ r , n ' ( z ) < 0 for 2 r n and 0 < z < 1 , it follows that ψ r , n ( z ) is a monotonically decreasing function of z . Given that ψ r , n ( 0 ) = r and ψ r , n ( 1 ) = 2 r n , we have 2 r n ψ r , n ( S ( t ) ) r for 0 < H ( t ) < 1 , which implies that 2 r n λ ( t ) λ r n : G ( S ( t ) ) r λ ( t ) , for t > 0 . Combining this result with the relationship between Tsallis entropy and the hazard rate (as defined in Eq. (4)) for β > 1 ( 0 < β 1 ) , completes the proof.
Let us consider an illustrative example that demonstrates the application of the preceding proposition.
Example 2.5.
Consider a linear consecutive 2-out-of-3:G system with lifetime T 2 3 : G = m a x m i n T 1 , T 2 , m i n T 2 , T 3 , where the component lifetimes T i are i.i.d. with an exponential distribution with the cdf F ( t ) = 1 e λ t for t > 0 . The exponential distribution has a constant hazard rate, λ 1 ( t ) = λ , so, it follows that E λ T 2 3 : G 12 = λ . Applying Proposition 2.4 yields the bounds on the Tsallis entropy of the system as 0.5 λ H β T 2 3 : G 0.25 λ . Based on (11), one can compute the exact value as H β T 2 3 : G = 0.35 λ which falls within the bounds.
The following theorem holds under the assumption that the expected value of the squared hazard rate function of T is finite.
Theorem 2.5.
Under the conditions of Theorem 2.11 such that E λ 2 ( T ) < , for 2 r n and β > 1 ( 0 < β 1 ) , it holds that
H β T r n : G 1 1 β Ω r , n , β E λ 2 β 1 T 1 ,   Ω r , n , β = 0 1 w 2 k r n : G 4 ( w ) d w
Proof. 
The pdf of T r n : G can be rewritten as f r n : G ( t ) = f ( t ) k r n : G ( F ( t ) ) , while its failure rate function is given by
λ r n : G t = λ t F t k r n : G F t S r n : G t ,   f o r   t > 0 .  
Consequently, by (4) and using Cauchy-Schwarz inequality, we obtain
0 λ r n : G β 1 ( t ) f r n : G ( t ) S r n : G β 1 ( t ) d t = 0 λ β 1 ( t ) f ( t ) f ( t ) F ( t ) f r n : G 2 ( F ( t ) ) d t 0 λ 2 ( β 1 ) ( t ) f ( t ) d t 1 / 2 0 F ( t ) h r n : G 2 ( F ( t ) ) 2 h ( t ) d t 1 / 2 = E λ 2 ( β 1 ) ( T ) 1 / 2 0 1 w 2 k r n : G 4 ( w ) d w 1 / 2
The last equality follows from the change of variable w = F ( t ) , completing the proof.

3. Characterization Results

In this section, we investigate characterization results for consecutive systems using Tsallis entropy. To illustrate these results, we focus on a specific system type: a linear consecutive ( n i )-out-of-n:G system, under the condition that n 2 i , where i = 0,1 , , n / 2 . To this end, we begin by revisiting a lemma derived from the Müntz–Szász Theorem (as referenced in Kamps [26]), which is commonly used in moment-based characterization theorems.
Lemma 3.1.
For an integrable function ψ ( t ) on the finite interval ( a , b ) if a b x n j ψ ( x ) d x = 0 , j 1 , then ψ ( x ) = 0 for almost all x ( a , b ) , where n j , j 1 is a strictly increasing sequence of positive integers satisfying j = 1 1 n j =
It is worth pointing out that Lemma 3.1 is a well-established concept in functional analysis, stating that the sets x n 1 , x n 2 , ; 1 n 1 < n 2 < constitutes a complete sequence. Notably, Hwang and Lin [27] expanded the scope of the Müntz-Szász theorem for the functions ϕ n j ( x ) , n j 1 , where ϕ ( x ) is both absolutely continuous and monotonic over the interval ( a , b ) .
Theorem 3.1.
Consider two consecutive ( n i )-out-of-n:G systems with lifetimes T n i n : G 1 and T n i n : G 2 composed of n i.i.d. components with cdfs F 1 and F 2 , and pdfs f 1 and f 2 , respectively. Then F 1 and F 2 belong to the same family of distributions, but for a change in location, if and only if for a fixed i 0 ,
                                                                  H β T n i n : G 1 = H β T n i n : G 2 , f o r   a l l   n 2 i ,   a n d   β Θ .  
Proof. 
For the necessity part, since F 1 and F 2 belong to the same family of distributions, but for a change in location, then F 2 ( y ) = F 1 ( y a ) , for all y a and a R . Then, it is clear that
H β T n i n : G 2   = 1 1 β a f 2 , n i n : G β ( y ) d y = 1 1 β a f 1 , n i n : G β ( y a ) d y = 1 1 β 0 f 1 , n i n : G β x d x = H β T n i n : G 1 . t a k i n g   x = y a .  
To establish the sufficiency part, we first observe that for a consecutive ( n i )-out-of- n :G system, the following equation holds
                            k n i n : F ( w ) = ( n i ) ( i + 1 ) ( 1 w ) n i 1 i ( n i + 1 ) ( 1 w ) n i , 0 < w < 1 ,
where n 2 i and i ranges from 0 to n / 2 . Given the assumption that H β T n i n : G 1 = H β T n i n : G 2 , we can write
1 1 β 0 1 k n i n : G β ( w ) f 1 β 1 F 1 1 ( w ) d w 1 = 1 1 β 0 1 k n i n : G β ( w ) f 2 β 1 F 2 1 ( w ) d w 1 ,  
or equivalently
0 1 ( 1 w ) n 2 i β ϕ i , n , β ( w ) f 1 β 1 F 1 1 w f 2 β 1 F 2 1 w d w = 0 ,
where
ϕ i , n , β w = ( n i ) ( i + 1 ) ( 1 w ) i + 1 i ( n i + 1 ) w i β ,   f o r   0 < w < 1 .
By applying Lemma 3.1 with the function
ψ w = ϕ i , n , β w f 1 β 1 F 1 1 w f 2 β 1 F 2 1 w ,
and considering the complete sequence w ( n 2 i ) β , n 2 i , one can conclude that
f 1 β 1 F 1 1 ( w ) = f 2 β 1 F 2 1 ( w ) ,     a . e .     w 0,1 ,
or f 1 F 1 1 ( w ) = f 2 F 2 1 ( w ) , 0 < w < 1
Consequently, F 1 and F 2 are part of the same distribution family, differing only in a location shift.
Recognizing that a consecutive n-out-of-n:G system is a special case of a series system, the following corollary characterizes its Tsallis entropy.
Corollary 3.1.
Let T n n : G 1 and T n n : G 2 be two series systems having the common pdfs f 1 ( t ) and f 2 ( t ) and cdfs F 1 ( t ) and F 2 ( t ) , respectively. Then F 1 and F 2 belong to the same family of distributions, but for a change in location, if and only if
H β T n n : G 1 = H β T n n : G 2 ,   f o r   a l l   n 1   a n d   β Θ .
Another helpful characterization is provided in the following theorem.
Theorem 3.2.
Under the conditions of Theorem 3.1, F 1 and F 2 belong to the same family of distributions, but for a change in location and scale, if and only if for a fixed i ,
H β T n i n : G 1 H β ( T ) = H β T n i n : G 2 H β T 2 ,   f o r   a l l   n 2 i ,   a n d   β Θ .    
Proof. 
The necessity is straightforward, so we must now establish the sufficiency aspect. Leveraging Eqs. (6) and (18), we can derive
          H β T n i n : G 1 H β T 1 = 1 1 β 0 1 k n i n : G β ( w ) f 1 β 1 F 1 1 ( w ) H β T 1 d w 1 .
An analogous argument can be made for H β T n i n : G 2 / H β T 2 . If relation (22) holds for two cdfs F 1 and F 2 , then we can infer from Equation (23) that
      0 1 k n i n : G β w f 1 β 1 F 1 1 w H β T 1 d w = 0 1 k n i n : G β w f 2 β 1 F 2 1 w H β T 2 d w .    
Let us set c = H β T 2 / H β T 1 . Using similar arguments as in the proof of Theorem 3.2, we can write
0 1 w n 2 i β ϕ i , n , β ( w ) c f 1 β 1 F 1 1 w f 2 β 1 F 2 1 w d w = 0 .
The proof is then completed by using similar arguments to those in Theorem 3.2.
By applying Theorem 3.2, we derive the following corollary.
Corollary 3.2.
Suppose the assumptions of Corollary 3.1, F 1 and F 2 belong to the same family of distributions, but for a change in location and scale, if and only if for a fixed n ,
H β T n n : G 1 H β ( T ) = H β T n n : G 2 H β T 2 ,   for   all   n 1 ,   and   β Θ
The exponential distribution is characterized in the following theorem through the use of Tsallis entropy for consecutive r-out-of-n:G systems. This result serves as the theoretical basis for a newly developed goodness-of-fit test for the exponential distribution, applicable to a wide range of datasets. To establish this result, we begin by introducing the lower incomplete beta function, defined as:
B ( t ; a , b ) = 0 t x a 1 ( 1 x ) b 1 d x , 0 < t < 1 ,
where a and b are positive real numbers. When t = 0 , this expression reduces to the complete beta function. We now proceed to present the main result.
Theorem 3.3.
Let us assume that T n i n : G is the lifetime of the consecutive ( n i ) -out-of-n:G system having n i.i.d. component lifetimes with the pdf f and c d f F . Then T has an exponential distribution with the parameter λ if and only if for a fixed i 0 ,
H β T n i n : G = [ ( n i ) ( i + 1 ) ] β ( n i + 1 ) [ i ( n i + 1 ) ] β ( n i ) B i ( n i + 1 ) ( n i ) ( i + 1 ) ; β ( n i ) , β + 1 β H β ( T ) + β 1 β 1 1 β ,
for all n 2 i , and β Θ .
Proof. 
Given an exponentially distributed random variable T , its Tsallis entropy, directly calculated using ( 6 ) , is H β ( T ) = λ β 1 β β ( 1 β ) . Furthermore, since f F 1 ( w ) = λ ( 1 w ) , application of Eq. (11) yields:
H β T n i n : G     = 1 1 β 0 1 k n i n : G β w f β 1 F 1 w d w 1       = λ β 1 1 β 0 1 k n i n : G β ( w ) ( 1 w ) β 1 d w 1 1 β       = β H β ( T ) + β 1 β 0 1 k n i n : G β ( w ) ( 1 w ) β 1 d w 1 1 β ,  
for β > 0 . To derive the second term, let us set A = ( n i ) ( i + 1 ) and B = i ( n i + 1 ) , uponrecalling (10), it holds that
0 1 k n i n : G β ( w ) ( 1 w ) β 1 d w   = 0 1 ( 1 w ) β ( n i ) 1 ( A B ( 1 w ) ) β d w = A β 0 1 z β ( n i ) 1 1 B A z β d z , (   t a k i n g   z = 1 w )       = A β ( n i + 1 ) B β ( n i ) 0 B A u β ( n i ) 1 ( 1 u ) β d u ,   t a k i n g   u = B A z = A β ( n i + 1 ) B β ( n i ) B B A ; β ( n i ) , β + 1 ,            
where the necessity is derived. To establish sufficiency, we assume that Equation (25) holds for a fixed value of r and assume that D = β H β ( T ) + β 1 β . Following the proof of Theorem 3.1 and utilizing the result in Equation (26), we obtain the following relation
1 1 β 0 1 k n i n : G β ( w ) f β 1 F 1 ( w ) d w = D 0 1 k n i n : G β ( w ) ( 1 w ) β 1   d w
which is equivalent to
0 1 k n i n : G β ( w ) f β 1 F 1 w ( 1 β ) D ( 1 w ) β 1 d w = 0 ,
where k n i n : G ( w ) is defined in (18). Thus, it holds that
    0 1 ( 1 w ) n 2 i β ϕ i , n , β ( w ) f β 1 F 1 w ( 1 β ) D ( 1 w ) β 1 d w = 0 ,    
where ϕ i , n , β ( w ) is defined in (21). Applying Lemma 3.1 to the function
ψ w = ϕ i , n , β w f β 1 F 1 w ( 1 β ) D ( 1 w ) β 1 ,
and utilizing the complete sequence ( 1 w ) ( n 2 i ) , n 2 i , we can deduce that
f F 1 ( w ) = [ 1 β D ] 1 β 1 1 w , a . e . w 0,1 .
This implies that
d F 1 ( w ) d w = 1 f F 1 ( w ) = 1 [ ( 1 β ) D ] 1 β 1 ( 1 w ) .  
By solving this equation, it yields F 1 ( w ) = log ( 1 w ) [ ( 1 β ) D ] 1 β 1 + d , where d is an arbitrary constant. Utilizing the boundary condition l i m w 0 F 1 ( w ) = 0 , it follows that d = 0 , we determine that d = 0 . Consequently, leading to F 1 w = [ log 1 w   ] /   [ ( 1 β ) D ] 1 β 1 for w > 0 . This implies the cdf F ( w ) = 1 e β ( 1 β ) H β ( T ) + β 1 β 1 w , w > 0 , confirming that T follows an exponential distribution with scale parameter β ( 1 β ) H β ( T ) + β 1 β 1 . This establishes the theorem.

4. Entropy-Based Exponentiality Testing

In this section, we introduce a nonparametric method for estimating the Tsallis entropy of consecutive r-out-of-n:G systems. The exponential distribution’s broad applicability has led to the development of numerous test statistics aimed at assessing exponentiality. These methods often rely on foundational principles in statistical theory. The central objective here is to determine whether the distribution of a random variable T follows an exponential distribution. Let H 0 ​ denote the cdf of the exponential distribution, defined as F 0 ( t ) = 1 e λ t , for t > 0 . The hypothesis under investigation is stated as follows:
H 0 : F t = F 0 t ,   v s .   H 1 : F t F 0 t .
Extropy has recently gained considerable attention in the research community as a useful metric for goodness-of-fit testing. Qiu and Jia [28] pioneered the development of two consistent estimators for Tsallis entropy based on the concept of spacings and subsequently constructed a goodness-of-fit test for the uniform distribution using the more efficient of the two. In a related contribution, Xiong et al. [29] exploited the properties of classical record values to derive a characterization result for the exponential distribution, leading to the development of a novel exponentiality test. Their work carefully outlined the test statistic and emphasized its strong performance, particularly with small sample sizes. Building on this foundation, Jose and Sathar [30] proposed a new exponentiality test based on a characterization involving the Tsallis entropy of lower r-record values.
The present section aims to extend these developments by exploring the Tsallis entropy of consecutive r-out-of-n:G systems. As established in Theorem 3.3, the exponential distribution can be uniquely characterized through the Tsallis entropy associated with such systems. Leveraging Equation (25) and after simplification, we propose a new test statistic for the exponentiality, denoted by T S i , n , β , and for n 2 i , defined as follows:
T S i , n , β     = H β T n i n : G η i , n , β β H β T + β 1 β + 1 1 β ,
where
η i , n , β = [ ( n i ) ( i + 1 ) ] β ( n i + 1 ) [ i ( n i + 1 ) ] β ( n i ) B i ( n i + 1 ) ( n i ) ( i + 1 ) ; β ( n i ) , β + 1 .   f o r   a l l   β Θ .   I f   n 2 i ,  
then Theorem 3.3 directly implies that T S i , n , β = 0 if and only if T is exponentially distributed. This fundamental property establishes T S i , n , β as a viable measure of exponentiality and a suitable candidate for a test statistic. Given a random sample T 1 , T 2 , , T N , an estimator T S ^ i , n , β of T S i , n , β can be used as a test statistic. Significant deviations of T S ^ i , n , β from its expected value under the null hypothesis (i.e., the assumption of an exponential distribution) would indicate non-exponentiality, prompting the rejection of the null hypothesis. Consider a random sample of size N , denoted by T 1 , T 2 , , T N drawn from an absolutely continuous distribution H and T 1 : N T 2 : N T N : N represent the corresponding order statistics. To estimate the test statistic, we adopt an estimator proposed by Vasicek [31] for d H 1 ( w ) d w = 1 / h H 1 ( w ) as follows:
d F 1 ( w ) d w N T l + m T l m 2 m ,   f o r   a l l   l = m + 1 , m + 2 , , N m ,  
and m is a positive integer smaller than N / 2 which is known as the window size and X l m : N = X 1 : N for l m and X l + m : N = X N : N for l N m . So, a reasonable estimator for T S ^ i , n , β can be derived using Equation (30) as follows:
      T S ^ i , n , β = 1 N ( 1 β ) l = 1 N 2 m N T ( l + m ) T ( l m ) β 1 k n i n : G l N + 1 β β η i , n , β .  
Establishing the consistency of an estimator is essential, particularly when evaluating estimators for parametric functions. The following theorem confirms the result stated in Equation (25). The proof adopts an approach similar to that of Theorem 1 in Vasicek's [31]. Notably, Park [32] and Xiong et al. [29] also utilized the consistency proof technique introduced by Vasicek [31] to validate the reliability of their respective test statistics.
Theorem 4.1.
Assume that T 1 , T 2 , , T N is a random sample of size N taken from a population with pdf f and cdf F . Also, let the variance of the random variable be finite. Then T S ^ i , n , β p   T S i , n , β as N + , m + and m N 0 , where p   stands for the convergence in probability for all β Θ .
Proof. 
To establish the consistency of the estimator T S ^ i , n , β , we employ the approach given in Noughabi and Arghami [33]. As both m and N tend to infinity, with the ratio m / N approaching 0, we can approximate the density as follows:
2 m N X l + m : N X l m : N     = F N X l + m : N F N X l m : N X l + m : N X l m : N       F X l + m : N F X l m : N X l + m : N X l m : N f X l + m : N f X l m : N 2 f X l : N ,  
where F N represents the empirical distribution function. Furthermore, given that l N + 1 =   F N X l : N , we can express
T S ^ i , n , β   = 1 N ( 1 β ) l = 1 N 2 m N T ( l + m ) T ( l m ) β 1 k n i n : G l N + 1 β β η i , n , β 1 N ( 1 β ) l = 1 N f β 1 X l : N k n i n : G F N X l : N β β η i , n , β = 1 N ( 1 β ) l = 1 N f β 1 X l : N k n i n : G F X l : N β β η i , n , β = 1 N ( 1 β ) l = 1 N f β 1 X l k n i n : G F X l β β η i , n , β ,
where the second approximation relies on the almost sure convergence of the empirical distribution function i.e. F N X l : N   a . s .     F X l : N as N . Now, applying the Strong Law of Large Numbers, we have
1 N ( 1 β ) l = 1 N f β 1 X l [ k n i n : G F X l β β η i , n , β ]     a . s .       E 1 ( 1 β ) f β 1 ( X ) k n i n : G ( F ( X ) ) β β η i , n , β = 1 ( 1 β ) 0 f β ( w ) k n i n : G ( F ( w ) ) β β η i , n , β d w = S β T n i n : G η i , n , β β H β ( T ) + β 1 β + 1 1 β ,
This convergence demonstrates that T S ^ i , n , β is a consistent estimator of T S i , n , β and hence completes the proof of consistency for all β Θ .
The following theorem shows that the root mean square error (RMSE) of T S ^ i , n , β remains invariant under location shifts in the random variable T. However, this invariance does not hold under scale transformations. The proof of these results can be obtained by adapting the arguments presented by Ebrahimi et al. [34].
Theorem 4.2.
Assume that T 1 , T 2 , , T N is a random sample of size N taken from a population with pdf h and c d f H and Y j = a T j + b , a > 0 , b R . Denote the estimators for T S i , n , β on the basis of T j and Y j with T S ^ i , n , β T and T S ^ i , n , β Y , respectively. Then, the following properties apply:
3.
E T S ^ i , n , β Y = E T S ^ i , n , β T / a ,
4.
V a r T S ^ i , n , β Y = V a r T S ^ i , n , β T / a 2 ,
5.
R M S E T S ^ i , n , β Y = R M S E T S ^ i , n , β T / a , for all β Θ .
Proof. 
It is not hard to see from (25) that
T S ^ i , n , β Y   = 1 N ( 1 β ) l = 1 N 2 m N Y ( l + m ) Y ( l m ) β 1 k n i n : G l N + 1 β β η i , n , β = 1 N ( 1 β ) l = 1 N 2 m N a T ( l + m ) T ( l m ) β 1 k n i n : G l N + 1 β β η i , n , β = T S ^ i , n , β T / a .
The proof is then completed by leveraging the properties of the mean, variance, and RMSE of T S ^ i , n , β Y = T S ^ i , n , β T / a .
Various combinations of n and i can be used to construct a range of test statistics. For computational purposes, we choose n = 3 and i = 1 which simplifies the expression in Equation (30) for evaluation. Under the null hypothesis, H 0 , the value of T S ^ 1,3 , 2 asymptotically converges to zero as the sample size N approaches infinity. Conversely, under an alternative distribution with an absolutely continuous cdf F , the value of T S ^ 1,3 , 2 converges to a positive value as N + . Based on these asymptotic properties, we reject the null hypothesis at a given significance level α , for a finite sample size N if the observed value of the test statistic T S ^ 1,3 , 2 exceeds the critical value T S ^ 1,3 , 2 ( 1 α ) . The asymptotic distribution of T S ^ 1,3 , 2 is intricate and analytically intractable due to its dependence on both the sample size N and the window parameter m .
To overcome this, we employed a Monte Carlo simulation approach. Specifically, we generated 10,000 samples of sizes N = 5 ,   10 ,   20 ,   30 ,   40 ,   50 ,   100 from the standard exponential distribution under the null hypothesis. For each sample size, we determined the ( 1 α ) -th quantile of the simulated T S ^ 1,3 , 2 values to establish the critical value for significance levels α = 0.05,0.01 , while varying the window size m from 2 to 30. Table 1 and Table 2 present the resulting critical values for the respective sample sizes and significance levels.

Power Comparisons

The power of the T S ^ 1,3 , 2 test was assessed through a Monte Carlo simulation involving nine alternative probability distributions. For each sample size N = 10,000 replicates of size N were drawn from each alternative distribution, and the T S ^ 1,3 , 2 statistic was computed for each. The power at a significance level α was then estimated as the empirical rejection rate, specifically the proportion of the 10,000 samples yielding a T S ^ 1,3 , 2 statistic exceeding the critical threshold.
To validate the efficiency of the newly introduced test based on the T S ^ 1,3 , 2 statistic, its performance will be benchmarked against several existing tests for exponentiality reported in the literature, with the specifications of the alternative distributions provided in Table 3.
The simulation setup, including the selection of alternative distributions and their parameters, closely follows the methodology proposed by Jose and Sathar [30]. To assess the efficiency of the newly proposed test based on the T S ^ 1,3 , 2 statistic, its performance is compared with several established tests for exponentiality reported in the literature, as summarized in Table 4.
The performance of the T S ^ 1,3 , 2 statistic is contingent upon the window size m, necessitating the prior determination of an appropriate value to ensure adequate adjusted statistical power. Simulations across varying sample sizes resulted in the heuristic formula m = 0.4 N , where x is the floor function. This formula provides a practical guideline for selecting m and aims to ensure robust power performance across different distributions.
To comprehensively assess the performance of the proposed T S ^ 1,3 , 2 , test, we selected ten established tests for exponentiality and evaluated their power against a diverse set of alternative distributions. Notably, Xiong et al. [29] introduced a test statistic based on the Tsallis entropy of classical record values, while Jose and Sathar [30] characterized the exponential distribution using Tsallis entropy derived from lower iii-records and developed a corresponding test statistic.
These two tests, designated as D 9 ​ and D 10 ​ in Table 4, are included in our comparative analysis due to their foundation in information-theoretic principles. The original authors provided detailed discussions supporting their relevance and applicability for testing exponentiality. To estimate the power of each test, we simulated 10,000 independent samples for each sample size N { 10,20,50 } from each alternative distribution detailed in Table 3. The power values for the T S ^ 1,3 , 2 , test was then computed at a 5 percent significance level. Subsequently, the power values for T S ^ 1,3 , 2 , and the eleven competing tests were calculated for the same sample sizes and alternative distributions. These comparative results are summarized in Table 5.
In general, the T S ^ 1,3 , 2 test statistic demonstrates strong performance in detecting deviations from exponentiality towards the gamma distribution. However, its performance against other alternative distributions, including Weibull, uniform, half-normal, and log-normal, is moderate, exhibiting neither exceptional strength nor significant weakness.

5. Conclusions

This study explores the role of Tsallis entropy within the framework of consecutive r-out-of-n:G systems. A key contribution lies in identifying a meaningful connection between the Tsallis entropy of systems with general continuous lifetime distributions and those governed by the uniform distribution. Given the challenges of deriving closed-form entropy expressions, especially in systems with large n or complex component distributions, we propose a set of informative bounds that offer practical approximations. These bounds enhance both the theoretical understanding and empirical analysis of Tsallis entropy in reliability contexts. Furthermore, we develop a nonparametric estimator specifically tailored for consecutive r-out-of-n:G systems and demonstrate its effectiveness using simulated and real data. The estimator provides valuable insights into system-level uncertainty and reliability, with potential applications in fields such as image processing, where entropy-based measures support data-driven decision-making.
In summary, this work advances the application of Tsallis entropy in reliability modeling by (i) establishing theoretical relationships, (ii) deriving practical bounds, and (iii) introducing a robust estimation technique. These contributions lay the foundation for future research into entropy-based methodologies across a wide range of statistical and engineering domains.

Data Availability

The datasets are available within the manuscript.

Acknowledgments

Princess Nourah bint Abdulrahman University Researchers Supporting Project number (PNURSP2025R226), Princess Nourah bint Abdulrahman University, Riyadh, Saudi Arabia.

Conflicts of interest

The authors declare that they have no conflicts of interest to report regarding the present study.

References

  1. Jung, K.-H.; Kim, H. Linear consecutive-k-out-of-n:F system reliability with common-mode forced outages. Reliab. Eng. Syst. Saf. 1993, 41, 49–55. [Google Scholar] [CrossRef]
  2. Shen, J.; Zuo, M.J. Optimal design of series consecutive-k-out-of-n:G systems. Reliab. Eng. Syst. Saf. 1994, 45, 277–283. [Google Scholar] [CrossRef]
  3. Kuo, W.; Zuo, M.J. Optimal Reliability Modeling: Principles and Applications; John Wiley & Sons, 2003. [Google Scholar]
  4. Chung, C.I.-H.; Cui, L.; Hwang, F.K. Reliabilities of Consecutive-k Systems; Springer, 2013; Vol. 4. [Google Scholar]
  5. Boland, P.J.; Samaniego, F.J. Stochastic ordering results for consecutive k-out-of-n:F systems. IEEE Trans. Reliab. 2004, 53, 7–10. [Google Scholar] [CrossRef]
  6. Eryılmaz, S. Mixture representations for the reliability of consecutive-k systems. Math. Comput. Model. 2010, 51, 405–412. [Google Scholar] [CrossRef]
  7. Eryılmaz, S. Conditional lifetimes of consecutive k-out-of-n systems. IEEE Trans. Reliab. 2010, 59, 178–182. [Google Scholar] [CrossRef]
  8. Eryılmaz, S. Reliability properties of consecutive k-out-of-n systems of arbitrarily dependent components. Reliab. Eng. Syst. Saf. 2009, 94, 350–356. [Google Scholar] [CrossRef]
  9. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  10. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  11. Wong, K.M.; Chen, S. The entropy of ordered sequences and order statistics. IEEE Trans. Inf. Theory 1990, 36, 276–284. [Google Scholar] [CrossRef]
  12. Park, S. The entropy of consecutive order statistics. IEEE Trans. Inf. Theory 1995, 41, 2003–2007. [Google Scholar] [CrossRef]
  13. Ebrahimi, N.; Soofi, E.S.; Soyer, R. Information measures in perspective. Int. Stat. Rev. 2010, 78, 383–412. [Google Scholar] [CrossRef]
  14. Zarezadeh, S.; Asadi, M. Results on residual Rényi entropy of order statistics and record values. Inf. Sci. 2010, 180, 4195–4206. [Google Scholar] [CrossRef]
  15. Toomaj, A.; Doostparast, M. A note on signature-based expressions for the entropy of mixed r-out-of-n systems. Naval Res. Logist. 2014, 61, 202–206. [Google Scholar] [CrossRef]
  16. Toomaj, A. Rényi entropy properties of mixed systems. Commun. Stat. Theory Methods 2017, 46, 906–916. [Google Scholar] [CrossRef]
  17. Mesfioui, M.; Kayid, M.; Shrahili, M. Rényi entropy of the residual lifetime of a reliability system at the system level. Axioms 2023, 12, 320. [Google Scholar] [CrossRef]
  18. Alomani, G.; Kayid, M. Further properties of Tsallis entropy and its application. Entropy 2023, 25, 199. [Google Scholar] [CrossRef]
  19. Baratpour, S.; Khammar, A.H. Results on Tsallis entropy of order statistics and record values. Istatistik J. Turk. Stat. Assoc. 2016, 8, 60–73. [Google Scholar]
  20. Kumar, V. Some results on Tsallis entropy measure and k-record values. Physica A 2016, 462, 667–673. [Google Scholar] [CrossRef]
  21. Kayid, M.; Alshehri, M.A. Shannon differential entropy properties of consecutive k-out-of-n:G systems. Oper. Res. Lett. 2024, 57, 107190. [Google Scholar] [CrossRef]
  22. Kayid, M.; Shrahili, M. Information properties of consecutive systems using fractional generalized cumulative residual entropy. Fractal Fract. 2024, 8(10), 568. [Google Scholar] [CrossRef]
  23. Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer, 2007. [Google Scholar]
  24. Bagai, I.; Kochar, S.C. On tail-ordering and comparison of failure rates. Commun. Stat. Theory Methods 1986, 15, 1377–1388. [Google Scholar] [CrossRef]
  25. Navarro, J.; Eryilmaz, S. Mean residual lifetimes of consecutive-k-out-of-n systems. J. Appl. Probab. 2007, 44, 82–98. [Google Scholar] [CrossRef]
  26. Kamps, U. Characterizations of distributions by recurrence relations and identities for moments of order statistics. In Handbook of Statistics; 1998; Volume 16, pp. 291–311. [Google Scholar]
  27. Hwang, J.S.; Lin, G.D. On a generalized moment problem. II. Proc. Amer. Math. Soc. 1984, 91, 577–580. [Google Scholar] [CrossRef]
  28. Qiu, G.; Jia, K. Extropy estimators with applications in testing uniformity. J. Nonparametr. Stat. 2018, 30, 182–196. [Google Scholar] [CrossRef]
  29. Xiong, P.; Zhuang, W.; Qiu, G. Testing exponentiality based on the extropy of record values. J. Appl. Stat. 2022, 49, 782–802. [Google Scholar] [CrossRef] [PubMed]
  30. Jose, J.; Sathar, E.I.A. Characterization of exponential distribution using extropy based on lower k-records and its application in testing exponentiality. J. Comput. Appl. Math. 2022, 402, 113816. [Google Scholar] [CrossRef]
  31. Vasicek, O. A test for normality based on sample entropy. J. R. Stat. Soc. Ser. B Stat. Methodol. 1976, 38, 54–59. [Google Scholar] [CrossRef]
  32. Park, S. A goodness-of-fit test for normality based on the sample entropy of order statistics. Stat. Probab. Lett. 1999, 44, 359–363. [Google Scholar] [CrossRef]
  33. Hadi Alizadeh Noughabi and Naser Reza Arghami. Testing exponentiality based on characterizations of the exponential distribution. Journal of Statistical Computation and Simulation 2011, 81, 1641–1651. [Google Scholar] [CrossRef]
  34. Ebrahimi, N.; Pflughoeft, K.; Soofi, E.S. Two measures of sample entropy. Stat. Probab. Lett. 1994, 20, 225–234. [Google Scholar] [CrossRef]
  35. Fortiana, J.; Grané, A. A scale-free goodness-of-fit statistic for the exponential distribution based on maximum correlations. J. Stat. Plann. Inference 2002, 108, 85–97. [Google Scholar] [CrossRef]
  36. Choi, B.; Kim, K.; Song, S.H. Goodness-of-fit test for exponentiality based on Kullback-Leibler information. Commun. Stat. Simul. Comput. 2004, 33, 525–536. [Google Scholar] [CrossRef]
  37. Mimoto, N.; Zitikis, R. The Atkinson index, the Moran statistic, and testing exponentiality. J. Jpn. Stat. Soc. 2008, 38, 187–205. [Google Scholar] [CrossRef]
  38. Volkova, K.Y. On asymptotic efficiency of exponentiality tests based on Rossberg's characterization. J. Math. Sci. 2010, 167, 489–493. [Google Scholar] [CrossRef]
  39. Zamanzade, E.; Arghami, N.R. Goodness-of-fit test based on correcting moments of modified entropy estimator. J. Stat. Comput. Simul. 2011, 81, 2077–2093. [Google Scholar] [CrossRef]
  40. Baratpour, S.; Rad, A.H. Testing goodness-of-fit for exponential distribution based on cumulative residual entropy. Commun. Stat. Theory Methods 2012, 41, 1387–1396. [Google Scholar] [CrossRef]
  41. Noughabi, H.A.; Arghami, N.R. Goodness-of-fit tests based on correcting moments of entropy estimators. Commun. Stat. Simul. Comput. 2013, 42, 499–513. [Google Scholar] [CrossRef]
  42. Volkova, K.Y.; Nikitin, Y.Y. Exponentiality tests based on Ahsanullah’s characterization and their efficiency. J. Math. Sci. 2015, 204, 42–54. [Google Scholar] [CrossRef]
  43. Torabi, H.; Montazeri, N.H.; Grané, A. A wide review on exponentiality tests and two competitive proposals with application on reliability. J. Stat. Comput. Simul. 2018, 88, 108–139. [Google Scholar] [CrossRef]
Figure 1. The plot of H β T 2 4 : G with respect to β and γ as demonstrated in Example 2.1.
Figure 1. The plot of H β T 2 4 : G with respect to β and γ as demonstrated in Example 2.1.
Preprints 167832 g001
Table 1. Critical values of the T S ^ 1,3 , 2 statistic at significance level α = 0.05 .
Table 1. Critical values of the T S ^ 1,3 , 2 statistic at significance level α = 0.05 .
m N = 5 N = 10 N = 20 N = 30 N = 40 N = 50 N = 100
2 3.314111 4.885299 5.31386 5.669357 5.665471 5.879007 5.969947
3 2.958639 3.575905 3.652558 3.735460 3.929436 3.934263
4 2.091046 2.584478 2.745547 2.981776 3.072066 3.135100
5 2.209270 2.462253 2.569272 2.619691 2.657987
6 1.960857 2.129584 2.283690 2.277793 2.414754
7 1.734072 1.992837 2.066112 2.116888 2.232844
8 1.609901 1.819573 1.922531 1.975798 2.081751
9 1.455605 1.673973 1.802262 1.858107 1.945239
10 1.587775 1.675985 1.782838 1.877756
11 1.488481 1.620426 1.688339 1.848637
12 1.416203 1.538009 1.614312 1.764357
13 1.342485 1.472354 1.577332 1.722066
14 1.285730 1.406798 1.500436 1.675081
15 1.369318 1.455467 1.629516
16 1.322761 1.420537 1.607517
17 1.271932 1.372767 1.574213
18 1.233146 1.339279 1.523170
19 1.195628 1.308735 1.499460
20 1.259018 1.476491
21 1.226252 1.436125
22 1.203902 1.419521
23 1.166646 1.396562
24 1.137062 1.382412
25 1.367450
26 1.326762
27 1.313035
28 1.301748
29 1.274347
30 1.268577
Table 2. Critical values of the T S ^ 1,3 , 2 statistic at significance level α = 0.01 .
Table 2. Critical values of the T S ^ 1,3 , 2 statistic at significance level α = 0.01 .
m N = 5 N = 10 N = 20 N = 30 N = 40 N = 50 N = 100
2 7.741372 11.020993 13.130953 14.345684 14.343766 14.219676 13.504338
3 5.330965 6.404442 7.433630 6.808109 6.722751 7.335729
4 3.388231 4.296331 4.723610 4.846727 4.829120 5.240127
5 3.316372 3.695920 3.789253 3.926883 4.006127
6 2.804852 3.208528 3.263101 3.443553 3.478685
7 2.548659 2.771682 2.937497 2.903739 3.083877
8 2.211149 2.461236 2.679584 2.677176 2.844005
9 1.992974 2.214899 2.379238 2.401170 2.585907
10 2.071670 2.276676 2.272026 2.477653
11 1.905570 2.009089 2.231709 2.328932
12 1.827860 2.002682 2.037605 2.226634
13 1.681670 1.860946 1.978671 2.171389
14 1.594050 1.808541 1.853613 2.038638
15 1.700383 1.812322 1.999786
16 1.642237 1.733522 1.956928
17 1.541187 1.671427 1.873484
18 1.485475 1.589422 1.864797
19 1.439565 1.549233 1.800001
20 1.520203 1.777315
21 1.477773 1.733554
22 1.435841 1.678109
23 1.380899 1.637635
24 1.341584 1.646366
25 1.574026
26 1.563578
27 1.526702
28 1.512871
29 1.481510
30 1.462325
Table 3. Alternative Probability Distributions for Evaluating the Power of the Test Statistic.
Table 3. Alternative Probability Distributions for Evaluating the Power of the Test Statistic.
Distribution Probability Density Function Support Notation
Weibull f ( t ) = α β t β α 1 e t β α , t > 0 , β , σ > 0 W ( α , β )
Gamma f ( t ) = 1 β α Γ ( α ) t α 1 e t / β t > 0 , α , β > 0 G ( α , β )
Uniform f ( t ) = 1 β α , α t β U ( α , β )
Half-Normal f ( t ) = β λ π e t β β λ β , t > 0 , λ > 0 H N ( λ )
Log-Normal f ( t ) = 1 t λ β π e ( l n t μ ) 2 β λ β , t > 0 , λ > 0 , μ R L N ( μ , λ )
Table 4. Competing Tests for Exponentiality.
Table 4. Competing Tests for Exponentiality.
Test Reference Notation
1 Fortiana and Grané [35] D 1
2 Choi et al. [36] D 2
3 Mimoto and Zitikis [37] D 3
4 Volkova [38] D 4
5 Zamanzade and Arghami [39] D 5
6 Baratpour and Rad [40] D 6
7 Noughabi and Arghami [41] D 7
8 Volkova and Nikitin [42] D 8
9 Torabi et al. [43] D 9
10 Xiong et al. [29] D 10
11 Jose and Sathar [30] D 11
Table 5. Power comparisons of the tests in significance level α = 0.05
Table 5. Power comparisons of the tests in significance level α = 0.05
N H 1 D 1 D 2 D 3 D 4 D 5 D 6 D 7 D 8 D 9 D 10 D 11 T S ^ 1,3 , 2
10 G ( 1,1 ) 5 5 5 5 5 5 5 5 5 5 5 5
G ( 0.4,1 ) 34 11 50 46 29 7 0 50 65 60 83 75
W ( 1.4,1 ) 15 17 16 13 16 23 29 16 1 15 17 3
H N ( 1 ) 11 10 10 8 10 8 20 10 1 18 8 5
U ( 0,1 ) 42 28 31 15 33 60 51 24 2 93 100 10
L N ( 0,0.8 ) 12 24 16 21 23 17 26 19 2 5 5 2
L N ( 0,1.4 ) 36 19 40 6 29 15 1 9 47 3 2 3
20 G ( 1,1 ) 5 5 5 5 5 5 5 5 6 5 5 5
G ( 0.4,1 ) 56 31 77 80 66 25 0 82 89 95 99 96
W ( 1.4,1 ) 32 27 34 27 17 33 47 29 6 29 8 2
H N ( 1 ) 23 14 19 12 7 30 23 14 2 37 5 4
U ( 0,1 ) 86 52 63 28 54 92 80 39 18 100 100 12
L N ( 0,0.8 ) 18 52 26 48 42 18 49 45 8 4 4 0
L N ( 0,1.4 ) 61 45 67 11 64 43 0 16 71 0 3 2
50 G ( 1,1 ) 5 5 5 5 5 5 5 5 5 5 5 5
G ( 0.4,1 ) 89 78 99 99 95 68 0 99 100 100 100 100
W ( 1.4,1 ) 73 55 79 65 5 62 80 67 37 56 7 0
H N ( 1 ) 59 23 52 25 1 54 48 28 13 73 2 2
U ( 0,1 ) 100 91 98 62 62 100 99 72 78 100 100 24
L N ( 0,0.8 ) 26 93 44 93 55 24 84 92 47 3 3 0
L N ( 0,1.4 ) 93 85 95 25 93 85 0 29 95 0 2 0
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated