Preprint
Article

This version is not peer-reviewed.

Randomly Stopped Sums with Generalized Subexponential Distribution

A peer-reviewed version of this preprint was published in:
Axioms 2023, 12(7), 641. https://doi.org/10.3390/axioms12070641

Submitted:

25 May 2023

Posted:

26 May 2023

You are already at the latest version

Abstract
Let {ξ1, ξ2. . . .} be a sequence of independent possibly differently distributed random variables defined on a probability space (Ω, F, P) with distribution functions {F_ξ1., F_ξ2 , . . .}. Let η be a counting random variable independent of sequence {ξ1, ξ2. . . .}. In this paper, we find conditions under which distribution function of randomly stopped sum Sη = ξ1 + ξ2 + . . . + ξη belongs to the class of generalized subexponential distributions.
Keywords: 
;  ;  ;  

1. Introduction

Let { ξ 1 , ξ 2 , } be a sequence of independent random variables (r.v.s) with distribution functions (d.f.s) { F ξ 1 , F ξ 2 , } , and let η be a counting random variable, that is, a nonnegative, nondegenerate at 0, and integer-valued r.v. In addition, we suppose that the r.v. η and the sequence { ξ 1 , ξ 2 , } are independent.
Let S 0 : = 0 , S n : = ξ 1 + + ξ n for n N , and let
S η = k = 1 η ξ k
be the randomly stopped sum of the r.v.s ξ 1 , ξ 2 , .
By F S η we denote the d.f. of S η , and by F ¯ we denote the tail function (t.f.) of a d.f. F, that is, F ¯ ( x ) = 1 F ( x ) for x R . It is obvious that the following equalities hold for positive x:
F S η ( x ) = P ( η = 0 ) + n = 1 P ( η = n ) P ( S n x ) F ¯ S η ( x ) = n = 1 P ( η = n ) P ( S n > x ) .
In this paper, we consider a sequence { ξ 1 , ξ 2 , } of independent and possibly nonidentically distributed r.v.s. We suppose that some of d.f.s of these r.v.s belong to the class of generalized subexponential distributions OS , and we find conditions under which d.f. F S η remains in this class.
We use the following three notations for the asymptotic relations of arbitrary positive functions f and g: f ( x ) = o g ( x ) means that lim x f ( x ) / g ( x ) = 0 ; f ( x ) c g ( x ) , c > 0 , means that lim x f ( x ) / g ( x ) = c ; and f ( x ) g ( x ) means that
0 < lim inf x f ( x ) g ( x ) lim sup x f ( x ) g ( x ) < .
The rest of the paper is organized as follows. In Section 2, we desribe class of generalized subexponential distributions. Section 4 consists of some results on closure under randomly stopped sums for regularity classes related with generalized subexponential distributions. The main results of the paper are formulated in Section 3. The proofs of the main results are given in Section 5 and Section 6.

2. Generalized subexponentiality

Let ξ be a r.v. defined on a probability space ( Ω , F , P ) with d.f. F ξ .
A d.f. F ξ of a real-valued r.v. is said to be generalized subexponential, denoted F ξ OS , if
lim sup x F ξ F ξ ¯ ( x ) F ¯ ξ ( x ) < ,
where F ξ F ξ denote the convolution of d.f. F ξ with it self, i.e.
F ξ F ξ ( x ) = F ξ 2 ( x ) : = F ξ ( x y ) d F ξ ( y ) , x R .
For distributions of non-negative r.v.s class OS was introduced by Klüppelberg [1] and later for real-valued r.v.s was studied by Shimura and Watanabe [2], Baltrūnas et al. [3], Watanabe and Yamamuro [4], Yu and Wang [5], Cheng and Wang [6], Lin and Wang [7], Konstantinides et al. [8], Mikutavičius and Šiaulys [9] among others.
In [2], the class of distributions OS is considered together with other distribution regularity classes. In that paper, several closedness properties of the class OS were proved. For example, it is shown that the class OS is not closed under convolution roots. This means that there exist r.v. ξ such that n-fold convolution F ξ n OS for all n 2 , but F ξ OS . In [3], the simple conditions are provided under which d.f. of the special form
F ξ ( x ) = 1 exp 0 x q ( u ) d u
belongs to the class OS , where q is some integrable hazard rate function. For distributions of class OS the closure under tail-equivalence and the closure under convolution are established in [4]. The detailed proofs of these closures for non-negative r.v.s are presented in [1] and for real-valued r.v.s in [5]. The closure under convolution tail equivalence means that in case of independent r.v.s ξ 1 , ξ 2 conditions F ξ 1 OS , F ξ 2 OS imply that F ξ 1 F ξ 2 = F ξ 1 + ξ 2 OS . The closure under tail-equivalence means that conditions F ξ 1 OS , F ¯ ξ 1 ( x ) F ¯ ξ 2 ( x ) imply F ξ 2 OS .
A counterexample, showing that F ξ 1 , F ξ 2 OS for independent r.v.s ξ 1 , ξ 2 does not imply F ξ 1 ξ 2 OS can be found in [7]. Moreover in that paper, the closure under maximum is established which means that F ξ 1 , F ξ 2 OS for independent r.v.s ξ 1 , ξ 2 imply F ξ 1 ξ 2 OS . The authors of articles [8] and [9] consider when the distribution of the product of two independent random variables ξ , θ belongs to the class OS . For instance in [9], it is proved that d.f. F ξ θ is generalized subexponential if F ξ OS and θ is non-negative and not-degenerated at zero.

3. Main results

In this section, we formulate two theorems which are the main assertions of this paper. The first theorem deals to the case when the counting r.v. has a finite support.
Theorem 1.
Let { ξ 1 , ξ 2 , } be a sequence of independent r.v.s, and η be a counting r.v. independent of { ξ 1 , ξ 2 , } . If η is bounded, F ξ 1 OS , and for other indices k 1 either F ξ k OS or F ¯ ξ k ( x ) = O F ¯ ξ 1 ( x ) , then d.f. of randomly stopped sum F S η belongs to the class OS .
The case of unbounded support of counting r.v. is considered in the second theorem. In such a case, to be F S η OS we need the counting random variable to have a light tail.
Theorem 2.
Let { η , ξ 1 , ξ 2 , . . . } be independent random variables, where counting r.v. η be such that E e λ η < for all λ > 0 . Then F S η O S , if F ξ 1 O S and one of the conditions below is satisfied:
( i ) P ( η = 1 ) > 0 and lim sup x sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) < ; ( ii ) 0 < lim inf x inf k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) lim sup x sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) < .
We will present the proofs of both theorems in Section 6. According to the statements of these theorems, many random variables with generalized subexponential distributions can be constructed. We will demonstrate such constructions in section ??

5. Auxiliary lemmas

In this section, we will present and prove some auxiliary lemmas that will be applied to the derivations of the main theorems 1 and 2.
Lemma 1.
Let X and Y be two real valued r.v.s with corresponding d.f.s F X and F Y . The following statements hold:
(i) F X OS if and only if sup x R F X F X ¯ ( x ) F ¯ X ( x ) < .
(ii) If F X OS and F ¯ Y ( x ) x F ¯ X ( x ) , then F Y OS .
(iii) If F X OS and F Y OS , then F X F Y OS .
(iv) If F X OS , then F X OL i.e. lim sup x F ¯ X ( x 1 ) F ¯ X ( x ) = 1 .
(v) If F X OS and F ¯ Y ( x ) = O F ¯ X ( x ) , then F X F Y OS and F X F Y ¯ ( x ) x F ¯ X ( x ) .
Proof. A large part of the properties of the class OS listed in Lemma 1 can be found, for instance, in [1,2,4,5]. However, for the sake of exposition completeness, we present the full proof of the formulated lemma.
Part (i). If F X O S , then
lim sup x F X F X ¯ ( x ) F ¯ X ( x ) <
according to definition. This estimate implies that F ¯ X ( x ) > 0 for each x R . In addition, the inequality (1) gives that
F X F X ¯ ( x ) F ¯ X ( x ) M
if x x M for some M and x M .
If x < x M , then, obviously, F ¯ X ( x ) F ¯ X ( x M ) and F X F X ¯ ( x ) 1 .
Therefore, for each x R we get that
F X F X ¯ ( x ) F ¯ X ( x ) max M , 1 F ¯ X ( x M ) <
because F ¯ X ( x M ) > 0 . The last estimate finishes the proof of the part (i) because the condition
sup x R F X F X ¯ ( x ) F ¯ X ( x ) <
implies (1) obviously.
Part (ii). The condition F ¯ Y ( x ) x F ¯ X ( x ) implies
lim inf x F ¯ Y ( x ) F ¯ X ( x ) > 0 and lim sup x F ¯ Y ( x ) F ¯ X ( x ) < .
It follows from this that
F ¯ Y ( x ) F ¯ X ( x ) M , x x M ,
for some M and x M . If x < x M , then
F ¯ Y ( x ) F ¯ X ( x ) 1 F ¯ X ( x M ) <
because F X OS . According to the derived estimates
sup x R F ¯ Y ( x ) F ¯ X ( x ) = max M , 1 F ¯ X ( x M ) = C < .
Therefore for each x R
F Y F Y ¯ ( x ) = F ¯ Y ( x y ) F ¯ X ( x y ) F ¯ X ( x y ) d F Y ( y ) C F ¯ X ( x y ) d F Y ( y ) = C F ¯ Y ( x y ) d F X ( y ) = C F ¯ Y ( x y ) F ¯ X ( x y ) F ¯ X ( x y ) d F X ( y ) C 2 F ¯ X ( x y ) d F X ( y ) = C 2 F X F X ¯ ( x ) .
This estimate implies that
lim sup x F Y F Y ¯ ( x ) F ¯ Y ( x ) C 2 lim sup x F X F X ¯ ( x ) F ¯ Y ( x ) C 2 lim sup x F X F X ¯ ( x ) F ¯ X ( x ) 1 lim inf x F ¯ Y ( x ) F ¯ X ( x ) <
due to the assumption F X OS and the first inequality in (2). The last estimate gives that d.f. F Y belongs to the class OS . Part (ii) of the lemma is proved.
Part (iii). According to part (i) we have that
sup x R F X F X ¯ ( x ) F ¯ X ( x ) = C 1 < and sup x R F Y F Y ¯ ( x ) F ¯ Y ( x ) = C 2 <
Let X 1 , X 2 , Y 1 , Y 2 be independent r.v.s. Suppose that X 1 , X 2 are distributed according to the d.f. F X , and Y 1 , Y 2 are distributed according to the d.f. F Y . For each x R we get
( ( F X F Y ) ) 2 ¯ ( x ) = ( F X F Y ) ( F X F Y ) ¯ ( x ) = P ( X 1 + Y 1 + X 2 + Y 2 > x ) = P ( X 1 + X 2 + Y 1 + Y 2 ) > x ) = P ( X 1 + X 2 > x y ) d P ( Y 1 + Y 2 y ) = F X F X ¯ ( x y ) F ¯ X ( x y ) F ¯ X ( x y ) d P ( Y 1 + Y 2 y ) C 1 F ¯ X ( x y ) d P ( Y 1 + Y 2 y ) = C 1 P ( X 1 + Y 1 + Y 2 > x ) = C 1 F Y F Y ¯ ( x y ) F ¯ Y ( x y ) F ¯ Y ( x y ) d P ( X 1 y ) C 1 C 2 F ¯ Y ( x y ) d F X ( y ) = C 1 C 2 F X F Y ¯ ( x ) .
Hence
sup x R ( ( F X F Y ) ) 2 ¯ ( x ) F X F Y ¯ ( x ) C 1 C 2
implying that F X F Y OS by part (i). Part (iii) of the lemma is proved.
Part (iv). Due to the part (i)
sup x R F X F X ¯ ( x ) F X ¯ ( x ) = C 3 < .
In addition, for x > 2 , we obtain
F X F X ¯ ( x ) = F ¯ X ( x t ) d F X ( t ) ( 1 , x ] F ¯ X ( x t ) d F X ( t ) F ¯ X ( x 1 ) ( F X ( x ) F X ( 1 ) )
When x is large enough we have F ( x ) F ( 1 ) > 0 , and, therefore,
F ¯ X ( x 1 ) F ¯ X ( x ) F X F X ¯ ( x ) F ¯ X ( x ) 1 F X ( x ) F X ( 1 ) .
Hence
lim sup x F ¯ X ( x 1 ) F ¯ X ( x ) C 3 F ¯ X ( 1 ) < ,
and part (iv) of the lemma is proved.
Part (v). Since F ¯ Y ( x ) = O F ¯ X ( x ) , we have
F ¯ Y ( x ) F ¯ X ( x ) M , x > x M ,
with certain constants M and x M . If x x M , then
F ¯ Y ( x ) F ¯ X ( x ) 1 F ¯ X ( x M ) <
because F X OS implies F ¯ X ( x M ) > 0 . From the both above inequalities it follows that
sup x R F ¯ Y ( x ) F ¯ X ( x ) max M , 1 F ¯ X ( x M ) = C 4
Consequently, for x R we get
F X F Y ¯ ( x ) = F ¯ Y ( x y ) d F X ( y ) C 4 F ¯ X ( x y ) d F X ( y ) = C 4 F X F X ¯ ( x ) C 5 F X ¯ ( x )
with some positive constant C 5 , where the last step in the above derivation follows from part (i) of the lemma.
On the other hand, there exists a real b R for which
F ¯ Y ( b ) = 1 F Y ( b ) 1 2
For this b, we get
F X F Y ¯ ( x ) ( b , ) F ¯ X ( x y ) d F Y ( y ) F ¯ X ( x b ) ( b , ) d F Y ( y ) = F ¯ X ( x b ) F ¯ Y ( b ) 1 2 F ¯ X ( x ) F ¯ X ( x b ) F ¯ X ( x )
Hence,
lim inf x F X F Y ¯ ( x ) F ¯ X ( x ) 1 2 lim inf x F ¯ X ( x b ) F ¯ X ( x ) .
In part (iv) of the lemma we proved that F X OL . It is easy to verify that
F X OL lim sup x F ¯ X ( x 1 ) F ¯ X ( x ) < F ¯ X ( x b ) x F ¯ X ( x ) for each b R .
Therefore, the estimate (4) implies that
lim inf x F X F Y ¯ ( x ) F ¯ X ( x ) > 0 .
From (3) and (5) inequalities it follows that F X F Y ¯ ( x ) x F X ¯ ( x ) . Moreover by part (ii) of the lemma F X F Y OS . This finish the proof of the last part of the lemma. □
Lemma 2.
Let { ξ 1 , ξ 2 . . . } be a sequence of independent r.v.s, for which F ξ 1 OS , and for others indices k 2 either F ξ k OS or F ¯ ξ k = O ( F ¯ ξ 1 ( x ) ) . Then F S n OS for all n N .
Proof. If n = 1 ,then the statement is obvious because S 1 = ξ 1 . If n = 2 , then two options are possible F ξ 2 OS or F ¯ ξ 2 = O ( F ¯ ξ 1 ( x ) ) . In the first case F S 2 = F ξ 1 F ξ 2 OS according to the part (iii) of Lemma 1. In the second case F S 2 OS by the part (v) of the same lemma.
Let now n > 2 . Denote
K = { k { 2 , . . . , n } : F ¯ ξ k ( x ) = O ( F ¯ ξ 1 ( x ) ) } .
Initially assume that the set K is empty. In such a case, F ξ k OS for all indices k K c = { 1 , 2 , 3 , , n } . By the part (iii) of Lemma 1 we get that F S n OS .
Let now the index set K = { k 1 , k 2 , . . . k r } { 1 , . . . n } is not empty. Since
F ¯ ξ k 1 ( x ) = O ( F ¯ ξ 1 ( x ) ) ,
part (v) of Lemma 1 implies that
F ξ 1 F ξ k 1 OS ,
and
F ξ 1 F ξ k 1 ¯ ( x ) F ¯ ξ 1 ( x ) .
According the relation (7)
lim sup x F ¯ ξ k 2 ( x ) F ξ 1 F ξ k 1 ¯ ( x ) lim sup x F ¯ ξ k 2 ( x ) F ¯ ξ 1 ( x ) 1 lim inf x F ξ 1 F ξ k 1 ¯ ( x ) F ¯ ξ 1 ( x ) <
because F ¯ ξ k 2 ( x ) = O ( F ¯ ξ 1 ( x ) ) . This means that
F ¯ ξ k 2 ( x ) = O ( F ξ 1 F ξ k 1 ¯ ( x ) ) .
Hence according to (6) and part (v) of Lemma 1 we get
F ξ 1 F ξ k 1 F ξ k 2 = ( F ξ 1 F ξ k 1 ) F ξ k 2 OS ,
and
F ξ 1 F ξ k 1 F ξ k 2 ¯ ( x ) F ξ 1 F ξ k 1 ¯ ( x ) .
Continuing the process we obtain
F K : = F ξ 1 j = 1 r F ξ k j = F ξ 1 F ξ k 1 F ξ k 2 . . . F ξ k r OS ,
and
F ξ 1 F ξ k 1 F ξ k 2 . . . F ξ k r ¯ ( x ) F ξ 1 F ξ k 1 F ξ k 2 . . . F ξ k r 1 ¯ ( x ) .
For the remaining indices k K c = { 2 , 3 , . . . , n } { k 1 , k 2 , . . . , k r } d.f. F ξ k OS . By the part (iii) of Lemma 1 we get
F K c : = k K c F ξ k OS .
Using part (iii) of Lemma 1 again we derive that
F S n = F K F K c OS .
This finish the proof of Lemma 2. □
Lemma 3.
Let ξ 1 , ξ 2 , . . . be a sequence of independent random variables,for which F ξ 1 O S and
lim sup x sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) <
Then there exists a constant C ^ for which
F ¯ S n ( x ) C ^ n 1 F ¯ ξ 1 ( x )
for all x R and for all n 2 .
Proof. The condition (8) implies that
sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) C 6
for all x A with some positive constants C 6 and A. If x < A ,then
sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) 1 F ¯ ξ 1 ( x ) 1 F ¯ ξ 1 ( A ) < .
Therefore, for each x R
sup k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) max C 6 , 1 F ¯ ξ 1 ( A ) : = C 7 .
In addition, the part (i) of Lemma 1 gives that
F ξ 1 F ξ 1 ¯ ( x ) C 8 F ¯ ξ 1 ( x )
for all x R with some positive constant C 8 .
We will prove the inequality (9) with constant C ^ = C 7 C 8 . If n = 1 , the inequality (9) holds evidently because F ¯ S 1 ( x ) = F ¯ ξ 1 ( x ) . If n = 2 , then by (10) and (11) for x R we have
F ¯ S 2 ( x ) = F ¯ ξ 2 ( x y ) d F ξ 1 ( y ) C 7 F ¯ ξ 1 ( x y ) d F ξ 1 ( y ) = C 7 F ξ 1 F ξ 1 ¯ ( x ) C ^ F ¯ ξ 1 ( x ) .
Suppose now that the inequality (9) holds for n = m 2 , i.e.
F ¯ S m ( x ) F ¯ ξ 1 ( x ) C ^ m 1 , x R .
After choosing n = m + 1 , from this assumption and from (10), (11) we get
F ¯ S m + 1 ( x ) = F ¯ S m ( x y ) d F ξ m + 1 ( y ) C ^ m 1 F ¯ ξ 1 ( x y ) d F ξ m + 1 ( y ) = C ^ m 1 F ¯ ξ m + 1 ( x y ) d F ξ 1 ( y ) C ^ m 1 C 7 F ¯ ξ 1 ( x y ) d F ξ 1 ( y ) = C ^ m 1 C 7 F ξ 1 F ξ 1 ¯ ( x ) C ^ m F ¯ ξ 1 ( x ) , x R .
According to the induction principle, the inequality (9) holds for all n N . Lemma 3 is proved. □

6. Proofs of the main results

In this section, we present proofs of the main results op the paper.
Proof of Theorem 1. suppose that P ( η { 0 , 1 , , L } ) = 1 for some L N . We have
F ¯ S η ( x ) = n = 1 L P ( η = n ) F ¯ S n ( x ) , x R .
Let
L = max { k { 1 , 2 , . . . , L } : P ( η = k ) > 0 } .
For each x R
F ¯ S η ( x ) F ¯ S L ( x ) P ( η = L ) F ¯ S L ( x ) F ¯ S L ( x ) = P ( η = L ) > 0 .
On the other hand
F ¯ S η ( x ) = k = 0 L 1 P ( η = L k ) P ( S L k > x )
For any random variable ξ k , k { 1 , 2 , , L } , there exists a negative number a k , for which P ( ξ k a k ) 1 / 2 . We have
P ( S L 1 > x ) = P ( S L 1 a L > x a L , ξ L a L ) + P ( S L 1 > x , ξ L < a L ) P ( S L > x a L ) + P ( S L 1 > x ) P ( ξ L < a L )
From this we derive that
P ( S L 1 > x ) 2 P ( S L > x a L )
for each x R . Similarly,
P ( S L 2 > x ) 2 P ( S L 1 > x a L 1 ) 4 P ( S L > x a L 1 a L )
also for each real number x. Continuing the process we obtain
P ( S L k > x ) 2 k P S L > x j = 0 k 1 a L j
for all x R and for all k = 1 , 2 , . . . , L 1 . After inserting the obtained estimates into inequality (13), we get that
F ¯ S η ( x ) k = 0 l 1 P ( η = L k ) 2 k P ( S L > x j = 0 k 1 a L j ) P ( S L > x a ) k = 0 L 1 2 k P ( η = L k ) = C F ¯ S L ( x a ) ,
where
C = k = 0 L 1 2 k P ( η = L k ) , and a = j = 2 L a j .
Consequently, for all x
F ¯ S η ( x ) F ¯ S L ( x ) C F ¯ S L ( x a ) F ¯ S L ( x )
By Lemma 2 and part (iv) of Lemma 1 we have that F S L O S O L . Therefore
lim sup x F ¯ S η ( x ) F ¯ S L ( x ) <
By (12) and (14) we have,that
F ¯ S η ( x ) F ¯ S L ( x )
Therefore F S η O S together with F S L by the part (ii) of Lemma 1. Theorem 1 is proved. □
Proof of Theorem 2. Part (i) Whereas
F ¯ S η ( x ) = n = 1 P ( η = n ) F ¯ S n ( x ) ,
by Lemma 3 for all real numbers x we obtain
F ¯ S η ( x ) F ¯ ξ 1 ( x ) n = 1 C ^ n 1 P ( η = n ) F ¯ ξ 1 ( x ) F ¯ ξ 1 ( x ) E e C ^ η < ,
where C ^ is some positive constant.
On the other hand
F ¯ S η ( x ) P ( η = 1 ) F ¯ ξ 1 ( x ) .
Hence under conditions of part (i), we have that F ¯ S η ( x ) x F ¯ ξ 1 ( x ) . Therefore F S η O S according to part (ii) of Lemma 1. Part (i) of Theorem 2 is proved.
Part(ii). If P ( η = 1 ) > 0 , then assertion of this part follows from the proved part (i). Since E e λ η < for each λ > 0 , the inequality (15) implies that
lim sup x F ¯ S η ( x ) F ¯ ξ 1 ( x ) < .
In addition, conditions of part (ii) of the theorem give that
inf k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) Δ
for all x x Δ and some positive Δ . If x < x Δ , then
inf k 1 F ¯ ξ k ( x ) F ¯ ξ 1 ( x ) inf k 1 F ¯ ξ k ( x Δ ) F ¯ ξ 1 ( x Δ ) F ¯ ξ 1 ( x Δ ) Δ F ¯ ξ 1 ( x Δ ) : = C ˜ > 0
due to the assumption F ξ 1 OS . The derived inequalities imply that
F ¯ ξ k ( x ) C ˜ F ¯ ξ 1 ( x )
for some positive constant C ˜ , and for all x R , k { 1 , 2 , } .
Using the last estimate we get
F ¯ S 2 ( x ) = F ¯ ξ 2 ( x y ) F ¯ ξ 1 ( x y ) F ¯ ξ 1 ( x y ) d F ξ 1 ( y ) C ˜ F ξ 1 F ξ 1 ¯ ( x ) C ˜ F ¯ ξ 1 ( 0 ) F ¯ ξ 1 ( x ) , x R .
Similarly,
F ¯ S 3 ( x ) = F ¯ S 2 ( x y ) F ¯ ξ 1 ( x y ) F ¯ ξ 1 ( x y ) d F ξ 1 ( y ) C ˜ F ¯ ξ 1 ( 0 ) F ξ 1 F ξ 1 ¯ ( x ) C ˜ F ¯ ξ 1 ( 0 ) 2 F ¯ ξ 1 ( x ) , x R .
Continuing process we obtain
F ¯ S n ( x ) C ˜ F ¯ ξ 1 ( 0 ) n 1 F ¯ ξ 1 ( x )
for all x R and n { 2 , 3 , } .
Therefore,
lim inf x F ¯ S η ( x ) F ¯ ξ 1 ( x ) lim inf x P ( η = L ˜ ) F ¯ S L ˜ F ¯ ξ 1 ( x ) P ( η = L ˜ ) C ˜ F ¯ ξ 1 ( 0 ) L ˜ 1 > 0 ,
where L ˜ = min { n 2 : P ( η = n ) > 0 } .
The derived inequalities (16) and (17) imply F ¯ S η ( x ) x F ¯ ξ 1 ( x ) . By part (ii) of Lemma 1 we get F S η OS . Theorem 2 is proved.

Author Contributions

Conceptualization, J.Š.; methodology, J.K. and J.Š.; software, J.K.; validation, J.Š.; formal analysis, J.K.; investigation, J.K. and J.Š.; writing-original draft preparation, J.K.; writing-review and editing, J.Š.; visualization, J.Š.; supervision, J.Š.; project administration, J.Š.; funding acquisition, J.K. All authors have read and agreed to the published version of the manuscript.

Institutional Review Board Statement

Not applicable

Data Availability Statement

Not applicable

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Klüppelberg, C. Asymptotic ordering of distribution functions and convolution semigroups. Semigr. Forum 1990, 40, 77–92. [Google Scholar] [CrossRef]
  2. Shimura, T.; Watanabe, T. Infinite divisibility and generalized subexponentiality. Bernoulli 2005, 11, 445–469. [Google Scholar] [CrossRef]
  3. Baltrūnas, A.; Omey, E.; Van Gulck, S. Hazard rates and subexponential distributions. Publ. de l’Institut Math. 2006, 80, 29–46. [Google Scholar] [CrossRef]
  4. Watanabe, T.; Yamamuro, K. Ratio of the tail of an infinitely divisible distribution on the line to that of its Lévy measure. Electron. J. Probab. 2010, 15, 44–74. [Google Scholar] [CrossRef]
  5. Yu, C.; Wang, Y. Tail behaviovior of supremum of a random walk when Cramér condition fails. Front. Math. China 2014, 9, 431–453. [Google Scholar] [CrossRef]
  6. Cheng, D.; Wang, Y. Asymptotic behavior of the ratio of tail probabilities of sum and maximum of independent random variables. Lith. Math. J. 2012, 52, 29–39. [Google Scholar] [CrossRef]
  7. Lin, J.; Wang, Y. New examples of heavy tailed O-subexponential distributions and related closure properties. Stat. Probab. Lett. 2012, 82, 427–432. [Google Scholar] [CrossRef]
  8. Konstantinides, D.; Leipus, R.; Šiaulys, J. A note on product-convolution for generalized subexponential distributions. Nonlinear Anal.: Model. Control 2022, 27, 1054–1067. [Google Scholar] [CrossRef]
  9. Mikutavičius, G.; Šiaulys, J. Product convolution of generalized subexponential distributions. Mathematics 2023, 11, 248. [Google Scholar] [CrossRef]
  10. Chistyakov, V.P. A theorem on sums of independent, positive random variables and its applications to branching processes. Theor Probab. Appl. 1964, 9, 640–648. [Google Scholar] [CrossRef]
  11. Athreya, K.B.; Ney, P.E. Branching Processes; Springer-Verlag: New York, 1972. [Google Scholar]
  12. Chover, J.; Ney, P.; Waigner, S. Degeneracy properties of subcritical branching processes. Ann. Probab. 1973, 1, 663–673. [Google Scholar] [CrossRef]
  13. Chover, J.; Ney, P.; Waigner, S. Functions of probability measures. J. d’Analyse Math. 1973, 26, 255–302. [Google Scholar] [CrossRef]
  14. Embrechts, P.; Goldie, C.M. On convolution tails. Stoch. Process. their Appl. 1982, 13, 263–278. [Google Scholar] [CrossRef]
  15. Embrechts, P.; Omey, E. A property of long tailed distributions. J. Appl. Probab. 1984, 21, 80–87. [Google Scholar] [CrossRef]
  16. Cline, D.B.H. Intermediate regular and Π variation. Proc. London Math. Soc. 1994, 68, 594–611. [Google Scholar] [CrossRef]
  17. Cline, D.B.H.; Samorodnitsky, G. Subexponentiality of the product of independent random variables. Stoch. Process. their Appl. 1994, 49, 75–98. [Google Scholar] [CrossRef]
  18. Klüppelberg, C. Subexponential distributions and characterization of related classes. Probab. Theory Relat. Fields 1989, 82, 259–269. [Google Scholar] [CrossRef]
  19. Pakes, A.G. Convolution equivalence and infinite divisibility. J. Apll. Probab. 2004, 41, 407–424. [Google Scholar] [CrossRef]
  20. Rogozin, B.A. On the constant in the definition of subexponential distributions. Theory Probab. Appl. 2000, 44, 409–412. [Google Scholar] [CrossRef]
  21. Foss, S.; Korshunov, D. Lower limits and equivalences for convolution tails. Ann. Probab. 2007, 35, 366–383. [Google Scholar] [CrossRef]
  22. Cline, D.B.H. Convolution tails, product tails and domain of attraction. Probab. Theory Relat. Fields 1986, 72, 529–557. [Google Scholar] [CrossRef]
  23. Watanabe, T. The Wiener condition and the conjectures of Embrechts and Goldie. Ann. Probab. 2019, 47, 1221–1239. [Google Scholar] [CrossRef]
  24. Foss, S.; Korshunov, D.; Zachary, S. An Introduction to Heavy-Tailed and Subexponential Distributions, 2nd ed.; Springer: New York, 2013. [Google Scholar]
  25. Athreya, K.B.; Ney, P.E. Branching Processes; Springer: New York, 1972. [Google Scholar]
  26. Embrechts, P.; Klüppelberg, C.; Mikosch, T. Modelling Extremal Events for Insurance and Finance; Springer: Berlin, 1997. [Google Scholar]
  27. Asmussen, S. Applied Probability and Queues, 2nd ed.; Springer: New York, 2003. [Google Scholar]
  28. Denisov, D.; Foss, S.; Korshunov, D. Asymptotics of randomly stopped sums in the presence of heavy tails. Bernoulli 2010, 16, 971–994. [Google Scholar] [CrossRef]
  29. Watanabe, T. Convolution equivalence and distribution of random sums. Probab. Theory Relat. Fields 2008, 142, 367–397. [Google Scholar] [CrossRef]
  30. Schmidli, H. Compound sums and subexponentiality. Bernoulli 1999, 5, 999–1012. [Google Scholar] [CrossRef]
  31. Pakes, A.G. Convolution equivalence and infinite divisibility: corrections and corollaries. J. Appl. Probab. 2007, 44, 295–305. [Google Scholar] [CrossRef]
  32. Wang, Y.; Yang, Y.; Wang, K.; Cheng, D. Some new equivalent conditions on asymptotics and local asymptotics for random sums and their applications. Insur. Math. Econ. 2007, 40, 256–266. [Google Scholar] [CrossRef]
  33. Feller, W. One-sided analogues of Karamata’s regular variation. Enseign. Math. 1969, 15, 107–121. [Google Scholar]
  34. Seneta, E. Regularly Varying Functions. Lecture Notes in Mathematics, volume 508; Springer-Verlag: Berlin, 1976. [Google Scholar]
  35. Bingham, N.H.; Goldie, C.M.; Teugels, J.L. Regular Variation; Cambridge University Press: Cambridge, 1987. [Google Scholar]
  36. Tang, Q.; Yan, J. A sharp inequality for the tail probabilities of i.i.d. r.v.’s with dominatedly varying tails. Sci. China Ser. A 2002, 45, 1006–1011. [Google Scholar] [CrossRef]
  37. Tang, Q.; Tsitsiashvili, G. Precise estimates for the ruin probability in the finite horizon in a discrete-time risk model with heavy-tailed insurance and financial risks. Stoch. Processes Appl. 2003, 108, 299–325. [Google Scholar] [CrossRef]
  38. Cai, J.; Tang, Q. On max-type equivalence and convolution closure of heavy-tailed distributions and their applications. J. Appl. Probab. 2004, 41, 117–130. [Google Scholar] [CrossRef]
  39. Konstantinides, D. A class of heavy tailed distributions. J. Numer. Appl. Math. 2008, 96, 127–138. [Google Scholar]
  40. Embrechts, P.; Goldie, C.M. On closure and factorization properties of subexponential and related distributions. J. Aust. Math. Soc. Ser. A 1980, 29, 243–256. [Google Scholar] [CrossRef]
  41. Foss, S.; Korshunov, D.; Zachary, S. Convolution of long-tailed and subexponential distributions. J. Appl. Probab. 2009, 46, 756–767. [Google Scholar] [CrossRef]
  42. Leipus, R.; Šiaulys, J. Closure of some heavy-tailed distribution classes under random convolution. Lith. math. J. 2012, 52, 249–258. [Google Scholar] [CrossRef]
  43. Danilenko, S.; Šiaulys, J. Randomly stopped sums of not identically dis- tributed heavy tailed random variables. Stat. Probab. Lett. 2016, 113, 84–93. [Google Scholar] [CrossRef]
  44. Danilenko, S.; Markevičiūtė, J.; Šiaulys, J. Randomly stopped sums with exponential-type distributions. Nonlinear Anal.: Model. Control 2017, 22, 793–807. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated