The distributions of sums for Shanker, Akash, Ishita, Pranav, Rani, and Ram Awadh

In statistics and probability theory, one of the most important statistics is the sums of random variables. After introducing a probability distribution, determining the sum of n independent and identically distributed random variables is one of the interesting topics for authors. This paper presents the probability density functions for the sum of n independent and identically distributed random variables such as Shanker, Akash, Ishita, Pranav, Rani, and Ram Awadh. In order to determine all aforementioned distributions, the problem-solving methods are applied which is based on the change-of-variables technique.


Introduction
The items such as various wireless communications (e.g., satellite communications, equal-gain receivers, and radars) and insurance are widely known as applications for the sums of random variables (Karagiannidis et al. (2005), Van Khuong and Kong, (2006), Nadarajah (2008), Ramsay (2008) , and Nadarajah et al. (2012)). Moreover, in the field of systems reliability, like the lifespan of 1 out of n cold standby, spare systems can also be referred to the sums of random variables. If represents the lifetime of the whole system and indicates the lifetime of the ith component, then the sums of 's lifetime is equal to .
Recently, a collection of the one-parameter continuous probability density functions (PDFs) has been introduced by Shukla and Shanker (2019). Similar to the Lindley distribution, these functions are also made up of a mixture of exponential distribution with scale parameter , (exp()) and gamma distribution with different shape parameters n, and scale parameters , (gamma(n, )). For instance, with n=2 one can obtain the Shanker distribution. When n=3, the Using Table 1 data, the presented PDFs by Shukla and Shanker (2019) are given in Table 2.  The authors have compared available distributions in Table 2 with the Lindley distribution and the results showed that the performance of the them is better than the Lindley distribution.
The Lindley distribution was introduced by Lindley (1958). This distribution is plentifully used in the reliability theory. Ghitany et al. )2008( examined the Lindley distribution in detail and showed that this distribution in many ways has more flexibility than exponential distribution. Introducers of the presented PDFs in Table 2 have discussed some features such as descriptive statistics (mean, variance, coefficient of variation, skewness and kurtosis), reliability theory (hazard function, mean residual life and stress-strength reliability models), stochastic ordering, mean deviations, Lorenz curves, entropies and simulation study. But another important aspect, i.e., the sum of n independent and identically distributed (IID) random variables for introduced functions, has not been studied. Therefore, here we calculate the distribution of sums for the aforementioned PDFs.
The prevalent and well-known methods for obtaining the sums of n IID random variables are the moment generating functions, Laplace transforms, convolution method, and change-of-variables technique. These methods can be used to find the sum of the distributions which mathematically have a simple PDF (e.g., exponential, gamma, Lindley, and Pareto). However, the other density functions like the Weibull and Burr that have a more complex mathematical form, to obtain the distribution of sums are used methods of the generalized hypergeometric functions such as the Fox's H-function and the Meijer's G-function.
There is an extended history for the sums of independent random variables. Many authors have examined the distribution of sums of the random variables with a statistical distribution. Some of the works devoted to the sum of independent random variables can be seen briefly in Table 3.
Fox's H-function, and Wright generalized Ψfunction

The distribution of sums
Suppose 1 , 2 ,…, denote n IID random variables that according to the PDFs of Table 2. If = 1 + 2 + ⋯ + is the distribution of sums for the PDFs listed in Table 2, by applying the change-of-variables technique one can obtain the density functions of for Shanker, Akash, Ishita, Pranav, Rani, and Ram Awadh. The results are briefly shown in Table 4.
where, | | is the absolute value of the Jacobian.
Since is independent, hence Eq. (1) is given by: Now, the marginal PDF for is equal to: With n=2 and 3 in Eq.
(3) and for > 0, the functions of ( ) are given by: ) are equal to ( + 2) 3 . Therefore, it can be said that the coefficients of ! for the sum of n Ishita random variables are ( + 2) . Nevertheless, according to the obtained results of Eqs. (4) and (5), the explicit form for the sum of n IID with Ishita random variables can be expressed as follows: By this analogous logic, it can be simplicity obtained all the PDFs of reported in Table 4.

The mth Moments
Another important measure for statistical distributions is the mth moment. The calculation of this factor is very important for any statistical distribution. The mth moment for each arbitrary PDF can be expressed as: (7) By special values for amounts of m can be defined the descriptive statistical indicators such as, mean, variance, coefficient of variation, skewness and kurtosis. Table 5 presents the closed form for the mth moments of PDFs Table 4 as follows.

Conclusion
In this paper, the probability density functions of the sum of n independent and identically distributed random variables with Shanker, Akash, Ishita, Pranav, Rani, and Ram Awadh have been explicitly assessed. Their determination is based on the change-of-variables technique.
Moreover, the mth moments for them were also accurately calculated.