Submitted:
11 May 2025
Posted:
13 May 2025
You are already at the latest version
Abstract

Keywords:
1. Introduction
1.1. From Probability Theory to Information Theory
1.2. The Challenge of Accurate Entropy Estimation
1.3. Related Works and Previous Contributions
1.4. Motivation and Contribution of This Paper
2. Overview on Materials and Methods of Information Theory
2.1. Transforming a Discrete-State Stochastic Process into a Probability Distribution
2.2. Integer-Order Rényi -Entropies as Synthetic Indices for the Characterization of s
2.3. Rényi Entropy Rates
2.4. Relationship Between Specific Rényi Entropy Rate and Specific Rényi Entropy
2.5. Rényi Generalized Dimensions
2.6. Rényi Generalized Dimension Rates
2.7. Rényi Mutual Information
2.8. Measures of Entropic Distance Between Distributions
2.9. Two Fundamental Families of Structured Probability Distributions
- 1.
- Mono Dominant with Uniform Residues (MD_UR): This family generates distributions in which, starting from the uniform case, a single symbol progressively absorbs more probability mass by subtracting it uniformly from the other symbols. The family can be defined as:where the parameter controls the strength of the dominant symbol.
- 2.
- Uniform Block Dominant with Mono Residue (UBD_MR): This family defines structured, biased distributions starting from a uniform base. Probability mass is progressively removed from the last non-zero event and equally redistributed among all preceding ones. The resulting distribution exhibits a dominant block of equal, high-probability values at the front, followed by a single lower residual value, with all subsequent entries set to zero. This construction induces a non-trivial truncation of the support and does not admit a simple closed-form expression.
![]() |
![]() |
2.10. Ambiguity Reduction in Entropy-Based Measures
2.11. Interpreting Rényi Entropies as the Negative Logarithm of Generalized Means
3. Practical Implementation of Methods of Information Theory
3.1. Converting a Realization into a Relative Frequency Distribution
3.2. Rényi Mean and Specific Rényi Entropy of a Relative Frequency Distribution
3.3. Apparent vs. Actual Specific Entropies in Controlled Probabilistic Processes
3.4. Determination of the Relationship Between Actual and Apparent Rényi Mean
- is the scaling factor;
- is the translation offset;
3.5. Toward Accurate Estimation of Rényi Entropies via Affine Transformations
- Min-max normalization of actual and empirical Rényi means, to ensure comparability;
- A linear transformation between the two domains;
- The logarithmic relationship between Rényi entropy and Rényi mean.
- 1.
- Change of paradigm: data do not need correction, but they have to be rescaled from an empirical to a theoretical context.
- 2.
- Consistency: as , we have and , ensuring that converges to the true entropy .
- 3.
- Bias and variance control: the estimator compensates for the known bias in empirical frequencies by incorporating a prior empirical mean , resulting in a lower mean squared error compared to classical estimators.
- 4.
- Computational efficiency: the formula requires only a few arithmetic operations and, optionally, a numerically precomputed lookup table for , making it suitable for large-scale inference.
3.6. A Second Transformation for More Accurate Estimation of Entropies: Orthogonal Stretch
3.7. A Practical Example of Applying the First Order Affine Transformation
- 1.
- Choice of the stochastic process: In this case, without loss of generality, and following the foundational approach adopted by early pioneers of probability theory (e.g., Cardano, Galileo, Pascal, Fermat, Huygens, Bernoulli, de Moivre, Newton), we consider a memoryless stochastic process generated by repeated rolls of a six-sided die subject to a specific probability distribution. The key advantage of using memoryless processes is that their entropic state remains invariant under changes in the dimensionality of the sample space used for statistical analysis.
- 2.
- Choice of the theoretical : We select a distribution whose entropic characteristics lie far from the regions typically covered by empirical distributions derived from small samples, thereby making the recovery task more challenging, for instance:
- 3.
- Choice of the composition of the entropic space: We consider a three dimensional entropic space. The entropies are: specific Shannon entropy (), specific collision entropy (), and specific min-entropy (). A generic three-dimensional point is projected onto two two-dimensional points: , located above the diagonal in the collision entropy-Shannon entropy plane (blue zone), and , located beneath the diagonal in the collision entropy-min-entropy plane (red zone). The value of is shared between the two projections.
- 4.
- Visualization of theoretical, empirical, and stretched contexts: To clearly distinguish the regions concerning the states of theoretical probability distributions, of their derived empirical distributions, and of the states resulting after the application of the first-order affine transformation, the diagram includes the curves generated by the elaborations over MD_UR and UBD_MR families, along with the curves relative to the corresponding realizations and their stretched versions.
- 5.
- Determination of the entropic state points of the theoretical distribution T: These points describe the entropic state of the initial distribution and represent the target of the estimation process:
- 6.
- Generation of realizations: A set of realizations is generated from the process by applying a random number generator to the previously defined probability distribution T. A large number of realizations helps reduce the deviation of the final estimation from the target value. Each realization consists of a sequence of samples.
- 7.
- Choice of the the parameters of the sample space Ω: To significantly reduce the density of data, we select a sample space with the same alphabet of the process and dimension . The total number of elementary events is .
- 8.
- Embedding data into the sample space to derive relative frequencies: Consequently, the number of elementary events that occur in is . This leads to a very sparse data regime, with a density .
- 9.
- Calculation of their center of gravity S: For each realization, we compute the Rényi means and their logarithmic mapping in entropy plane; the results are then averaged:
- 10.
- Calculation of their center of gravity E: The affine transformation is applied to the Rényi means of the empirical data and the entropic state points of the translated Rényi means are derived; the results are then averaged:
- 11.
- Evaluation of the distance between E and T: The estimation method is satisfying when point E results coincident or very near to point T
4. Discussion
4.1. Considerations on the Example Presented in § 3.7
4.2. Considerations Concerning Algorithms for Generating MD_UR and UBD_MR Families of s
4.3. Data Contextualization vs. Data Correction in Entropy Estimation Methods
5. Conclusion
6. Future Work
- Improved transformation models: Develop secondary orthogonal transformations to refine entropy estimation, especially min-entropy, through adaptive corrections.
- Analytical characterization and simplification: Investigating analytical approximations or tight bounds for empirical Rényi means , reducing computational complexity.
- Variance control and confidence estimation: Enhancing robustness by exploring variance control methods, including smoothing kernels, shrinkage techniques, or deriving confidence intervals analytically.
- Extension to real-world data: Validation of the estimator on real datasets, particularly in cybersecurity, neuroscience, and computational biology, to test practical effectiveness.
- Generalization to other information functionals: Extending the affine estimation framework to broader information-theoretic measures, expanding its theoretical and practical scope.
- Integration into statistical and machine learning workflows: Exploring applications of entropy estimation within machine learning as feature transformations, loss functions, or regularizers to innovate data-driven modeling.
Funding
Conflicts of Interest
Abbreviations
| Set of q ordered symbols (alphabet) | |
| Stationary, infinite length, discrete-state stochastic process whose samples | |
| belong to | |
| r | Physical realization (data sequence of finite length) taken from a |
| L | Number of samples constituting r |
| Set of physical realizations | |
| Arithmetic mean of calculated over | |
| Sample space resulting from the Cartesian product d times of | |
| d | Dimension of |
| Cardinality of | |
| Number of occurrences of elementary events of observed during the | |
| evolution of r | |
| Data density in the sample space | |
| Discrete probability distribution | |
| MD_UR | Mono Dominant (with Uniform Residues) family of s |
| UBD_MR | Uniform block dominant (with Mono Residue) family of s |
| Relative frequency distribution | |
| obtained from a whose infinite d-grams are considered coordinates | |
| of occurred elementary events of | |
| obtained from a realization r whose finite d-grams are considered | |
| coordinates of occurred elementary events of | |
| Order of a Rényi mean or of a Rényi entropy | |
| -Rényi mean of a | |
| -Rényi mean of a | |
| Estimated -Rényi mean of a | |
| -Rényi entropy of a | |
| -Rényi entropy of a | |
| Specific -Rényi entropy of a | |
| Specific -Rényi entropy of an | |
| Estimated specific Rényi -entropy of a | |
| Specific Rényi -entropy rate of a | |
| Specific Rényi -entropy rate of r | |
| Estimated specific Rényi -entropy rate of a | |
| Dynamical system | |
| Partition of the state space into d-dimensional cells of linear size | |
| Probability that a trajectory of the crosses the i-th cell | |
| Rényi generalized dimension of order | |
| Rényi Generalized Dimension Rate of order of the | |
| -Rényi mutual information | |
| Specific -Rényi mutual information |
| 1 |
An affine transformation is a mapping of the form , where A is a linear operator (e.g., a matrix) and B is a fixed vector. It preserves points, straight lines, and planes, and includes operations such as scaling, rotation, translation, and shear. |
| 2 | The Rényi mean of order of a probability distribution is a generalized averaging operator defined as . It forms the basis of Rényi entropy, which is defined as . |
References
- Cardano, G. Liber de Ludo Aleae (The Book on the Game of DiceGames of Chance); 1564. Published posthumously in 1663.
- Pascal, B.; de Fermat, P. Correspondence on the Problem of Points; Original correspondence, Toulouse and Paris, 1654. Reprinted and translated in various historical anthologies on the foundations of probability theory.
- Kolmogorov, A. Foundations of the theory of probability, 1950 ed.; Chelsea Publishing Co.: New York, 1933. [Google Scholar]
- Shannon, C. A mathematical theory of communication. The Bell System Technical Journal 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Rényi, A. On measures of entropy and information. In Proceedings of the Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, Berkeley, CA, USA, 1961; pp.547–561.
- Arikan, E. An inequality on guessing and its application to sequential decoding. In Proceedings of the Proceedings of 1995 IEEE International Symposium on Information Theory, 1995, pp. 322–. [CrossRef]
- Csiszár, I. Generalized cutoff rates and Rényi’s information measures. IEEE Transactions on information theory 1995, 41, 26–34. [Google Scholar] [CrossRef]
- Beck, C. Generalised information and entropy measures in physics. Contemporary Physics 2009, 50, 495–510. [Google Scholar] [CrossRef]
- Fuentes, J.; Gonçalves, J. Rényi Entropy in Statistical Mechanics. Entropy 2022, 24. [Google Scholar] [CrossRef]
- Cachin, C. Smooth entropy and Rényi entropy. In Proceedings of the International Conference on the Theory and Applications of Cryptographic Techniques. Springer; 1997; pp. 193–208. [Google Scholar]
- Boztas, S. On Rényi entropies and their applications to guessing attacks in cryptography. IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences 2014, 97, 2542–2548. [Google Scholar] [CrossRef]
- Badhe, S.S.; Shirbahadurkar, S.D.; Gulhane, S.R. Renyi entropy and deep learning-based approach for accent classification. Multimedia Tools and Applications 2022, pp. 1–33.
- Sepúlveda-Fontaine, S.A.; Amigó, J.M. Applications of Entropy in Data Analysis and Machine Learning: A Review. Entropy 2024, 26. [Google Scholar] [CrossRef] [PubMed]
- Pál, D.; Póczos, B.; Szepesvári, C. Estimation of Rényi entropy and mutual information based on generalized nearest-neighbor graphs. Advances in neural information processing systems 2010, 23. [Google Scholar]
- Acharya, J.; Orlitsky, A.; Suresh, A.; Tyagi, H. The Complexity of Estimating Rényi Entropy. In Proceedings of the The twenty-sixth annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2015, San Diego, CA, USA, January 4-6 2015. SIAM, SIAM, 2015; pp. 1855–1869. [CrossRef]
- Skorski, M. Practical Estimation of Renyi Entropy, 2020, [arXiv:cs.DS/2002.09264]. arXiv:cs.DS/2002.09264].
- Miller, G.A. Note on the bias of information estimates. Information Theory in Psychology, 1955; II-B, 95–100. [Google Scholar]
- Vasicek, O.A. A Test for Normality Based on Sample Entropy. Journal of the Royal Statistical Society: Series B (Methodological) 1976, 38, 54–59. [Google Scholar] [CrossRef]
- Basharin, G.P. On a statistical estimate for the entropy of a sequence of independent random variables. Theory Probab. Appl. 1959, 4, 333–336. [Google Scholar] [CrossRef]
- Good, I. Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables. Ann. Math. Statist. 1963, 34, 911–934. [Google Scholar] [CrossRef]
- Harris, B. The statistical estimation of entropy in the non-parametric case. Mathematics Research Center – Technical Summary Report 1975.
- Grassberger, P.; Procaccia, I. Estimation of the Kolmogorov entropy from a chaotic signal. Phys. Rev. A 1983, 28, 2591–2593. [Google Scholar] [CrossRef]
- Akaike, H. Prediction and entropy. In Selected Papers of Hirotugu Akaike; Springer, 1985; pp. 387–410.
- Kozachenko, L.F.; Leonenko, N.N. Sample Estimate of the Entropy of a Random Vector. Problemy Peredachi Informatsii 1987, 23:2, 95––101.
- Grassberger, P. Finite sample corrections to entropy and dimension estimates. Physics Letters A 1988, 128, 369–373. [Google Scholar] [CrossRef]
- Joe, H. Estimation of Entropy and Other Functionals of a Multivariate Density. Annals of the Institute of Statistical Mathematics 1989, 41, 683–697. [Google Scholar] [CrossRef]
- Hall, P.; Morton, S. On the estimation of entropy. Annals of the Institute of Statistical Mathematics 1993, 45, 69–88. [Google Scholar] [CrossRef]
- Wolpert, D.; Wolf, D. Estimating functions of probability distributions from a finite set of samples. Phys. Rev. E 1995, 52, 6841–6854. [Google Scholar] [CrossRef] [PubMed]
- Pöschel, T.; Ebeling, W.; Rosé, H. Guessing probability distributions from small samples. Journal of Statistical Physics 1995, 80, 1443–1452. [Google Scholar] [CrossRef]
- Schürmann, T.; Grassberger, P. Entropy estimation of symbol sequences. Chaos: An Interdisciplinary Journal of Nonlinear Science 1996, 6, 414–427. [Google Scholar] [CrossRef]
- Paluš, M. Coarse-grained entropy rates for characterization of complex time series. Physica D: Nonlinear Phenomena 1996, 93, 64–77. [Google Scholar] [CrossRef]
- Panzeri, S.; Treves, A. Analytical estimates of limited sampling biases in different information measures. Network: Computation in Neural Systems 1996, 7, 87–107. [Google Scholar] [CrossRef]
- Beirlant, J.; Dudewicz, E.; Györfi, L.; Denes, I. Nonparametric entropy estimation. An overview. 1997, Vol. 6, pp. 17–39.
- Schmitt, A.; Herzel, H. Estimating the Entropy of DNA Sequences. Journal of theoretical biology 1997, 188, 369–77. [Google Scholar] [CrossRef]
- Strong, S.P.; Koberle, R.; de Ruyter van Steveninck, R.R.; Bialek, W. Entropy and Information in Neural Spike Trains. Phys. Rev. Lett. 1998, 80, 197–200. [Google Scholar] [CrossRef]
- Porta, A.; Baselli, G.; Liberati, D.; Montano, N.; Cogliati, C.; Gnecchi-Ruscone, T.; Malliani, A.; Cerutti, S. Measuring regularity by means of a corrected conditional entropy in sympathetic outflow. Biological cybernetics 1998, 78, 71–8. [Google Scholar] [CrossRef] [PubMed]
- Holste, D.; Große, I.; Herzel, H. Bayes’ estimators of generalized entropies. Journal of Physics A: Mathematical and General 1998, 31, 2551–2566. [Google Scholar] [CrossRef]
- Chen, S.; Goodman, J. An empirical study of smoothing techniques for language modeling. Computer Speech & Language 1999, 13, 359–394. [Google Scholar]
- de Wit, T.D. When do finite sample effects significantly affect entropy estimates? The European Physical Journal B - Condensed Matter and Complex Systems 1999, 11, 513–516. [Google Scholar] [CrossRef]
- Rached, Z.; Alajaji, F.; Campbell, L.L. Rényi’s entropy rate for discrete Markov sources. In Proceedings of the Proceedings of the CISS, 1999, Vol. 99, pp. 17–19.
- Antos, A.; Kontoyiannis, I. Convergence properties of functional estimates for discrete distributions. Random Structures & Algorithms 2001, 19, 163–193. [Google Scholar] [CrossRef]
- Nemenman, I.; Shafee, F.; Bialek, W. Entropy and inference, revisited. In T. G. Dietterich, S. Becker, and Z. Ghahramani, editors, Advances in Neural Information Processing Systems 2002, 14, 471–478. [Google Scholar]
- Chao, A.; Shen, T.J. Non parametric estimation of Shannon’s index of diversity when there are unseen species. Environ. Ecol. Stat. 2003, 10, 429–443. [Google Scholar] [CrossRef]
- Grassberger, P. Entropy estimates from insufficient samples. arXiv2003, physics/0307138v2 2003.
- Paninski, L. Estimation of entropy and mutual information. Neural Computation 2003, 15, 1191–1253. [Google Scholar] [CrossRef]
- Wyner, A. ; D., F. On the lower limits of entropy estimation. IEEE Transactions on Information Theory - TIT.
- Amigó, J.M.; Szczepański, J.; Wajnryb, E.; Sanchez-Vives, M.V. Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity. Neural Computation 2004, 16, 717–736. [Google Scholar] [CrossRef]
- Kraskov, A.; Stögbauer, H.; Grassberger, P. Estimating mutual information. Phys. Rev. E 2004, 69, 066138. [Google Scholar] [CrossRef] [PubMed]
- Paninski, L. Estimating entropy on m bins given fewer than m samples. IEEE Transactions on Information Theory 2004, 50, 2200–2203. [Google Scholar] [CrossRef]
- Pöschel, T.; Ebeling, W.; Frömmel, C.; Ramirez, R. Correction algorithm for finite sample statistics. The European physical journal. E, Soft matter 2004, 12, 531–41. [Google Scholar] [CrossRef] [PubMed]
- Schürmann, T. Bias analysis in entropy estimation. Journal of Physics A: Mathematical and General 2004, 37, L295. [Google Scholar] [CrossRef]
- Ciuperca, G.; Girardin, V. On the estimation of the entropy rate of finite Markov chains. In Proceedings of the Proceedings of the international symposium on applied stochastic models and data analysis, 2005, pp.1109–1117.
- Szczepański, J.; Wajnryb, E.; Amigó, J.M. Variance Estimators for the Lempel-Ziv Entropy Rate Estimator. Chaos: An Interdisciplinary Journal of Nonlinear Science 2006, 16, 043102. [Google Scholar] [CrossRef]
- Hlaváčková-Schindler, K.; Paluš, M.; Vejmelka, M.; Bhattacharya, J. Causality detection based on information-theoretic approaches in time series analysis. Physics Reports 2007, 441, 1–46. [Google Scholar] [CrossRef]
- Kybic, J. High-Dimensional Entropy Estimation for Finite Accuracy Data: R-NN Entropy Estimator. In Proceedings of the Information Processing in Medical Imaging; Karssemeijer, N.; Lelieveldt, B., Eds., Berlin, Heidelberg; 2007; pp. 569–580. [Google Scholar]
- Vu, V.; Yu, B.; Kass, R. Coverage-adjusted entropy estimation. Statistics in Medicine 2007, 26, 4039–4060. [Google Scholar] [CrossRef]
- Bonachela, J.; Hinrichsen, H.; Muñoz, M. Entropy estimates of small data sets. Journal of Physics A: Mathematical and Theoretical 2008, 41, 9. [Google Scholar] [CrossRef]
- Grassberger, P. Entropy Estimates from Insufficient Samplings, 2008, [arXiv:physics.data-an/physics/0307138].
- Hausser, J.; Strimmer, K. Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks. J. Mach. Learn. Res. 2009, 10, 1469–1484. [Google Scholar]
- Lesne, A.; Blanc, J.; Pezard, L. Entropy estimation of very short symbolic sequences. Physical Review E 2009, 79, 046208. [Google Scholar] [CrossRef]
- Golshani, L.; Pasha, E.; Yari, G. Some properties of Rényi entropy and Rényi entropy rate. Information Sciences 2009, 179, 2426–2433. [Google Scholar] [CrossRef]
- Xu, D.; Erdogmuns, D. , Divergence and Their Nonparametric Estimators. In Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives; Springer New York: New York, NY, 2010; pp. 47–102. [Google Scholar] [CrossRef]
- Källberg, D.; Leonenko, N.; Seleznjev, O. Statistical Inference for Rényi Entropy Functionals. arXiv preprint arXiv:1103.4977 2011.
- Nemenman, I. Coincidences and Estimation of Entropies of Random Variables with Large Cardinalities. Entropy 2011, 13, 2013–2023. [Google Scholar] [CrossRef]
- Vinck, M.; Battaglia, F.; Balakirsky, V.; Vinck, A.; Pennartz, C. Estimation of the entropy based on its polynomial representation. Phys. Rev. E 2012, 85, 051139. [Google Scholar] [CrossRef] [PubMed]
- Boucheron, S.; Lugosi, G.; Massart, P. Concentration Inequalities: A Nonasymptotic Theory of Independence; Vol. 14, Oxford Series in Probability and Statistics, Oxford University Press: Oxford, UK, 2013. Focuses on concentration inequalities with applications in probability, statistics, and learning theory.
- Zhang, Z.; Grabchak, M. Bias Adjustment for a Nonparametric Entropy Estimator. Entropy 2013, 15, 1999–2011. [Google Scholar] [CrossRef]
- Valiant, G.; Valiant, P. Estimating the Unseen: Improved Estimators for Entropy and Other Properties. J. ACM 2017, 64. [Google Scholar] [CrossRef]
- Li, L.; Titov, I.; Sporleder, C. Improved estimation of entropy for evaluation of word sense induction. Computational Linguistics 2014, 40, 671–685. [Google Scholar] [CrossRef]
- Archer, E.; Park, I.; Pillow, J. Bayesian entropy estimation for countable discrete distributions. The Journal of Machine Learning Research 2014, 15, 2833–2868. [Google Scholar]
- Schürmann, T. A Note on Entropy Estimation. Neural Comput. 2015, 27, 2097–2106. [Google Scholar] [CrossRef]
- Kamath, S.; Verdú, S. Estimation of entropy rate and Rényi entropy rate for Markov chains. In Proceedings of the 2016 IEEE International Symposium on Information Theory (ISIT); 2016; pp. 685–689. [Google Scholar] [CrossRef]
- Skorski, M. Improved estimation of collision entropy in high and low-entropy regimes and applications to anomaly detection. Cryptology ePrint Archive, Paper 2016/1035, 2016.
- Acharya, J.; Orlitsky, A.; Suresh, A.; Tyagi, H. Estimating Rényi entropy of discrete distributions. IEEE Transactions on Information Theory 2017, 63, 38–56. [Google Scholar] [CrossRef]
- de Oliveira, H.; Ospina, R. A Note on the Shannon Entropy of Short Sequences 2018. [CrossRef]
- Berrett, T.; Samworth, R.; Yuan, M. Efficient multivariate entropy estimation via k-nearest neighbour distances. The Annals of Statistics 2019, 47, 288–318. [Google Scholar] [CrossRef]
- Verdú, S. Empirical estimation of information measures: a literature guide. Entropy 2019, 21, 720. [Google Scholar] [CrossRef] [PubMed]
- Goldfeld, Z.; Greenewald, K.; Niles-Weed, J.; Polyanskiy, Y. Convergence of smoothed empirical measures with applications to entropy estimation. IEEE Transactions on Information Theory 2020, 66, 4368–4391. [Google Scholar] [CrossRef]
- Kim, Y.; Guyot, C.; Kim, Y. On the efficient estimation of Min-entropy. IEEE Transactions on Information Forensics and Security 2021, 16, 3013–3025. [Google Scholar] [CrossRef]
- Contreras Rodríguez, L.; Madarro-Capó, E.; Legón-Pérez, C.; Rojas, O.; Sosa-Gómez, G. Selecting an effective entropy estimator for short sequences of bits and bytes with maximum entropy. Entropy 2021, 23. [Google Scholar] [CrossRef]
- Ribeiro, M.; Henriques, T.; Castro, L.; Souto, A.; Antunes, L.; Costa-Santos, C.; Teixeira, A. The entropy universe. Entropy 2021, 23. [Google Scholar] [CrossRef] [PubMed]
- Subramanian, S.; Hsieh, M.H. Quantum algorithm for estimating alpha-Rényi entropies of quantum states. Physical Review A 2021, 104, 022428. [Google Scholar] [CrossRef]
- Grassberger, P. On Generalized Schürmann Entropy Estimators. Entropy 2022, 24. [Google Scholar] [CrossRef] [PubMed]
- Skorski, M. Towards More Efficient Rényi Entropy Estimation. Entropy 2023, 25, 185. [Google Scholar] [CrossRef]
- Gecchele, A. Collision Entropy Estimation in a One-Line Formula. Cryptology ePrint Archive, Paper 2023/927 2023.
- Al-Labadi, L.; Chu, Z.; Xu, Y. Advancements in Rényi entropy and divergence estimation for model assessment. Computational Statistics 2024, pp. 1–18.
- Álvarez Chaves, M.; Gupta, H.V.; Ehret, U.; Guthke, A. On the Accurate Estimation of Information-Theoretic Quantities from Multi-Dimensional Sample Data. Entropy 2024, 26, 387. [Google Scholar] [CrossRef]
- Pinchas, A.; Ben-Gal, I.; Painsky, A. A Comparative Analysis of Discrete Entropy Estimators for Large-Alphabet Problems. Entropy 2024, 26. [Google Scholar] [CrossRef]
- De Gregorio, J.; Sánchez, D.; Toral, R. Entropy Estimators for Markovian Sequences: A Comparative Analysis. Entropy 2024, 26. [Google Scholar] [CrossRef] [PubMed]
- Onicescu, O. Energie informationelle. Comptes Rendues l’Academie des Sciences 1966, 263. [Google Scholar]
- Pincus, S. Approximate entropy (ApEn) as a complexity measure. Chaos: An Interdisciplinary Journal of Nonlinear Science 1995, 5, 110–117. [Google Scholar] [CrossRef]
- Richman, J.S.; Moorman, J.R. Physiological time-series analysis using approximate entropy and sample entropy. American Journal of Physiology-Heart and Circulatory Physiology 2000, 278, H2039–H2049. [Google Scholar] [CrossRef]
- Costa, M.; Goldberger, A.L.; Peng, C.K. Multiscale Entropy Analysis of Complex Physiologic Time Series. Physical Review Letters 2002, 89, 068102. [Google Scholar] [CrossRef]
- Bandt, C.; Pompe, B. Permutation entropy: a natural complexity measure for time series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
- Manis, G.; Aktaruzzaman, M.; Sassi, R. Bubble entropy: An entropy almost free of parameters. IEEE Transactions on Biomedical Engineering 2017, 64, 2711–2718. [Google Scholar] [CrossRef]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Golshani, L.; Pasha, E. Rényi entropy rate for Gaussian processes. Information Sciences 2010, 180, 1486–1491. [Google Scholar] [CrossRef]
- Teixeira, A.; Matos, A.; Antunes, L. Conditional Rényi Entropies. IEEE Transactions on Information Theory 2012, 58, 4273–4277. [Google Scholar] [CrossRef]
- Fehr, S.; Berens, S. On the Conditional Rényi Entropy. IEEE Transactions on Information Theory 2014, 60, 6801–6810. [Google Scholar] [CrossRef]
- Grassberger, P. Generalized dimensions of strange attractors. Physics Letters A 1983, 97, 227–230. [Google Scholar] [CrossRef]
- Van Erven, T.; Harremos, P. Rényi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory 2014, 60, 3797–3820. [Google Scholar] [CrossRef]
- Kullback, S.; Leibler, R. On Information and Sufficiency. The Annals of Mathematical Statistics 1951, 22, 79–86. [Google Scholar] [CrossRef]
- Hölder, O. Ueber einen Mittelwertsatz. Mathematische Annalen 1889, 34, 511–518. [Google Scholar]








Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

