Article
Version 1
Preserved in Portico This version is not peer-reviewed
Entropy of Difference
Version 1
: Received: 24 November 2023 / Approved: 14 December 2023 / Online: 14 December 2023 (08:55:02 CET)
A peer-reviewed article of this Preprint also exists.
Nardone, P.; Sonnino, G. Entropy of Difference: A New Tool for Measuring Complexity. Axioms 2024, 13, 130. Nardone, P.; Sonnino, G. Entropy of Difference: A New Tool for Measuring Complexity. Axioms 2024, 13, 130.
Abstract
{Here, we propose a new tool to estimate the complexity of a time series: the entropy of difference (ED). The method is based solely on the sign of the difference between neighbouring values in a time series. This makes it possible to describe the signal as efficiently as prior proposed parameters such as permutation entropy (PE) or modified permutation entropy (mPE), but (1) reduces the size of the sample that is necessary to estimate the parameter value, and (2) enables the use of the Kullback-Leibler divergence to estimate the "distance" between the time series data and random signals.
Keywords
entropy; complexity measure; random signal
Subject
Computer Science and Mathematics, Applied Mathematics
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment