Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Entropy of Difference

Version 1 : Received: 24 November 2023 / Approved: 14 December 2023 / Online: 14 December 2023 (08:55:02 CET)

A peer-reviewed article of this Preprint also exists.

Nardone, P.; Sonnino, G. Entropy of Difference: A New Tool for Measuring Complexity. Axioms 2024, 13, 130. Nardone, P.; Sonnino, G. Entropy of Difference: A New Tool for Measuring Complexity. Axioms 2024, 13, 130.

Abstract

{Here, we propose a new tool to estimate the complexity of a time series: the entropy of difference (ED). The method is based solely on the sign of the difference between neighbouring values in a time series. This makes it possible to describe the signal as efficiently as prior proposed parameters such as permutation entropy (PE) or modified permutation entropy (mPE), but (1) reduces the size of the sample that is necessary to estimate the parameter value, and (2) enables the use of the Kullback-Leibler divergence to estimate the "distance" between the time series data and random signals.

Keywords

entropy; complexity measure; random signal

Subject

Computer Science and Mathematics, Applied Mathematics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.