Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Next Article in Journal
Modeling Spectral Properties in Stationary Processes of Varying Dimensions with Applications to Brain Local Field Potential Signals
Next Article in Special Issue
A Survey of Information Entropy Metrics for Complex Networks
Previous Article in Journal
Approximate Evolution for A Hybrid System—An Optomechanical Jaynes-Cummings Model
Previous Article in Special Issue
Geometric Aspects of the Isentropic Liquid Dynamics and Vorticity Invariants
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Fractional Order Entropies

by
António M. Lopes
1,*,† and
José A. Tenreiro Machado
2,†
1
LAETA/INEGI, Faculty of Engineering, University of Porto, Rua Dr. Roberto Frias, 4200-465 Porto, Portugal
2
Department of Electrical Engineering, Institute of Engineering, Polytechnic of Porto, Rua Dr. António Bernardino de Almeida, 431, 4249-015 Porto, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Entropy 2020, 22(12), 1374; https://doi.org/10.3390/e22121374
Submission received: 11 November 2020 / Revised: 26 November 2020 / Accepted: 2 December 2020 / Published: 5 December 2020
(This article belongs to the Special Issue Review Papers for Entropy)

Abstract

:
Fractional calculus (FC) is the area of calculus that generalizes the operations of differentiation and integration. FC operators are non-local and capture the history of dynamical effects present in many natural and artificial phenomena. Entropy is a measure of uncertainty, diversity and randomness often adopted for characterizing complex dynamical systems. Stemming from the synergies between the two areas, this paper reviews the concept of entropy in the framework of FC. Several new entropy definitions have been proposed in recent decades, expanding the scope of applicability of this seminal tool. However, FC is not yet well disseminated in the community of entropy. Therefore, new definitions based on FC can generalize both concepts in the theoretical and applied points of view. The time to come will prove to what extend the new formulations will be useful.

1. Introduction

In recent decades, the generalization of the concepts of differentiation [1,2,3,4] and entropy [5,6,7,8] have received considerable attention. In the first case we may cite the fractional calculus (FC) [9,10]. FC was introduced by Leibniz in the scope of mathematics by the end of the 17th century, but only recently found application in biology [11,12], physics [13,14] and engineering [15,16], among others [17,18]. The concept of entropy was introduced by Clausius [19] and Boltzmann [20] in the field of thermodynamics. Later, entropy was also explored by Shannon [21] and Jaynes [22] in the context of information theory. Meanwhile, both topics evolved considerably, motivating the formulation of fractional operators [23,24] and entropy indices [25,26,27,28,29,30,31,32,33,34,35,36,37,38]. These generalizations extend the application of the two mathematical tools and highlight certain characteristics, such as the power-law behavior, non-locality and long range memory [39,40].
This paper reviews the concept of entropy in the framework of FC. In fact, FC is not yet well disseminated among the community of entropy and, therefore, new definitions based on FC may expand the scope of this powerful tool. To the authors’ best knowledge, new entropy definitions are welcomed by the scientific community, somehow contrary to what happens with recent fractional operators. Consequently, the manuscript does not intend to assess the pros or the cons of the distinct formulations for some given problem. In a similar line of thought, the analysis of entropy-based indices proposed in the literature for comparing or characterizing some phenomena or probability distributions are outside the focus of this paper. Interested readers can obtain further information on divergence measures [41] and mutual information [42], as well as for sample [43], approximate [44], permutation [45], spectral [46], and fuzzy [47] entropies, among others [48]. Indeed, the main idea of this paper is to review the concept of fractional entropy and to present present day state of its development.
The paper is organized as follows. Section 2 presents the fundamental concepts of FC. Section 3 introduces different entropies with one, two and three parameters. Section 4 reviews the fractional-order entropy formulations. Section 5 compares the different formulations for four well-known distributions. Section 6 assesses the impact of the fractional entropies and analyses their main areas of application. Finally, Section 7 outlines the main conclusions.

2. Fractional-Order Derivatives and Integrals

FC models capture non-local effects, useful in the study of phenomena with long range correlations in time or space.
Let us consider the finite interval [ a , b ] , with a , b R and a < b , and let n 1 < q < n , with n N . The Euler’s gamma function is denoted by Γ ( · ) and the operator [ · ] calculates the integer part of the argument. Several definitions of fractional derivatives were formulated [24,49,50]. A small set is presented in the follow-up, which includes both historically relevant and widely used definitions:
  • The left-side and the right-side Caputo derivatives,
    C D a + q f ( t ) = 1 Γ ( n q ) a x 1 ( x τ ) q n + 1 d n d τ n f ( τ ) d τ , x a ,
    C D b q f ( x ) = ( 1 ) n Γ ( n q ) x b 1 ( τ x ) q n + 1 d n d τ n f ( τ ) d τ , x b ,
  • The left-side and the right-side Grünwald-Letnikov derivatives,
    G L D a + q f ( x ) = lim h 0 h q m = 0 [ x a h ] ( 1 ) m q m f ( x m h ) , x a ,
    G L D b q f ( x ) = lim h 0 h q m = 0 [ b x h ] ( 1 ) m q m f ( x + m h ) , x b ,
  • The Hadamard derivative,
    H a D + q f ( x ) = q Γ ( 1 q ) 0 x f ( x ) f ( τ ) [ log ( x / τ ) ] q + 1 d τ τ ,
  • The left-side and right-side Hilfer derivatives of type 0 β 1 ,
    H D a + q f ( x ) = R L I a + γ q d n d x n R L I a + ( 1 β ) ( n q ) f ( x ) ,
    H D b q f ( x ) = R L I b γ q ( 1 ) d n d x n R L I b ( 1 β ) ( n q ) f ( x ) ,
    where R L I a + q and R L I b q denote the left-side and right-side Riemann-Liouville fractional integrals of order q > 0 , respectively, defined by:
    R L I a + q f ( x ) = 1 Γ ( q ) a x f ( τ ) ( x τ ) 1 q d τ , x a ,
    R L I b q f ( x ) = 1 Γ ( q ) x b f ( τ ) ( τ x ) 1 q d τ , x b ,
  • The Karcı derivative
    K D q f ( x ) = lim h 0 d { [ f ( x + h ) ] q [ f ( x ) ] q } d h d [ ( x + h ) q x q ] d h = d d x [ f ( x ) ] · [ f ( x ) q 1 ] x q 1 .
  • The Liouville, the left-side and the right-side Liouville derivatives,
    L D q f ( x ) = 1 Γ ( 1 q ) d d x x f ( τ ) ( x τ ) q d τ , < x < + ,
    L D 0 + q f ( x ) = 1 Γ ( n q ) d n d x n 0 x f ( τ ) ( x τ ) q n + 1 d τ , x > 0 ,
    L D 0 q f ( x ) = ( 1 ) n Γ ( n q ) d n d x n x + f ( τ ) ( x τ ) q n + 1 d τ , x < + ,
  • The Marchaud, the left-side and the right-side Marchaud derivatives,
    M D q f ( x ) = q Γ ( 1 q ) x f ( x ) f ( τ ) ( x τ ) q + 1 d τ ,
    M D + q f ( x ) = q Γ ( 1 q ) 0 + f ( x ) f ( x τ ) τ q + 1 d τ ,
    M D q f ( x ) = q Γ ( 1 q ) 0 + f ( x ) f ( x + τ ) τ q + 1 d τ ,
  • The left-side and the right-side Riemann-Liouville derivatives,
    R L D a + q f ( x ) = 1 Γ ( n q ) d n d x n a x f ( τ ) ( x τ ) q n + 1 d τ , x a ,
    R L D b q f ( x ) = ( 1 ) n Γ ( n q ) d n d x n x b f ( τ ) ( τ x ) q n + 1 d τ , x b ,
  • The Riesz derivative,
    R D x q f ( x ) = 1 2 cos ( q π / 2 ) 1 Γ ( q ) d n d x n x f ( τ ) ( x τ ) q n + 1 d τ + x + f ( τ ) ( τ x ) q n + 1 d τ ,
  • The local Yang derivative,
    Y D q f ( x ) | x = x 0 = lim x x 0 Δ q [ f ( x ) f ( x 0 ] ( x x 0 ) q .
Often, the Caputo formulation is applied in physics and numerical integration, the Riemann-Liouville in calculus, and the Grünwald-Letnikov in engineering, signal processing and control. These classical definitions are the most frequently used by researchers. In what concerns the mathematical pros and cons of the Karcı and the Yang derivatives, readers may visit [23,51] and references therein. In fact, it should be noted that some formulations need some careful reflection and are the matter of some controversy, since many authors do not consider them as fractional operators [23,52,53]. Nevertheless, the debate about what it really means the term ‘fractional derivative’ is still ongoing among contemporary mathematicians [51].

3. The Concept of Entropy

Let us consider a discrete probability distribution P = { p 1 , p 2 , , p N } , with i p i = 1 and p i 0 . The Shannon entropy, S ( S ) , of distribution P is defined as:
S ( S ) = i p i I ( p i ) = i p i ln p i ,
and represents the expected value of the information content given by I ( p i ) = ln p i . Therefore, for the uniform probability distribution we have p i = N 1 , N N , and the Shannon entropy takes its maximum value S = ln N , yielding the Boltzmann formula, up to a multiplicative factor, k, which denotes the Boltzmann constant.
The Rényi and Tsallis entropies are one-parameter generalizations of (21) given by, respectively:
S q ( R ) = 1 1 q ln i p i q , q > 0 , q 1 ,
S q ( T ) = 1 q 1 1 i p i q , q R .
The entropies S q ( R ) and S q ( T ) reduce to the Shannon formulation S ( S ) when q 1 . The Rényi entropy has an inverse power law equilibrium distribution [54], satisfying the zero-th law of thermodynamics [55]. It is important in statistics and ecology to quantify diversity, in quantum information to measure entanglement, and in computer science for randomness extraction. The Tsallis entropy was proposed in the scope of nonextensive statistical mechanics and has found application in the field of complex dynamics, in diffusion equations [56] and Fokker-Planck systems [57].
Other one-parameter entropies are the Landsberg-Vedral and Abe formulations [26,58]:
S q ( L ) = 1 1 q 1 i p i q 1 ,
S q ( A ) = i p i q p i q 1 q q 1 , q ] 0 , 1 ] .
Expression (24) is related to the Tsallis entropy by S q ( L ) = S q ( T ) i p i q , and is often known as normalized Tsallis entropy. Expression (25) is a symmetric modification of the Tsallis entropy, which is invariant to the exchange q q 1 , and we have S q ( A ) = ( q 1 ) S q ( T ) q 1 S q 1 ( T ) q q 1 .
The two-parameter Sharma-Mittal entropy [32] is a generalization of the Shannon, Tsallis and Rényi entropies, and is defined as follows:
S r , q ( S M ) = 1 1 r i p i q 1 r 1 q 1 , q > 0 , q 1 , r 1 .
The Sharma-Mittal entropy reduces to the Rényi, Tsallis and Shannon’s formulations for the limits r 1 , r q and { r , q } { 1 , 1 } , respectively.
Examples of three-parameter formulations consist of the gamma and the Kaniadakis entropies, S d , c 1 , c 2 ( G ) and S κ , τ , ζ ( K ) , respectively. The gamma entropy is given by [35]:
S d , c 1 , c 2 ( G ) = i e c 2 c 1 Γ ( d + 1 , 1 c 1 ln p i , 1 c 2 ln p i ) ,
where e denotes the Napier constant, Γ ( a , z 1 , z 2 ) represents the generalized incomplete gamma function, defined by:
Γ ( a , z 1 , z 2 ) = Γ ( a , z 1 ) Γ ( a , z 2 ) = z 1 z 2 t a 1 e t d t
and Γ ( x , y ) = y t x 1 e t d t is the upper incomplete gamma function.
The entropy S d , c 1 , c 2 ( G ) follows the first three Khinchin axioms [35,59,60] within the parameter regions defined by (29) and (30):
c 2 > 1 > c 1 > 0 , 1 1 c 1 < d < 1 1 c 2 ,
c 1 > 1 > c 2 > 0 , 1 1 c 2 < d < 1 1 c 1 .
Different combinations of the parameters yield distinct entropy formulations [35]. For example, if we set { d , c 1 , c 2 } = { 0 , 1 , q } , then we recover the Tsallis entropy, while for { d , c 1 , c 2 } = { 0 , 1 ± ϵ , 1 ϵ } , ϵ 0 , we obtain the Shannon entropy.
The Kaniadakis entropy belongs to a class of trace-form entropies given by [38]:
S = i p i Λ ( p i ) ,
where Λ ( x ) is a strictly increasing function defined for positive values of the argument, noting that Λ ( x 0 + ) = . The function Λ ( x ) can be viewed as a generalization of the ordinary logarithm [38] that, for three-parameter, yields:
Λ ( x ) = ln κ , τ , ζ ( x ) = ζ κ x τ + κ ζ κ x τ κ ζ κ + ζ κ ( κ + τ ) ζ κ + ( κ τ ) ζ κ .
Therefore, the Kaniadakis entropy, S κ , τ , ζ ( K ) , can be expressed as:
S κ , τ , ζ ( K ) = i p i ln κ , τ , ζ ( p i ) , κ , ζ R , | κ | 1 < τ | κ | .
The Entropy S κ , τ , ζ ( K ) is Lesche [61] and thermodynamically [62] stable for | κ | τ | κ | . Distinct combinations of the parameters yield several entropy formulations [38]. For example, if we set κ = τ = ( q 1 ) / 2 or κ 0 , τ = 0 , then expression (33) yields the Tsallis and the Shannon entropies, respectively.
Other entropies can be found in the literature [63,64], but a thorough review of all proposed formulations is out of the scope of this paper.

4. Fractional Generalizations of Entropy

It was noted [65] that the Shannon and Tsallis entropies have the same generating function i p i x and that the difference in the Formulas (21) and (23) is just due to the adopted differentiation operator. In fact, using the standard first-order differentiation, d d x , we obtain the Shannon entropy:
S ( S ) = lim x 1 d d x i p i x ,
while adopting the Jackson q-derivative [66], D q f ( x ) = f ( q x ) f ( x ) q x x , 0 < q < 1 , yields the Tsallis entropy [28]:
S q ( T ) = lim x 1 D q i p i x .
Other expressions for entropy can be obtained by adopting additional differentiation operators.
In 2001, Akimoto and Suzuki [67] proposed the one-parameter fractional entropy, S α ( A S ) , given by:
S α ( A S ) = lim x 1 i d α d x α e x ln p i ,
where d α d x α = R L D a + α is the Riemann-Liouville operator (17), with a = 0 .
The expressions (36) and (17) yield:
S α ( A S ) = i α 1 Γ ( 2 α ) 1 F 1 ( 1 ; 1 α ; ln p i ) , 0 < α < 1 ,
where 1 F 1 ( a ; b ; c ) denotes the confluent hypergeometric function of the first kind [68]:
1 F 1 ( a ; b ; x ) = 1 + a b x 1 ! + a ( a + 1 ) b ( b + 1 ) x 2 2 ! + a ( a + 1 ) ( a + 2 ) b ( b + 1 ) ( b + 2 ) x 3 3 ! + .
It can be shown that [67] has the concavity and non-extensivity properties. In the limit α 1 , it obeys positivity and gives the Shannon entropy, S ( S ) .
In 2009, Ubriaco introduced a one-parameter fractional entropy, S α ( U ) , given by [69]:
S α ( U ) = lim x 1 d d x R L D α 1 i e x ln p i ,
where R L D α 1 is the Riemann-Liouville left-side derivative (17) with a .
Therefore, we obtain:
S α ( U ) = lim x 1 d d x 1 Γ ( 1 α ) i x e τ ln p i ( x τ ) α d τ .
Performing the integration and taking the limit x 1 , it yields:
S α ( U ) = i p i ( ln p i ) α , 0 α 1 .
The Ubriaco entropy (41) is thermodynamically stable and obeys the same properties of the Shannon entropy, with the exception of additivity. When α 1 , we recover the Shannon entropy.
In 2012, Yu et al. [70] formulated a one-parameter fractional entropy by means of the simple expression:
S α ( Y ) = R L I 0 α ( p i ln p i ) , α R + ,
where the operator R L I 0 α is the left-side Riemann-Liouville integral (8), with a = 0 . Expression (42) obeys the concavity property and is an extension and generalization of the Shannon entropy.
Another fractional entropy was derived in 2014 by Radhakrishnan et al. [71], being given by:
S q , α ( R C J ) = i p i q ( ln p i ) α , q , α > 0 .
The two-parameter expression (43) was inspired in (41) and the entropy (44) derived by Wang in the context of the incomplete information theory [72]:
S q ( W ) = ln p 1 q = i p i q ( ln p i ) ,
where i p i q = 1 and O q = i p i q O i denotes the q-expectation that characterizes incomplete normalization.
The entropy (43) is considered a fractional entropy in a fractal phase space in which the parameters q and α are associated with fractality and fractionality, respectively. In the limit, when (i) q 1 Equation (43) reduces to (41), (ii) α 1 recovers S q ( W ) , and (iii) { q , α } = { 1 , 1 } expression (43) yields the standard Shannon formula (21).
In 2014, Machado followed a different line of thought [73], thinking of Shannon information I p i = ln p i as a function of order zero lying between the integer-order cases D 1 I p i = p i 1 ln p i and D 1 I p i = 1 p i . In the perspective of FC, this observation motivated the formulation of information and entropy of order α R as [24]:
I α p i = D α I p i = p i α Γ α + 1 ln p i + ψ ˜ ,
S α ( M ) = i p i α Γ α + 1 ln p i + ψ ˜ p i ,
where D α denotes a fractional derivative operator, ψ ˜ = ψ 1 ψ 1 α and ψ · represent the digamma function.
The one-parameter fractional entropy (46) fails to obey some of the Khinchin axioms with exception of the case q = 0 that leads to the Shannon entropy [74]. This behavior is in line with what occurs in FC, where fractional derivatives fail to obey some of the properties of integer-order operators [1].
Expression (46) was generalized by Jalab et al. [75] in the framework of local FC [76]. A adopting (20), the following expression was proposed:
S α ( J ) = i p i i α Γ i α + 1 ln p i + ψ ˜ p i .
Equation (47) decreases from 1 to 1 α , α ] 0 , 1 [ . Therefore, we have:
S α ( J ) i p i i α Γ i α + 1 ln p i + 1 α p i .
In 2016, Karcı [77] proposed the fractional derivative (10), based on the concept of indefinite limit and the l’Hôpital’s rule. Adopting f ( x ) = i p i x , and using (10) into (34), he derived the following expression for fractional entropy [78]:
S α ( K ) = K D α f ( x ) = K D α i p i x = i p i p x x α 1 ( 1 ) p 1 ln p = i p i p α ln p .
In 2019, Ferreira and Machado [79] presented a new formula for the entropy based on the work of Abe [65] and Ubriaco [69]. They start by the definition of left-side Liouville fractional derivative of a function f with respect to another function g, with g > 0 , given by:
L D g α f ( x ) = 1 Γ ( 1 α ) g ( x ) d d x x [ g ( x ) g ( s ) ] α g ( s ) f ( s ) d s , 0 < α 1 .
Choosing f ( x ) = p i x and g ( x ) = e x + 1 expression (50) leads to:
L D g α f ( x ) = 1 Γ ( 1 α ) e x + 1 d d x x [ e x + 1 e s + 1 ] α e s + 1 e s ln ( p i ) d s
= [ 1 α ln ( p i ) ] e ( α 1 ) ( x + 1 ) + x [ 1 ln ( p i ) ] + 1 Γ ( 1 ln ( p i ) ) Γ ( 2 α ln ( p i ) ) .
Therefore, we have:
L D g α f ( 1 ) = [ 1 α ln ( p i ) ] p i Γ ( 1 ln ( p i ) ) Γ ( 2 α ln ( p i ) ) ,
which applying Γ ( x + 1 ) = x Γ ( x ) , for x > 0 , results in:
L D g α f ( 1 ) = p i Γ ( 1 ln ( p i ) ) Γ ( 1 α ln ( p i ) ) .
Using (54) into (34) gives:
S α ( F M ) = i p i Γ ( 1 ln ( p i ) ) Γ ( 1 α ln ( p i ) ) , 0 < α 1 .
In 2019, Machado and Lopes [80] proposed two fractional formulations of the Rényi entropy, S q , α ( M L 1 ) and S q , α ( M L 2 ) . Their derivation adopts a general averaging operator, instead of the linear one that is assumed for the Shannon entropy (21). Let us consider a monotonic function f ( x ) with inverse f 1 ( x ) . Therefore, for a set of real values { x i } , i = 1 , 2 , , with probabilities { p i } , we can define a general mean [81] associated with f ( x ) as:
f 1 i p i f ( x i ) .
Applying (56) to the Shannon entropy (21) we obtain:
S = f 1 i p i f ( I ( p i ) ) ,
where f ( x ) is a Kolmogorov–Nagumo invertible function [82]. If the postulate of additivity for independent events is considered in (56), then only two functions f ( x ) are possible, consisting of f 1 ( x ) = c · x and f 2 ( x ) = c · exp [ ( 1 q ) x ] , with c , q R . For f 1 ( x ) we get the ordinary mean and we verify that S = S ( S ) . For f ( x ) = c · e ( 1 q ) x we have the expression:
S = 1 1 q i p i · exp [ ( 1 q ) I ( p i ) ] ,
which gives the Rényi entropy:
S q ( R ) = 1 1 q ln i p i q , q > 0 , q 1 .
If we combine (45) and (58), then we obtain:
S q , α ( M L 1 ) = 1 1 q ln i p i · exp ( 1 q ) · I α ( p i ) = 1 1 q ln i p i · exp ( q 1 ) · p i α Γ α + 1 ln p i + ψ ˜ .
On the other hand, if we rewrite (22) as:
S q ( R ) = q 1 q ln 1 N i p i q 1 q · N 1 q = q 1 q ln p i g · N 1 q ,
where p i g = 1 N i p i q 1 q is a generalized mean, then we obtain:
S q , α ( M L 2 ) = D α H q ( R ) = 1 N α q q 1 q p i g α Γ α + 1 1 q ln N + ln p i g + ψ ˜ .
In the limit, when α 0 , both S q , α ( M L 1 ) and S q , α ( M L 2 ) yield (22).

5. Comparison of the Fractional-Order Entropies

In this section we use the fractional entropy formulas to compute the entropy both of abstract and real-world data series.

5.1. Fractional-Order Entropy of Some Probability Distributions

We calculate the entropy of four well-known probability distributions, namely those of Poisson, Gaussian, Lévy and Weibull. We consider these cases just with the purpose of illustrating the behavior of the different formulations. Obviously other cases could be considered, but we limit the number for the sake of parsimony. Firstly, we present the results obtained with the one-parameter entropies S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) . Then, we consider the two-parameter formulations S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) . Table 1 summarizes the constants adopted for the distributions and the intervals of variation of the entropy parameters.
Figure 1 depicts the values of S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] . We verify that in the limits, either α 0 or α 1 , the values of the Shannon entropy are calculated as 2.087, 5.866, 4.953 and 5.309, respectively. Moreover, it can be seen that S α ( U ) and S α ( F M ) are very close to each other, S α ( A S ) does not obey positivity, S α ( J ) diverges at small values of α , S α ( M ) has a maximum at values of α close to 0.6 and diverges as α 1 .
Figure 2 portraits the values of S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] . We verify that in the domain considered the entropies vary slightly and do not diverge.

5.2. Fractional-Order Entropy of Real-World Data

We calculate the entropy of a real-world time series, namely the Dow Jones Industrial Average (DJIA) financial index. The DJIA raw data are available at the Yahoo Finance website (https://finance.yahoo.com/). Herein, we consider the stock closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval. Occasional missing values, as well as values corresponding to closing days, are estimated using linear interpolation. The processed DJIA time series, x = { x 1 , x 2 , , x T } , T = 12 , 381 points, is used to construct a histogram of relative frequencies, f ( x ) , with N = 50 bins equally spaced and non-overlapping, for estimating the probability distribution of x.
Figure 3a depicts the values of S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] . We verify that, as shown in Section 5.1, S α ( U ) and S α ( F M ) yield similar results, S α ( J ) diverges for small values of α , and S α ( M ) has a maximum at values of α close to 0.6, diverging when α 1 .
Figure 3b–d show the values of S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] , yielding results of the same type as before.

6. Impact and Applications of the Fractional-Order Entropies

To assess the impact of the fractional-order entropies on the scientific community, we consider the number of citations received by the nine papers that first proposed them. Table 1 summarizes the results obtained from the database Scopus on 7 November 2020 (www.scopus.com). We verify that those nine papers were cited 218 times by 170 distinct papers, and that the expressions proposed by Ubriaco and Machado received more attention.
To unravel the main areas of application of the fractional entropies, we use the VOSviewer (https://www.vosviewer.com/), which allows the construction and visualization of bibliometric networks [83]. The bibliometric data of the 170 papers that cite the nine papers that present the fractional-order entropies were collected from Scopus for constructing Table 2, and are the input information to the VOSviewer. The co-occurrence of the authors’ keywords in the 170 papers is analyzed, with the minimum value of co-occurrence of each keyword set to 3. Figure 4 depicts the generated map. We verify the emergence of six clusters, C = { C 1 , , C 6 } . At the top, the light-blue cluster, C 1 , includes the fields of finance and finance time series analysis, while the light-green one, C 2 , encompasses a variety of areas, such as solvents, fractals, commerce, and stochastic systems, tightly connected to some entropy-based complexity measures. On the right, the dark-green cluster, C 3 , includes the areas of fault detection and image processing. At the bottom of the map, the red cluster, C 4 , highlights the fields of chromosome and DNA analysis, while the dark-blue, C 5 , one emphasizes some clustering and visualization techniques, as multidimensional scaling and hierarchical clustering. On the left, the magenta cluster, C 6 , includes keywords not related with applications.
In summary, we conclude that the fractional entropies were applied to a considerable number of distinct scientific areas and that we may foresee a promising future for their development by exploring the synergies of the two mathematical tools. The prevalence of some proposals, from the point of view of citations, may be due to the time elapsed since their formulation. Indeed, more recent formulations had not yet sufficient time to disseminate in the community. Another reason may have to do with the type and audience of journal where they were published. Nonetheless, a full bibliometric analysis is not the leitmotif of the present paper.

7. Conclusions

This paper reviewed the concept of entropy in the framework of FC. To the best of the authors’ knowledge the fractional entropies proposed so far were included in this review. The different formulations result from the adopted (i) fractional-order operator or (ii) generating function. In general such entropies are non-extensive and converge to the classical Shannon entropy for certain values of their parameters. The fractional entropies have found applications in the area of complex systems, where the classical formulations revealed some limitations. The FC brings a shinny future in further developments of entropy and its applications.

Author Contributions

A.M.L. and J.A.T.M. conceived, designed and performed the experiments, analyzed the data and wrote the paper. Both authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Oldham, K.; Spanier, J. The Fractional Calculus: Theory and Application of Differentiation and Integration to Arbitrary Order; Academic Press: New York, NY, USA, 1974. [Google Scholar]
  2. Samko, S.; Kilbas, A.; Marichev, O. Fractional Integrals and Derivatives: Theory and Applications; Gordon and Breach Science Publishers: Amsterdam, The Netherlands, 1993. [Google Scholar]
  3. Miller, K.; Ross, B. An Introduction to the Fractional Calculus and Fractional Differential Equations; John Wiley and Sons: New York, NY, USA, 1993. [Google Scholar]
  4. Kilbas, A.; Srivastava, H.; Trujillo, J. Theory and Applications of Fractional Differential Equations; North-Holland Mathematics Studies; Elsevier: Amsterdam, The Netherlands, 2006; Volume 204. [Google Scholar]
  5. Plastino, A.; Plastino, A.R. Tsallis entropy and Jaynes’ Information Theory formalism. Braz. J. Phys. 1999, 29, 50–60. [Google Scholar] [CrossRef]
  6. Li, X.; Essex, C.; Davison, M.; Hoffmann, K.H.; Schulzky, C. Fractional Diffusion, Irreversibility and Entropy. J. Non-Equilib. Thermodyn. 2003, 28, 279–291. [Google Scholar] [CrossRef]
  7. Mathai, A.; Haubold, H. Pathway model, superstatistics, Tsallis statistics, and a generalized measure of entropy. Phys. A Stat. Mech. Appl. 2007, 375, 110–122. [Google Scholar] [CrossRef] [Green Version]
  8. Anastasiadis, A. Special Issue: Tsallis Entropy. Entropy 2012, 14, 174–176. [Google Scholar] [CrossRef] [Green Version]
  9. Tenreiro Machado, J.A.; Kiryakova, V. Recent history of the fractional calculus: Data and statistics. In Handbook of Fractional Calculus with Applications: Basic Theory; Kochubei, A., Luchko, Y., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 1, pp. 1–21. [Google Scholar]
  10. Machado, J.T.; Galhano, A.M.; Trujillo, J.J. On development of fractional calculus during the last fifty years. Scientometrics 2014, 98, 577–582. [Google Scholar] [CrossRef] [Green Version]
  11. Ionescu, C. The Human Respiratory System: An Analysis of the Interplay between Anatomy, Structure, Breathing and Fractal Dynamics; Series in BioEngineering; Springer: London, UK, 2013. [Google Scholar]
  12. Lopes, A.M.; Machado, J.T. Fractional order models of leaves. J. Vib. Control. 2014, 20, 998–1008. [Google Scholar] [CrossRef] [Green Version]
  13. Hilfer, R. Application of Fractional Calculus in Physics; World Scientific: Singapore, 2000. [Google Scholar]
  14. Tarasov, V. Fractional Dynamics: Applications of Fractional Calculus to Dynamics of Particles, Fields and Media; Springer: New York, NY, USA, 2010. [Google Scholar]
  15. Parsa, B.; Dabiri, A.; Machado, J.A.T. Application of Variable order Fractional Calculus in Solid Mechanics. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Baleanu, D., Lopes, A.M., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 7, pp. 207–224. [Google Scholar]
  16. Lopes, A.M.; Machado, J.A.T. Fractional-order modeling of electro-impedance spectroscopy information. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Baleanu, D., Lopes, A.M., Eds.; De Gruyter: Berlin, Germany, 2019; Volume 7, pp. 21–41. [Google Scholar]
  17. Valério, D.; Ortigueira, M.; Machado, J.T.; Lopes, A.M. Continuous-time fractional linear systems: Steady-state behaviour. In Handbook of Fractional Calculus with Applications: Applications in Engineering, Life and Social Sciences, Part A; Petráš, I., Ed.; De Gruyter: Berlin, Germany, 2019; Volume 6, pp. 149–174. [Google Scholar]
  18. Tarasov, V.E. On history of mathematical economics: Application of fractional calculus. Mathematics 2019, 7, 509. [Google Scholar] [CrossRef] [Green Version]
  19. Clausius, R. The Mechanical Theory of Heat: With Its Applications to the Steam-Engine and to the Physical Properties of Bodies; Van Voorst, J., Ed.; Creative Media Partners: Sacramento, CA, USA, 1867. [Google Scholar]
  20. Boltzmann, L. Vorlesungen über die Principe der Mechanik; Barth, J.A., Ed.; Nabu Press: Charleston, SC, USA, 1897; Volume 1. [Google Scholar]
  21. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef] [Green Version]
  22. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620. [Google Scholar] [CrossRef]
  23. Ortigueira, M.D.; Machado, J.T. What is a fractional derivative? J. Comput. Phys. 2015, 293, 4–13. [Google Scholar] [CrossRef]
  24. Valério, D.; Trujillo, J.J.; Rivero, M.; Machado, J.T.; Baleanu, D. Fractional calculus: A survey of useful formulas. Eur. Phys. J. Spec. Top. 2013, 222, 1827–1846. [Google Scholar] [CrossRef]
  25. Lopes, A.M.; Tenreiro Machado, J.; Galhano, A.M. Multidimensional Scaling Visualization Using Parametric Entropy. Int. J. Bifurc. Chaos 2015, 25, 1540017. [Google Scholar] [CrossRef] [Green Version]
  26. Landsberg, P.T.; Vedral, V. Distributions and channel capacities in generalized statistical mechanics. Phys. Lett. A 1998, 247, 211–217. [Google Scholar] [CrossRef]
  27. Beck, C. Generalised information and entropy measures in physics. Contemp. Phys. 2009, 50, 495–510. [Google Scholar] [CrossRef]
  28. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  29. Kaniadakis, G. Statistical mechanics in the context of special relativity. Phys. Rev. E 2002, 66, 056125. [Google Scholar] [CrossRef] [Green Version]
  30. Naudts, J. Generalized thermostatistics based on deformed exponential and logarithmic functions. Phys. A Stat. Mech. Appl. 2004, 340, 32–40. [Google Scholar] [CrossRef] [Green Version]
  31. Abe, S.; Beck, C.; Cohen, E.G. Superstatistics, thermodynamics, and fluctuations. Phys. Rev. E 2007, 76, 031102. [Google Scholar] [CrossRef] [Green Version]
  32. Sharma, B.D.; Mittal, D.P. New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci. 1975, 10, 28–40. [Google Scholar]
  33. Wada, T.; Suyari, H. A two-parameter generalization of Shannon–Khinchin axioms and the uniqueness theorem. Phys. Lett. A 2007, 368, 199–205. [Google Scholar] [CrossRef] [Green Version]
  34. Bhatia, P. On certainty and generalized information measures. Int. J. Contemp. Math. Sci. 2010, 5, 1035–1043. [Google Scholar]
  35. Asgarani, S. A set of new three-parameter entropies in terms of a generalized incomplete Gamma function. Phys. A Stat. Mech. Appl. 2013, 392, 1972–1976. [Google Scholar] [CrossRef]
  36. Hanel, R.; Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions. EPL (Europhys. Lett.) 2011, 93, 20006. [Google Scholar] [CrossRef]
  37. Sharma, B.D.; Taneja, I.J. Entropy of type (α, β) and other generalized measures in information theory. Metrika 1975, 22, 205–215. [Google Scholar] [CrossRef]
  38. Kaniadakis, G. Maximum entropy principle and power-law tailed distributions. Eur. Phys. J. B-Condens. Matter Complex Syst. 2009, 70, 3–13. [Google Scholar] [CrossRef] [Green Version]
  39. Tarasov, V.E. Lattice model with power-law spatial dispersion for fractional elasticity. Cent. Eur. J. Phys. 2013, 11, 1580–1588. [Google Scholar] [CrossRef] [Green Version]
  40. Nigmatullin, R.; Baleanu, D. New relationships connecting a class of fractal objects and fractional integrals in space. Fract. Calc. Appl. Anal. 2013, 16, 911–936. [Google Scholar] [CrossRef]
  41. Lin, J. Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145–151. [Google Scholar] [CrossRef] [Green Version]
  42. Cover, T.M.; Thomas, J.A. Entropy, relative entropy and mutual information. Elem. Inf. Theory 1991, 2, 1–55. [Google Scholar]
  43. Ebrahimi, N.; Pflughoeft, K.; Soofi, E.S. Two measures of sample entropy. Stat. Probab. Lett. 1994, 20, 225–234. [Google Scholar] [CrossRef]
  44. Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA 1991, 88, 2297–2301. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Bandt, C.; Pompe, B. Permutation entropy: A natural complexity measure for time series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
  46. Pan, Y.; Chen, J.; Li, X. Spectral entropy: A complementary index for rolling element bearing performance degradation assessment. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2009, 223, 1223–1231. [Google Scholar] [CrossRef]
  47. Fan, J.L.; Ma, Y.L. Some new fuzzy entropy formulas. Fuzzy Sets Syst. 2002, 128, 277–284. [Google Scholar] [CrossRef]
  48. Rosso, O.A.; Blanco, S.; Yordanova, J.; Kolev, V.; Figliola, A.; Schürmann, M.; Başar, E. Wavelet entropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Methods 2001, 105, 65–75. [Google Scholar] [CrossRef]
  49. De Oliveira, E.C.; Tenreiro Machado, J.A. A review of definitions for fractional derivatives and integral. Math. Probl. Eng. 2014, 2014, 238459. [Google Scholar] [CrossRef] [Green Version]
  50. Sousa, J.V.D.C.; de Oliveira, E.C. On the ψ-Hilfer fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018, 60, 72–91. [Google Scholar] [CrossRef]
  51. Katugampola, U.N. Correction to “What is a fractional derivative?” by Ortigueira and Machado [Journal of Computational Physics, Volume 293, 15 July 2015, Pages 4–13. Special issue on Fractional PDEs]. J. Comput. Phys. 2016, 321, 1255–1257. [Google Scholar] [CrossRef] [Green Version]
  52. Tarasov, V.E. No nonlocality. No fractional derivative. Commun. Nonlinear Sci. Numer. Simul. 2018, 62, 157–163. [Google Scholar] [CrossRef] [Green Version]
  53. Abdelhakim, A.A.; Machado, J.A.T. A critical analysis of the conformable derivative. Nonlinear Dyn. 2019, 95, 3063–3073. [Google Scholar] [CrossRef]
  54. Lenzi, E.; Mendes, R.; Da Silva, L. Statistical mechanics based on Rényi entropy. Phys. A Stat. Mech. Appl. 2000, 280, 337–345. [Google Scholar] [CrossRef]
  55. Parvan, A.; Biró, T. Extensive Rényi statistics from non-extensive entropy. Phys. Lett. A 2005, 340, 375–387. [Google Scholar] [CrossRef] [Green Version]
  56. Plastino, A.; Casas, M.; Plastino, A. A nonextensive maximum entropy approach to a family of nonlinear reaction–diffusion equations. Phys. A Stat. Mech. Appl. 2000, 280, 289–303. [Google Scholar] [CrossRef]
  57. Frank, T.; Daffertshofer, A. H-theorem for nonlinear Fokker–Planck equations related to generalized thermostatistics. Phys. A Stat. Mech. Appl. 2001, 295, 455–474. [Google Scholar] [CrossRef]
  58. Abe, S. A note on the q-deformation-theoretic aspect of the generalized entropies in nonextensive physics. Phys. Lett. A 1997, 224, 326–330. [Google Scholar] [CrossRef]
  59. Khinchin, A.I. Mathematical Foundations of Information Theory; Dover: New York, NY, USA, 1957. [Google Scholar]
  60. Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1963. [Google Scholar]
  61. Lesche, B. Instabilities of Rényi entropies. J. Stat. Phys. 1982, 27, 419–422. [Google Scholar] [CrossRef]
  62. Gell-Mann, M.; Tsallis, C. Nonextensive Entropy: Interdisciplinary Applications; Oxford University Press: Oxford, UK, 2004. [Google Scholar]
  63. Amigó, J.M.; Balogh, S.G.; Hernández, S. A brief review of generalized entropies. Entropy 2018, 20, 813. [Google Scholar] [CrossRef] [Green Version]
  64. Namdari, A.; Li, Z. A review of entropy measures for uncertainty quantification of stochastic processes. Adv. Mech. Eng. 2019, 11, 1687814019857350. [Google Scholar] [CrossRef]
  65. Abe, S. Nonextensive statistical mechanics of q-bosons based on the q-deformed entropy. Phys. Lett. A 1998, 244, 229–236. [Google Scholar] [CrossRef]
  66. Jackson, F.H. On q-functions and a certain difference operator. Earth Environ. Sci. Trans. R. Soc. Edinb. 1909, 46, 253–281. [Google Scholar] [CrossRef]
  67. Akimoto, M.; Suzuki, A. Proposition of a New Class of Entropy. J. Korean Phys. Soc. 2001, 38, 460–463. [Google Scholar]
  68. Abramowitz, M.; Stegun, I.A. (Eds.) Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Dover: New York, NY, USA, 1965. [Google Scholar]
  69. Ubriaco, M.R. Entropies based on fractional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef] [Green Version]
  70. Yu, S.; Huang, T.Z.; Liu, X.; Chen, W. Information measures based on fractional calculus. Inf. Process. Lett. 2012, 112, 916–921. [Google Scholar] [CrossRef]
  71. Radhakrishnan, C.; Chinnarasu, R.; Jambulingam, S. A Fractional Entropy in Fractal Phase Space: Properties and Characterization. Int. J. Stat. Mech. 2014, 2014, 460364. [Google Scholar] [CrossRef] [Green Version]
  72. Wang, Q.A. Extensive generalization of statistical mechanics based on incomplete information theory. Entropy 2003, 5, 220–232. [Google Scholar] [CrossRef] [Green Version]
  73. Machado, J.T. Fractional Order Generalized Information. Entropy 2014, 16, 2350–2361. [Google Scholar] [CrossRef] [Green Version]
  74. Bagci, G.B. The third law of thermodynamics and the fractional entropies. Phys. Lett. A 2016, 380, 2615–2618. [Google Scholar] [CrossRef]
  75. Jalab, H.A.; Subramaniam, T.; Ibrahim, R.W.; Kahtan, H.; Noor, N.F.M. New Texture Descriptor Based on Modified Fractional Entropy for Digital Image Splicing Forgery Detection. Entropy 2019, 21, 371. [Google Scholar] [CrossRef] [Green Version]
  76. Yang, X.J. Advanced Local Fractional Calculus and Its Applications; World Science Publisher: New York, NY, USA, 2012. [Google Scholar]
  77. Karcı, A. New approach for fractional order derivatives: Fundamentals and analytic properties. Mathematics 2016, 4, 30. [Google Scholar] [CrossRef] [Green Version]
  78. Karcı, A. Fractional order entropy: New perspectives. Optik 2016, 127, 9172–9177. [Google Scholar] [CrossRef]
  79. Ferreira, R.A.; Tenreiro Machado, J. An Entropy Formulation Based on the Generalized Liouville Fractional Derivative. Entropy 2019, 21, 638. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  80. Machado, J.T.; Lopes, A.M. Fractional Rényi entropy. Eur. Phys. J. Plus 2019, 134, 217. [Google Scholar] [CrossRef]
  81. Beliakov, G.; Sola, H.B.; Sánchez, T.C. A Practical Guide to Averaging Functions; Springer: Cham, Switzerland, 2016. [Google Scholar]
  82. Xu, D.; Erdogmuns, D. Renyi’s entropy, divergence and their nonparametric estimators. In Information Theoretic Learning; Springer: Berlin/Heidelberg, Germany, 2010; pp. 47–102. [Google Scholar]
  83. Van Eck, N.J.; Waltman, L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 2010, 84, 523–538. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. The values of S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] for the (a) Poisson, (b) Gaussian, (c) Lévy and (d) Weibull distributions.
Figure 1. The values of S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] for the (a) Poisson, (b) Gaussian, (c) Lévy and (d) Weibull distributions.
Entropy 22 01374 g001
Figure 2. The values of S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] for the (ac) Poisson, (df) Gaussian, (gi) Lévy and (jl) Weibull distributions.
Figure 2. The values of S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] for the (ac) Poisson, (df) Gaussian, (gi) Lévy and (jl) Weibull distributions.
Entropy 22 01374 g002
Figure 3. The entropy of the DJIA stock index for daily closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval: (a) S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] ; (bd) S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] .
Figure 3. The entropy of the DJIA stock index for daily closing values in the time period from 1 January 1987 up to 24 November 2018, with one-day sampling interval: (a) S α ( A S ) , S α ( U ) , S α ( Y ) , S α ( M ) , S α ( J ) , S α ( K ) and S α ( F M ) versus α [ 0 , 1 ] ; (bd) S q , α ( R C J ) , S q , α ( M L 1 ) and S q , α ( M L 2 ) versus α [ 0.6 , 0.6 ] and q [ 1.2 , 2.2 ] .
Entropy 22 01374 g003
Figure 4. The map of co-occurrence of the authors’ keywords in the 170 papers extracted from Scopus for constructing Table 2. The minimum value of co-occurrence of each keyword is 3. The clusters are represented by C = { C 1 , , C 6 } .
Figure 4. The map of co-occurrence of the authors’ keywords in the 170 papers extracted from Scopus for constructing Table 2. The minimum value of co-occurrence of each keyword is 3. The clusters are represented by C = { C 1 , , C 6 } .
Entropy 22 01374 g004
Table 1. The constants adopted for the probability distributions and the intervals of variation of the entropy parameters.
Table 1. The constants adopted for the probability distributions and the intervals of variation of the entropy parameters.
DistributionExpressionParametersDomainOrder
1-par. Entropy
Order
2-par. Entropy
Poisson f ( x ) λ z e λ z ! λ = 4 z = 0 , 1 , , 50 α [ 0 , 1 ] α [ 0.6 , 0.6 ]
Gaussian f ( x ) = 1 σ 2 π e 1 2 x μ σ 2 σ = 4
μ = 0
x [ 2 , 2 ]
Lévy f ( x ) = c 2 π e c ( 2 x μ ) ( x μ ) ( 3 / 2 ) c = 4
μ = 0
x [ 0.1 , 20 ] q [ 1.2 , 2.2 ]
Weibull f ( x ) = k λ x λ ( k 1 ) e ( x / λ ) k k = 1.5
λ = 1
x [ 0.01 , 2.5 ]
Table 2. Citations received by the nine papers that proposed the fractional-order entropies, according to the database Scopus on 7 November 2020.
Table 2. Citations received by the nine papers that proposed the fractional-order entropies, according to the database Scopus on 7 November 2020.
EntropyEquation NumberAuthorsReferenceN. CitationsYear
S α ( A S ) (37)Akimoto and Suzuki[67]52001
S α ( U ) (41)Ubriaco[69]882009
S α ( Y ) (42)Yu et al.[70]72012
S q , α ( R C J ) (43)Radhakrishnan et al.[71]32014
S α ( M ) (46)Machado[73]792014
S α ( J ) (47)Jalab et al.[75]62019
S α ( K ) (49)Karcı[77]162016
S α ( F M ) (55)Ferreira and Machado[79]42019
S q , α ( M L 1 ) (60)Machado and Lopes[80]52019
S q , α ( M L 2 ) (62)Machado and Lopes[80]52019
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lopes, A.M.; Machado, J.A.T. A Review of Fractional Order Entropies. Entropy 2020, 22, 1374. https://doi.org/10.3390/e22121374

AMA Style

Lopes AM, Machado JAT. A Review of Fractional Order Entropies. Entropy. 2020; 22(12):1374. https://doi.org/10.3390/e22121374

Chicago/Turabian Style

Lopes, António M., and José A. Tenreiro Machado. 2020. "A Review of Fractional Order Entropies" Entropy 22, no. 12: 1374. https://doi.org/10.3390/e22121374

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop