Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

A modified belief entropy in Dempster-Shafer framework

  • Article
  • Authors
  • Metrics
  • Comments
  • Media Coverage

Abstract

How to quantify the uncertain information in the framework of Dempster-Shafer evidence theory is still an open issue. Quite a few uncertainty measures have been proposed in Dempster-Shafer framework, however, the existing studies mainly focus on the mass function itself, the available information represented by the scale of the frame of discernment (FOD) in the body of evidence is ignored. Without taking full advantage of the information in the body of evidence, the existing methods are somehow not that efficient. In this paper, a modified belief entropy is proposed by considering the scale of FOD and the relative scale of a focal element with respect to FOD. Inspired by Deng entropy, the new belief entropy is consistent with Shannon entropy in the sense of probability consistency. What’s more, with less information loss, the new measure can overcome the shortage of some other uncertainty measures. A few numerical examples and a case study are presented to show the efficiency and superiority of the proposed method.

1 Introduction

Dempster-Shafer evidence theory [1, 2] is effective in modeling and processing uncertain information of intelligent systems. It has been extensively studied in many fields such as pattern recognition [38], fault diagnosis [912], multiple attribute decision making [1315], risk analysis [1620], controller design [21, 22] and so on [2325]. However, some open issues in Dempster–Shafer evidence theory are still needed for further study. Firstly, highly conflicting evidence may lead to counterintuitive results, conflict management among different information sources should be addressed cautiously [2629]. Secondly, the dependence among different evidence should be taken into consideration before applying combination rule [3032]. Thirdly, a widely applicable method of generating basic probability assignment (BPA) should be developed to model uncertain information [3335]. Finally, the incompleteness of the frame of discernment (FOD) should be taken into consideration in an open world [29, 3638]. These open issues are often related to uncertainty modeling. One way to manage the uncertainty is to quantify the uncertainty before further information processing.

Uncertainty often comes from several types of uncertain and incomplete information, including ignorance, vagueness and so on [39]. Uncertainty and ignorance are difficult categories to deal with [40]. Although ignorance increases the uncertain degree of uncertain information in an open world, with a proper uncertainty measure, one can manage or even decrease the uncertain degree of uncertain information. Since uncertainty measure is a hot topic in information processing [4144], many theories have been developed for uncertainty modeling, such as Shannon entropy [45], possibility theory [46], fuzzy sets [47], Dempster–Shafer evidence theory [1, 2] and rough sets [48]. Some extended theories and hybrid methods are also presented for uncertainty measure, e.g. Hohle’s confusion measure [49], Yager’s dissonance measure [50], the weighted Hartley entropy [51], Klir & Ramer’s discord measure [52] and Klir & Parviz’s strife measure [53], Deng entropy [54], generalized evidence theory [29], D numbers [55] and so on [5661]. Among these methods, Shannon entropy is a well-known theory for uncertainty measure in the probabilistic framework. For example, as a generalization of Shannon entropy, network entropy is an effective measurement for testing the complexity of networks [6265]. But Shannon entropy can’t be used directly in the framework of Dempster–Shafer evidence theory, because a mass function in evidence theory is a generalized probability assigned on the power set of FOD. To address this issue, some modified methods based on Shannon entropy are proposed [4953], of which some have been successfully applied in real applications [66, 67]. However, these methods are somehow not that effective in some cases [54, 56].

Recently, in Dempster–Shafer framework, a new uncertainty measure named Deng entropy is proposed. Deng entropy can measure the uncertain degree more efficiently than some other uncertainty measures in some cases [54]. Although Deng entropy has been successfully applied in some real applications [912, 68], it doesn’t take into consideration of the scale of the FOD, which means a loss of available information while doing information processing. The information loss will lead to fail in uncertainty measure in some cases. In order to overcome this shortage of Deng entropy, a modified belief entropy based on Deng entropy is proposed in this paper. The proposed belief entropy can improve the performance of Deng entropy by considering the scale of the FOD and the relative scale of a focal element with respect to FOD. What’s more, the proposed method remains all the merits of Deng entropy thus it can degenerate to Shannon entropy in the sense of the probability consistency.

The rest of this paper is organized as follows. In Section 2, the preliminaries on Dempster–Shafer evidence theory, Shannon entropy, Deng entropy and some uncertainty measures in Dempster–Shafer framework are briefly introduced. In Section 3, the new belief entropy is presented. In Section 4, some numerical examples are presented, as well as a comparative study between the new belief entropy and some other uncertainty measures. In Section 5, a case study is presented to show the effectiveness and the potential application prospect of the new measure. The conclusions and ongoing work are given in Section 6.

2 Preliminaries

Some preliminaries are briefly introduced in this section, including Dempster-Shafer evidence theory, Shannon entropy, Deng entropy and some other typical uncertainty measures in Dempster-Shafer framework.

2.1 Dempster-Shafer evidence theory

Let Ω = {θ1, θ2, …, θi, …, θN} be be a finite nonempty set of mutually exclusive and exhaustive events, Ω is called the frame of discernment (FOD). The power set of Ω, denoted as 2Ω, is composed of 2N elements denoted as follows: (1)

A mass function m is defined as a mapping from the power set 2Ω to the interval [0, 1], which satisfies the following conditions [1, 2]: (2) If m(A) > 0, then A is called a focal element, the mass function m(A) represents how strongly the evidence supports the proposition A.

A body of evidence (BOE), also known as a basic probability assignment (BPA) or basic belief assignment (BBA), is represented by the focal sets and their associated mass value: (3) where ℜ is a subset of the power set 2Ω, each A ∈ ℜ has an associated nonzero mass value m(A).

A BPA m can also be represented by its associate belief function Bel and plausibility function Pl respectively, defined as follows: (4)

In Dempster-Shafer evidence theory, two independent mass functions, denoted as m1 and m2, can be combined with Dempster’s rule of combination defined as [1, 2]: (5) where k is a normalization constant representing the degree of conflict between m1 and m2, k is defined as [1, 2]: (6)

2.2 Shannon entropy

As an uncertainty measure of information volume in a system or process, Shannon entropy plays a central role in information theory. Shannon entropy indicates that the information volume of each piece of information is directly connected to its uncertain degree.

Shannon entropy, as the information entropy, is defined as follows [45]: (7) where N is the number of basic states, pi is the probability of state i, pi satisfies . If the unit of information is bit, then b = 2.

2.3 Deng entropy

Deng entropy is a generalization of Shannon entropy in Dempster–Shafer framework [54]. If the information is modelled in the framework of a probability theory, Deng entropy can be degenerated to Shannon entropy. Deng entropy, denoted as Ed, is defined as follows [54]: (8) where |A| denotes the cardinality of the proposition A, X is the FOD. If and only if the mass value is assigned to single elements, Deng entropy can be degenerated to Shannon entropy, in this case, the form of Deng entropy is as follows: (9) For more details about Deng entropy, please refer to [54].

2.4 Uncertainty measures in Dempster-Shafer framework

Assume that X is the FOD, A and B are focal elements of the mass function, and |A| denotes the cardinality of A. Then, the definitions of some typical uncertainty measures in Dempster-Shafer framework are briefly introduced as follows.

2.4.1 Hohle’s confusion measure.

Hohle’s confusion measure, denoted as CH, is defined as follows [49]: (10)

2.4.2 Yager’s dissonance measure.

Yager’s dissonance measure, denoted as EY, is defined as follows [50]: (11)

2.4.3 Dubois & Prade’s weighted Hartley entropy.

Dubois & Prade’s weighted Hartley entropy, denoted as EDP, is defined as follows [51]: (12)

2.4.4 Klir & Ramer’s discord measure.

Klir & Ramer’s discord measure, denoted as DKR, is defined as follows [52]: (13)

2.4.5 Klir & Parviz’s strife measure.

Klir & Parviz’s strife measure, denoted as SKP, is defined as follows [53]: (14)

2.4.6 George & Pal’s conflict measure.

The total conflict measure proposed by George & Pal, denoted as TCGP, is defined as follows [56]: (15)

3 The proposed belief entropy

3.1 Problem description

In the framework of Dempster-Shafer evidence theory, the uncertain information is modeled not only by mass functions, the FOD is also the source of uncertainty, e.g. the number of elements in the FOD. However, Dubois & Prade’s weighted Hartley entropy and Deng entropy measure the uncertain degree of BOEs by only taking into consideration of the mass function and the cardinality of a proposition, the scale of FOD is totally ignored. Thus these methods can’t effectively measure the difference of uncertain degree with similar basic probability assignment on different FODs. A simple example of the limitation of Deng entropy and weighted Hartley entropy is shown in Example 3.1.

Example 3.1. Consider a target identification problem, assume that two reliable sensors report the detection results independently. The results are represented by BOEs shown as follows:

Recall the Eq (8) of Deng entropy, the uncertainty measure of m1 and m2 are shown as follows: (16) (17)

Recall the Eq (12) of Dubois & Prade’s weighted Hartley entropy, the uncertainty measure of m1 and m2 are shown as follows: (18) (19)

The results calculated by Deng entropy and the weighted Hartley entropy are counterintuitive. The two BOEs have the same mass value assignment, but the FOD of the first BOE consists of four targets denoted as a, b, c and d, while the second BOE has only three possible targets denoted as a, b and c. Intuitively, it is expected that the second BOE has less uncertainty than the first one. In other words, the uncertain degree of m1 should be bigger than that of m2. Both Deng entropy and weighted Hartley entropy fail to quantify the difference of uncertain degree among these two BOEs. To address this issue, a modified belief entropy based on Deng entropy is proposed.

3.2 The new belief entropy

In the framework of Dempster-Shafer evidence theory, the new belief entropy based on Deng entropy is shown as follows: (20) where |A| denotes the cardinality of the focal element A, |X| denotes the cardinality of X which represents the number of element in FOD. Compared with Deng entropy, the new belief entropy addresses more information in BOE, including the scale of FOD, denoted as |X|, and the relative scale of a focal element with respect to FOD, denoted as ((|A| − 1)/|X|).

The exponential factor in the new belief entropy represents the uncertain information in a BOE that has been ignored by Deng entropy and some other uncertainty measures such as the confusion measure, the dissonance measure, the weighted Hartley entropy, the discord measure and the strife measure. More importantly, by involving the scale of FOD in the proposed belief entropy, the new uncertainty measure now can effectively quantify the difference among different BOEs even if the same mass value is assigned on different FODs. In addition, the new information exponential factor doesn’t affect the merit of Deng entropy, which will be discussed in detail in the ensuing part of this paper.

With the new belief entropy, recall Example 3.1, the new belief entropy for these two BOEs is calculated as follows: (21) (22)

The comparison results of different uncertainty measures for Example 3.1 are shown in Table 1. It can be concluded that both Dubois & Prade’s weighted Hartley entropy and Deng entropy can’t measure the difference of uncertain degree between these two BOEs, while the new belief entropy can effectively measure the different uncertain degree by taking into consideration of more available information in the BOE. In addition, according to Table 1, the first BOE m1 has a higher uncertain degree with the new belief entropy, this is reasonable because the FOD of m1 consists of four candidate targets which means a larger information volume than the second BOE m2. The efficiency of the new belief entropy is not available in the weighted Hartley entropy and Deng entropy.

thumbnail
Table 1. Uncertainty measure of Example 3.1 with different methods.

https://doi.org/10.1371/journal.pone.0176832.t001

3.3 Property of the new belief entropy

Some properties of the new belief entropy are presented in this section, including the range of the new measure and its compatibility with Shannon entropy.

Property 1. Mathematically, the value range of the new belief entropy is (0, +∞).

Proof. According to Dempster-Shafer evidence theory, a focal element A consists at least one element and the superior limit of its element number is the scale of FOD, while a FOD Ω (the X in Eq (20)) consists at least one element and there is no superior limit, thus the range of |A| and |X| are the same, denoted as [1, +∞). The range of a mass function m(A) is (0, 1].

Recall Eq (20), where |A| ∈ [1, +∞), |X| ∈ [1, +∞), m(A) ∈ (0, 1]. Thus the range of the new belief entropy can be denoted as EMd(m) ∈ (0, +∞).

Property 2. The new belief entropy can degenerate to the Shannon entropy when the mass function is Bayesian.

Proof: Recall Eq (20), if the mass function m(A) is Bayesian, then the mass value (BPA) is assigned only on single element subset, then |A|≡1. In this case, the new belief entropy can degenerate to the following equation: (23) Eq (23) is in consistent with Eqs (9) and (7) when the mass function is Bayesian, because a mass function m(A) can degenerate to a Bayesian probability pi in the sense of the probability consistency.

4 Numerical example and discussion

In order to show the rationality and merit of the proposed belief entropy, some numerical examples are presented in this section. In Section 4.1, the compatibility of the new belief entropy with Shannon entropy and Deng entropy is verified with some simple numerical examples. In the Section 4.2, the superiority of the new belief entropy compared with some other uncertainty measures is presented.

4.1 Compatibility with Shannon entropy

Example 4.1. Consider a target identification problem, if the target reported by the sensor is a with one hundred percent belief, then the mass function can be denoted as m({a}) = 1 in the frame of discernment X = {a}.

Shannon entropy H, Deng entropy Ed and the new belief entropy EMd are calculated respectively as follows:

It is obvious that the uncertain degree for a certain event is zero. So the values of Shannon entropy, Deng entropy and the new belief entropy are all zero.

Example 4.2. Consider the mass function m({a}) = m({b}) = m({c}) = m({d}) = m({e}) = 0.2 in the frame of discernment X = {a, b, c, d, e}.

Shannon entropy H, Deng entropy Ed and the new belief entropy EMd are calculated respectively as follows:

According to Example 4.1 and 4.2, if the mass value is only assigned on the single element, the result of the new belief entropy is consistent with Shannon entropy and Deng entropy. The compatibility of the new belief entropy with Shannon entropy and Deng entropy verifies the effectiveness and rationality of the proposed belief entropy.

4.2 Superiority of the new belief entropy

In this section, the numerical examples are no longer appropriate for Shannon entropy, so the comparison is between the proposed belief entropy and some other uncertainty measures in Dempster-Shafer framework.

Example 4.3. Consider the mass function m({a, b, c, d, e}) = 1 in the frame of discernment X = {a, b, c, d, e}.

Deng entropy Ed and the new belief entropy EMd are calculated as follows:

The result shows that both Deng entropy and the new belief entropy of this vacuous mass function are bigger than that in Example 4.2. This is because the vacuous mass function in Example 4.3 means the information is totally unknown for the system, but the Bayesian mass function in Example 4.2 shows that the probability is equally distributed in the system. More information is available in Example 4.2 than the vacuous mass function in Example 4.3, so the uncertain degree in Example 4.2 should be smaller than that of the vacuous mass function. In addition, in Example 4.3, the uncertain degree indicated by the new belief entropy is smaller than that of Deng entropy, this is achieved by taking into consideration of the scale of the FOD in the BOE. That is to say, by taking into consideration of more available information, the uncertain degree measured by the new belief entropy is significantly decreased in comparison with Deng entropy. It’s also safe to say that the new belief entropy can be more accurate for uncertainty measure in this case.

In order to test the capacity and superiority of the new belief entropy, recall the example in [54] as the following example.

Example 4.4. Consider the mass function m({6}) = 0.05, m({3, 4, 5}) = 0.05, m(T) = 0.8 and m(X) = 0.1 in a FOD X = {1, 2, …, 14, 15} with fifteen elements denoted as Element 1, …, and Element 15. T represents a variable subset with the number of element changes from Element 1 to Element 14, as is shown in Table 2.

thumbnail
Table 2. Modified Deng entropy with a variable element in T.

https://doi.org/10.1371/journal.pone.0176832.t002

Deng entropy Ed and the modified belief entropy EMd are calculated with a changed proposition T, the results are shown in Table 2 and Fig 1.

thumbnail
Fig 1. Comparison between the modified belief entropy and Deng entropy.

https://doi.org/10.1371/journal.pone.0176832.g001

Table 2 and Fig 1 show that the modified belief entropy is smaller than Deng entropy. This is reasonable, because more information in the BOE is taken into consideration within the modified belief entropy. The proposed method has a less information loss than Deng entropy.

Fig 2 shows the results of comparison between the modified belief entropy and some other typical uncertainty measures in Dempster-Shafer framework.

In Fig 2, the uncertain degree measured by Hohle’s confusion measure never changes with the variation of the element number in proposition T, thus it cannot measure the variance of uncertain degree in this case. Similar to Hohle’s confusion measure, Yager’s dissonance measure has a limited capacity of uncertainty measure in this case, both of these two methods can’t measure the change in proposition T. The uncertain degree measured by Klir & Ramer’s discord measure, Klir & Parviz’s strife measure and George & Pal’s conflict measure is decreasing with the increasing of the element number in the proposition T. Thus, the confusion measure, the dissonance measure, the discord measure, the strife measure and the conflict measure can’t effectively measure the rising of the uncertain degree along with the increasing of the element number in the proposition T.

It seems that the uncertain degree measured by Dubois & Prade’s weighted Hartley entropy, Deng entropy and the modified belief entropy is rising significantly along with the increasing of the element number in proposition T. However, the weighted Hartley entropy and Deng entropy can’t distinguish the different uncertain degree among BOEs with similar BPAs on different FODs, as is shown in Example 3.1. More importantly, by taking into consideration of the scale of the FOD and the cardinality of each proposition simultaneously, the uncertain degree measured by the modified belief entropy is significantly decreased in comparison with Deng entropy. The proposed modified belief entropy takes advantage of more valuable information in BOE, which ensures it to be more reasonable and effective for uncertainty measure in Dempster–Shafer framework.

5 A case study

In order to show the effectiveness and the application prospect of the modified belief entropy, the case study in [69] and the fault diagnosis method in [10] are recalled in this section. While performing the fault diagnosis method in [10], this paper changes Deng entropy into the new belief entropy.

Recall the example in [69]. Three fault types are denoted as F1, F2 and F3, the fault hypothesis set is Θ = {F1, F2, F3}, three sensors report the diagnosis results independently, the diagnosis results are modelled as BOEs, denoted as E1, E2 and E3, the BPAs of the diagnosis results are shown in Table 3.

Based on the sensor reports in Table 3, which one is the fault that happens now, F1, F2 or F3? With Dempster’s rule of combination in Eq (5), the combination results of sensor reports are shown in Table 4. It’s hard to judge which fault has been occurred, because the combination results obtained by the conventional Dempster’s rule of combination are very close.

thumbnail
Table 4. Fused results with only Dempster’s rule of combination.

https://doi.org/10.1371/journal.pone.0176832.t004

In order to handle this problem, in [10], a fault diagnosis method based on Deng entropy is proposed, the reliability of sensor data will be modelled as a weight of each BOE. Follow the fault diagnosis method in [10], the weight of the ith BOE (i = 1, 2, 3) is defined as the product of a static reliability ws(i) and a dynamic reliability wd(i), denoted as follows [10]: (24) where the static reliability ws(i) of each BOE is listed in Table 5, and the dynamic reliability wd(i) is defined as follows [10]: (25) where Crd(i) is the credibility degree of the ith BOE E1, Ed(i) is Deng entropy of the ith BOE E1, max(Ed(i)) is the maximum of Deng entropy value among all the BOEs. The value of Crd(i) and Ed(i) of each BOE in [10] is shown in Table 5.

Based on Eqs (24) and (25), the weight of each BOE based on the new belief entropy EMd(i) is defined as follows: (26) The modified belief entropy of each BOE is calculated as follows:

Now, it’s clear that max(EMd(mi)) = EMd(m1) = 2.0505. Based on Eq (26), the weight of each BOE based on the modified belief entropy is calculated as follows:

With a normalization step, the weight of each BOE is as follows:

Now, the BBAs in Table 3 can be modified with the normalized weight of each BOE, the weighted BBA of each propostion is calculated as follows:

Recall Dempster’s rule of combination in Eq (5), since there are three independent BOEs, so each weighted BPA will be fused two times with itself, the calculation is shown as follows:

The fused results with the proposed uncertainty measure are compared with some other methods, as is shown in Table 6. Intuitively, F1 is the fault type because both the first BOE E1 and the third BOE E3 have a big belief (no less than 60%) on F1, while the second BOE E2 may come from an abnormal sensor in comparison with the other two BOEs. The fused result of Yuan et al’s method with the new measure is compatible with Fan et al’s method and Yuan et al’s method (with Deng entropy), although these three methods can overcome the shortage of Dempster’s rule of combination and lead to the right conclusion, Yuan et al’s method with the new measure has the highest belief (89.51%) on the conclusion that F1 is the fault.

The case study demonstrates the effectiveness of the new belief entropy. In addition, the case study shows a promising application prospect of the new uncertainty measure.

6 Conclusions

In information processing, each tiny piece of information is valuable. The uncertain information should be addressed cautiously, especially when there is limited available information. In this paper, a new belief entropy based on Deng entropy is proposed. The proposed method takes full advantage of uncertain information in BOE, including the mass function, the cardinality of the proposition and the scale of FOD. By addressing more available information of BOEs, the difference of uncertain degree that can’t be addressed by some other uncertainty measures now can be distinguished successfully. Numerical examples show that the new belief entropy can quantify the uncertain degree of BOE more accurately. The case study demonstrates the effectiveness and the application prospect of the new measure. Further study of this work will be focused on the application of the proposed measure. The new belief entropy provides a promising way to measure the uncertain degree in decision making, fault diagnosis, pattern recognition, risk analysis and so on.

Acknowledgments

The authors are very grateful to the anonymous referees for the valuable and constructive comments which have led to improve the manuscript significantly.

Author Contributions

  1. Conceptualization: YT.
  2. Data curation: YT.
  3. Formal analysis: DZ YT.
  4. Funding acquisition: WJ YT.
  5. Investigation: DZ YT.
  6. Methodology: YT WJ.
  7. Project administration: DZ.
  8. Software: DZ YT.
  9. Supervision: DZ.
  10. Validation: DZ YT WJ.
  11. Visualization: YT.
  12. Writing – original draft: YT.
  13. Writing – review & editing: YT.

References

  1. 1. Dempster AP. Upper and Lower Probabilities Induced by a Multi-valued Mapping. Annals of Mathematical Statistics. 1967;38(2):325–339.
  2. 2. Shafer G. A Mathematical Theory of Evidence. Princeton: Princeton University Press; 1976.
  3. 3. Denoeux T. A k-Nearest Neighbor Classification Rule Based on Dempster–Shafer Theory. IEEE Transactions on Systems Man & Cybernetics. 1995;25(5):804–813.
  4. 4. Liu ZG, Pan Q, Dezert J. A new belief-based K-nearest neighbor classification method. Pattern Recognition. 2013;46(3):834–844.
  5. 5. Ma J, Liu W, Miller P, Zhou H. An Evidential Fusion Approach for Gender Profiling. Information Sciences. 2015;333:10–20.
  6. 6. Liu ZG, Pan Q, Dezert J, Mercier G. Credal classification rule for uncertain data based on belief functions. Pattern Recognition. 2014;47(7):2532–2541.
  7. 7. Han D, Liu W, Dezert J, Yang Y. A novel approach to pre-extracting support vectors based on the theory of belief functions. Knowledge-Based Systems. 2016;110:210–223.
  8. 8. Liu ZG, Pan Q, Dezert J, Martin A. Adaptive imputation of missing values for incomplete pattern classification. Pattern Recognition. 2016;52:85–95.
  9. 9. Jiang W, Wei B, Xie C, Zhou D. An evidential sensor fusion method in fault diagnosis. Advances in Mechanical Engineering. 2016;8(3):1–7.
  10. 10. Yuan K, Xiao F, Fei L, Kang B, Deng Y. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory. Sensors. 2015;16(1):1–13.
  11. 11. Yuan K, Xiao F, Fei L, Kang B, Deng Y. Conflict management based on belief function entropy in sensor fusion. SpringerPlus. 2016;5:638. pmid:27330904
  12. 12. Jiang W, Xie C, Zhuang M, Shou Y, Tang Y. Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis. Sensors. 2016;16(9):1509.
  13. 13. Chin KS, Fu C, Wang Y. A method of determining attribute weights in evidential reasoning approach based on incompatibility among attributes. Computers & Industrial Engineering. 2015;87:150–162.
  14. 14. Du WS, Hu BQ. Attribute reduction in ordered decision tables via evidence theory. Information Sciences. 2016;364:91–110.
  15. 15. Fu C, Wang Y. An interval difference based evidential reasoning approach with unknown attribute weights and utilities of assessment grades. Computers & Industrial Engineering. 2015;81:109–117.
  16. 16. Wang YM, Elhag TMS. A comparison of neural network, evidential reasoning and multiple regression analysis in modelling bridge risks. Expert Systems with Applications. 2007;32(2):336–348.
  17. 17. Su X, Deng Y, Mahadevan S, Bao Q. An improved method for risk evaluation in failure modes and effects analysis of aircraft engine rotor blades. Engineering Failure Analysis. 2012;26(12):164–174.
  18. 18. Fu C, Yang JB, Yang SL. A group evidential reasoning approach based on expert reliability. European Journal of Operational Research. 2015;246(3):886–893.
  19. 19. Zhang X, Mahadevan S, Deng X. Reliability analysis with linguistic data: An evidential network approach. Reliability Engineering & System Safety. 2017;162:111–121.
  20. 20. Jiang W, Xie C, Wei B, Zhou D. A modified method for risk evaluation in failure modes and effects analysis of aircraft turbine rotor blades. Advances in Mechanical Engineering. 2016;8(4):1–16.
  21. 21. Yager RR, Filev DP. Including probabilistic uncertainty in fuzzy logic controller modeling using Dempster–Shafer theory. IEEE transactions on systems, man, and cybernetics. 1995;25(8):1221–1230.
  22. 22. Tang Y, Zhou D, Jiang W. A New Fuzzy-Evidential Controller for Stabilization of the Planar Inverted Pendulum System. PloS ONE. 2016;11(8):e0160416. pmid:27482707
  23. 23. Wang YM, Yang JB, Xu DL, Chin KS. Consumer preference prediction by using a hybrid evidential reasoning and belief rule-based methodology. Expert Systems with Applications. 2009;36(4):8421–8430.
  24. 24. Ma J, Liu W, Benferhat S. A belief revision framework for revising epistemic states with partial epistemic states. International Journal of Approximate Reasoning. 2015;59:20–40.
  25. 25. Zhou K, Martin A, Pan Q, Liu ZG. Median evidential c-means algorithm and its application to community detection. Knowledge-Based Systems. 2015;74:69–88.
  26. 26. Zadeh LA. A simple view of the Dempster-Shafer theory of evidence and its implication for the rule of combination. AI magazine. 1986;7(2):85–90.
  27. 27. Liu W. Analyzing the degree of conflict among belief functions. Artificial Intelligence. 2006;170(11):909–924.
  28. 28. Schubert J. Conflict management in Dempster–Shafer theory using the degree of falsity. International Journal of Approximate Reasoning. 2011;52(3):449–460.
  29. 29. Deng Y. Generalized evidence theory. Applied Intelligence. 2015;43(3):530–543.
  30. 30. Su X, Mahadevan S, Han W, Deng Y. Combining dependent bodies of evidence. Applied Intelligence. 2015;44(3):634–644.
  31. 31. Su X, Mahadevan S, Xu P, Deng Y. Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP. Risk Analysis. 2015;35(7):1296–1316. pmid:25847228
  32. 32. Su X, Mahadevan S, Xu P, Deng Y. Handling of Dependence in Dempster–Shafer Theory. International Journal of Intelligent Systems. 2015;30(4):441–467.
  33. 33. Deng X, Liu Q, Deng Y, Mahadevan S. An improved method to construct basic probability assignment based on the confusion matrix for classification problem. Information Sciences. 2016;340:250–261.
  34. 34. Yang Y, Liu Y. Iterative Approximation of Basic Belief Assignment Based on Distance of Evidence. Plos One. 2016;11(2):e0147799. pmid:26829403
  35. 35. Jiang W, Zhan J, Zhou D, Li X. A method to determine generalized basic probability assignment in the open world. Mathematical Problems in Engineering. 2016;2016:373142.
  36. 36. Smets P, Kennes R. The transferable belief model. Artificial Intelligence. 1994;66(2):191–234.
  37. 37. Smets P. Belief functions on real numbers. International Journal of Approximate Reasoning. 2005;40(3):181–223.
  38. 38. Zhou D, Tang Y, Jiang W. A modified model of failure mode and effects analysis based on generalized evidence theory. Mathematical Problems in Engineering. 2016;2016:4512383.
  39. 39. Zhang MJ, Wang YM, Li LH, Chen SQ, Slowinski R, Artalejo J, et al. A general evidential reasoning algorithm for multi-attribute decision analysis under interval uncertainty. European Journal of Operational Research. 2017;257(3):1005–1015.
  40. 40. Pezzulo G, Lorini E, Calvi G. How do I Know how much I don’t Know? A cognitive approach about Uncertainty and Ignorance. In: Proceedings of the Cognitive Science Society. vol. 26; 2014.
  41. 41. Denoeux T. Maximum Likelihood Estimation from Uncertain Data in the Belief Function Framework. IEEE Transactions on Knowledge & Data Engineering. 2013;25(1):119–130.
  42. 42. Wen M, Kang R. Reliability analysis in uncertain random system. Fuzzy Optimization and Decision Making. 2016;15(4):491–506.
  43. 43. Sankaran PG, Sunoj SM. Quantile-based cumulative entropies. Communication in Statistics- Theory and Methods. 2017;46(2):805–814.
  44. 44. Conti PL, Marella D, Scanu M. How far from identifiability? A systematic overview of the statistical matching problem in a non-parametric framework. Communications in Statistics—Theory and Methods. 2017;46(2):967–994.
  45. 45. Shannon CE. A mathematical theory of communication. ACM SIGMOBILE Mobile Computing and Communications Review. 2001;5(1):3–55.
  46. 46. Feller W. An introduction to probability theory and its applications (2nd ed.). New York: Wiley; 1957.
  47. 47. Zadeh LA. Fuzzy sets. Information & Control. 1965;8(3):338–353.
  48. 48. Pawlak Z. Rough sets. International Journal of Parallel Programming. 1982;11(5):341–356.
  49. 49. Hohle U. Entropy with respect to plausibility measures. In: Proceedings of the 12th IEEE International Symposium on Multiple-Valued Logic; 1982. p. 167–169.
  50. 50. Yager RR. Entropy and specificity in a mathematical theory of evidence. International Journal of General Systems. 1983;9(4):249–260.
  51. 51. Dubois D, Prade H. A note on measures of specificity for fuzzy sets. International Journal of General Systems. 1985;10(4):279–283.
  52. 52. Klir GJ, Ramer A. Uncertainty in Dempster–Shafer theory: A critical re-examination. International Journal of General Systems. 1991;18(2):155–166.
  53. 53. Klir GJ, Parviz B. A note on the measure of discord. In: Eighth International Conference on Uncertainty in Artificial Intelligence; 1992. p. 138–141.
  54. 54. Deng Y. Deng entropy. Chaos Solitons & Fractals. 2016;91:549–553.
  55. 55. Deng Y. D numbers: theory and applications. Journal of Information & Computational Science. 2012;9(9):2421–2428.
  56. 56. George T, Pal NR. Quantification of conflict in Dempster-Shafer framework: A new approach. International Journal of General Systems. 1996;24(4):407–423.
  57. 57. Sabahi F, Akbarzadeh-T MR. A qualified description of extended fuzzy logic. Information Sciences. 2013;244:60–74.
  58. 58. Deng Y, Liu Y, Zhou D. An Improved Genetic Algorithm with Initial Population Strategy for Symmetric TSP. Mathematical Problems in Engineering. 2015;2015:212794.
  59. 59. Yang Y, Han D. A new distance-based total uncertainty measure in the theory of belief functions. Knowledge-Based Systems. 2016;94:114–123.
  60. 60. Sabahi F, Akbarzadeh-T MR. Introducing validity in fuzzy probability for judicial decision-making. International Journal of Approximate Reasoning. 2014;55(6):1383–1403.
  61. 61. Deng Y. Fuzzy Analytical Hierarchy Process Based On Canonical Representation on Fuzzy Numbers. Journal of Computational Analysis and Applications. 2017;22(2):201–228.
  62. 62. Chen Z, Dehmer M, Shi Y. A Note on Distance-based Graph Entropies. Entropy. 2014;16(10):5416–5427.
  63. 63. Cao S, Dehmer M, Shi Y. Extremality of degree-based graph entropies. Information Sciences. 2014;278(10):22–33.
  64. 64. Chen Z, Dehmer M, Emmert-Streib F, Shi Y. Entropy bounds for dendrimers. Applied Mathematics & Computation. 2014;242:462–472.
  65. 65. Cao S, Dehmer M. Degree-based entropies of networks revisited. Applied Mathematics & Computation. 2015;261:141–147.
  66. 66. Frikha A, Moalla H. Analytic hierarchy process for multi-sensor data fusion based on belief function theory. European Journal of Operational Research. 2015;241(1):133–147.
  67. 67. Khodabandeh M, Shahri AM. Uncertainty evaluation for a Dezert–Smarandache theory-based localization problem. International Journal of General Systems. 2014;43(6):610–632.
  68. 68. Jiang W, Wei B, Qin X, Zhan J, Tang Y. Sensor Data Fusion Based on a New Conflict Measure. Mathematical Problems in Engineering. 2016;2016:5769061.
  69. 69. Fan X, Zuo MJ. Fault diagnosis of machines based on D-S evidence theory. Part 1: D–S evidence theory and its improvement. Pattern Recognition Letters. 2006;27(5):366–376.