Fuzzy entropy - IEEE Xplore

13 downloads 0 Views 521KB Size Report
classical Shannon entropy and the fuzzy entropy. In fact while the later deals with vagueness and ambigu- ous uncertainties, the former tackles probabilistic un-.
Fuzzy Entropy: a Brief Survey S. Al-sharhan, F. Karray, W. Gueaieb, a n d 0. Basir Department of Systems Design Engineering - University of Waterloo - Waterloo, Canada E m a i l Address: {salah,karray,wail,basir}@watfast.uwaterloo.ca

Abstract- This paper presents a survey about different types of fuzzy information measures. A number of schemes have been proposed to combine the fuzzy set theory and its application to the entropy concept as a fuzzy information measurements. The entropy concept, as a relative degree of randomness, has been utilized to measure the fuzziness in a fuzzy set or system. However, a major difference exists between the classical Shannon entropy and the fuzzy entropy. In fact while the later deals with vagueness and ambiguous uncertainties, the former tackles probabilistic uncertainties (randomness).

keywords: Entropy; Shannon entropy; Fuzzu logic; Fuzzy entropy; Information measure.

I. INTRODUCTION Information theory [22] and more recently fuzzy set theory [29], have proved t o be applicable t o a wide variety of fields. Herein we introduce a survey for the fuzzy information measurement, i.e:,. fuzzy entropy. Since there are different approaches dealing with fuzzy entropy, we discuss each one separately. We also provide an introduction to fuzzy logic and information theory.

B. Information Theory Although information theory, founded by Shannon [21], [22], has dealt formally with communication systems at its incep tion, it has also been applied t o other areas such as clustering, fuzzy logic systems, decision making, t o mention a few. Information theory is concerned with quantification of information. The quantity of information is defined as the amount of information conveyed in an event and depends on the probability of the event [17]. The definition of the information quantity involves the logarithm of the probability of an event A, P(A) and is given in equation (1)where the minus sign is chosen t o make the Shannon entropy of A,I(A), positive, since log P ( A ) < 1.

I ( A ) = -109 P ( A ) .

(1)

The average information over all the events is called the entropy. It is usually called Shannon entropy if it refers to the classical information entropy, and is given by n

A . f i z z y Logic and fuzzy sets

H(X)=-

Fuzzy set theory has been developed recently to mimic the powerful capability of human reasoning to design systems that can deal effectively with complex processes. By definition, a fuzzy set is a set containing elements with varying membership degrees [4]. This is different from classical (crisp) sets in which elements have full membership in that set (i.e., their membership is assigned a value of 1). Elements of a fuzzy set are mapped t o a universe of membership values using a functiontheoretic form. The function maps elements of a fuzzy set into a real value belonging t o the interval [O,l]. Fuzzy set theory is very useful in modelling complex and imprecise systems. It has also been used very effectively in the area of control as a decision making system 131, [24]. A fuzzy logic controller consists of four main components: fuzzification interface, knowledge base, inference engine and defuzzification interface. The fuzzification interface performs a scale mapping that transfers values of the input variables into corresponding universes of discourse. The knowledge-base consists of a database that contains a process description, and a linguistic (fuzzy) control rule base that reflects the control goals. The third component, the inference engine, is the kernel of the FLC. It simulates the human decision-making logic and generates control actions based on rules of inference. In contrast to the fuzzification interface, the defuzzification compo-

0-7803-7293-X/01/$17.00 @ 2001 lEEE

nent generates a crispvalued control action. Several defuzzification methods have been proposed in the literature. The most common ones are the centroid or center of area method, the center of sums method, and the mean of maxima method. Further details on this could be found in [8], [6], [ll],(231, [l], ~41.

1135

Pk 109 Pk k=l

where X is a set of random variables and Pk is the set of all Pk = P [ x = Zk], and probabilities for the variables in k = 1 , 2 , ..n. In this survey, we try t o highlight several types of “entropy” measures of fuzziness. The rest of this paper is organized as follows: section 11. introduces different types of fuzzy entropy. Section 111. summarizes the work and highlights the future direction.

x;

11. FUZZY ENTROPY The measure of uncertainty is adopted as a measure of information. Hence, the measures of fuzziness are known as fuzzy information measures [15], [18]. The measure of a quantity of fuzzy information gained from a fuzzy set or fuzzy system is known as fuzzy entropy. Here, it should be highlighted that the meaning of fuzzy entropy is quite different from the classical Shannon entropy because no probabilistic concept is needed in order t o define it [2], [5], [16]. This is due t o the fact that fuzzy entropy contains vagueness and ambiguity uncertainties, while Shannon entropy contains the randomness uncertainty (probabilistic).

2001 IEEE International Fuzzy Systems Conference

The fuzzy entropy is defined using the concept of membership function. In 1972, De Luca and Termini [12] defined fuzzy entropy based on Shannon’s function and they introduced a set of properties for which a fuzzy entropy should satisfy them. The properties of fuzzy entropy are widely accepted and have become a criterion for defining any new fuzzy entropy. The fuzzy entropy proposed by De Luca and Termini is shown in equation (2). It is defined based on the concept of membership function, where there are n membership functions ( p i ) . j p i l o g b i ) + (1 - p L i ) W l - p i ) )

H A = -K

(2)

i=l

The four properties of this fuzzy entropy are: P-1 H A = 0 iff A is a crisp set ( p i = 0 or 1 V zi E A).

P-2 H A is maximum iff pi = 0.5 V xi E A. P-3 H 2 H’ where H’ is the entropy of A*, a sharpened version of A . P-4 H = H where H is the entropy of the complement

set A.

A . Probability Entropy of f i z z y Events In 1965, L. Zadeh found the fuzzy set theory [29]. He defined an entropy for a fuzzy event as the uncertainty associated with that event. This entropy is defined as a weighted Shannon entropy where the membership values are regarded as weights. In his work, Zadeh extended the Shannon entropy t o be applied as a fuzzy entropy on a fuzzy subset A for the finite set X = ( 2 1 , z2,..., z,} with respect t o the probability distribution P = ( p l , p 2 , ...,pn} [30]. Such an entropy is expressed as

1. Sharpness: d(f) = 0 e f ( X ) c (0, l},

f E [0,1Ix.

2. Valuation: d ( f A g ) + d ( f V g ) = d ( f ) + d ( g ) , f , g E [0,1Ix.

3. Generalized additivity: There exists two mappings s,t : [0,CO) + [0,CO) such that d ( f x 9 ) = d ( f ) t ( P ( g ) )+ s ( P ( f ) ) d ( g ) for all f E [0, 1Ix, g E [0,1Iy, where X and Y are finite sets.

C. Non probabilistic Entropy De Luca and Termini [12] are considered among the first to extend Shannon entropy t o the measurement of fuzziness and have given it an information interpretation. They characterize their fuzzy entropy by two major characteristics. First, when they introduce the fuzzy entropy as a measure of the degree of fuzziness, they state that the meaning of this quantity is “quite different from that of Shannon entropy to define it”. Second, they define the properties of any fuzzy entropy. A fuzzy entropy that intends t o measure the fuzziness uncertainty should satisfies those properties. The properties are explained in section (11.) De Luca and Termini defined a function H ( f )as an entropy on M(I) where I is a fuzzy set and M(I) is the class of all maps from I to a lattice (L). Although conceptually different, the function H ( f ) has certain similarities with the Shannon entropy. Its range is the set of non-negative real numbers and is defined as in equation (4), where n is the number of elements of I and K is a positive constant (usually equal to 1/N to get the normalized version of H ( f ).)

n

H = - C p i=l

A (zi)pilog(pi)

where p~ is the membership function of A and pi is the probability of occurrence of zi. One can notice that this situation contains the three types of uncertainties: randomness, ambiguity, and vagueness; that is randomness and Fuzziness.

From the definition of H ( f ) ,The maximum of H ( f ) is reached whenf(z) = 1/e for all z of I, that is f(z) # 0.5. Hence, the requirement of P 2 is not fulfilled. De Luca and Termini have realized that the H ( f ) is not proper fuzziness measure. They introduced a new functional entropy, d o ) , which they call entropy of the f u z z y set:

B. Shannon f i z z y Entropy Sander [19] has presented a characterization of a fuzzy entropy that is formally analogous t o the information Shannon entropy. He considered a discrete finite sets X = {el, z2, ..., z,} and Fp, p E [0,1], denotes constant fuzzy sets, that is Fp(zi)= p for all zi E X,a = 1 , 2 , ..., n. The power of a fuzzy set f E [0, 1Ix is defined as the quantity:

d(f) = H ( f ) + N.F)

where f is defined point by point as 1 - f(z),which is the complement of the fuzzy set. From equation (5), d(f) = d ( f ) , and d ( f ) can be written in terms of Shannon’s function S(z) = -zlnz - (1 - z ) l n ( l - z), as follows: n

d ( f 1 = -K

n

Sander deduced that the fuzzy entropy is any mapping d : [0, 1Ix + (0, CO). The Shannon fuzzy entropy is introduced as in equation (3).

n

H = -KC(pilog(pLi) i=l

f ( z i ) l n f ( z i ) ,c

> 0.

(3)

i=l

For fuzzy sets f and g, where the AND operation is the Min operator and the OR operation represents the Max operator, Sander has proved that the following properties have to be imposed on a fuzzy entropy d in order that d ( f ) = H ( f ) for all f E [0, 1Ix. The three properties are:

S(f(.i))

(6)

i= 1

The fuzzy entropy H (using the log function), is given by

n

~ ( f =) -c

(5)

+ (1 - pi)log(l -pi))

(7)

where p i is the membership function and K is a constant (= l/n). Note that H is expressed in terms of base 10 logarithm ’log’ and not the natural logarithm function ’ln’. The entropy in equation (7) satisfies the properties of the fuzzy entropy (Pl, P2, P3, and P4). It becomes one of the most popular fuzzy entropy used by others t o compute the average fuzziness of fuzzy set. This entropy can be interpreted

1136

as the average amount of information arising from fuzziness. Correspondingly, the entropy of a single fuzzy event xi in I is H f ( p i ) = S(pi(ri)) In 1974, De Luca and Termini have extended their entropy concept to the L-fuzzy set, which generalizes the range of the characteristic function to a lattice [13].

D. T o t a l Information M e a s u r e of fizzy Sets Total information measure of a fuzzy set was introduced by De Luca and Termini [12]. It is used widely by many users as a measure of uncertainty of information that contains the fuzziness and randomness uncertainties. A total entropy measure is required since to clarify the relationship between the information in an ordinary and a fuzzy set. There have been several attempts t o combine probabilistic (Shannon) and possibilistic (fuzzy) entropies. De Luca and Termini introduced this entropy as following: For a set I with randomly occurring events { x l , x z r ...,xn} in an experiment, one and only one event may occur in each trial with respective probabilities { p i , p z , ...,pn}. Fuzzification of set I induces two kinds of uncertainties: The uncertainty deduced from the “random” nature of the experiment. The average of this uncertainty is computed by Shannon entropy: n

H(Pl,Pz,...,Pn) = - C p i l o g ( P i ) , i=l

(8)

where H, in equation (8), also provides the average information which is received knowing which event has occurred.

some reasons, a new fuzzy set is deduced from the set A’ due to the change in A’ sharpness . The membership values of an element are changed from 0 to an arbitrary value in the range [0,0.5], and from 1 to an arbitrary value in the range [0.5,1]. In this way the ordinary set A‘ has been changed t o a fuzzy set A. The fuzzy set A includes two types of uncertainties: random uncertainty in an ordinary set and the fuzziness Uncertainty. Xie and Bedrosian have pointed out t o some drawbacks of the fuzzy total entropy of De Luca and Termini, explained in equation (10). They introduced another entropy of total uncertainty as

The entropy in equation (11) satisfies the four properties of the fuzzy entropy (Pl-P4). However, it has the following drawback: An equivalence has been established between fuzzy information and the Shannon information measures because both of them have the same mathematical form. If p i = pi, they conclude that the average amount of fuzzy information produced by a fuzzy set contains n elements is “equivalent” t o the Shannon’s average amount produced by the same number of independent binary Shannon information sources. There is no physical meaning for this equivalence, except that both measures yield the same numerical value, since both fuzzy and Shannon entropies are conceptually different. Another undesirable property of the total entropy is that it always reduces to H ( p 0 , p l ) if the fuzziness is removed, irrespective of the defuzzification process.

The uncertainty that arises form the fuzziness of the fuzzy set relative t o the ordinary set. This amount of ambiguity involved in the interpretatioii of xi (taking a decision on the element) and it is given by S(p2) = -PilOg(pi)

-

(1 - Pi)log(l - Pi).

The statistical average, m, of the ambiguity for the whole set is given in equation (9),while the total entropy can be considered as in equation (10). m ( p , p l , p z , ...,Pn)

Cpis(~i).

1

(9)

i= 1

+

Htotal = f J ( p l , P z , ...,pn) m ( / l , p l , p z , ...,pn). (10) Htotai may interpreted as the total average uncertainty related to a random experiment on I. The first uncertainty results from making a prediction about the element of I which will occur as a result of the experiment. The second uncertainty is due to taking a decision whether the value to be assigned to the element is either 1 or 0. If the fuzziness is removed, (i.e., m= O ) , the total entropy, Htotal, is reduced t o Shannon entropy H ( p l , p z , ...,pn). On the other hand, if randomness is removed, then Htotal = S ( p 0 . This means that there is no random experiment and only a fixed element, xi, will occur. Another total entropy of a fuzzy set was defined by Xie and Bedrosian [25], [26]. They considered a set containing the elements 0 and 1 with probabilities po and p1. A’ is an ordinary (sharp) set related to a fuzzy set A . Suppose that, for

1137

E. Hybrid Entropy Hybrid entropy, introduced by Pal and Pal [14], aims at tackling the drawbacks of the total entropy introduced by De Luca-Termini and Xie-Bedrosian. The basic idea behind this entropy is that since the fuzzy set is a generalization of the ordinary classical set, the entropy of fuzzy set can be a generalized version of classical entropy. In other words, the classical entropy becomes a special case when fuzziness is removed. It is worth to mention here that if the fuzziness is removed, the value of generalized entropy is decreased. They decided that in the example of 0 and 1introduced by Xie and Bedrosian, there would be only one uncertainty, which is the difficulty to determine whether an incoming symbol as 0 or l. This difficulty depends on two independent factors: probabilistic distribution and the possibilities (fuzziness) distribution. Such a measure is called the hybrid entropy. For digital communication over a noisy channel (same example used by Xie & Bedrosian for the total entropy), let po and p l be the probabilities of the occurrence of 0 and 1 symbols, respectively. Let pi denotes the membership functions of the fuzzy set “Symbol close t o 1” . The average likeliness of interpreting a received symbol as 0 , EO,and 1, E l , are expressed in 12 and 13, respectively. n

Both Eo and E1 are monotonically increasing functions of pi E [O,11. Since po and p l are the probabilities of occurrence of 0 and 1, respectively, the hybrid entropy of the fuzzy set A (symbol close t o 1) is defined in equation (14).

The hybrid entropy can be used as an objective measure for a proper defuzzification of a certain fuzzy set. It takes into account both probabilistic entropies in the absence of fuzziness.

A fuzzy entropy measure H A can be obtained from the above expression by dividing by the constant 0.5cardx. This entropy is then normalized t o 1.

The sum EXEX ( I p ~ ( 2-) (1 - p ~ ( z ) ) (in) equation (15) can be interpreted as the Hamming-distance, 91 ( A ,A'), between the sets A and A". Hence, H A can be written as

F. Higher Order Entropy The higher order entropy is also defined by Pal and Pal [14] to measure the average uncertainty associated with any arbitrary subset with T support (combination). De Luca and Termini entropy (section C.), gives an average amount of difficulty in taking a decision about an element's property. However, in many cases it is desired t o describe the property of a group of elements, i.e, a subset of the entire set. To tackle this problem, Pal and Pal established a higher order entropy in the following approach: let P be a fuzzy property set with a finite number of supports n, such that:

with pi denoting the degree t o which xi possesses the property P. A combination of r elements is composed out of n elements. Let S[ denote the i t h such a combination and p ( S [ ) denote the degree t o which the combination S [ , as a whole , possesses the property P. There are t =

( )

such combinations. The

entropy of order r of the fuzzy set A is defined as

H P will give a measure of the average amount of difficulty in taking a decision on any subset of size r with respect t o property P. If r = 1 then the entropy is reduced to that of De Luca & Termini. H' has the following properties: is maximum if pi = 0.5 and minimum if (pi 1) for all i = 1,2, ..,n.

1. H'

=

0 or

2. H' 2 H", where H" is the r t h order entropy of a s h a r p ened version of the fuzzy set.

3. H '

2 H'+'

for all pi E [0,0.5] and H'

2 H'+l

for all

pi E [0.5,1].

G. Yager Entropy Since the intersection between a fuzzy set and its complement is different form the empty set 4, Yager [27], [28] introduced a fuzzy entropy card (C) as a suitable fuzziness measure for the set A with PC (x) = min { P A (x), 1 - P A (x)1 = $ { P A ( % ) -k (1- P A ( 2 ) )

- I P A ( 2 ) - (1 - P A ( x ) ) l ) . Then for a finite universe 1 card(G) = -

x

(1 - IPA(%) XEX

-

(1 - p ~ ( 2 ) ) I ) .

Alternatively, the Euclidean distance, p2(A,A"), can be used instead of the Hamming distance t o obtain an equivalent H A . The Euclidean distance is given by

H. Kaufmann Entropy Kaufmann [7] introduced an entropy t o measure the fuzziness of the fuzzy set as follows:

where

Kaufmann noticed that this method of computing the entropy for a fuzzy set does not depend on accounting for the effective values of p, and instead it does for their relative values. The relative values come from the function given in equation (16). This is a drawback for this measure because the relative values lead t o the same entropy for different fuzzy sets and ordinary sets.

I. Geometry of Fuzzy Set and Entropy Kosko [9], [lo] introduced the fuzzy entropy of a fuzzy set based on the fuzzy set geometry and distances between them. He viewed the fuzzy set as a fuzzy message and generalized the bit value (bivalent) representation of an ordinary set t o a fuzzy set. After that, Kosko initiated the Fit (Fuzzy unit) vectors as a generalization of bit vectors. For example, if X = {x1,22, ..., xn} is a crisp set, then the subset A = { 2 1 , 2 2 } can be represented by a bit vector A = {l,O,l,O}. In this presentation, the 1's represent the presence of the element in the subset while the 0's represent their absence. Kosko has used the geometry of fuzzy sets t o tackle the problem of fuzzy power set. This problem can be summarized in the following question: what does the fuzzy power set F = {2"}, the set of all fuzzy subset, look like? He introduced the answer as a cube (hypercube) where any fuzzy subsets is a point in the cube. The set of all fuzzy subsets is the unit hypercube In = [0, lIn. The geometry of fuzzy sets involves both the domain X = {x1,x2, ...,2,) and the range of m a p are non-fuzzy ping mA = X + [0,1]. Vertices of the cube In sets. All the points inside the cube are fuzzy subsets and the coordinates of each point are the values of the membership function. On the vertices the "0" and the "1" represent the absence or presence of the element (the Fit vector ).

1138

To define a fuzzy entropy, Kosko uses the hypercube and the distance between the fuzzy subset A and the nearest vertex A,,,, and the distance between the fuzzy subset A and the farthest vertex A f a r as follows: If a denotes the distance l(A,Anea,)to the nearest vertex, and b denotes the distance l ( A , A f , , ) t o the farthest vertex, then the fuzzy entropy, H ( A ) , is the ratio of a to b. In other words,

This entropy satisfies the properties of the fuzzy entropy (De Luca and Termini properties). Moreover, Kosko used the overlap (intersection) and underlap (union) t o define another entropy based on sigma count. Sigma count is the cardinality measure of a set. Since the overlap and the underlap characterize fuzziness, they can be involved in the measure of fuzziness. Kosko used the geometry of hypercube to prove that the fuzzy entropy of a fuzzy set is given by

H ( A )=

M ( A n A") M ( A U A") '

Although the fuzzy entropy defined by Kosko satisfies the four properties of the fuzzy entropy (section II.), it still suffer from a drawback. It basically pertains to the resources of the system. The absence high performance machines may lead to significant non-desirable computing overhead. To calculate the fuzzy entropy of a fuzzy set of 100 elements, for instance, a 100-dimension hypercube has to be created. In a similar work, Shang and Jang [20] have introduced a new fuzzy measurement (entropy) based on the membership function of the intersection and the union of a fuzzy set and its complement. If A is a fuzzy set, A" is its complement, p A n A c ( Z i ) denotes the membership function value of their inC denotes the membership function tersection, and ~ A U A (xi) value of their union, then a fuzzy entropy can be defined as:

They have proved that H(A) satisfies the De Luca-Termini axioms, (Pl-P4), mentioned in section (II.), page 2. E'urthermore, H(A) has the following properties in addition t o the four axioms:

+

1. H(AU B ) + H ( A n B ) = H ( A ) H ( B ) . 2. By analyzing the expression H(A) , a form of H ( A ) can be found similar to the fuzzy entropy proposed by Kosko [9], [lo] based on sigma count as in equation 17, where M(A) = m ~ ( ~ i ) .

111. CONCLUSION Different types of fuzzy information measures are presented. The fuzzy entropy, as an extension of the classical entropy, is introduced with different aspects as a measures of the fuzzy set information. Main features and drawbacks of each entropy are also highlighted. It is important t o mention that each of the various definitions of entropies have been used for different types of applications. The authors are currently compiling and surveying the scope of applications of the different entropies and will certainly show their binding once this has been completed.

1139

REFERENCES [l] M. Brown and C. Harris. Neurufuzzy Adaptive Modeling and Control. Prentice Hall, 1994. [2] F. Criado and T. Gachechiladze. Entropy of fuzzy events. Fuzzy sets and systems, 88:99-106, 1997. [3] C. W. de Silva. Intelligent Control fizzy Logic Applications. CRC Press, 1995. [4] D. Dubois and H. Prade. Fuzzy Sets and Systems: Theory and Applications. New York: Academic Press, 1980. [5] J. Herencia and M. Lamata. Entropy measure associated with

fuzzy basic probability assignment. In IEEE int. Conf. On fuzzy systems, volume 2, pages 863-868, 1997. [6] J. S. R. Jang, C. T. Sun, and E. Mizutani. Neuro-Fuzzy and Soft Computing. Prentice Hall, 1997. 171 A. Kaufmann and M. Gupta. Introduction to fuzzy arithmetic: theory and applications. New York Van Nostrand Reinhold Co., 1985. [8] G. Klir, U. Clair, and Boyuan. Fuzzy set theory, foundations and applications. Prentice-Hall, 1988. [9] B. Kosko. Fuzzy entropy and conditioning. Information Sciences, 40:165-174, 1986. [lo] B. Kosko. Concepts of fuzzy information measure on continuous domains. Int. Journal o n general systems, 17:211-240, 1990. [ll] C. T. Lin and C. S. G. Lee. Neural fizzy Systems: A NeuruFuzzy Synergism to Intelligent Systems. Prentice Hall, 1996. [12] A. D. Luca and S. Termini. A definition of non probabilistic entropy in the setting of fuzzy set theory. Information and Control, 20:301-312, 1972. [13] A. D. Luca and S. Termini. Entropy of L-fuzzy set. Information and Control, 24(1):55-73, 1974. [14] N. Pal and S. Pal. higher order fuzzy entropy and hybrid entropy of a fuzzy set. Information Sciences, 61:211-221, 1992. [15] A. Ramer. Fuzziness vs probability. Int. Journal on general systems, 17:241-248, 1990. [16] B. Reecan and D. Markechova. The entropy of fuzzy dynamical systems, general scheme and generator. fizzy sets and systems, 96:191-199, 1998. [17] A. Robert. Information theory. New York: Dover Publications, 1990. [18] I. Rudas and M. Kaynak. Entropy-based operations on fuzzy sets. IEEE transactions on fuzzy systems, 6(1):33-39, 1998. [19] W. Sander. On measures of fuzziness. fizzy sets and systems, 29:49-55, 1989. [20] X. Shang and W.Jiang. A note on fuzzy information measures. Pattern recognition letters, 18:425432, 1997. [21] C. Shannon. A mathematical theory of communication. Bell Syst., Tech. J., 27:379-423, 1948. [22] C. Shannon and W. Weaver. The mathematical theory of communication. Urbana: University of Illinois Press, 1949. [23] L. Tsoukalas and R. Uhrig. fizzy and Neural Approaches in Engineering. John Wiley and Sons,INC., 1997. [24] L. X. Wang. A Course in Fuzzy Systems and Control. Prentice Hall, 1997. [25] W. Xie and S. Bedrosian. An information measure for fuzzy sets. IEEE Transactions on systems, man, and cybernetics, 14(1), 1984. [26] W. X. Xie and S. Bedrosian. The information in fuzzy set and the relation between Shannon and fuzzy information. In 17th annu. conf. Information Sciences and Systems, pages 24S254, 1983. [27] R. Yager. On the measure of fuzziness and negation. part i: Membership in the unit interval. International Journal on General systems, 5:221-229, 1979. [28] R. Yager. On the measure of fuzziness and negation. part ii: Lattice. Information and control, 44:236260, 1980. [29] L. Zadeh. Fuzzy set. Information and Control, 8:338-353, 1965. [30] L. Zadeh. Probability measures of fuzzy events. J. Math. Anal. Appl., 23:421-427, 1965.