Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
366 views

Fuzzy Logic & Machine Learning - PPT

The document is a tutorial on fuzzy logic and machine learning presented by H.R. Tizhoosh at the IEEE World Congress on Computational Intelligence (WCCI) in 2016. It provides an overview of fuzzy logic and machine learning, and discusses their relationship. The tutorial covers the history and foundations of fuzzy logic, including fuzzy sets, operations on fuzzy sets like intersection and union, and T-norms used to combine membership functions.

Uploaded by

venjyn
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
366 views

Fuzzy Logic & Machine Learning - PPT

The document is a tutorial on fuzzy logic and machine learning presented by H.R. Tizhoosh at the IEEE World Congress on Computational Intelligence (WCCI) in 2016. It provides an overview of fuzzy logic and machine learning, and discusses their relationship. The tutorial covers the history and foundations of fuzzy logic, including fuzzy sets, operations on fuzzy sets like intersection and union, and T-norms used to combine membership functions.

Uploaded by

venjyn
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 138

Tutorial, IEEE WCCI 2016, Vancouver, Canada

Fuzzy Logic &


Machine Learning
H.R.Tizhoosh
KIMIA Lab, University of Waterloo, Canada

tizhoosh.uwaterloo.ca :: tizhoosh@uwaterloo.ca
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Overview

u Fuzzy Logic
u Machine Learning
u Fuzzy Logic and Machine Learning
u A Critical Review
Fuzzy Logic
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Fuzzy Logic

u 1920: Three-Valued Logic (Jan Łukasiewicz)

Source: https://en.wikipedia.org/wiki/Three-valued_logic

Vagueness Source: www.uspsoig.gov


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Fuzzy Logic

u 1923: Paper on vagueness (Bertrand Russell)

All traditional logic habitually assumes


that precise symbols are being employed. It
is therefore not applicable to this
terrestrial life, but only to an imagined
celestial existence.

Bertrand Russell

Vagueness
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Fuzzy Logic

Vagueness
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Fuzzy Logic

The vagueness of the word chair is typical of all terms


whose application involves the use of the senses. In all
such cases "borderline cases" and "doubtful objects"
are easily found to which we are unable to say either
that the class name does or does not apply.

Max Black
Vagueness: An exercise in logical analysis, 1937

Vagueness
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Probability Theory Fuzzy Logic

Yesterday Today
Will tomorrow rain? It is raining!
But what is the rain intensity?
Drizzle, light, moderate, heavy, extreme?

Event occurs

Uncertain vs. Vague


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

On Friday, April 23, 1954, at 2:33 p.m. as Miss Emily


Grierson died, 100% of population of our town went
to her funeral: the men through a in 5% of cases usual
respectful affection for a fallen monument, 97.23% of
women out of curiosity to see the inside of her house,
which no one save an 76 years and 7 months old
manservant– 50% gardener, 50% cook--had seen in
>= 10 years.

Uncertain vs. Vague


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

When Miss Emily Grierson died, our whole town


went to her funeral: the men through a sort of
respectful affection for a fallen monument, the women
mostly out of curiosity to see the inside of her house,
which no one save an old manservant--a combined
gardener and cook--had seen in at least ten years.

William Faulkner
A Rose for Emily

Uncertain vs. Vague


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Modeling of Real World


Modeling of Human Reasoning

Set Theory
Logic
Measure Theory
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1965: Paper of Fuzzy Sets (Lotfi Zadeh)


u 1966: Patern Recognition as interpolation of membership functions (Zadeh et al.)

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1965: Paper of Fuzzy Sets (Lotfi Zadeh)


u 1966: Patern Recognition as interpolation of membership functions (Zadeh et al.)

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

#1 if x∈A
Characteristic function f A (x ) = "
!0 if x∉A

The Law of Non-Contradiction A ∩ A = ∅ ,


The Law of Excluded Middle A ∪ A = X .

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A = {(x , µ A (x )) x ∈ X }

A = ∫ µ A (x )/ x
X

Example: Consider the set Neighbors of 4

Aclassic = {3, 4, 5}

& 0.6 0.9 1.0 1.0 1.0 0.9 0.6 #


A Fuzzy =% , , , , , , "
$ 1 2 3 4 5 6 7 !

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

threshold thresholds

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Intersection

µ A∩ B ( x) = min (µ A ( x), µ B ( x) ) ∀x ∈ X

A∩ A ≠ ∅

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Union

µ A∪ B ( x) = max(µ A ( x), µ B ( x) ) ∀x ∈ X

A∪ A ≠ X
Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Complement

µ A ( x) = 1 − µ A ( x) ∀x ∈ X

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

T-Norm
Boundary T(0, 0) =0 , T(a, 1) = T(1, a) = a
Monotonicity a<=c & b<= d -> T(a, b)<=T(c, d)
Commutativity T(a, b) = T(b, a)
Associativity T(a, T(b, c)) = T(T(a, b), c)

Example:
1
" %
Yager (
w
1 − min $1, (1 − a ) + (1 − b)
$#
w
)
w
'
'&
w ∈ (0, ∞ )

Dubois & Prade ab


max(a, b,α )
α ∈ (0,1)

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

T-CoNorm (S-Norm)
Boundary S(1, 1) =1 , S(a, 0) = S(0, a) = a
Monotonicity a<=c & b<=d -> S(a,b)<= S(c, d)
Commutativity S(a, b) = S(b, a)
Associativity S(a, S(b,c)) = S(S(a,b), c)

Example:
1
Yager ! $
(
min #1, a w + b w ) w
& w ∈ (0, ∞ )
" %
Dubois & Prade a + b − ab − min(a, b,1 − α )
α ∈ (0,1)
max(1 − a,1 − b, α )

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Hypercube Presentation

fuzzy set with one member fuzzy set with two members

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Memberships
Membership as similarity
Generating MFs
Membership as probability 1. Subjective:
Membership as intensity
intuition, expertise, knowledge
Membership as approximation 2. Automatic:
Membership as compatibility
Clustering, Neural nets, Genetic Algorithms
Membership as possibility

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Memberships

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Sets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1972: Fuzzy controller for steam engine (Assilian and Mamdani)

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1972: Fuzzy controller for steam engine (Aslani and Mamdani)

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1972: Fuzzy controller for steam engine (Aslani and Mamdani)

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

COA Center of Area

z =β

∑ z ⋅ µ (z )
z =α
Z crisp = z =β

∑ µ (z )
z =α

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Photograph of the “Fuzzy steam engine”, Queen Mary


College, 1974, reprint courtesy of Brian Gaines

Fuzzy Controller
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1969: Concept of Fuzzy Partitioning (Enrique Ruspini)

Fuzzy Clustering
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1973: First Fuzzy Clustering algorithm (FCM by Dunn and Bezdek)

Fuzzy Clustering
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

# c & N
Fuzzy  C-­‐Means M fc = $U µik ∈ [0,1]; ∑µik = 1; 0 < ∑ µik < N '
% i=1 k=1 (

1.  Initialization
N
m
∑(µik ) xk
2.  Class  Centers vi = k=1
N
m
∑(µik ) 1
k=1
µik =
3.  Update  Membership  
2
C!d $ m−1

∑## dik &&


j =1 " jk %

4.  Condition   U t −U t−1 ≤ ε

Fuzzy Clustering
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

FCM

With  2  features

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Clustering
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1975: The Concept of Linguistic Variables (Lotfi Zadeh)

Linguistic Variables
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1975: The Concept of Linguistic Variables (Lotfi Zadeh)

Linguistic Variables
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

„In retreating from precision in the face of


overpowering complexity, it is natural to explore the use
of what might be called linguistic variables, that is,
variables whose values are not numbers but words or
sentences in a natural or artificial language.“

Zadeh

Linguistic Variables
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

(x , T(x), U , G , M )

Semantic rules
Variable name

Set of its terms


Syntax rules

Universe of discourse

Linguistic Variables
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Word Average
always 99
very often 88
ususally 85
often 78
relatively often 65
balanced 50
from time to time 20
(Simpson, 1944)
sometimes 20
not usually 10
seldom 10
very seldom 6
almost never 3
never 0

Linguistic Hedges
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

µCON (x ) = [µ (x )]2 $!2[µ (x )]2 for 0 ≤ µ (x )≤ 0.5


µ INT (x ) = #
!"1 − 2[1 − µ (x )]2 else
µ DIL (x ) = [µ (x )]0.5
Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Linguistic Hedges
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

µA is known (bright, cold, tall, old,…)

µvery A = CON(A)

µmore or less A = DIL(A)

µvery very A = CON(CON((A))

µnot very A = 1− (CON(A))

µ more A = (A)1.25

µ less A = (A)0.75

Linguistic Hedges
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

If a pixel is bright, then it is with high probability noise


Logic Truth Sets Identity

Classical Reasoning {0,1} Crisp Yes

tion Fuzzy Reasoning [0,1] Crisp Yes


axa
Rel Approximate Reasoning [0,1] Fuzzy Yes

Plausible Reasoning [0,1] Fuzzy NO

Fuzzy Logic
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

1+4=5

‘about 1’ + ‘approximately 4’ = ?

Fuzzy Arithmetic
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Extension Principle

µ A⊕ B (z ) = sup [min[µ A (x ), µ B (y )]]


x+ y= z

Source: H.R.Tizhoosh, Fuzzy Bildverarbeitung, Springer, 1998

Fuzzy Arithmetic
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Dual Set A = {x x ∈ X , x has a certain property }

~
Fuzzy Set A = &% -+ x, µ ~ (x )*( x ∈ X , µ ~ (x )∈ [0,1] #"
$, A ) A !

~ '! / , $!
Fuzzy Set A = & -- (x , u ), µ ~ (x , u )** x ∈ X, u ∈ J x ⊆ [0,1] #
~ A
Type II !% . ~ + !"

Type II Fuzzy Sets


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Classical Set

Fuzzy Set

Fuzzy Set Type II

Type II Fuzzy Sets


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u Mid 1970s – late 1980s: Booming of fuzzy applications in Japan and Europe

Rice Cooker Washing Machines Sendai Subway

Fuzzy Boom
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1993: ANFIS (J.S.R. Jang)

▪ ANFIS (adaptive-network-based fuzzy inference system)


▪ A fuzzy inference system implemented in the framework
of adaptive networks
▪ hybrid learning procedure
▪ ANFIS can construct an input-output mapping based on
both human knowledge (in the form of fuzzy if-then
rules) and stipulated input-output data pairs

Fuzzy Learning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1993: ANFIS (J.S.R. Jang)

Fuzzy Reasoning

Equivalent ANFIS

Fuzzy Learning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1993: ANFIS (J.S.R. Jang)

Fuzzy Subspace

ANFIS with 9 rules

Fuzzy Learning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 2004: Evolving Fuzzy Systems (P.P. Angelov et al.)

Fuzzy Evolution
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Evolution
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Evolving Fuzzy Image Segmentation

Fuzzy Evolution
Machine Learning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Heating Pressure Rice


Cooker & Warmer

Bar Code Reader


OCR Pen

Smart Search
Source: http://earthxxii.deviantart.com/art/Etnogenez-Birth-of-Artificial-Intelligence-402715744
https://wall.alphacoders.com/by_sub_category.php?id=205999
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Machine Learning

u 1901: First works on PCA (K.Person)


u 1933: PCA development (H. Hotelling)
u 2002: Principal Component Analysis (book by I. Jolliffe)

PCA Source: http://taygetea.com/, https://cnx.org/


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Machine Learning

Source: https://www.youtube.com/watch?v=4pnQd6jnCWk

PCA
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Brief History of Machine Learning

>>> import numpy as np


>>> from sklearn.decomposition import PCA
>>> X = np.array([[-1, -1], [-2, -1], [-3, -2], [1, 1], [2, 1], [3, 2]])
>>> pca = PCA(n_components=2)
>>> pca.fit(X)
PCA(copy=True, n_components=2, whiten=False)
>>> print(pca.explained_variance_ratio_)
[ 0.99244... 0.00755...]

Source: http://scikit-learn.org/

PCA
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1950: Alan Turing’s seminal paper

Source: Copeland, B. J. Artificial Intelligence (Oxford: Blackwell, 1993)

Turing Test
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1980: Searle, John R., Minds, Brains, and programs

The Chinese Room


Source: macrovu.com
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1958: Perceptrons (F. Rosenblatt)


u 1969: Limitations of Perceptrons (Minsky and Papert)

w1
x1
w2
x2
Σ f y
wn
xn

Perceptron
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1982: Self-Organizing Maps (T. Kohonen)

Output layer

Input layer

SOM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1982: Self-Organizing Maps (T. Kohonen)

winning
neuron

neighborhood

SOM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1986: Backpropagation (Rumelhart et al.)

f Σ f Σ

f Σ f Σ f Σ

f Σ f Σ

BackProp
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1986: ID3 algorithm (J.R.Quinlan)


u 1993: C 4.5 algorithm (J.R.Quinlan)

Decision Trees
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Decision Trees
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Positive and negative Samples:

Expected information of the tree with A as root:

Information gain for taking A:

Decision Trees
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1995: Support Vector Machines (Cortes and Vapnik)

SVM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

The optimal
hyperplane

Optimal hyperplane
as linear
combination of
support vectors

Linear decision in the


feature space

SVM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Labelled
training data

Linearly
separable

Or
rewritten

The optimal
hyperplane

Distance
between two
classes

..and for the


optimal
hyperplane

SVM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Non-Linearly Separable Cases

Feature Space
Input Space

SVM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1995: Boosting (Freund and Shapire)

Main idea:

Combine multiple weak classifier (instead


of trying to find the best classifier) in order
to increase classification accuracy

Boosting
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Boosting
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1995: Boosting (Freund and Shapire)

Given:
Initialize
For
Train weak learner with Dt
Get weak hypothesis and its error:

Set

Update

Output final solution:

Boosting
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1995: Random Decision Forests (T.K. Ho)


u 2001: Random Forests (L.Breiman)

o A forest is a collection of several hundred to several


thousand trees.
o The forest error depends on:
1. Correlation between trees: more correlation
more error
2. The quality of each individual tree (lower error)

Random Forests
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Ensemble learning with a multitude of trees

Forest

Random Forests
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

u 1995: Convolutional Neural Networks (LeCun and Bengio)


u 2006: Fast learning for deep belief nets (Hinton et al.)
u 2007: Greedy layer-wise training for deep nets (Bengio et al.)

Idea:
Neural networks can learn difficult recognition tasks if designed
with more hidden layers
(i.e., more than 5 hidden layers)

Challenge:
Training of deep networks was until recently practically
impossible

Deep Nets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

convolution
max pooling fully connected

Deep Nets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

[-1.9 3.7 12.9 -23.8 … 21.6]

Deep Nets
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Query
Images

Deep Nets
Fuzzy Logic &
Machine Learning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Random Forest

u Random forests, ensembles of weakly-correlated decision trees, can be used in


concert with fuzzy logic concepts to both classify storm types based on a
number of radar-derived storm characteristics and provide a measure of
“confidence” in the resulting classifications.
u The random forest technique provides measures of variable importance and
interactions, as well as methods for addressing missing data for transforming
the input data and structuring the final classification algorithm.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Random Forest

u Fuzzy Random Forest: A Multi-classifier based on a “forest” of randomly


generated fuzzy decision trees
u Combining
u the robustness of multi-classifiers
u the construction efficiency of decision trees
u the power of the randomness o increase the diversity, and
u the flexibility of fuzzy logic for data managing
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Random Forest

1. Start with examples set of entry


2. At any node N still to be expanded, compute the number of examples of each class.
3. Compute the standard information content.
4. At each node search the set of remaining attributes to split the node
• Select with any criteria, the candidate attributes set to split the node.
• Compute the information content to each child node obtained from each candidate attribute.
• Select the candidate attribute such that information gain is maximal.
5. Divide N in sub-nodes according to possible outputs
of the attribute selected in the previous step.
6. Repeat steps 2-5 to stop criteria is satisfied
in all nodes.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Random Forest


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Random Forest

u A multi-classifier system based on a forest of randomly generated


fuzzy decision trees (Fuzzy Random Forest),
u New method to combine their decisions to obtain the final decision of
the forest.
u The proposed combination is a weighted method based on the concept
of local fusion and on the data set Out Of Bag (OOB) error.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

u Principal Components Analysis is sensitive to outliers, missing data,


and poor linear correlation between variables.
u Data transformations have a large impact upon PCA.
u Robust fuzzy PCA algorithm (FPCA): The matrix data is fuzzified, thus
diminishing the influence of the outliers.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

u Nonlinear fuzzy robust principal component analysis (NFRPCA) algorithm


u After this preprocessing step the similarity classifier is then used for the
actual classification.
u The procedure was tested for dermatology, hepatitis and liver-disorder
data.
u Compared to results with classical PCA and the similarity classifier, higher
accuracies were achieved with the approach using nonlinear fuzzy robust
principal component analysis and the similarity classifier.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

u A new method called PCA-TF is proposed that allows performing PCA on data sets of
trapezoidal (or triangular) fuzzy numbers, that may contain also real numbers and
intervals.
u A group of orthogonal axes is found that permits the projection of the maximum
variance of a real numbers’ matrix, where each number represents a trapezoidal fuzzy
number.
u The initial matrix of fuzzy numbers is projected to these axes by means of fuzzy
numbers arithmetic, which yields Principal Components and they are also fuzzy numbers.
u Based on these components it is possible to produce graphs of the individuals in two-
dimensional plane.
u It is also possible to evaluate the shape of the ordered pairs of fuzzy numbers and
visualize the membership function for each point on the z axis over the two-dimensional
xy plane.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

[4] T. Denoeux, M. Masson Principal Component Analysis of Fuzzy Data


using Autossociative Neural Networks. IEEE Transactions on fuzzy
systems, 12:336-349, 2004.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

u Hybrid approach: Fuzzy + PCA


u A new PCA based monitoring that uses fuzzy logic capability
u The reason to use fuzzy logic: its good ability to approximate nonlinear
function with arbitrary accuracy
u Tested on Tennessee Eastman Process
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & PCA

The Tennessee Eastman (TE) Challenge Process


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

u A support vector machine (SVM) learns the decision surface from two distinct
classes of the input points.
u A fuzzy membership is applied to each input point
u The SVMs are reformulated such that different input points can make different
contributions to the learning of decision surface.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM


Since the fuzzy membership
is a function of the mean
and radius of each class,
these two outliers
are regarded as less
important in FSVM training

The result of SVM learning The result of FSVM


for data sets with outliers learning for data
sets with outliers.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

u In SVM, an n-class problem is converted into n two-class problems.


u For the i-th two-class problem ones determines the optimal decision function
which separates class i from the remaining classes.
u Using the decision functions obtained by training the SVM, for each class, one
can define a truncated polyhedral pyramidal membership function.
u Since, for the data in the classifiable regions, the classification results are the
same for the two methods, the generalization ability of the FSVM is the same
with or better than that of the SVM.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

Unclassifiable region by Class boundary with membership functions


the two-class formulation
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM


u In the absence of additional information, fuzzy membership values are
usually selected based on the distribution of training vectors
u A number of assumptions are made about the underlying shape of this
distribution.
u An alternative method of generating membership values: generate
membership values iteratively based on the positions of training vectors
relative to the SVM decision surface itself.
u The algorithm is capable of generating results equivalent to an SVM with a
modified (non distance based) penalty (risk) function.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

u The use of Receiver Operating Characteristic (ROC) Curve and the area under
the ROC Curve (AUC) has been used as a measure of the performance of
machine learning algorithms.
u A SVM classifier fusion model using genetic fuzzy system.
u Genetic algorithms are applied to tune the optimal fuzzy membership
functions.
u The performance of SVM classifiers are evaluated by their AUCs.
u AUC-based genetic fuzzy SVM fusion model produces not only better AUC but
also better accuracy than individual SVM classifiers.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM


1.0

TP (True Positive) Rate

0.5
better

0.0

0.0 0.5 1.0


FP (False Positive) Rate
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

u How to solve the sensitivity of SVM to noise and outliers


u Characterizations of fuzzy support vector machine (FSVM) can be analyzed.
u But the determination of fuzzy membership is a difficulty.
u New fuzzy membership function is proposed.
u Each sample points is given the tightness arranged forecasts by this method
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & SVM

u Fuzzy support vector machines (FSVMs) and the extraction of fuzzy rules from
SVMs.
u An FSVM is identical to a special type of SVM.
u Categorization and analysis of existing approaches to obtain fuzzy rules from
SVMs.
u Questioning the sense of extracting fuzzy rules from SVs:
• Simpler methods that output prototypical points (e.g., clustering approaches) can
used.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

u Fuzzy Restricted Boltzmann Machine (FRBM): the parameters governing


the model are replaced by fuzzy numbers.
u The original RBM becomes a special case in the FRBM, when there is no
fuzziness in the FRBM model.
u In the process of learning FRBM, the fuzzy free energy function is
defuzzified before the probability is defined.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

RBM
parameters as fuzzy numbers

FRBM
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

u Text summarization: Measuring the worth of sentences for a summary


u Associating the Deep learning algorithm with fuzzy logic
u The fuzzifier is a process of translating the inputs into feature values.
Based on fuzzy values, rules are generated for each sentence by the
weight given to the features.
u A rule can be defined for the proposed approach as, a set of features
value is considered for judging the importance of sentences. The rules
are composed based on the importance of the each sentence.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

“The proposed text summarization


algorithm uses the fuzzy logic system has to assign
class labels for the sentences, in order to compute
the importance of each sentence. The fuzzy logic
system accepts the pre summarized set of
documents as input”
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

u Embedding prior knowledge into the learning structure


u Two-step semi-supervised learning method called fuzzy deep belief networks
(FDBN) for sentiment classification
u A general deep belief networks (DBN) is trained by the semi-supervised
learning taken on training dataset.
u Then, a fuzzy membership function for each class of reviews is designed based
on the learned deep architecture.
u Since the training of DBN maps each review into the DBN output space, the
distribution of all training samples in the space is treated as prior knowledge
and is encoded by a series of fuzzy membership functions.
u A new FDBN architecture is constructed and the supervised learning stage is
applied to improve the classification performance of the FDBN.
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

Training of FDBN is two stages:


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

u Neural networks increasingly adopted in the prediction of exchange


rate
u However, most of them predict a specific number
u Small gap between the predicted values and the actual values may
lead to disastrous consequences.
u Forecast the fluctuation range of the exchange rate by combining
Fuzzy Granulation with Continuous-valued Deep Belief Networks
(CDBN)
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Fuzzy Logic & Deep Learning

u Detecting abnormal behaviors in surveillance videos


u Using fuzzy clustering and multiple Auto-Encoders (FMAE).
u Many types of normal behaviors in the daily life: fuzzy clustering
u Multiple Auto-Encoders to estimate different types of normal behaviors
u Auto-Encoder is a good tool to capture common structures of normal
video due to large redundancies
A Critical Review
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Critical Review

u Machine learning has been changing rapidly


u “Hot topics” are changing quickly
u ML researchers are working on deep learning, manifold learning, structured
output pre-diction, sparsity and compressed sensing, constructive induction,
etc.,
u Most fuzzy researchers are still occupied with rule induction (“a topic that
matured and essentially stopped in ML research in the 1990s”).
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Critical Review

u Most works extend ML methods by delivery a fuzzy extension, e.g.,


u from rule induction to fuzzy rule induction,
u from decision trees to fuzzy decisions trees,
u from nearest neighbor estimation to fuzzy nearest neighbor estimation,
u from support vector machines to fuzzy support vector machines
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

A Critical Review
u “Fuzzification” of ML methods can be questioned:
u The intellectual challenge is typically not very high (scientific contribution
most likely not very deep)
u Increased flexibility through fuzzification could also be achieved by passing to
a more flexible non-fuzzy model class (e.g., using SVMs with Gaussian instead
of linear kernels)
u More flexibility may be a disadvantage (i.e., risk of overfitting)
u Increased computational complexity
u In some cases, the link to fuzzy sets and fuzzy logic appears to be somewhat
artificial (member-ship functions are used as a weighting function)
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

…however, potentials are There

Modeling:
u We need a suitable formalization of the problem
u Often overlooked in machine learning
u Fuzzy logic has much to offer in this regard
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

…however, potentials are There

Non-Inductive Inference:
u Transfer learning: Taking advantage of what has been learned in one domain
while learning in another domain
u Knowledge transfer: largely similarity-based or analogical
u Fuzzy inference can support that kind of formal reasoning
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

General Rule

Induction Deduction

Observation Prediction

Specific Instance
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Induction Deduction

u Going from specific to general u Going from general to specific


u Reasoning from evidence u Draw a conclusion follows
(observations/data) to draw a logically
conclusion (establish a
hypothesis)
u Always less certain than evidence u Conclusion true when premise
true
u Used because examining all
observations may be impossible/ u Used to exploit linguistically
infeasible formulated knowledge
u Example: Neural Networks, all
types of classifiers u Example: Fuzzy rules
established by experts
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

…however, potentials are There

Uncertainty:
u Uncertainty is everywhere
u Fuzzy framework can contribute to representation and handling of uncertainty
u Possibility theory for uncertainty formalisms, such as belief functions, and
imprecise probabilities, can be used for this purpose
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

Question: John is about 18, is he allowed to vote?

Answer: It is quite possible, but not certain

The uncertainty of an event is described in possibility theory at the


same time :

• by the degree of possibility of this event and


• by the degree of possibility of the opposite event
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

Ω Reference Set

A⊆Ω Event

g ( A) Real number measuring the confidence in the


occurrence of A

g (∅ ) = 0, g (Ω ) = 1
Impossible event Sure event
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

Monotonicity with respect to inclusion:

A ⊆ B ⇒ g (A)≤ g (B )

Such set functions are called:


Fuzzy measures, valuations or confidence measures
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory
Consider the confidence measure concerning disjunctions:

∀A, B ⊆ Ω, g (A ∪ B )≥ max(g (A), g (B ))


The possibility measure:

∀A, B ⊆ Ω, Π (A ∪ B ) = max(Π (A), Π (B ))

Finite reference set:

∀A, Π (A) = sup{π (ω )ω ∈ A}


π (ω ) = Π ({ω})
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory
Consider the confidence measure concerning conjunctions:

∀A, B ⊆ Ω, g (A ∩ B )≤ min (g (A), g (B ))


The necessity measure:

∀A, B ⊆ Ω, N (A ∪ B ) = min (N (A), N (B ))

Finite reference set:

∀A, N (A) = inf {


1 − π (ω )ω ∉ A}
π (ω ) = Π ({ω})
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

∀A, Π (A) = 1 − N (A )

Contradictory events:

max (Π (A), Π (A ))= 1

min (N (A), N (A ))= 0


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

Probability P(A ∪ B ) = P(A)+ P(B ) A∩ B = ∅

Possibility
Π (A ∪ B ) = max(Π (A)+ Π (B ))
Necessity

N (A ∩ B ) = min (N (A)+ N (B ))
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

∀A, P(A) = ∑ p(ω )


ω∈ A
∀A, Π (A) = sup{π (ω )ω ∈ A}
N (A) = inf {
1 − π (ω )ω ∉ A}

∀ω , ∑ p(ω )= 1
ω∈Ω

∃ω , π (ω ) = 1
WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

P(A)+ P (A )= 1 max (Π (A), Π (A ))= 1


Π (A)+ Π (A )≥ 1
N (A)+ N (A )≤ 1

∀A, N (A)≤ P(A)≤ Π (A)


WCCI 2016 :: Tutorial by H.R.Tizhoosh, KIMIA Lab, University of Waterloo :: http://tizhoosh.uwaterloo.ca/

Possibility Theory

Total Ignorance

1 100 events: p=0.01


∀ω ∈ Ω, p(ω ) =
Ω

∀ω ∈ Ω, π (ω ) = 1 100 events: π=1

You might also like