Module 2 Lecture NotesAIML
Module 2 Lecture NotesAIML
MODULE – 2
➢ There are a variety of ways in which we can represent the Knowledge in Artificial
Intelligence.
b)REPRESENTATION of Facts
b)The Symbol Level: representation of the objects at the knowledge level are defined in
terms of symbols.
➢ There will be a two-way mapping done between FACTS and REPRESENTATIONS, and
we call them as “Representation Mappings”.
➢ The forward representation mappings will map from Facts -> Representations. Whereas,
the backward representation mappings will be the other way.
➢ One representation of Facts is so common that it deserves special mention: i.e the natural
language sentences ( particularly English sentences).
➢ Let us look into an example that uses Mathematical Logic as the representational
formalism as follows;
b) The fact represented above in English can be represented using Logic as follows;
dog ( Spot )
c) Suppose that we also have a logical representation of the Fact that “all dogs have tails”,
this we can represent it as shown below;
d) Then, using the deductive mechanism of logic, we may generate the new representation
object as below;
hastail ( Spot )
➢ A good system that we use to represent the knowledge in a particular domain should
possess the following four properties;
i) Representational Adequacy:- the ability to represent all of the kinds of knowledge that
are needed in that domain.
ii) Inferential Adequacy:- the ability to manipulate the representational structures in such
a way as to derive new structures corresponding to new knowledge inferred from
old.
iii) Inferential Efficiency:- the ability to incorporate into the Knowledge structure additional
information that can be used to focus the attention of the inference mechanisms in the
most promising directions.
iv) Acquisitional Efficiency:- the ability to acquire new information easily.
➢ There is no single system that optimizes all of the capabilities to represent the Knowledge.
➢ Multiple techniques exist for representing Knowledge.
➢ Some of the examples of Knowledge representation techniques are as follows;
i) Simple Relational Knowledge
✓ The simplest way to represent declarative facts is, as a set of relations of the same
sort used in the database systems. Example shown below;
✓ The relational knowledge was able to describe the objects of the Knowledge Base.
✓ It is possible to augment the basic representation with inference mechanisms that
operate on the structure of the representation.
✓ By using the “Property Inheritance” method, we can have a group of classes, and
its objects associated to it.
iii) Inferential Knowledge
✓ Property Inheritance is not the only form of inference.
✓ Sometimes all the power of traditional logic is necessary to describe the inferences
that are needed.
✓ We can make use of the first-order predicate logic to represent additional
knowledge, which uses certain symbols as per the predicate logic.
iv) Procedural Knowledge
✓ Procedural knowledge specifies What to Do , When.
i) Important Attributes
✓ There are two attributes that are of very general significance i.e instance & isa.
✓ Isa represents the class of an object, whereas instance represents the thing that
belongs to a class of a particular object.
✓ These attributes are important because they support Property Inheritance.
✓ The name doesn‟t matter, but it does matters about the classes and its inclusion.
ii) Relationship among attributes
✓ The attributes that we use to describe objects are themselves entities that we
represent.
✓ There are four properties that is independent of the specific knowledge and they
are;
❖ Inverses
▪ Entities in the world are related to each other in many different ways.
▪ In many cases, it is important to represent other views of
relationships.
▪ We can use attributes that focus on a single entity, but to use them in
pairs, one the inverse of the other.
▪ Example: team(Pee-Wee-Reese, Brooklyn-Dodgers) can represent
the team information in two ways;
One associated with Pee-Wee-Reese -> team = Brooklyn-Dodgers
One associated with Brooklyn-Dodgers-> team = Pee-Wee-Reese
❖ Existence in an isa hierarchy
▪ Just as there are Classes of Objects, specialized subsets of those
classes; there are Attributes and Specializations of Attributes.
▪ Example: the attribute “height” is actually a specialization of the more
general attribute “Physical-Size”.
Problem 3: In many domains, it is not at all clear what the primitives should be.
Mary = daughter(brother(mother(Sue)))
Mary = daughter(sister(mother(Sue)))
Mary = daughter( brother(father(Sue)))
Mary=daughter(sister(father(Sue)))
If we don‟t know whether Mary is female, then we have another 4 set of primitives
with son.
Therefore the way to solve this problem is by changing set of primitives as;
Parent, Child, Sibling, Male, Female
Mary = Child(Sibling(Parent(Sue)))
And
Therefore, No single English speaker can be found all over the world.
✓ In order to have access to the right structure for describing a particular situation, it
is necessary to solve all of the following problems;
iii) How to find a better structure if the one chosen initially turns out to be not
appropriate?
2) Predicate Logic
✓ This is appealing because it is simple to deal with & a decision procedure for it exists.
Example:
It is raining. RAINING
It is sunny SUNNY
It is windy. WINDY
✓ In predicate logic, we can represent real world facts as statements written as wff’s(Well Formed
Formula)
Example: Consider some set of sentences given below. The facts described by these sentences can
be represented as a set of wff’s in predicate logic as follows;
Man(Marcus)
Pompeian(Marcus)
Ruler(Caesar)
7) People only try to assassinate rulers they are not loyal to.
✓ Apart from normal predicate logic, we can also use INSTANCE and ISA predicates too.
✓ The predicate INSTANCE is a binary, where first argument is an object; second argument is a class
to which the object belongs.
Example:
[2.3] Resolution
✓ Resolution is a procedure, which gains its efficiency from the fact that it operates on statements
that have been converted to a very convenient standard form.
✓ Resolution is used if various statements are given and need to prove a conclusion of those
statements.
✓ Resolution is a single inference rule which can efficiently operate on Conjunctive Normal Form.
,
✓ A clausal sentence is either a literal or a disjunction of literals.
, ,PV Q
✓ A clause is a set of literals.
1. Eliminate implication
2. Standardize variable
3. Move negation inwards
4. Skolemization
5. Drop universal quantifier
Example:
1) Eliminate implication
2) Standardize variable
- i.e instead of using same variable in all sentences, use different variables for each.
4) Skolemization
- i.e to remove existential quantifier and replace it by Skolem constant.
After Skolemization
Smile(A)
Graduating(B)
After Dropping
Smile(x)
Graduating(y)
Basis of Resolution
✓ The resolution procedure is a simple iterative process: at each step two clauses , called the Parent
classes are compared and will yield a new Clause that has been inferred from them.
✓ The new clause represents ways that the two Parent clauses interact with each other.
Winter V Summer
¬Winter V Cold
P P
(P^Q) -> R ¬(P^Q)v R
= ¬P v ¬Q v ¬R
(SvT) -> Q (¬S ^ ¬T) v Q
=(¬SvQ) ^ (¬TvQ)
T T
To Prove: R
Step 1: Convert the given axioms into clause form, in such a way that it should be in disjunction form and
not in conjunction.
Example: A v B - is correct
A ^ B - is not correct, so here we need to make A,B as two separate clauses
Step 2: To prove the statement “R” , we need to prove its negation part. i.e to prove ¬R
✓ A procedural representation is one in which, the control information that is necessary to use the
knowledge is embedded in the knowledge itself.
✓ A declarative representation is one in which knowledge is specified, but how to use that
knowledge is not given.
✓ Logic programming is a programming language paradigm in which logical assertions are viewed as
programs. Example: PROLOG
✓ Matching
- If we compare rules by matching the current state & the pre-conditions of the rules.
Department of CSE Page 15
Module – 2 Artificial Intelligence & Machine Learning (18CS71)
- Indexing method is used to find the match by considering the current state as an index into
the rules & select as an index into the rules and select the matching one’s exactly.
3) Concept Learning
➢ Concept learning is a process of searching for the correct hypothesis that can be decided
from a large set of hypothesis.
Example:
Tablets Gadgets
Smart Phones
Concept Space
< Φ, Φ, Φ, Φ > which is used to represent negative example which simply “Rejects All”
< ?, ?, ?, ? > which is used to represent positive example which “Accept All”
Example: concept: Days on which Aldo enjoys his favorite water sport.
➢ The table above describes a set of example days, each represented by a set of attributes.
➢ The attribute “EnjoySport” indicates whether Aldo enjoys his favorite water sport on this
day.
➢ The task is to learn to predict the value of “EnjoySport” for an arbitrary day based on the
values of it‟s other attributes.
➢ Let each hypothesis be a vector of 6 constraints specifying the values of the 6 attributes
Sky, AirTemp, Humidity, Wind, Water, Forecast
➢ To illustrate the hypothesis that Aldo enjoys his favorite sport only on Cold days with High
humidity is represented as;
( ? , Cold, High, ?, ?, ? )
Algorithm
1) The first step of Find-S is to initialize „h‟ to the most Specific hypothesis in „H‟
h = { Φ,Φ,Φ,Φ,Φ,Φ }
2) The first example in table is a positive example, it becomes clear that our hypothesis is too
specific. Therefore, none of the „Φ‟ constraints in „h‟ are satisfied by this example, so each
is replaced by next more general constraint that fits the example such as;
3) Next example is considered, where that is also positive example. Therefore, we need to
further generalize „h‟ by substituting a „?‟ in place of any attribute value in „h‟ is not satisfied
by new example.
h = { Sunny, Warm, ?, Strong, Warm, Same }
4) Consider third example, which is a negative example & the algorithm makes o change to
„h‟. In fact, the Find-S algorithm simply ignores every negative example.
5) To complete our trace of Find-S, the 4th positive example leads to further generalization of
„h‟ as follows;
The candidate elimination algorithm incrementally builds the version space given a
hypothesis space H and a set E of examples. The examples are added one by one;
each example possibly shrinks the version space by removing the hypotheses that are
inconsistent with the example. The candidate elimination algorithm does this by
updating the general and specific boundary for each new example.
• You can consider this as an extended form of Find-S algorithm.
• Consider both positive and negative examples.
• Actually, positive examples are used here as Find-S algorithm (Basically they are
generalizing from the specification).
• While the negative example is specified from generalize form.
S = { ɸ, ɸ,ɸ,….ɸ }
G = { ?, ?, ?, …? }
Algorithm
Output :