MOD 3.1 Propositional Logic
MOD 3.1 Propositional Logic
MOD 3.1 Propositional Logic
AND
REASONING
Prepared By
-Anooa Joy
KNOWLEDGE-
BASED AGENTS
• Implementation Level: This level is the physical representation of the knowledge level. Here, it
is understood that “how the knowledge-based agent actually implements its stored knowledge.”
KNOWLEDGE BASED AGENT
TECHNIQUES USED FOR KNOWLEDGE
REPRESENTATION
• Logic: It is the basic method used to represent the knowledge of a
machine. The term logic means to apply intelligence over the
stored knowledge.
Logic can be further divided as:
1. Propositional Logic: This technique is also known
as propositional calculus, statement logic, or sentential logic. It is
used for representing the knowledge about what is true and what is
false.
2. First-order Logic: It is also known as Predicate logic or First-
order predicate calculus (FOPL). This technique is used to represent
the objects in the form of predicates or quantifiers. It is different
from Propositional logic as it removes the complexity of the sentence
represented by it. In short, FOPL is an advance version of
propositional logic.
TECHNIQUES USED FOR
KNOWLEDGE REPRESENTATION
3. Rule-based System: . In the rule-based system, we impose rules over the propositional
logic and first-order logic techniques. If-then clause is used for this technique. For
example, if there are two variables A and B. Value of both A and B is True.
Consequently, the result of both should also be True and vice-versa. It is represented
as: If the value of A and B is True, then the result will be True. So, such a technique
makes the propositional as well as FOPL logics bounded in the rules.
4. Semantic Networks: The technique is based on storing the knowledge into the system
in the form of a graph. Nodes of a graph represent the objects which exist in the real
world, and the arrow represents the relationship between these objects. Such techniques
show the connectivity of one object with another object. For example, Consider the
given knowledge stored in a machine:
• Ram has a cycle.
• Ram is a boy.
• Cycle has a bell.
• Ram is 12 years old.
• Cycle has two paddles.
TECHNIQUES USED FOR
KNOWLEDGE REPRESENTATION
5. Frames: In this technique, the knowledge is stored via slots and fillers. Slots are the entities
and Fillers are its attributes similar to database. They are together stored in a frame. So, whenever
there is a requirement, the machine infers the necessary information to take the decision. For
example, Tomy is a dog having one tail. It can be framed as:
Tomy((Species (Value = Dog))
(Feature (Value = Tail)))
6. Script: It is an advanced technique over the Frames. Here, the information is stored in the form
of a script. The script is stored in the system containing all the required information. The system
infers the information from that script and solves the problem
LOGIC AS A KR LANGUAGE
WUMPUS WORLD
WUMPUS WORLD
GAME DESCRIPTION
The Wumpus World is a cave consisting of rooms
connected by passageways. Lurking somewhere in the
cave is the Wumpus, a beast that eats any agent that
enters its room. The Wumpus can be shot by an agent,
but agent has only one arrow. Some rooms contain
bottomless pits that trap any agent that wanders into
the room. Occasionally, there is a heap of gold in a
room. The goal is to collect the gold and exit the world
without being eaten.
WUMPUS WORLD ENVIRONMENT
• The agent always starts in the field [1,1].
• The task of the agent is to find the gold, return to the field
[1,1] and climb out of the cave.
• Squares adjacent to Wumpus are smelly and squares
adjacent to pit are breezy (not diagonal)
• Glitter iff gold is in the same square
• Shooting kills Wumpus if you are facing it
• Wumpus emits a horrible scream when it is killed that can be
heard anywhere
• Shooting uses up the only arrow
• Grabbing picks up gold if in same square
• Releasing drops the gold in same square
PEAS DESCRIPTION
• Performance measure
• gold: +1000, death: -1000
• -1 per step , -10 for using the arrow
• Environment
• Squares adjacent to Wumpus are smelly
• Squares adjacent to pit are breezy
• Glitter iff gold is in the same square
• Gold is picked up by reflex, can’t be dropped
• Shooting kills Wumpus if you are facing it. It screams
• Shooting uses up the only arrow
• Grabbing picks up gold if in same square
• Releasing drops the gold in same square
• You bump if you walk into a wall
• Actuators: Face , Move, Grab, Release, Shoot
• Sensors: Stench, Breeze, Glitter, Bump, , Scream
WUMPUS WORLD CHARACTERIZATION
1. Deterministic Yes – outcomes exactly specified
2. Static Yes – Wumpus and Pits do not move
3. Discrete Yes
4. Single-agent Yes – Wumpus is essentially a natural feature
5. Fully Observable No – only local perception
6. Episodic No—What was observed before (breezes, pits, etc) is very
useful.
EXPLORING THE WUMPUS WORLD
• Each sentence makes a claim about the world. An agent is said to believe a sentence about the world.
• E.g., the language of arithmetic
• x+2 ≥ y is a sentence; x2y +> {} is not a sentence
• x+2 ≥ y is true iff the number x+2 is no less than the number y
• x+2 ≥ y is true in a world where x = 7, y = 1
• x+2 ≥ y is false in a world where x = 0, y = 6
INFERENCING WITH KNOWLEDGE
AND ENTAILMENT
• Inferencing is how we derive:
• Conclusions from existing knowledge;
• New information from existing information. Inferencing might be used in both ASK and TELL
operations.
• Entailment is the generation or discovery that a new sentence is TRUE given existing
sentences. Entailment means that one thing follows logically from another. Entailment is a
relationship between sentences (i.e., syntax) that is based on semantics. Knowledge base KB
entails sentence α if and only if α is true in all worlds where KB is true ie, KB ╞ α
• E.g.
1. KB containing “the Phillies won” and “the Reds won” entails “Either the Phillies won or the
Reds won”
2. x+y = 4 entails 4 = x+y
MODELS
• Logicians typically think in terms of models, which are
formally structured worlds with respect to which truth can be
evaluated.
• We say m is a model of a sentence α if α is true in m.
• M(α) is the set of all models of α, then KB ╞ α iff
M(KB) ╞ M(α)
• E.g.
1. KB = Phillies won and Yankees won
2. α = Phillies won
THE CONNECTION BETWEEN
SENTENCES AND FACTS
• The property of one fact following from another is mirrored by the property of one sentence being entailed
by another.
• If KB is true in the real world, then any sentence α derived from KB by a sound inference procedure is
also true in the real world
ENTAILMENT IN THE WUMPUS WORLD
• Situation after detecting nothing in [1,1], moving right, breeze in [2,1]
• Consider possible models for KB assuming only pits
• 3 Boolean choices⇒ 8 possible models
WUMPUS MODELS
SYNTAX
• The symbols and the connectives together define the syntax of the language. Again, syntax is like grammar.
• TRUTH SYMBOLS: T (true) and F (false) are provided by the language. Either T or F.
• PROPOSITIONAL SYMBOLS: P, Q, R, etc. mean something in the environment. Proposition symbols are sentences.
• E.g: P means “It is hot”, Q means “It is humid”, R means “It is raining”, “If it is hot and humid, then it is raining”
P^Q=>R
• Syntax can have:
• ATOMIC SENTENCE: Truth and propositional symbols are considered ATOMIC SENTENCES. Atomic sentences must have
truth assigned (i.e., be assigned T or F).
• COMPLEX SENTENCES: More complex sentences are formed using connectives. Sentences formed in this way can be
called Well-Formed Formula (WFF). The evaluation of complex sentences is done using truth tables for the connectives.
SEMANTICS
• Need to be able to evaluate sentences to true or false. The truth tables define the semantics of the language.
LOGICAL CONNECTIVES
P Q ¬P P∧Q P ∨Q P⇒ Q P⇔ Q
2.
3.
4.
AɅ B
6. And Elimination:
A
VALIDITY AND SATISFIABILITY
• Validity: If a sentence is valid in all set of models, then it is a valid sentence. Validity is also
known as tautology, where it is necessary to have true value for each set of model.
Eg: A V ¬A, A ⇒ A,
• Satisfiability: If a sentence is true atleast for some set of values, it is a satisfiable sentence.
• It can be done by truthtable enumeration.
• (P V Q) → (P Ʌ Q)
P Q PVQ PɅQ (P V Q) → (P Ʌ Q)
False False False False True
False True True False False
True False True False False
True True True True True
• from the above truth table, it is clear that the given expression is satisfiable but not valid.
EXAMPLE 2:
• ((A → B) Ʌ A) → B
A B A→B (A → B) Ʌ A ((A → B) Ʌ A) → B
• Given a knowledge base KB (a set of sentences) and a sentence (called a theorem). Does a
KB semantically entail ? In other words in all interpretations in which sentences in the KB
are true, is also true? Ie, KB |= α ?
• Three approaches:
• Truth-table approach
• Deduction using Inference rules
• Proof by Contradiction or Resolution-refutation
DEDUCTION THEOREM & PROOF BY
CONTRADICTION
Deduction Theorem (connects inference and validity)
• KB ╞ α if and only if KB⇒α is valid
Proof By Contradiction or Refutation or reductio ad absurdum
• KB ╞ a is valid if and only if the sentence KBɅ¬α is a contradiction.
• Monotonic
• If we have a proof, adding information to the DB will not invalidate the proof ie set of entailed
sentences can only increase information to KB.
DEDUCTION EXAMPLE
• Given KB.
• PɅ Q
• P→ R
• QɅ R →S
• R7 : (P12 P21)
• Apply de Morgan's rule to R7:
• R8 : P12 Ʌ P21
KB IN RESTRICTED FORMS
• If the sentences in the KB are restricted to some special forms some of the sound inference
rules may become complete
• Example:
• Horn form (Horn normal form)
• CNF (Conjunctive Normal Forms)
PROPOSITIONAL THEOREM PROVING
• Search for proofs is a more efficient way than enumerating models (We can ignore irrelevant
information). Truth tables have an exponential number of models.
• The idea of inference is to repeat applying inference rules to the KB.
• Inference can be applied whenever suitable premises are found in the KB
• Theorem proving means to apply rules of inference directly to the sentences.
• Two ways to ensure completeness:
1. Proof by resolution: use sequence of powerful inference rules (resolution rule) and construction
of / search for a proof. Resolution works best when the formula is of the special form CNF.
Properties
• Typically requires translation of sentences into a normal form.
2. Forward or Backward chaining: use of modus ponens on a restricted form of propositions (Horn
clauses)
NORMAL FORMS
• Literal: A literal is an atomic sentence (propositional symbol), or the negation of an atomic
sentence. Eg:- p (positive literal), ¬p (negative literal)
• Clause: A disjunction of literals. Eg:- ¬p ∨ q
• Conjunctive Normal Form (CNF): A conjunction of disjunctions of literals, i.e., a conjunction
of clauses Eg:- (AV¬B) ^ (BV¬ CV ¬ D)
• DNF( Disjunctive Normal Form): This is a reverse approach of CNF which is disjunction of
conjunction of literals. Eg:- (A1 Ʌ B1) V (A2 Ʌ B2) V…V (An Ʌ Bn)
• In DNF, it is OR of AND’s, a sum of products, or a cluster concept, whereas, in CNF, it is
ANDs of OR’s a product of sums.
CNF TRANSFORMATION
• In propositional logic, the resolution method is applied only to those clauses which are
disjunction of literals. There are following steps used to convert into CNF:
1) Eliminate bi-conditional implication by replacing A ⇔ B with (A → B) Ʌ (B →A)
2)Eliminate implication by replacing A → B with ¬A V B.
3) In CNF, negation(¬) appears only in literals, therefore we move negation inwards as:
¬ ( ¬A) ≡ A (double-negation elimination
¬ (A Ʌ B) ≡ ( ¬A V ¬B) (De Morgan)
¬(A V B) ≡ ( ¬A Ʌ ¬B) (De Morgan)
4) Finally, using distributive law on the sentences, and form the CNF as:
(A1 V B1) Ʌ (A2 V B2) Ʌ …. Ʌ (An V Bn).
• Prove R from:
1 (P → Q) → Q
2 (P → P)→ R
3 (R → S) → ¬ (S → Q)
SOLUTION
• Convert to CNF
1 PvQ
1. (P → Q) → Q ¬(¬ P v Q) v Q
2 PvR
(P ¬ Q) v Q
3 ¬PvR
(P v Q) Ʌ(¬ Q v Q)
(P v Q) Ʌ T 4 RvS
5 Rv¬Q
2. (P → P)→ R ¬(¬ P v P) v R
6 ¬Sv¬Q
(P ¬ P) v R
7 ¬R Neg
(P v R)Ʌ (¬ P v R)
8 S 4,7
3. (R → S) → ¬ (S → Q) ¬(¬ R v S) v ¬ (¬ S v Q)
9 ¬Q 6,8
(R Ʌ ¬ S) v (S Ʌ ¬ Q)
10 P 1,9
(R v S) Ʌ (¬ S v S) Ʌ (R v ¬ Q) Ʌ (¬ S v ¬ Q)
11 R 3,10
(R v S) ɅT Ʌ (R v ¬ Q) Ʌ(¬ S v ¬ Q)
12 F 7,11
PROPOSITIONAL RESOLUTION EXAMPLE
• Consider the following Knowledge Base:
1. The humidity is high or the sky is cloudy.
2. If the sky is cloudy, then it will rain.
3. If the humidity is high, then it is hot.
4. It is not hot.
Given KB
• R1: (¬B1,1 P1,2 P2,1) (¬P1,2 B1,1) (¬P2,1 B1,1)
• R2: B1,1
• R3: Negation of theorem= ( P1,2)= P1,2
• Given R1 can be split up as R4: : (¬B1,1 P1,2 P2,1) R5: (¬P1,2 B1,1) R6: (¬P2,1 B1,1)
• Consider R5 and R2 apply Modus Ponens R6: ¬P1,2
• Consider R6 and R3 which leads to a negation
HORN CLAUSES AND DEFINITE CLAUSES
• DEFINITE CLAUSE: A disjunction of literals of which exactly one is positive.
• (L1,1 breeze B1,1 ) Yes
• ( B1,1 P1,2 P2,1 ) No
• HORN CLAUSE: A disjunction of literals of which atmost one is positive, ie is a CNF clause with
exactly one positive literal. The positive literal is called the head. The negative literals are called the
body. All definite clauses are Horn Clauses.
• (L1,1 breeze B1,1 ) Yes
• ( B1,1 P1,2 P2,1 ) Yes
• ( B1,1 P1,2 P2,1 ) No
• Horn clauses are closed under resolution, ie if 2 Horn closes are resolved we get back a horn clause.
• Not all sentences in propositional logic can be converted into the Horn form
• GOAL CLAUSE: A clause with no positive literal.
• (L1,1 breeze B1,1 ) No
• ( B1,1 P1,2 P2,1 ) Yes
HORN CLAUSES
• Horn clauses can be re-written as implications ie, logic proposition of the form: p1 ^
….. ^ pn → q .
• Eg: C B A can be written as C B → A
• KB = conjunction of Horn clauses.
• Modus Ponens (for Horn Form
• Inference with Horn Clauses can be done using forward and backward chaining
algorithms.
• The Prolog language is based on Horn Clauses.
• Deciding entailment with Horn Clauses is linear in the size of the knowledge base.
FORWARD AND BACKWARD CHAINING
• These algorithms are very natural and run in linear time
FORWARD CHAINING:
• Based on rule of modus ponens. If know P1, …,Pn& know (P1 Ʌ... ɅPn)→Q. Then can conclude Q.
Whenever the premises of a rule are satisfied, infer the conclusion. Continue with rules that became
satisfied.
• Forward chaining is also known as a forward deduction or forward reasoning method when using an
inference engine. Forward chaining is a form of reasoning which start with atomic sentences in the
knowledge base and applies inference rules (Modus Ponens) in the forward direction to extract more
data until a goal is reached.
BACKWARD CHAINING:
• In Backward chaining, we will start with our goal predicate and then infer further rules.
• Search start from the query and go backwards.
FORWARD CHAINING
• IDEA: It begins from facts(positive literals) in knowledge base and determines if the query can be entailed by
knowledge base of definite clauses. If all premises of an implication are known its conclusion is added to set of
known facts. Eg: Given L1,1 and Breeze and (L1,1 Ʌ Breeze) → B1,1 is in knowledge base then B1,1 can be added.
p1,……, pn p1 ^ ….. ^ pn → q
• Every inference is an application of modus ponens ie Can be used with forward
q
chaining.
FORWARD CHAINING STEPS
1. Start with given proposition symbols (atomic sentence).
2. Iteratively try to infer truth of additional proposition symbols
3. Continue until
– no more inference can be carried out, or
– goal is reached
FORWARD CHAINING
• Fire any rule whose premises are satisfied in the KB, add its
conclusion to the KB, until query is found
• AND-OR graph: Multiple links joined by an arc indicates
a conjunction where every link has to be proved, while
multiple links without an arc indicates disjunction, where
any link has to be proved.
AND-OR
GRAPH
FORWARD CHAINING
• Current goal: L
• L can be inferred by A ^ B → L • Current goal: M • Current goal: M
• Both are true • M can be inferred by B^L→ M • M can be inferred by B ^ L→ M
• L is true • Both are true
• Current goal: M • M is true
BACKWARD CHAINING
THANK YOU