Module3 - Logic
Module3 - Logic
Logic:
Deduction
And
3
Inductive
Reasoning
ATTY. HERSIE A.
BUNDA
Module Outcomes
Atty. Hersie A. Bunda Page 1
Module No 3 ADGE: Logic: Deduction and Inductive Reasoning
At the end of the module, the students must be able to:
1. defines propositional logic;
2. knows the history of propositional logic and;
3. knows the language of propositional logic;
Module Content:
Propositional Logic
Propositional logic, also known as sentential logic and statement logic, is the branch of logic that studies
ways of joining and/or modifying entire propositions, statements or sentences to form more complicated
propositions, statements or sentences, as well as the logical relationships and properties that are derived
from these methods of combining or altering statements. In propositional logic, the simplest statements
are considered as indivisible units, and hence, propositional logic does not study those logical properties
and relations that depend upon parts of statements that are not themselves statements on their own, such
as the subject and predicate of a statement. The most thoroughly researched branch of propositional logic
is classical truth-functional propositional logic, which studies logical operators and connectives that are
used to produce complex statements whose truth-value depends entirely on the truth-values of the simpler
statements making them up, and in which it is assumed that every statement is either true or false and not
both. However, there are other forms of propositional logic in which other truth-values are considered, or
in which there is consideration of connectives that are used to produce statements whose truth-values
depend not simply on the truth-values of the parts, but additional things such as their necessity, possibility
or relatedness to one another.
1. Introduction
Propositional logic largely involves studying logical connectives such as the words “and” and “or” and the
rules determining the truth-values of the propositions they are used to join, as well as what these rules
mean for the validity of arguments, and such logical relationships between statements as being consistent
or inconsistent with one another, as well as logical properties of propositions, such as being tautologically
true, being contingent, and being self-contradictory. (These notions are defined below.)
Propositional logic also studies way of modifying statements, such as the addition of the word “not” that is
used to change an affirmative statement into a negative statement. Here, the fundamental logical principle
involved is that if a given affirmative statement is true, the negation of that statement is false, and if a
given affirmative statement is false, the negation of that statement is true.
What is distinctive about propositional logic as opposed to other (typically more complicated) branches of
logic is that propositional logic does not deal with logical relationships and properties that involve the parts
of a statement smaller than the simple statements making it up. Therefore, propositional logic does not
study those logical characteristics of the propositions below in virtue of which they constitute a valid
argument:
1. George W. Bush is a president of the United States.
2. George W. Bush is a son of a president of the United States.
3. Therefore, there is someone who is both a president of the United States and a son of a
president of the United States.
The recognition that the above argument is valid requires one to recognize that the subject in the
first premise is the same as the subject in the second premise. However, in propositional logic,
simple statements are considered as indivisible wholes, and those logical relationships and
properties that involve parts of statements such as their subjects and predicates are not taken into
consideration.
Propositional logic can be thought of as primarily the study of logical operators.
A logical operator is any word or phrase used either to modify one statement to make a
different statement, or join multiple statements together to form a more complicated
statement. In English, words such as “and”, “or”, “not”, “if … then…”, “because”, and
“necessarily”, are all operators.
A logical operator is said to be truth-functional if the truth-values (the truth or falsity, etc.) of the
statements it is used to construct always depend entirely on the truth or falsity of the statements
from which they are constructed. The English words “and”, “or” and “not” are (at least arguably)
truth-functional, because a compound statement joined together with the word “and” is true if both
the statements so joined are true, and false if either or both are false, a compound statement
joined together with the word “or” is true if at least one of the joined statements is true, and false
if both joined statements are false, and the negation of a statement is true if and only if the
statement negated is false.
Some logical operators are not truth-functional. One example of an operator in English that is not
truth-functional is the word “necessarily”. Whether a statement formed using this operator is true
or false does not depend entirely on the truth or falsity of the statement to which the operator is
applied. For example, both of the following statements are true:
2 + 2 = 4.
Someone is reading an article in a philosophy encyclopedia.
Necessarily, 2 + 2 = 4.
Necessarily, someone is reading an article in a philosophy encyclopedia.
Here, the first example is true but the second example is false. Hence, the truth or falsity of a
statement using the operator “necessarily” does not depend entirely on the truth or falsity of the
statement modified.
Truth-functional propositional logic is that branch of propositional logic that limits itself to the
study of truth-functional operators. Classical (or “bivalent”) truth-functional propositional
logic is that branch of truth-functional propositional logic that assumes that there are
are only two possible truth-values a statement (whether simple or complex) can have:
(1) truth, and (2) falsity, and that every statement is either true or false but not both.
Classical truth-functional propositional logic is by far the most widely studied branch of
propositional logic, and for this reason, most of the remainder of this article focuses exclusively on
this area of logic. In addition to classical truth-functional propositional logic, there are other
branches of propositional logic that study logical operators, such as “necessarily”, that are not
truth-functional. There are also “non-classical” propositional logics in which such possibilities as (i)
a proposition’s having a truth-value other than truth or falsity, (ii) a proposition’s having an
indeterminate truth-value or lacking a truth-value altogether, and sometimes even (iii) a
proposition’s being both true and false, are considered.
2. History
The serious study of logic as an independent discipline began with the work of Aristotle (384-322
BCE). Generally, however, Aristotle’s sophisticated writings on logic dealt with the logic of
categories and quantifiers such as “all”, and “some”, which are not treated in propositional logic.
However, in his metaphysical writings, Aristotle espoused two principles of great importance in
propositional logic, which have since come to be called the Law of Excluded Middle and the Law of
Contradiction. Interpreted in propositional logic, the first is the principle that every statement is
either true or false, the second is the principle that no statement is both true and false. These are,
of course, cornerstones of classical propositional logic. There is some evidence that Aristotle, or at
least his successor at the Lyceum, Theophrastus (d. 287 BCE), did recognize a need for the
development of a doctrine of “complex” or “hypothetical” propositions, that is, those involving
conjunctions (statements joined by “and”), disjunctions (statements joined by “or”) and
conditionals (statements joined by “if… then…”), but their investigations into this branch of logic
seem to have been very minor.
More serious attempts to study statement operators such as “and”, “or” and “if… then…” were
conducted by the Stoic philosophers in the late 3rd century BCE. Since most of their original works
—if indeed, these writings were even produced—are lost, we cannot make many definite claims
about exactly who first made investigations into what areas of propositional logic, but we do know
from the writings of Sextus Empiricus that Diodorus Cronus and his pupil Philo had engaged in a
protracted debate about whether the truth of a conditional statement depends entirely on it not
being the case that its antecedent (if-clause) is true while its consequent (then-clause) is false, or
whether it requires some sort of stronger connection between the antecedent and consequent—a
debate that continues to have relevance for modern discussion of conditionals. The Stoic
philosopher Chrysippus (roughly 280-205 BCE) perhaps did the most in advancing Stoic
propositional logic, by marking out a number of different ways of forming complex premises for
arguments, and for each, listing valid inference schemata. Chrysippus suggested that the following
inference schemata are to be considered the most basic:
1. If the first, then the second; but the first; therefore the second.
2. If the first, then the second; but not the second; therefore, not the first.
Atty. Hersie A. Bunda Page 4
Module No 3 ADGE: Logic: Deduction and Inductive Reasoning
3. Not both the first and the second; but the first; therefore, not the second.
4. Either the first or the second [and not both]; but the first; therefore, not the second.
5. Either the first or the second; but not the second; therefore the first.
Inference rules such as the above correspond very closely to the basic principles in a contemporary
system of natural deduction for propositional logic. For example, the first two rules correspond to
the rules of modus ponens and modus tollens, respectively. These basic inference schemata were
expanded upon by less basic inference schemata by Chrysippus himself and other Stoics, and are
preserved in the work of Diogenes Laertius, Sextus Empiricus and later, in the work of Cicero.
Advances on the work of the Stoics were undertaken in small steps in the centuries that followed.
This work was done by, for example, the second century logician Galen (roughly 129-210 CE), the
sixth century philosopher Boethius (roughly 480-525 CE) and later by medieval thinkers such as
Peter Abelard (1079-1142) and William of Ockham (1288-1347), and others. Much of their work
involved producing better formalizations of the principles of Aristotle or Chrysippus, introducing
improved terminology and furthering the discussion of the relationships between operators.
Abelard, for example, seems to have been the first to clearly differentiate exclusive disjunction from
inclusive disjunction (discussed below), and to suggest that inclusive disjunction is the more
important notion for the development of a relatively simple logic of disjunctions.
The next major step forward in the development of propositional logic came only much later with
the advent of symbolic logic in the work of logicians such as Augustus DeMorgan (1806-1871) and,
especially, George Boole (1815-1864) in the mid-19th century. Boole was primarily interested in
developing a mathematical-style “algebra” to replace Aristotelian syllogistic logic, primarily by
employing the numeral “1” for the universal class, the numeral “0” for the empty class, the
multiplication notation “xy” for the intersection of classes x and y, the addition notation “x + y” for
the union of classes x and y, etc., so that statements of syllogistic logic could be treated in quasi-
mathematical fashion as equations; for example, “No x is y” could be written as “xy = 0”. However,
Boole noticed that if an equation such as “x = 1” is read as “x is true”, and “x = 0” is read as “x is
false”, the rules given for his logic of classes can be transformed into a logic for propositions, with
“x + y = 1” reinterpreted as saying that either x or y is true, and “xy = 1” reinterpreted as meaning
that x and y are both true. Boole’s work sparked rapid interest in logic among mathematicians.
Later, “Boolean algebras” were used to form the basis of the truth-functional propositional logics
utilized in computer design and programming.
Modal propositional logics are the most widely studied form of non-truth-functional propositional
logic. While interest in modal logic dates back to Aristotle, by contemporary standards the first
systematic inquiry into this modal propositional logic can be found in the work of C. I. Lewis in
1912 and 1913. Among other well-known forms of non-truth-functional propositional logic, deontic
logic began with the work of Ernst Mally in 1926, and epistemic logic was first treated
systematically by Jaakko Hintikka in the early 1960s. The modern study of three-valued
propositional logic began in the work of Jan Łukasiewicz in 1917, and other forms of non-classical
propositional logic soon followed suit. Relevance propositional logic is relatively more recent; dating
from the mid-1970s in the work of A. R. Anderson and N. D. Belnap. Paraconsistent logic, while
having its roots in the work of Łukasiewicz and others, has blossomed into an independent area of
research only recently, mainly due to work undertaken by N. C. A. da Costa, Graham Priest and
others in the 1970s and 1980s.
The basic rules and principles of classical truth-functional propositional logic are, among contemporary
logicians, almost entirely agreed upon, and capable of being stated in a definitive way. This is most easily
done if we utilize a simplified logical language that deals only with simple statements considered as
indivisible units as well as complex statements joined together by means of truth-functional connectives.
We first consider a language called PL for “Propositional Logic”. Later we shall consider two even simpler
languages, PL’ and PL”.
If we use the letter ‘‘ as our translation of the statement “Paris is the captial of France” in PL, and the letter
‘‘ as our translation of the statement “Paris has a population of over two million”, and use a horizontal line
to separate the premise(s) of an argument from the conclusion, the above argument could be symbolized
in language PL as follows:
In addition to statement letters like ‘‘ and ‘‘ and the operators, the only other signs that sometimes appear
in the language PL are parentheses which are used in forming even more complex statements. Consider
the English compound sentence, “Paris is the most important city in France if and only if Paris is the capital
of France and Paris has a population of over two million.” If we use the letter ‘‘ in language PL to mean
that Paris is the most important city in France, this sentence would be translated into PL as follows:
This latter statement asserts that Paris is the most important city in France if and only if it is the capital of
France, and (separate from this), Paris has a population of over two million. The difference between the
two is subtle, but important logically.
It is important to describe the syntax and make-up of statements in the language PL in a precise manner,
and give some definitions that will be used later on. Before doing this, it is worthwhile to make a distinction
between the language in which we will be discussing PL, namely, English, from PL itself. Whenever one
language is used to discuss another, the language in which the discussion takes place is called
the metalanguage, and language under discussion is called the object language. In this context, the object
language is the language PL, and the metalanguage is English, or to be more precise, English
supplemented with certain special devices that are used to talk about language PL. It is possible in English
to talk about words and sentences in other languages, and when we do, we place the words or sentences
we wish to talk about in quotation marks. Therefore, using ordinary English, I can say that “parler” is a
French verb, and “” is a statement of PL. The following expression is part of PL, not English:
This point may seem rather trivial, but it is easy to become confused if one is not careful.
In our metalanguage, we shall also be using certain variables that are used to stand for arbitrary
expressions built from the basic symbols of PL. In what follows, the Greek letters ‘‘, ‘‘, and so on, are used
for any object language (PL) expression of a certain designated form. For example, later on, we shall say
that, if is a statement of PL, then so is . Notice that ‘‘ itself is not a symbol that appears in PL; it is a
symbol used in English to speak about symbols of PL. We will also be making use of so-called “Quine
corners”, written ‘‘ and ‘‘, which are a special metalinguistic device used to speak about object language
expressions constructed in a certain way. Suppose is the statement “” and is the statement ““; then is
the complex statement ““.
Let us now proceed to giving certain definitions used in the metalanguage when speaking of the language
PL.
Definition: A statement letter of PL is defined as any uppercase letter written with or without a numerical
subscript.
Note: According to this definition, ‘‘, ‘‘, ‘‘, ‘‘, and ‘‘ are examples of statement letters. The numerical
subscripts are used just in case we need to deal with more than 26 simple statements: in that case, we can
use ‘‘ to mean something different than ‘‘, and so forth.
Definition: A connective or operator of PL is any of the signs ‘‘, ‘‘, ‘‘, ‘→’, and ‘↔’.
Reference:
https://iep.utm.edu/prop-log/
Course No. ADGE Descriptive Title : Logic: Deduction and Inductive Reasoning
Checked by:
Recommending Approval:
Approved: