Table of Contents

DCS-T-713-MERS-2025 Main
Lecture Notes



Introduction to Reasoning Machines




Syllogisms


What is it?
A form of deductive argument/reasoning in which a conclusion is drawn from 2 given or assumed propositions (premises / statements). The premises and the conclusion are simple declarative statements constructed using only three simple terms between them, each term appearing twice (as a subject and as a predicate)
E.g. all dogs are animals;
all animals have four legs;
therefore all dogs have four legs.

The argument in such syllogisms is valid by virtue of the fact that it would not be possible to assert the premises and to deny the conclusion without contradicting oneself.
(Based on Oxford Dictionary and Encyclopedia Britannica)
3 types: Categorical, conditional and disjunctive.

Categorical
The traditional type is the categorical syllogism in which both premises and the conclusion are simple declarative statements that are constructed using as few as three simple terms between them, each term appearing twice.
Assumes all premises are true.

Example
All men are mortal.
No gods are mortal.
Thus, no men are gods.
Conditional Implies an (unspoken) “if” in its premises.

Example
If you are injured.
I am qualified to assist with injuries.
Thus, I can heal you (– given that you accept my help).

Disjunctive
Uses an either/or premise.
If it is known that at least one of two statements is true, and that it is not the former that is true; we can infer that it has to be the latter that is true.

Example
A solid thing cannot be in two places at the same time.
X is a solid thing currently at position P1 now.
Thus, it CANNOT currently be at position P2.


Well-Known Syllogisms


Moduls Ponens
If a conditional statement
if P, then Q.
is accepted, and its stated antecedent P holds, then the consequent Q must be rightly inferred.
E.g. If it's raining then its cloudy.
It is raining.
Then it's cloudy.

Moduls Tollens
A mixed syllogism that takes the form of
If P, then Q.
Not Q.
Therefore, not P.

Application of the general truth that if a statement is true, then so is its contrapositive (“if not-B then not-A” is the contrapositive of “if A then B”).
E.g., If it rains, the ground is wet.
The ground is NOT wet.
Then it did NOT rain.


Boolean Logic

Number of values Two (True, False)
Atomic Operators Conjunction (AND), disjunction (OR), negation (NOT)
Combination of Rules XOR, NAND, ANDN, ORN, NOR …
Fundamental for all forms of computing.


Basic Reasoning Machine


Consists of
A set of rules
A working memory (WM)
An inference engine
Rules Expressed in some machine-readable way.
Each rule consists of n patterns.
WM Stores the information that the inference engine is working on at any point in time.
Inference Engine Matches rules against elements in WM.
Match When a rule matches an element in WM it fires, which means that the system implements what the rule says should happen.


Reasoning Operations


Matching
Pattern matching is the main method that reasoning systems use in their operation.
It works along these lines (arrow means output; data stores are in brackets; everything else is processes):
[WM] → [Data, Rules] → MATCH → [Output-1 (Conflict Set)] → RESOLVE → EXECUTE → [Output-2] → [WM]
Forward Chaining A data-driven method of reasoning in which the implications of existing data is deduced until an endpoint (goal) is achieved.
Repeated application of modus ponens that can be equated with 'deduction'.
Used in expert systems, business and production rule systems.

Backward Chaining
A goal-driven reasoning method for inferring unknown truths from known conclusions (goal) by moving backward from a solution to determine the initial conditions and rules. Backward chaining is often applied in artificial intelligence (AI) and may be used along with its counterpart, forward chaining. 
Repeated application of modus ponens that can be equated with 'abduction'.
Used in automated theorem provers, proof assistants, and various artificial intelligence applications.


Traditional Reasoning Categories


Deduction
Figuring out the implication of facts (or predicting what may come).
General → Specific.
Producing implications from premises.
The premises are given; the work involves everything else.
Conclusion is unavoidable given the premises (in a deterministic, axiomatic world).

Abduction
Figuring out how things came to be the way they are (or how particular outcomes could be made to come about, or how particular outcomes could be prevented).
The outcome is given; the work involves everything else.
Sherlock Holmes is a genius abducer.

Induction
Figuring out the general case.
Specific → General.
Making general rules from a (small) set of examples, e.g. 'the sun has risen in the east every morning up until now, hence, the sun will also rise in the east tomorrow.

Analogy
Figuring out how things are similar or different.
Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties.


Fuzzy Reasoning

Fuzzy Logic (FL) Extends classical logic by allowing truth values between 0 and 1 instead of just {T, F}. Designed for handling vagueness and graded membership in categories (e.g., “tall,” “warm,” “near”).
FL Features Statements are not just true or false but can have a degree of truth (μ ∈ [0,1]). Uses membership functions and fuzzy sets. Combines with fuzzy operators (min, max, t-norms, etc.) for reasoning.
Evidence A membership function μ(x) defines the degree to which an element belongs to a fuzzy set. Example: μ_tall(John) = 0.7 means John is 70% in the set of tall people.
Uncertainty Expressed as graded truth values. Unlike probability (uncertainty of events), fuzzy logic models the vagueness of concepts. Example: temperature = “warm” with membership 0.6.
Deduction Rules are applied with fuzzy truth values. Example: “If temperature is high (0.8), then fan speed is fast (0.8).” Reasoning propagates partial truth instead of strict Boolean truth.
Abduction Fuzzy systems can suggest plausible fuzzy causes: e.g., “If ground is wet (0.6), rain likelihood might be fuzzy-high (0.6).” But abduction is not a central focus of FL.
Induction Membership functions can be learned from data: e.g., clustering methods to define fuzzy categories like “young,” “middle-aged,” “old.”
Analogy Less emphasized formally, but fuzzy similarity measures (e.g., cosine similarity, fuzzy overlap) allow analogy: “X is similar to Y with degree 0.75.”


Non-Axiomatic Reasoning


NAL
Distinguishes itself from other reasoning languages in that it is intended for knowledge in worlds where the axioms are unknown, not guaranteed, and/or fallible.
NAL is itself axiomatic, but it is designed for domains that are non-axiomatic.
NAL Features Instead of being either {T,F}, statements have a degree of truth to them, represented by a value between 0 and 1.
NAL uses term logic, which is different from propositional logic in the way it expresses statements.
Evidence w+ is positive evidence; w- is negative evidence.

Uncertainty
Frequency: f = w+ / w, where w = w+ + w- (total evidence).
Confidence: c = w/(w + k), where k ≥ 1.
Ignorance: i = k/(w + k).

Deduction
The premises are given.
Figuring out the implication of facts (or predicting what may come). Producing implications from premises.
E.g., “The last domino will fall when all the other dominos between the first and the last have fallen”.
Represented as B → C < f1, c1 >, A → B < f2, c2 > ⊢ A → C < f3, c3 >

Abduction
A particular outcome X is given.
Figuring out how things came to be the way they are (or how particular outcomes could be made to come about, or how particular outcomes could be prevented).
E.g. Sherlock Holmes, who is a genius abducer.
Represented as B → C < f1, c1 >, B → A < f2, c2 > ⊢ A → C < f3, c3 >

Induction
A small set of examples is given.
Figuring out the general case. Making general rules from a (small) set of examples.
E.g. “The sun has risen in the East every morning up until now, hence, the sun will also rise in the East tomorrow”.
Represented as B → C < f1, c1 >, A → C < f2, c2 > ⊢ A → B < f3, c3 >

Analogy
A set of two (or more) things is given.
Figuring out how things are similar or different. Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties.
E.g. “What does a pen have in common with an arrow?” “What is the difference between a rock and a ball?”
Author of the Non-Axiomatic Reasoning covered here: Pei Wang





2024©K.R.Thórisson