Both sides previous revisionPrevious revisionNext revision | Previous revision |
public:t-713-mers:mers-25:reasoning-intro [2025/09/23 09:27] – leonard | public:t-713-mers:mers-25:reasoning-intro [2025/09/25 11:06] (current) – [Non-Axiomatic Reasoning] leonard |
---|
| Atomic Operators | Conjunction (AND), disjunction (OR), negation (NOT) | | | Atomic Operators | Conjunction (AND), disjunction (OR), negation (NOT) | |
| Combination of Rules | XOR, NAND, ANDN, ORN, NOR ... \\ Fundamental for all forms of computing. | | | Combination of Rules | XOR, NAND, ANDN, ORN, NOR ... \\ Fundamental for all forms of computing. | |
\\ | |
| |
=====Fuzzy Logic===== | |
| Fuzzy Logic (FL) | Extends classical logic by allowing truth values between 0 and 1 instead of just {T, F}. Designed for handling vagueness and graded membership in categories (e.g., “tall,” “warm,” “near”). | | |
| FL Features | Statements are not just true or false but can have a degree of truth (μ ∈ [0,1]). Uses membership functions and fuzzy sets. Combines with fuzzy operators (min, max, t-norms, etc.) for reasoning. | | |
| Evidence | A membership function μ(x) defines the degree to which an element belongs to a fuzzy set. Example: μ_tall(John) = 0.7 means John is 70% in the set of tall people. | | |
| Uncertainty | Expressed as graded truth values. Unlike probability (uncertainty of events), fuzzy logic models the vagueness of concepts. Example: temperature = “warm” with membership 0.6. | | |
| Deduction | Rules are applied with fuzzy truth values. Example: “If temperature is high (0.8), then fan speed is fast (0.8).” Reasoning propagates partial truth instead of strict Boolean truth. | | |
| Abduction | Fuzzy systems can suggest plausible fuzzy causes: e.g., “If ground is wet (0.6), rain likelihood might be fuzzy-high (0.6).” But abduction is not a central focus of FL. | | |
| Induction | Membership functions can be learned from data: e.g., clustering methods to define fuzzy categories like “young,” “middle-aged,” “old.” | | |
| Analogy | Less emphasized formally, but fuzzy similarity measures (e.g., cosine similarity, fuzzy overlap) allow analogy: “X is similar to Y with degree 0.75.” | | |
| |
\\ | \\ |
| |
| \\ Induction | Figuring out the general case. \\ Specific -> General. \\ Making general rules from a (small) set of examples, e.g. 'the sun has risen in the east every morning up until now, hence, the sun will also rise in the east tomorrow. | | | \\ Induction | Figuring out the general case. \\ Specific -> General. \\ Making general rules from a (small) set of examples, e.g. 'the sun has risen in the east every morning up until now, hence, the sun will also rise in the east tomorrow. | |
| \\ Analogy | Figuring out how things are similar or different. \\ Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties. | | | \\ Analogy | Figuring out how things are similar or different. \\ Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties. | |
| |
| \\ |
| |
| =====Fuzzy Reasoning===== |
| | Fuzzy Logic (FL) | Extends classical logic by allowing truth values between 0 and 1 instead of just {T, F}. Designed for handling vagueness and graded membership in categories (e.g., “tall,” “warm,” “near”). | |
| | FL Features | Statements are not just true or false but can have a degree of truth (μ ∈ [0,1]). Uses membership functions and fuzzy sets. Combines with fuzzy operators (min, max, t-norms, etc.) for reasoning. | |
| | Evidence | A membership function μ(x) defines the degree to which an element belongs to a fuzzy set. Example: μ_tall(John) = 0.7 means John is 70% in the set of tall people. | |
| | Uncertainty | Expressed as graded truth values. Unlike probability (uncertainty of events), fuzzy logic models the vagueness of concepts. Example: temperature = “warm” with membership 0.6. | |
| | Deduction | Rules are applied with fuzzy truth values. Example: “If temperature is high (0.8), then fan speed is fast (0.8).” Reasoning propagates partial truth instead of strict Boolean truth. | |
| | Abduction | Fuzzy systems can suggest plausible fuzzy causes: e.g., “If ground is wet (0.6), rain likelihood might be fuzzy-high (0.6).” But abduction is not a central focus of FL. | |
| | Induction | Membership functions can be learned from data: e.g., clustering methods to define fuzzy categories like “young,” “middle-aged,” “old.” | |
| | Analogy | Less emphasized formally, but fuzzy similarity measures (e.g., cosine similarity, fuzzy overlap) allow analogy: “X is similar to Y with degree 0.75.” | |
| |
\\ | \\ |
| Evidence | w<sup>+</sup> is positive evidence; w<sup>-</sup> is negative evidence. | | | Evidence | w<sup>+</sup> is positive evidence; w<sup>-</sup> is negative evidence. | |
| \\ Uncertainty | Frequency: f = w<sup>+</sup> / w, where w = w<sup>+</sup> + w<sup>-</sup> (total evidence). \\ Confidence: c = w/(w + k), where k ≥ 1. \\ Ignorance: i = k/(w + k). | | | \\ Uncertainty | Frequency: f = w<sup>+</sup> / w, where w = w<sup>+</sup> + w<sup>-</sup> (total evidence). \\ Confidence: c = w/(w + k), where k ≥ 1. \\ Ignorance: i = k/(w + k). | |
| \\ Deduction | The **premises** are given. \\ Figuring out the implication of facts (or predicting what may come). Producing implications from premises. \\ E.g. "The last domino will fall when all the other dominos between the first and the last have fallen". | | | \\ Deduction | The **premises** are given. \\ Figuring out the implication of facts (or predicting what may come). Producing implications from premises. \\ E.g., "The last domino will fall when all the other dominos between the first and the last have fallen".\\ Represented as B → C < f1, c1 >, A → B < f2, c2 > ⊢ A → C < f3, c3 > | |
| \\ Abduction | A particular **outcome X** is given. \\ Figuring out how things came to be the way they are (or how particular outcomes could be made to come about, or how particular outcomes could be prevented). \\ E.g. Sherlock Holmes, who is a genius abducer. | | | \\ Abduction | A particular **outcome X** is given. \\ Figuring out how things came to be the way they are (or how particular outcomes could be made to come about, or how particular outcomes could be prevented). \\ E.g. Sherlock Holmes, who is a genius abducer.\\ Represented as B → C < f1, c1 >, B → A < f2, c2 > ⊢ A → C < f3, c3 > | |
| \\ Induction | A **small set of examples** is given. \\ Figuring out the general case. Making general rules from a (small) set of examples. \\ E.g. "The sun has risen in the East every morning up until now, hence, the sun will also rise in the East tomorrow". | | | \\ Induction | A **small set of examples** is given. \\ Figuring out the general case. Making general rules from a (small) set of examples. \\ E.g. "The sun has risen in the East every morning up until now, hence, the sun will also rise in the East tomorrow".\\ Represented as B → C < f1, c1 >, A → C < f2, c2 > ⊢ A → B < f3, c3 > | |
| \\ Analogy | A set of **two (or more) things** is given. \\ Figuring out how things are similar or different. Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties. \\ E.g. "What does a pen have in common with an arrow?" "What is the difference between a rock and a ball?" | | | \\ Analogy | A set of **two (or more) things** is given. \\ Figuring out how things are similar or different. Making inferences about how something X may be (or is) through a comparison to something else Y, where X and Y share some observed properties. \\ E.g. "What does a pen have in common with an arrow?" "What is the difference between a rock and a ball?" | |
| | <sup>Author of the Non-Axiomatic Reasoning covered here: Pei Wang </sup> | | | | <sup>Author of the Non-Axiomatic Reasoning covered here: Pei Wang </sup> | |