| |
public:t-713-mers:mers-25:causation-methodology-architecture [2025/08/26 09:40] – created leonard | public:t-713-mers:mers-25:causation-methodology-architecture [2025/10/21 08:13] (current) – [Levels of Self-Programming] leonard |
---|
| Likely to be many ways? | For AGI the set of relevant self-programming approaches is likely to be a much smaller set than that typically discussed in computer science, and in all likelihood much smaller than often implied in AGI. | | | Likely to be many ways? | For AGI the set of relevant self-programming approaches is likely to be a much smaller set than that typically discussed in computer science, and in all likelihood much smaller than often implied in AGI. | |
| \\ Architecture | The possible solutions for effective and efficient self-programming are likely to be strongly linked to what we generally think of as the //architectural structure// of AI systems, since self-programming for AGI may fundamentally have to change, modify, or partly duplicate, some aspect of the architecture of the system itself, for the purpose of being better equipped to perform some task or set of tasks. | | | \\ Architecture | The possible solutions for effective and efficient self-programming are likely to be strongly linked to what we generally think of as the //architectural structure// of AI systems, since self-programming for AGI may fundamentally have to change, modify, or partly duplicate, some aspect of the architecture of the system itself, for the purpose of being better equipped to perform some task or set of tasks. | |
| What is Needed | What is NOT needed is a system that spews out rule after rule after rule, filling up a giant database of rules. That misses the point because what is needed, for any //particular// situation, is a //particular// reasoning chain -- in other words, we need **customized reasoning** \\ What is neede d | | | What is Needed | What is NOT needed is a system that spews out rule after rule after rule, filling up a giant database of rules. That misses the point because what is needed, for any //particular// situation, is a //particular// reasoning chain -- in other words, we need **customized reasoning** | |
| Achieving **Customized Reasoning** | What is called for is the equivalent of a just-in-time-compiler but for //reasoning//: A reasoner that produces exactly the kind of reasoning needed for the particular situation. This would be the most compact way for creating logically consistent results where trustworthiness is //part of the reasoning//. | | | Achieving **Customized Reasoning** | What is called for is the equivalent of a just-in-time-compiler but for //reasoning//: A reasoner that produces exactly the kind of reasoning needed for the particular situation. This would be the most compact way for creating logically consistent results where trustworthiness is //part of the reasoning//. | |
| Trustworthiness Requires Meta-Reasoning | Trustworthiness of reasoning output can only be done with knowledge of the reliability of the rules used, in other words, //rules about the rules//. This means that meta-reasoning is an inseparable part of the reasoning. | | | Trustworthiness Requires Meta-Reasoning | Trustworthiness of reasoning output can only be done with knowledge of the reliability of the rules used, in other words, //rules about the rules//. This means that meta-reasoning is an inseparable part of the reasoning. | |