| |
public:t-720-atai:atai-20:task-environment [2020/10/28 15:12] – [The Physical World & AI Research] thorisson | public:t-720-atai:atai-20:task-environment [2024/04/29 13:33] (current) – external edit 127.0.0.1 |
---|
| Constraint | A set of factors that limit the flexibility of that which it constrains. \\ Best conceived of as a //negative Goal// (//State// that is to be avoided). || | | Constraint | A set of factors that limit the flexibility of that which it constrains. \\ Best conceived of as a //negative Goal// (//State// that is to be avoided). || |
| Family | A set whose elements share one or more common trait (within some "sensible" (pre-defined or definable) allowed variability on one or more of: the types of variables, number of variables, the ranges of these variables). || | | Family | A set whose elements share one or more common trait (within some "sensible" (pre-defined or definable) allowed variability on one or more of: the types of variables, number of variables, the ranges of these variables). || |
| Domains | A Family of Environments, <m>D ⊂ W</m>. \\ The concept of ‘domains’ as subsets of the world, where a particular bias of distributions of variables, values and ranges exists, may be useful in the context of tasks that can be systematically impacted by such a bias (e.g. gravity vs. zero-gravity). Each variable <m>v ~∈~ D</m> may take on any value from the associated domain <m>D</m>; for physical domains we can take the domain of variables to be a subset of real numbers, <m>bbR</m>. || | | \\ \\ Domains | A Family of Environments, <m>D ⊂ W</m>. \\ The concept of ‘domains’ as subsets of the world, where a particular bias of distributions of variables, values and ranges exists, may be useful in the context of tasks that can be systematically impacted by such a bias (e.g. gravity vs. zero-gravity). Each variable <m>v ~∈~ D</m> may take on any value from the associated domain <m>D</m>; for physical domains we can take the domain of variables to be a subset of real numbers, <m>bbR</m>. || |
| Solution | The set of (atomic) //Actions// that can achieve one or more //Goals// in one or more //Task-Environments//. || | | Solution | The set of (atomic) //Actions// that can achieve one or more //Goals// in one or more //Task-Environments//. || |
| Action | The changes an //Agent// can make to variables relevant to a //Task-Environment//. || | | Action | The changes an //Agent// can make to variables relevant to a //Task-Environment//. || |
| | Medium or few number of Solutions. | | | | Medium or few number of Solutions. | |
| | Instructions of varying detail are possible. | | | | Instructions of varying detail are possible. | |
| Novelty | Novelty is unavoidable. \\ In other words, unforeseen circumstances will be encountered by any AI operating in such circumstances. Accordingly, it should be able to handle it, since that is essentially why intelligence exists in the first place. | | | \\ Novelty | Novelty is unavoidable. \\ In other words, unforeseen circumstances will be encountered by any AI operating in such circumstances. Accordingly, it should be able to handle it, since that is essentially why intelligence exists in the first place. | |
| \\ What that means | 1. Since no agent will ever know everything, no agent (artificial or natural) must assume non-axiomatic knowledge. \\ 2. It cannot be assumed that all knowledge that the AI system needs to know is known by its designer up front. This means it must acquire its own knowledge. All advanced AI systems must be //cumulative learners//. \\ 3. Since it must acquire its own knowledge, incrementally, knowledge acquisition will introduce //knowledge gaps and inconsistencies//. \\ 4. A cumulative learning agent will continuously live in a state of //insufficient knowledge and resources// (with respect to perfect knowledge), due to the physical world's //limited time and energy//. | | | \\ What that means | 1. Since no agent will ever know everything, no agent (artificial or natural) must assume non-axiomatic knowledge. \\ 2. It cannot be assumed that all knowledge that the AI system needs to know is known by its designer up front. This means it must acquire its own knowledge. All advanced AI systems must be //cumulative learners//. \\ 3. Since it must acquire its own knowledge, incrementally, knowledge acquisition will introduce //knowledge gaps and inconsistencies//. \\ 4. A cumulative learning agent will continuously live in a state of //insufficient knowledge and resources// (with respect to perfect knowledge), due to the physical world's //limited time and energy//. | |
| |