User Tools

Site Tools


public:t-713-mers:mers-24:knowledge

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
public:t-713-mers:mers-24:knowledge [2024/09/20 08:56] – [What is an "agent"?] thorissonpublic:t-713-mers:mers-24:knowledge [2024/09/20 10:09] (current) – [What is an "agent"?] thorisson
Line 98: Line 98:
  
 ====Controlled Experiment==== ====Controlled Experiment====
-|  What is it?   | A fairly recent research method, historically speakingfor testing hypotheses / theories  |+|  What is it?   | A fairly recent research method (approx. 200 years, so, historically speakingfor testing hypotheses / theories  |
 |  Why is it Important?  | The most reliable way humanity has found to create reliable sharable knowledge.    |  |  Why is it Important?  | The most reliable way humanity has found to create reliable sharable knowledge.    | 
 |  Why is it Relevant Here?  | Like individual learning, it involves dealing with new phenomena and //figuring them out//   | |  Why is it Relevant Here?  | Like individual learning, it involves dealing with new phenomena and //figuring them out//   |
Line 137: Line 137:
 ====What is an "agent"?==== ====What is an "agent"?====
 |  "Agent"   | In this course an 'agent' is a "stand-alone" intelligent system.    | |  "Agent"   | In this course an 'agent' is a "stand-alone" intelligent system.    |
-|  "Intelligent System"   | We expect an "intelligent" system to be able to //learn//   |+|  "Intelligent System"   | We expect an "intelligent" system to be able to //learn autonomously//.    |
 |  Minimum Learning  | That the system can learn //a task//  | |  Minimum Learning  | That the system can learn //a task//  |
 |  "Task"  | A transformation of a stat (typically in the environment) from one (steady-) state to another, that can be described (and thus verified) by a goal (described in some compressed way via a representation language).   | |  "Task"  | A transformation of a stat (typically in the environment) from one (steady-) state to another, that can be described (and thus verified) by a goal (described in some compressed way via a representation language).   |
Line 151: Line 151:
 |  Constraint Diversity: \\ Breadth of constraints on solutions  | \\ If a system X can reliably produce acceptable solutions under a higher number of solution constraints than system  Y, system X is more //powerful// than system Y.  | |  Constraint Diversity: \\ Breadth of constraints on solutions  | \\ If a system X can reliably produce acceptable solutions under a higher number of solution constraints than system  Y, system X is more //powerful// than system Y.  |
 |  Goal Diversity: \\ Breadth of goals  | If a system X can meet a wider range of goals than system Y, system X is more //powerful// than system Y.  | |  Goal Diversity: \\ Breadth of goals  | If a system X can meet a wider range of goals than system Y, system X is more //powerful// than system Y.  |
-|  \\ Generality  | Any system X that exceeds system Y on one or more of the above we say it's more //general// than system Y, but typically pushing for increased generality means pushing on all of the above dimensions.   |+|  Generality  | Any system X that exceeds system Y on one or more of the above we say it's more //general// than system Y, but typically pushing for increased generality means pushing on all of the above dimensions.   |
 |  General intelligence...  | ...means less is needed to be known up front when the system is created; the system can learn to figure things out and how to handle itself, in light of **LTE**.   | |  General intelligence...  | ...means less is needed to be known up front when the system is created; the system can learn to figure things out and how to handle itself, in light of **LTE**.   |
 |  And yet: \\ The hallmark of an AGI  | A system that can handle novel or **brand-new** problems, and be expected to attempt to address //open problems// sensibly. \\ The level of difficulty of the problems it solves would indicate its generality.  | |  And yet: \\ The hallmark of an AGI  | A system that can handle novel or **brand-new** problems, and be expected to attempt to address //open problems// sensibly. \\ The level of difficulty of the problems it solves would indicate its generality.  |
/var/www/cadia.ru.is/wiki/data/attic/public/t-713-mers/mers-24/knowledge.1726822585.txt.gz · Last modified: 2024/09/20 08:56 by thorisson

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki