User Tools

Site Tools


T-720-ATAI-2016 Main


Lecture Notes, F-12 01.03.2016

Attention the Concept

What it is In the domain of intelligent systems the management of system resources is typically called “attention”.
Why it's important Any highly capable AI system will need to manage its resources sensibly. In any world (especially the physical world which we inhabit) the availability of content for cognition – things to think about – vastly overshadows the amount of things we can mull over in our lifetime, and per time unit. Should we seek to build a highly autonomous system this system must be able to decide on its own how to use its time to cognize.
William James “Everyone knows what attention is. It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.” REF
Eyes are the window to attention The eyes are a strong indicator about where the attention of humans is focused.
Listen where you look You tend to listen to the spatial direction that you look in.
Look where you listen Upon hearing a curious sound there is a strong tendency to automatically look in the direction where the sound came from.
Cocktail party effect When situated in a cocktail party conversation you are able to ignore the chatter around you, yet you can hear your name being mentioned by someone standing afar away in the crowd. Selective attention. (Colin Cherry, 1953.)
Attention in action Awareness Test

Example of your attention in action: Stroop effect.
Instructions: First, read the words out loud. How did that work out for you? OK? Good.
Now, say out loud the name of the color of each word. What about this one?

But What IS Attention?

Filter Attention is like a lossy compressor: It throws out everything that is irrelevant.
Spotlight Attention is like a spotlight: Everything in the spotlight is super easy to process, the rest is hard-to-attend-to.
Top-down Goals and expectations steer attention.
Bottom-up The form and function of incoming stimuli steers attention.
Early selection “What goes in” is selected early on. Contradicted by the cocktail party effect. “Early” refers to a proposed “perceptual processing pipeline”.
Late selection “Data lingers around for a while” in the subconscious, until needed. “Late” refers to a proposed “perceptual processing pipeline”.

Early selection.

Broadbent's filter model of attention (early selection).

Treisman's attenuation model of attention (early selection).

Deutsch-Norman model of attention (late selection).

Knudsen Attention Framework model of attention (late selection).

Attention in Narrow AI

Ignoring resources AI has for the most part ignored resource constraints - whether time, energy, memory, or other.
Data Relevance We know in advance what information is relevant to system operation.
Sampling Frequency We know in advance how frequently the system has to sample information.
Action Frequency We know in advance how frequently the system has to act.
Resource Availability We know in advance the resource requirements of the system.
Adaptation Dynamic adaptation to the above not required.
Data Filtering Information filtering can be pre-programmed if characteristics of relevant information known in advance.
Resource Management Resource management and processing hand-tuned for tasks and environments.
Is attention necessary? Cognitive resource management has not been of much concern in AI work that builds relatively simple systems for particular targeted problems.
Conclusion Attention is not really needed for narrow AI systems.

In the Context of AGI

Is attention necessary? Yes. For systems capable of a wide range of actions in complex environments, explicit management of time and cognitive resources is not only useful, it is a necessity.
An agent in a task-environment where some of the variables matter (<m>v_{i … n}</m>) and others don't (<m>v_{n+1 … m}</m>). An autonomous agent must find out for itself which variables are in which category.
Prior AGI Attempts Significant limitations
Data filtering only
External information only
Realtime not addressed
Single-mode only

Functional Requirements for AGI Attention

Attention is transversal Resources have to be managed system-wide.
Data + Processes Attention in AGI is really a question of managing both data and processes.
Internal / External It is equally important to manage cognitive processes and events as data coming in through the senctors.
System-wide quantification of data relevance
Goal-directed (top-down)
Novelty / unexpectedness (bottom-up)
System-wide quantification of process relevance
Operational experience: prior success/failure of processes (top-down)
Available data: missing data may prevent processes from running (bottom-up)

Helgi Páll Helgason's Attention Model

Unified approach The same model is applied to all data coming from the sensors as to data generated internally during the cognitive process, including new processes, process initiation and termination, data generation, data deletion, etc.
Predictive capabilities - Capacity to generate predictions and expectations.
- Necessary control data for top-down attention in addition to goals.
Unified sensory pipeline - Data given identical treatment regardless of origin (external, internal).
Fine-grained - Data and processing units are small but numerous.
- Reasoning about small, simple components and their effects is significantly more tractable than for larger, more complex components.
Data-driven - All processing is triggered by the occurrence of data.
- Eliminates the need for fixed control loops, allowing for operation at multiple time scales and greater flexibility.

Helgi Páll Helgason model of attention.

Helgi Páll Helgason model of attention.

Helgi Páll Helgason model of attention.

Helgi Páll Helgason model of attention.

Helgi Páll Helgason model of attention.

Lecture by Helgi Páll Helgason at AGI-12 scroll to [8:40] for intro to Helgi's model.


/var/www/ · Last modified: 2016/02/29 21:53 by thorisson2

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki