IDEAL

Iterative Dialectic Engine for Automated Learning


Human consciousness

Terrence Deacon (1) argues persuasively that the breakthrough development for human consciousness was the acquisition of 'symbolic reference':  the ability to represent physical entities as abstract symbols, unrelated to their referents save by convention.  Uniquely, we humans can think of physical entities - including one another, and our own selves - as impersonal objects as well as unique 'souls' (in the sense of Kant's ding-an-sich, 'thing-in-itself').  This doesn't sound like much of an advance!  However, as Deacon makes clear, without symbolic reference it is impossible to develop language having a grammatical structure, and it is the 'co-evolutionary' development of just such a language capacity that has led to what we term human consciousness.

Computers can do symbolic reference - that's how programs are written.  So why aren't they conscious?  This question presupposes, wrongly, that the 'missing link' for computers is the same as that for humans.  It has been argued here that the 'missing link' for artificial intelligence (AI) is not symbolic reference but the control of, and communication between, the components of our computing machines.  As Deacon says, the key to symbolic interpretation

...is not in the machinery itself but in the flow of patterns through it. (1)

So is AI simply a matter of wiring up the components correctly?  Not quite.  Humans have a number of faculties that are 'necessary but not sufficient' for consciousness:  capacities that are not 'missing links' (because they are also present in other species not having human consciousness) but without which consciousness would be impossible.  Principal amongst these are our senses, memory, and emotions.  Now, computers can 'sense' using various input devices, and they have very good 'memory'... but how can they have emotions?  Will a thinking machine ever sigh at the sight of a firm bottom or weep at the sight of a crimson sunset?

Some people have 'explained' the presence of the emotional faculty in modern humans as being the inherited legacy of a more sensual past, when as noble savages we romped carefree across the plains of Serengeti.  No:  humans have emotions because we need them.  Rita Carter suggests that the most important component of consciousness

...is not the ability to plan, or choose, or follow through a strategy despite the insistent urgings of our unconscious brain to chase each passing shadow.  Rather it is the intuitive sense of meaning that binds our perceptions into a seamless whole and makes sense of our existence.  Can that, too, be pinpointed?  Astonishingly, it seems that it can.  Meaningfulness is inextricably bound up with emotion. (2)

Emotions endow meaning, enabling us to attach more significance to some sensory stimuli than to others.  But 'attaching significance' to different items of information is well within the capacity of computers.  For example, if we quantify the 'relative importance' of information on a scale of 1-5, then we can include this measure of significance as a new field in a database table:

People_I_know

Relative_importance

Bungalow Bill

5

Eleanor Rigby

2

Jude

4

Maxwell Edison

1

Michelle

4

Mr Kite

3

Polythene Pam

2

Rocky Raccoon

5

Sexy Sadie

3

If 'relative importance' is deemed too crude a measure then one could replace it with more fields, e.g. the 'primary emotions' identified by Carter as disgust, fear, anger and parental love:

People_I_know

Disgust

Fear

Anger

Parental_love

Bungalow Bill

1

2

3

2

Eleanor Rigby

2

4

4

3

Jude

2

4

3

5

Maxwell Edison

3

3

1

3

Michelle

1

1

4

3

Mr Kite

3

5

1

4

Polythene Pam

5

5

2

4

Rocky Raccoon

5

3

2

1

Sexy Sadie

4

2

5

2

It is concluded that any design of an automated learning engine must be able to express and record the relative importance of input data and stored data.

References

  1. Terrence Deacon, The Symbolic Species (Penguin 1997).

  2. Rita Carter, Mapping the Mind (Phoenix 1998).

Click here to go back


Copyright © Roger Kingdon 2004