Wednesday, January 30, 2008

More on consciousness

I mentioned my Abstraction abstracted paper to the CAS group. In responding to one of the comments I wrote the following.
Thanks for looking at the paper. The issue I was attempting to address is what is abstraction. I don't think we have a good answer. We all regard it as very important—especially in software and computational thinking. If you were going to define abstraction, what would you say? That's where I was going.

It wasn't where I was planning to go. (I'm not sure I knew where I was planning to go when I started.) But I kept getting pushed in the direction of attempting to understand what we mean by abstraction. I would up by saying that "abstraction" and "concept" are more or less the same. One contribution of the article (if there is one) is that "concept" depends on consciousness. Of course there is a very large philosophical literature on conceptualization. I can't claim to know much of it. (For example, the Stanford Encyclopedia of Philosophy has this article on concepts, which I haven't mastered.) But it does seem to me to be useful to say that a concept is a distinction that one is aware of making or being able to make--and hence depends on some level of self-awareness. So a concept is our way of making discrete chunks out of our experience.

It's not clear how widespread the ability to form concepts is. For example, dogs can recognize their masters. So they have a concept (an abstraction) of their master in some sense. (So even proper nouns are abstractions—of the thing named.) But do we know whether dogs can mentally refer to their masters? Can a dog think to itself something like, "I wish my master were home so she could take me out for a walk." Or is that too abstract? So perhaps some animals have the ability to form concepts but not to manipulate them mentally? Obviously I don't have all the answers.
So I guess that one of the important characteristics of concepts/abstractions is the ability to manipulate them mentally, to think with them, to use them to model the world. Apparently this is discussed in Rational Animals? by Susan Hurley and Matthew Nudds (Eds), including this paper by Ruth Garrett Millikan.

A further thought. Consciousness is an emergent phenomenon. Emergent phenomena are epiphenomenal. But we experience consciousness as real. (The qualia problem.) Why is that?

1 comment:

Anonymous said...

It is probably better to define "consciousness" as any inner state that mirrors an outer state. A stone warmed by the sun is "conscious" under that definition but it doesn't have the high resolution mirroring of the environment that humans have. Surely it is even less conscious than a lizard sleeping in the sun.

Simple organisms have inner states that direct their actions in relation to features in their environment. A microbe may be sensitive to levels of salinity, warmth, etc. It may be sensitive to traces of chemicals that are shed by prey creatures. Perhaps beyond those simple orienting mechanisms all they do is bump into things and try to absorb them or try to escape being eaten by them.

More complex organisms have more complex inner states that map onto external states somehow. Homing pigeons can find their way home over long distances -- even when they have been carried to the remote location in a covered cage.

At a certain point some animals begin to include themselves as identifiable units in their system of mapped entities. Elephants, chimpanzees, porpoises, etc. will react in a clearly self-aware way if they have regular access to mirrors and then experimenters covertly alter their appearance.

The old ideas that only humans had purposes, only humans have awareness of self, etc., cannot explain empirical observations. It appears much more likely that consciousness is incrementally emergent. Self awareness is not lacking in elephants and not perfect in humans.