I was thinking today about the human dream of creating a machine that is aware in the living sense of that word. There was an idea that I came up with and I wanted to discuss it. Let's hear what you think.
We learned from Darwin about evolution. That everything that emerges within a species tends to be necessary. We don't tend to see redundant body parts emerging, as there would be no motive for natural selection to develop that body part. It is reasonable to think that everything that emerges within a species is beneficial to that species.
So how does that apply to consciousness/sentience? If we accept that some lifeforms don't have it (eg grass), and some do (eg humans), then the some that do must benefit from it, else it wouldn't have evolved.
In which case, sentience must offer something beneficial to a lifeform that cannot be achieved by data processing alone.
That sounds quite profound to me. Because we seem to expect almost all things pertinent to the outward behaviour of autonomous beings, are able to be simulated without the need for a consciousness hovering above it. What does consciousness add to behaviour beyond what data processing can do?
Even if we consider something like emotion. In terms of survival behavior, the emotions themselves are not important, but it is the behavioural changes motivated by them which are. Surely those behaviour changes can be simulated without consciousness?
So I don't know what that extra something is.
One conclusion that we might draw, is that consciousness isn't useful without the ability for mind to modify behaviour. We wouldn't expect natural selection to choose consciousness without a corresponding cognitive control mechanic. In evolutionary terms, the two would occur simultaneously. One without the other doesn't modify behaviour.
So perhaps if we can try to understand what that extra something is, we would have some insight into why some life forms evolve consciousness and some don't.
The obvious dividing line to compare our thoughts across is animal vs vegetable. The former is considered conscious while the latter is not.
Vegetables have no ability to control their environment. They cannot change what soil they are growing in, or move themselves to find food. They can only adapt to their fixed environment. Their growth (I believe) is purely mechanical and void of any underlying mind. What could a mind possibly add to the behaviour of a blade of grass? Surely it would be a tremendous redundancy to evolve the machinery to support consciousness without it being beneficial to its survival?
Animals on the other hand do move about. They look for food. They control and shape their environment to suit themselves. They adapt their environment to suit themselves. So how does consciousness assist that? Why is this behaviour not achieved in simulated/data processing ways?
Of course what I'm getting at here is that once we understand what that extra something is, a machine could be evolved to have consciousness too once we know how that extra something could benefit the behaviour (and survival) of the machine.
We learned from Darwin about evolution. That everything that emerges within a species tends to be necessary. We don't tend to see redundant body parts emerging, as there would be no motive for natural selection to develop that body part. It is reasonable to think that everything that emerges within a species is beneficial to that species.
So how does that apply to consciousness/sentience? If we accept that some lifeforms don't have it (eg grass), and some do (eg humans), then the some that do must benefit from it, else it wouldn't have evolved.
In which case, sentience must offer something beneficial to a lifeform that cannot be achieved by data processing alone.
That sounds quite profound to me. Because we seem to expect almost all things pertinent to the outward behaviour of autonomous beings, are able to be simulated without the need for a consciousness hovering above it. What does consciousness add to behaviour beyond what data processing can do?
Even if we consider something like emotion. In terms of survival behavior, the emotions themselves are not important, but it is the behavioural changes motivated by them which are. Surely those behaviour changes can be simulated without consciousness?
So I don't know what that extra something is.

One conclusion that we might draw, is that consciousness isn't useful without the ability for mind to modify behaviour. We wouldn't expect natural selection to choose consciousness without a corresponding cognitive control mechanic. In evolutionary terms, the two would occur simultaneously. One without the other doesn't modify behaviour.
So perhaps if we can try to understand what that extra something is, we would have some insight into why some life forms evolve consciousness and some don't.
The obvious dividing line to compare our thoughts across is animal vs vegetable. The former is considered conscious while the latter is not.
Vegetables have no ability to control their environment. They cannot change what soil they are growing in, or move themselves to find food. They can only adapt to their fixed environment. Their growth (I believe) is purely mechanical and void of any underlying mind. What could a mind possibly add to the behaviour of a blade of grass? Surely it would be a tremendous redundancy to evolve the machinery to support consciousness without it being beneficial to its survival?
Animals on the other hand do move about. They look for food. They control and shape their environment to suit themselves. They adapt their environment to suit themselves. So how does consciousness assist that? Why is this behaviour not achieved in simulated/data processing ways?
Of course what I'm getting at here is that once we understand what that extra something is, a machine could be evolved to have consciousness too once we know how that extra something could benefit the behaviour (and survival) of the machine.
Last edited: