Right, but consciousness came about in humans...
See, that's the very assumption I'm questioning -- that consciousness
originated in us. Rather, our consciousness is a refinement of a trait existing in other primates, mammals, etc. to some degree or other.
But I don't think it can happen truly randomly or by accident. I think that there does need to be some degree of pressure in that direction, whether it be a selection pressure that brings it about as an emergent property of other beneficial features combined with self-reference or it be a purposeful effort by designers working on AI. Just having a pointer to yourself in your thoughts isn't enough, you need to apply that self-reference neurologically (or the closest equivalent in whatever system you're using) to modification of your thought patterns.
Sure, but there are matters of degree. And it's possible to define sentience too narrowly, as I've been saying. Maybe Starfleet computers have had a degree of sentience all along, but it wasn't recognized as such because humanity (and other Federation species) had too narrow a definition of sentience.
You mentioned
Godel, Escher Bach -- remember that one of the key points of Hofstadter's model of consciousness was that it wasn't just
one emergent process, it was an emergent property of several lower-level processes interacting, and each of those lower-level processes was itself emergent from the interaction of even lower-level processes, etc. So what I'm saying is that just because a mind doesn't have 100% of the processes that define human consciousness, that doesn't mean it has no consciousness; rather, it has portions of what we have rather than the whole thing.
After all, human consciousness itself is a variable thing. We're less conscious, by definition, when we sleep than when we're awake. I've read that there's some evidence that portions of the sleeping brain actually become more physically disengaged from each other so that cerebrospinal fluid can wash toxins and buildup out the spaces between them (which may be why sleep deprivation could increase the risk of Alzheimer's, or something like that), and that different parts of the brain operate more autonomously from each other during sleep, rather than working together as a whole. So waking consciousness is all the parts operating collectively, but when we sleep, only parts of our brain are working, or they're working independently and not as a whole. So when we dream, we have awareness, but it's a reduced level of awareness, one without judgment or clarity, and often without memory. I wouldn't say we're nonsentient when we dream, just that we don't have our full faculties. And the same would go for someone with brain damage or mental illness -- they have less than the full function of the brain, but that doesn't mean they aren't conscious beings. I think animals are probably the same way -- they have many or most of the same pieces, so they may have a consciousness similar to what a dreaming person or a toddler would have.
So maybe AI consciousness could have similar tiers. Maybe the
Enterprise computer is already somewhat sentient -- with the selection pressure producing that semi-sentience being the demands of Starfleet for efficient performance and ready understanding of the crew's requests and needs, as well as the complex processing that would be needed for the universal translator to interpret nuance and idiom and so forth -- but it isn't "awake" enough to exert its own independent will. Maybe it just takes another bit of selection pressure to create a program that operates on an "awake" level. With Moriarty, the selection pressure was "create an adversary that can defeat Data." With the Doctor, the selection pressure was the need to function effectively as a member of
Voyager's crew and community. Harder to say about "Emergence," but given how connected that entity seemed to be to the holodeck, maybe the selection pressure was the need to process all the crew's various fantasies and "dreams," to be able to function as a surrogate for a human imagination -- itself a key element of conscious thought, the ability to model alternative and future scenarios and the perceptions and choices of other minds.
I mean, I'm still enough of a Jaynesist (questionable analysis of ancient writings aside) that I'm not even sure that consciousness predated language or culture in humans.
I think that's more egocentrism, the notion that what we happen to be at this stage in our evolution is something unique and special. We've never been right about any such thing before. We weren't created in God's image. Earth wasn't made to be our home. Earth isn't the center of the universe. We're not the only species that makes tools or has language. (Chimps have been shown to have culture, by the way -- different populations have different techniques for tool use that are passed on through teaching.) And so on. So color me skeptical of any notion that affirms our own desire to be special.
I also have a probabilistic objection: Given the vast reaches of time, the billions of years in which life has been evolving and the hundreds of millions of years in which multicellular animal life has been evolving, what are the odds that we, at this particular moment in time, just happen to be within a few tens of millennia of something as revolutionary as the very first emergence of consciousness? That seems very shortsighted to me.