Quote Originally Posted by Sandform View Post
I have observed debates on this as to "why" consciousness arose in life, because wouldn't something without consciousness that acted exactly as if it were conscious survive just as long? This becomes the core of the problem with alien life. Are they conscious or are they unconscious things that seem conscious. Further more is all life on Earth conscious? Of course we can assume with no great leap of faith to assume that any animal with a brain is conscious, since it runs in the same fashion as ours, however until we narrow down to an exact knowledge of what consciousness is, or rather how it arises, we can never know for sure if an alien life form that shares no common ancestry with us would in fact be conscious. For that matter we could never be sure of an AI's consciousness. I have no doubts that if we communicate with alien life it would be intelligent, in the same way as our current computers are, however I would remain healthily skeptical as to the consciousness of such a being, excluding of course the possibility that it developed a brain exactly, or near enough, as ours.
I suppose this leads back to the initial formation of the brain. There is no doubt that these primitive brains were much more simpler than ours. I think I mentioned earlier that they are basically sort of like "reformatting machines". Imagine these flatworm creatures (that had basics brains and basic eyes) were selected for if they were in the dark most of the time (just a stupid example). The brain might take the signals from the basic eyes (which were basically just slightly curved patches) and convert that into muscle movement. The stronger the light signal, the stronger the muscle movement, meaning that when it's in light it will swim until it reaches the dark.

I wouldn't call this consciousness, but when you have to add a new module when your eyes get more dish-like, to recognise where light is coming from, then this is an extra process. The more and more modules you add, the thoughts rise exponentially. It is my belief that what we think of as "consciousness" (deliberatelly put in inverted commas) is tightly bound, if not synonymous with these processes.

I think however that this only happens in the way the brain has these "thoughts", as I (practically) formally defined in my previous paragraph. What makes AI different is that in programming terms, to make some sort of Turing machine, it is just a list of if/then statements, which work completely differently, even though they make them appear the same to us. The reality is that the actions the AI takes doesn't come from the same place (or reasoning should I say) that real intelligence has.

So I suppose it follows that consciousness isn't really a boolean term: conscious, or not conscious (like the processes going on in a Venus fly trap). I would say that it has a scope, like a gradient. A fly is conscious because it is capable or reasoning, but it is less conscious than me because it is capable of less reasoning (this morning I saw a fly keep trying to fly through the glass of my window).