• Lucid Dreaming - Dream Views




    Page 1 of 5 1 2 3 ... LastLast
    Results 1 to 25 of 125
    Like Tree23Likes

    Thread: Consciousness and AI

    1. #1
      Member Achievements:
      Created Dream Journal Referrer Bronze 5000 Hall Points Tagger First Class Populated Wall Veteran First Class
      Arra's Avatar
      Join Date
      Jan 2011
      Posts
      3,838
      Likes
      3887
      DJ Entries
      50

      Consciousness and AI

      I started writing this as a response to the AI thread in Science & Mathematics, but have realized it's more a philosophical topic.

      I've always had a problem with AI and the possibility for it to simulate human consciousness. Anyone who's seen how computer programming is done knows it's basically just algorithms; conditional statements and loops, and maybe a few slightly higher level concepts. Basically just rules. Give some part of the program some data, and carry out some algorithm to determine what, in the end, to output. A basic component would just be: if the data is this, do this. If it's not, do something else.

      I've never helped make a large-scale program, so correct me if I'm wrong, but it seems if all the computing power in the world was used to try to simulate a brain, it would just be a very complicated version of what I've described. How does consciousness arise at all through this? A behaviorist might say that the computer has simulated a human brain successfully if it behaves like one. But the computer would not have an awareness, it wouldn't experience anything, like the subjective experience of the color blue, which we all experience but could not make some unconscious being comprehend if our lives depended on it.

      I'm an atheist, and so far to me it seems natural selection is probably true, and that I'm somehow wrong. But I don't yet see where I've gone wrong.

      The response I usually get is that consciousness/awareness does arise through the basic components of the brain being put together in an appropriate formation. My argument is usually compared to emergent properties. People say it's like asserting that it's impossible for water to be wet because none of the H2O molecules are wet on their own. That on one level, yes, they're nothing but molecules interacting with one another through cohesion and that's all they're doing, but on another level wetness emerges. And the 'mystery' of how consciousness arises from brain processes is a parallel situation.

      But this response seems flawed to me. With water, the molecules interact, and when there are a sufficient number of them our brains are able to perceive them and interpret wetness from them. There is no property here that magically emerges. Every step can theoretically be explained in detail, so that it makes logical sense we should perceive the wetness. But there is no logical path that I can see that would lead from a complicated algorithm involving only physical components, as we know them, to consciousness.

    2. #2
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084
      If a program that simulates the brain (which is certainly possible, computers can simulate any causal physical situation) is 'just conditional statements and loops' as you put it, then surely a brain in nature also is just conditionals and loops?

      The question you need to ask is: if water and fat and protein and sugars and ions can cause consciousness, why can't silicon and electrons?

      Clearly matter is not relevant: it is what the matter does that is relevant. If the matter acts like a brain, you will have consciousness, if it's a brain in nature or on a computer.
      Marvo, Xaqaria, A Roxxor and 2 others like this.

    3. #3
      Member nina's Avatar
      Join Date
      Aug 2004
      Gender
      Posts
      10,788
      Likes
      2592
      DJ Entries
      17
      I don't understand why science has not yet been able to simulate consciousness, or create AI. If all a brain is, is just chemical processes and links, basic algorithms...then why hasn't science been able to replicate this with a computer? Are the links and processes in the brain too complex for science to replicate with present technology?

    4. #4
      Ad absurdum Achievements:
      1 year registered 1000 Hall Points Made lots of Friends on DV Referrer Bronze Veteran First Class
      Spartiate's Avatar
      Join Date
      Jul 2007
      Gender
      Location
      Block 4500-7000
      Posts
      4,825
      Likes
      1113
      Well there are around 1 quadrillion synapses in the brain, we haven't made neural networks that are large enough yet. We also haven't mapped every neuron and synapse in the brain, so even if we had a network large enough, we wouldn't know how to connect it.

    5. #5
      Member Achievements:
      Created Dream Journal Tagger Second Class 1000 Hall Points Veteran First Class
      Entaria's Avatar
      Join Date
      Feb 2011
      LD Count
      Lost track
      Gender
      Location
      Ontario
      Posts
      32
      Likes
      3
      DJ Entries
      5
      To answer the question of why we have not been able to create an AI yet:

      It's all a matter of complexity.

      I'm not a computer expert, or a neuroscience expert, (I just have a basic knowledge, as a psychology student who focuses on behaviour) so someone please correct me if I'm wrong, but I think the current issue with creating an AI that mimics human intelligence is a lack in the ability to allow computers to learn and respond with the same flexibility as humans (as well as a possible lack of computing power and memory. The human brain, as far as we know, is almost limitless in the amount of information it can store). The human brain has millions of neurons, each connecting to potentially dozens of others in hundreds of different ways. They grow in vast numbers when we are young to adapt to our situation, die off if they are no longer needed, find new ways to connect themselves to adapt to new situations as we get older and learn new things... I believe this is the process that needs to be recreated with technology in order to have an AI that can truly mimic human intelligence. As it is, we can program a computer and tell it "react this way if this happens" or "react in one of these ways if this happens," but they do not have the same capacity for learning and memory as a human, and so faced with an entirely new experience, a computer would not be able to respond. We simply don't have the capacity to make a list of every possible situation a computer (or human) could find itself in, and every possible response that could be made, not to mention the fact that we don't don't even have a full and complete picture of personality (if you want an AI with a personality) and how it reacts with the situation at hand, let alone knowing enough to program this into a computer. Even forgetting personality and going simply with logic is a problem, when faced with a situation that cannot use "standard" logic in the process of dealing with or reacting to that situation. The problem with creating an AI with the technology we currently have is that we simply don't know enough about human psychology and all the little myriad things that determine our own behaviour to be able to even consider simply programming a computer to act like a human, although we can certainly reach some level of human-like behaviours, and even learning, to some extent, with computers, until we can create something similar in structure to a human brain, that can be flexible and learn in a similar way that we do, I believe it will be impossible to create an AI approaching anything even remotely close to the human-like intelligence shown in fictional AIs. Not to mention we don't even know how the physical process works. We know that there are various chemical reactions and electrical impulses in the brain when performing specific tasks, but we don't know, for example, how this type of activity in one section of the brain translates into the experience of remembering something.

      And just a note in terms of the example of experiencing colour: Our own experience of colour is nothing more than our eyes taking in a particular wavelength of light, which stimulates specific receptors, and is interpreted by the brain as "colour," which is actually an abstract concept, as colour does not physically exist. This process can certainly be created in a computer, given the right sensors and data about wavelengths, thus allowing that computer to perceive those wavelengths, and indirectly, colour. Although internally, a computer may not experience colour the same way we do (which in itself is completely subjective and different from person to person. I can point out a hundred things that are blue, but I cannot even begin to explain to anyone what blue is, or even know if you see the colour blue the same way I do), it would still be able to "see" colour, just as we can see colour.
      Oneironaut Zero likes this.

    6. #6
      DuB
      DuB is offline
      Distinct among snowflakes DuB's Avatar
      Join Date
      Sep 2005
      Gender
      Posts
      2,399
      Likes
      362
      Quote Originally Posted by Aquanina View Post
      Are the links and processes in the brain too complex for science to replicate with present technology?
      Quote Originally Posted by Spartiate View Post
      Well there are around 1 quadrillion synapses in the brain, we haven't made neural networks that are large enough yet. We also haven't mapped every neuron and synapse in the brain, so even if we had a network large enough, we wouldn't know how to connect it.
      This is true, but I don't think most people really understand or appreciate just how wide the gap between meaty brains and simulated brains still is. The typical artificial neural network employed in modern computational neuroscience research is absurdly, laughably oversimplified. They "simulate" brains to about the same degree that paper airplanes "simulate" supersonic jets. And it's not simply a matter of network size or number of connections: the individual nodes themselves, the very foundations of the networks, do only a pale shadow of what actual neurons do. For example, nodes typically pass along "bits" of information in a binary fashion, unlike actual neurons which almost certainly transmit richer information encoded in the precise temporal patterns of their firings. What's more, we don't even have a thorough understanding of how actual neural cells and synapses work; we continue to learn very basic things about how they communicate with one another (recent example). In other words, the biological models most researchers are referring to when they design these nodes are just wrong.

      The worrying thing here is not how far we still have to go. The real worrying thing is that these conglomerations of (typically) a few dozen to a few hundred neuron-like abstractions are already really complicated. So complicated, in fact, that a lot of the time their behavior seems downright mysterious even to the researchers who designed them. While in principle one can always retrace any simulation to see exactly what happened to produce the outcome, in actual practice, doing so often requires sifting through more data than any human can reasonably make any sense of. It makes you wonder what it will be like if/when we have neural nets which actually do work a lot like brains. It seems rather likely that either (a) we won't ever get there, or (b) we will, but the behavior of these networks will be just as mysterious and indecipherable as real brains--which may represent a great pat on the back for humanity, but it's not clear how much we'd really learn from it.
      Taosaur and Xei like this.

    7. #7
      The Programmer Shadow27's Avatar
      Join Date
      Jan 2011
      LD Count
      6
      Gender
      Location
      The Bunker
      Posts
      436
      Likes
      41
      DJ Entries
      104
      Essentially:
      -The brain is an analogue machine using comparisons of likeness and logical assertions.
      -A computer program is a digital machine using calculations and data storage to evaluate and make logical assertions.
      -We don't know enough about the brain to know how the algorithms work as the brain is highly complex and not something we can take apart and examine.
      -The brain, being analogue, is difficult to represent through a digital medium.

      Join our Lucid Dreaming video game project!
      Lucid Dreams: 6 ---- WILD: 0
      ---------------------- MILD: 2
      ---------------------- EILD: 0
      ---------------------- DILD: 4
      ---------------------- DEILD: 0

    8. #8
      LD's this year: ~7 tommo's Avatar
      Join Date
      Jan 2007
      Gender
      Location
      Melbourne
      Posts
      9,202
      Likes
      4986
      DJ Entries
      7
      Quote Originally Posted by Shadow27 View Post
      -We don't know enough about the brain to know how the algorithms work as the brain is highly complex and not something we can take apart and examine.
      We can take it apart and examine it. Just in really, really small bits. Or slowly. I can't find an article you don't have to pay for, but basically some scientists have replicated something like less than a millimetre square of rats brain tissue. Also there are fMRI's which show us which parts of the brain are used for different things.

      To the OP. Water is only wet because we aren't. It's the same as wood being hard because our skin is softer than wood. Is our skin was steel, wood would feel pretty soft. If we were air or something, water would be pretty hard.

      So yes we are just a complicated machine. If we could replicate a human brain, there's no reason why it wouldn't be conscious. Unless you believe in souls etc.
      But you said you are an Atheist so you are just being illogical thinking that there is something else which causes consciousness.

      Personally I think when we replicate a human brain, we'll be like "wow! that's incredible, we're so smart" and then we'll get over it.
      The really cool thing will be probably before this, although maybe afterward, depending on how we replicate the brain. It will be
      all these robots etc. that we've already created simple versions of. Robots that can do things better than humans can. Like play chess (done)
      mathematics (done) chemistry (don't think so because it usually requires creativity, which we can't do yet to my knowledge) compose music, artwork (not yet) etc. etc.

      When we put these things together in a robot, we won't care whether it's conscious or not. Consciousness is really sort of a bothersome thing.
      Why else would so many people be so determined to get rid of it? (Buddhists, or just people seeking ego-death etc.
      Consciousness just gets in the way a lot of the time, makes us reflect on things too much, causes exaggerated and prolonged stress etc.
      Maybe we could make consciousness even better than it is in humans. So these sorts of problems don't arise. A sort of debugging of the human brain hehe

    9. #9
      Member Achievements:
      Created Dream Journal Referrer Bronze 5000 Hall Points Tagger First Class Populated Wall Veteran First Class
      Arra's Avatar
      Join Date
      Jan 2011
      Posts
      3,838
      Likes
      3887
      DJ Entries
      50
      Quote Originally Posted by tommo View Post
      So yes we are just a complicated machine. If we could replicate a human brain, there's no reason why it wouldn't be conscious. Unless you believe in souls etc. But you said you are an Atheist so you are just being illogical thinking that there is something else which causes consciousness.
      It would be illogical to pretend that I didn't see the problem with consciousness I brought up in the OP. I see a genuine problem with consciousness being brought up by physical processes and therefore with the ability for AI to become conscious. I might be making a logical error, but it isn't illogical for me to wonder what that error is (it seems like that's what you're saying, sorry if you aren't, need to say this lest you accuse me again of arguing against unmade claims). I'm not going to believe something and ignore a potential logical flaw just because it's consistent with beliefs I already hold. But I do think it's likely I am making some logical error in my reasoning in the OP. I just want to know what it is. No one seems to be addressing it though.

      The responses so far seem to be saying that consciousness is possible in AI and it makes sense with a complex brain, without even addressing what I said. I don't care what the thread diverts to or if my post isn't answered. Perhaps it's so logically stupid and the error I'm making is so obvious to everyone that no one sees a point in bringing it up, or maybe no one understands my argument because I didn't explain it well. Just don't pretend you're answering my original claim when you aren't addressing it at all.

      Quote Originally Posted by tommo View Post
      Consciousness just gets in the way a lot of the time, makes us reflect on things too much, causes exaggerated and prolonged stress etc.
      Maybe we could make consciousness even better than it is in humans. So these sorts of problems don't arise. A sort of debugging of the human brain hehe
      Gets in the way of what? If we didn't have consciousness, we wouldn't be aware and wouldn't place value on anything. There would be no feeling of anything being valuable, no state would be better than any other state, there would be nothing to get in the way of.

    10. #10
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084
      Your original post isn't very clear. Are you saying that consciousness is impossible for AI in the sole context of AI programs running on computers (or perhaps general mechanical devices, although that's vague and it's not clear why the brain shouldn't be included), or are you saying that consciousness seems impossible in all circumstances? Because the first half of your post is very computer centric but the second half seems more general.

    11. #11
      Member nina's Avatar
      Join Date
      Aug 2004
      Gender
      Posts
      10,788
      Likes
      2592
      DJ Entries
      17
      Quote Originally Posted by Dianeva View Post
      or maybe no one understands my argument because I didn't explain it well. Just don't pretend you're answering my original claim when you aren't addressing it at all.
      this

    12. #12
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084
      No, not that. Maybe people thought they were addressing your argument because you didn't explain it well. Don't be so ungrateful, people are just here to discuss, they generally do it in a pretty polite manner, and they generally put a lot of thought and time into it and try to honestly reply to you.

    13. #13
      Member Achievements:
      Created Dream Journal Tagger Second Class 1000 Hall Points Veteran First Class
      Entaria's Avatar
      Join Date
      Feb 2011
      LD Count
      Lost track
      Gender
      Location
      Ontario
      Posts
      32
      Likes
      3
      DJ Entries
      5
      I was more addressing the question that was brought up by Aquanina about why we don't have advanced AI yet, and... kind of lost sight of your original post XD

      Anyway, in my own personal view, I believe that if we do manage to create a human-like brain, then consciousness, or at least some form of it (potentially not quite the same as our own perception of consciousness), should logically follow. But as it is, we cannot create an AI with the complexity necessary to mimic the human brain, and so there is no room for consciousness. Whether or not we will ever achieve that level of technology is up for debate. I don't believe it will happen any time in the near future, but a couple hundred years from now... who knows? It's possible that by then, we'll know enough about how our own brains work to be able to create an artificial "human" brain, but I think the most we'll be able to see for a long time is computers capable of mimicking behaviour on a basis of algorithms and chance, not through conscious thought.

    14. #14
      Banned
      Join Date
      May 2007
      LD Count
      Loads
      Gender
      Location
      Digital Forest.
      Posts
      6,864
      Likes
      386
      Technically all you need is a self-expanding system that can process input, generate output which is composed of arbitrary segments of data that are able to assign some kind of relevance to one another independently in order to simulate/create a system like the human brain, and thus consciousness. The problem is, while this is relatively trivial to create, the actual processing of the input, and the internal connections becomes so exaggerated that it becomes impossible to get the speed of a distributed system like the human brain; sort of the same problems that are run into while attempting to simulate a physical system at the atomic level. So most AI systems are actually elaborate and thus slightly retarded so that they can actually function. In the past decade there have been many projects started to create a supercomputer which would emulate the structure of the human brain, a few by IBM I believe... So it is possible we may see such computing systems become available in the next decade or so. Perhaps sooner.

    15. #15
      Member Achievements:
      Created Dream Journal Referrer Bronze 5000 Hall Points Tagger First Class Populated Wall Veteran First Class
      Arra's Avatar
      Join Date
      Jan 2011
      Posts
      3,838
      Likes
      3887
      DJ Entries
      50
      I don't care if the thread goes off the topic of the original post. Anything related to the title would be fine. I'd be happy if the thread involved any engaging discussion, even if it isn't exactly about the intended topic. Talk about whatever you want. Consciousness and AI are big subjects and you can go off on a lot of tangents. It just seemed like some people read what I said but sort of absentmindedly ignored it, pretending I said something else and straw-manning. I've realized I'm relatively terrible and judging the intentions and motivations of other people, so this might not at all be what anyone intended. If it really wasn't clear what I was trying to say, or if the reply was some tangent on AI in general and no one intended to be replying to the OP, that's fine.

      I am really talking about consciousness in general. Forget about AI for a moment. It doesn't seem to me that consciousness should be able to arise in physical processes. To me it seems that consciousness arising from physical processes is like trying to make a green house from red bricks. Insisting that consciousness can arise from very complicated brain processes is like saying that if you pile an unimaginably intricate formation of red bricks, you'll eventually get a green house. That's basically what I'm trying to say in the original post. In the last two paragraphs, I bring up a common response to my argument, the one I predict will be the response of most people, and attempt to refute it.

      I think AI is a good topic in which the issue I've described naturally comes up. Is it possible for a robot, hypothetically, to experience conscious awareness once it became complicated enough? For example, is it possible for a robot to experience the feeling of sadness or color, in the subjective way we humans experience it? This naturally leads to questioning whether consciousness can be constructed from entirely physical processes.

    16. #16
      DuB
      DuB is offline
      Distinct among snowflakes DuB's Avatar
      Join Date
      Sep 2005
      Gender
      Posts
      2,399
      Likes
      362
      As a side note, I think this should be merged with the original AI thread. I can't keep straight which comments have been made in which thread, which suggests they're basically the same thread.

      I'll think about a more substantive reply soon

    17. #17
      Member Achievements:
      Created Dream Journal Tagger Second Class 1000 Hall Points Veteran First Class
      Entaria's Avatar
      Join Date
      Feb 2011
      LD Count
      Lost track
      Gender
      Location
      Ontario
      Posts
      32
      Likes
      3
      DJ Entries
      5
      Ah, I think I see what you're getting at now, Dianeva. I have to agree, trying to figure out how the experience of consciousness arises from what is seemingly just a big complicated collection of chemical and electrical reactions is something that is hard to wrap your head around. I'm gonna have to jump on board with you and say I have no freaking clue. It doesn't seem like something that should be possible, but obviously it is. I could just default to my religious mode and say "God did it" but that doesn't make for very interesting conversation does it? besides the fact that, though I am religious, I do believe that we need to do our best to come to a scientific explanation before placing it on a higher power of some sort. But yeah, this is a tricky question, how does conciousness come about in the first place? I mean, we don't really quite know how a bundle of firing neurons translates into you remembering what you had for dinner last night, what it tasted like, smelled like, etc. other than possibly it's all due to association and neurons triggering taste receptors in the brain or some such... I don't know, it's a tricky question, especially because consciousness and feeling are so hard to describe concretely in the first place, it makes it difficult how they actually come to exist. I suppose emotions could be in part evolutionary. Because we are social creatures, and our survival depends (or at least use to depend) on being accepted by the group, emotions would help with that. Showing sympathy and helping, although for the individual may be taxing, helps the group survive as a whole. Becoming attached to another person promotes sympathy and caring, and feeling sadness or grief if a person dies would be a deterrent to neglecting a person you care for. Anger keeps people from doing something to you, for fear of retribution, and fear keeps you from doing something stupid... emotions would have helped to guarantee survival and passing on of genes, and those who didn't experience emotion to the same extent may have been rejected by the group for acting in a manner not acceptable. Though this still doesn't quite resolve the issue of consciousness, maybe it at least resolves the issue of the emotional aspect of consciousness? Perhaps it's possible that the development of basic emotions resulted in consciousness evolving at the same time, as in my mind, emotions would be impossible without basic consciousness. Any thoughts? I won't be offended if you tear this thinking to bits, I'm mostly just coming up with it on the spot XD
      Last edited by Entaria; 03-03-2011 at 03:02 AM.

    18. #18
      Banned
      Join Date
      May 2007
      LD Count
      Loads
      Gender
      Location
      Digital Forest.
      Posts
      6,864
      Likes
      386
      No, it's actually like saying a large group of people trading labor for capital, capital for goods services, and vice versa would create an economy, then describing the economy as the result of the transfer of these things between people on a large scale. If you were to observe a single transaction, it would not necessarily behave like the system does overall, because the economy is the result of billions of those transactions, among other things.

      A better analogy would probably be that while a vast and dynamic array of billions of switches arranged in a given architecture with some basic method of changing the position of said switches may behave in the fashion of what we would describe as a computer, binary switches do not intrinsically possess these qualities. Consciousness and the brain can be explained in the same way. All we need is time to actually demonstrate this. Until then it is basically just talk :/

      Quote Originally Posted by Dianeva
      I think AI is a good topic in which the issue I've described naturally comes up. Is it possible for a robot, hypothetically, to experience conscious awareness once it became complicated enough? For example, is it possible for a robot to experience the feeling of sadness or color, in the subjective way we humans experience it? This naturally leads to questioning whether consciousness can be constructed from entirely physical processes.
      Well, what exactly does feeling sadness, or seeing colour entail? What makes it likely that it is NOT your brain that is the cause of your consciousness? What makes a brain different from something that emulates a brain, but is made of transistors rather than neurons?

      As far as anything has shown, feelings and sight are all senses that are processed by the brain, so why wouldn't consciousness occur in a perfect emulation of a brain?
      tommo likes this.

    19. #19
      Drivel's Advocate Xaqaria's Avatar
      Join Date
      May 2007
      LD Count
      WhoIsJohnGalt?
      Gender
      Location
      Denver, CO Catchphrase: BullCockie!
      Posts
      5,589
      Likes
      930
      DJ Entries
      9
      Quote Originally Posted by Aquanina View Post
      I don't understand why science has not yet been able to simulate consciousness, or create AI. If all a brain is, is just chemical processes and links, basic algorithms...then why hasn't science been able to replicate this with a computer? Are the links and processes in the brain too complex for science to replicate with present technology?
      The main issue from a hardware standpoint is that our computers utilize binary logic gates (this or that) while neurons fire in much more complex structures (if this then that and that and that and...) and that only covers the electrical aspect of how neurons communicate. There are even still more levels of communication within the chemical transmissions and then the electrical signals also release chemical transmissions and the chemical transmissions cause electrical signals to fire...

      The ability to happily respond to any adversity is the divine.
      Art
      Dream Journal Shaman Apprentice Chronicles

    20. #20
      LD's this year: ~7 tommo's Avatar
      Join Date
      Jan 2007
      Gender
      Location
      Melbourne
      Posts
      9,202
      Likes
      4986
      DJ Entries
      7
      I still don't understand how I didn't answer your question. I thought I was pretty straight forward.

      Consciousness HAS to arise from the brain. Where else would it come from?
      It is absurd to think that consciousness cannot arise from the brain if you are an Atheist.
      There is nothing else for it to arise from.

      When we replicate the human brain exactly, how could you think that consciousness will not come with that?
      Yes, we cannot build a green house from red bricks. But if you add in some yellow and blue bricks, there will be the
      illusion of being green.

      It is just a trick of the brain and the way it is wired.
      Consciousness just forces us to stay alive because we perceive ourselves as.... ourselves.

      Gets in the way of what? If we didn't have consciousness, we wouldn't be aware and wouldn't place value on anything. There would be no feeling of anything being valuable, no state would be better than any other state, there would be nothing to get in the way of.
      We think we are separate from other things. Which makes us want to keep surviving. Putting ourselves above other things.
      The only thing we value because of consciousness is our own lives and those of our close relatives, because they share our genes.
      This is getting off the point though and I don't want to convince you of this because you don't understand the consciousness issue in the
      first place. If you begin to understand the issue you raised, maybe you will see what I'm talking about regarding consciousness being a bother.

    21. #21
      Awoken Dreamer Achievements:
      Referrer Bronze 1000 Hall Points Veteran First Class
      bust113's Avatar
      Join Date
      Jul 2010
      LD Count
      7
      Gender
      Location
      Waking Dream
      Posts
      430
      Likes
      67
      I didn't care to read the thread, because I am super tired right now, but this is my take on this.

      As you said, AI is only an algorithm, made up of condition statements loops etcetera. But the human brain does quite the same thing, it puts all of the information through a very complex algorithm of conditions and loops. The brain IS a computer. A computer does not mean circuit boards and crazy wires, it simply means a particular object that processes information. Because of this, if you code an AI properly, you CAN in fact achieve consciousness. The human brain does not truly have complete consciousness, but these complex lines of code in our brains creates this illusion. And it is possible to move consciousness to a machine, we just need to figure out how.

      That said, as consciousness can be moved to another machine (And after that speech I made, you won't believe that I believe in "Spirits"), when our consciousness is transferred over, our actual spirit does not exist through the new entity. The new entity works like you, thinks like you, does everything like you, but it is not YOU. Our point of view, created by our spirit, only exists through the entity it was created from (So singularity will not be something we want to do, even though it is the person, you will just die and there will be a cloned robot of your acting like you), when that entity dies, the spirit does not take the perspective of the new entity, but it just dies and whatever death ideal is true will happen to it (Heaven, Hell, Simple non-existence, My Homelanders theory, etcetera). And the spirit, that is what separates organic consciousness from AI consciousness.

    22. #22
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084
      Thanks for further explaining yourself, Dianeva.

      What we're actually dealing with here are fundamental unresolved questions of philosophy of mind, so it's not surprising you can't find any fallacy causing the contradiction. Consciousness does seem paradoxical.

      Going back to the water analogy (which I think is the single distinct argument you've been talking about so far, correct me if I'm wrong), what you're saying is:

      1. With water we understand step by step how the phenomenon of wetness arises. With consciousness we have no idea how to produce such an argument, or even how such an argument could work in general.

      Roxxor objects that this is based on limited knowledge and so is not some kind of fundamental problem. However I do think there is a point above. At least with wetness, before the knowledge of atoms and chemistry, we may have suspected that there was such a reduction. At any rate we do not perceive some kind of conceptual block. However with consciousness it seems that no reductionist argument would ever suffice.

      I have a couple of other arguments to add to the thread.

      2. All other holistic, emergent phenomena can be explained by saying they are real as concepts in our minds, but do not have any kind of fundamental physical existence. Wetness is clearly 'real' to us and it is very useful to approach the world in a holistic fashion, but the universe does not need any holistic knowledge in order to operate properly; it all just falls out automatically from the bottom layer.

      However, there is a fundamental problem when you try to take this holistic approach with consciousness: it forms a strange loop, a cyclical logical loop that we would call paradoxical. How can it be the case that the mind only exists as a holistic phenomenon, when we need a mind in the first place for holistic phenomena to 'exist'?

      3. Consciousness is singular. Wetness can be added to more wetness and what you get is still just wetness. However we feel fairly certain that minds are single entities that can't be glooped together in this fashion. It can be argued that this doesn't apply if we use a different emergent analogy, for example the emergent analogy of a ball bearing. However, ball bearings can be swapped, duplicated, etcetera, without any philosophical problems (using the above conception of holistic existence), whereas doing this to brains is a philosophical can of worms. If you delete a brain and then make an exact copy somewhere else, has the consciousness teleported? Well, why not, all that matters is the brain as a holistic entity. But then, what if you made two copies? Where has the consciousness teleported to? By symmetry it can't be either, and it looks like we have a contradiction.

      Quote Originally Posted by tommo View Post
      I still don't understand how I didn't answer your question. I thought I was pretty straight forward.

      Consciousness HAS to arise from the brain. Where else would it come from?
      It is absurd to think that consciousness cannot arise from the brain if you are an Atheist.
      There is nothing else for it to arise from.

      When we replicate the human brain exactly, how could you think that consciousness will not come with that?
      Yes, we cannot build a green house from red bricks. But if you add in some yellow and blue bricks, there will be the
      illusion of being green.
      This is missing the point of the thread. Dianeva has explained herself quite well now. Obviously she knows that consciousness arises from neural processes, so telling her "what else could it arise from" achieves nothing. What is of interest here is what is wrong with the argument that consciousness can't arise from the brain. So far your only answer has been this:

      It is just a trick of the brain and the way it is wired.
      Which is, to put it bluntly, pretty rubbish.
      Last edited by Xei; 03-03-2011 at 01:38 PM.

    23. #23
      Member Achievements:
      Created Dream Journal Referrer Bronze 5000 Hall Points Tagger First Class Populated Wall Veteran First Class
      Arra's Avatar
      Join Date
      Jan 2011
      Posts
      3,838
      Likes
      3887
      DJ Entries
      50
      @Entaria
      I agree with about everything you said. An evolutionary advantage for emotion, morality, etc. is easy to come up with when you consider that our ancestors often had to work together in a society to survive. But yeah, it doesn't solve the problem of consciousness itself. The difference between a brain processing the color red and reacting to it in appropriate ways, and the subjective experience of the color red. It would be impossible to describe to a red-green colorblind person what the color red looks like to us.

      Quote Originally Posted by A Roxxor View Post
      A better analogy would probably be that while a vast and dynamic array of billions of switches arranged in a given architecture with some basic method of changing the position of said switches may behave in the fashion of what we would describe as a computer, binary switches do not intrinsically possess these qualities. Consciousness and the brain can be explained in the same way. All we need is time to actually demonstrate this. Until then it is basically just talk :/
      But if someone were to take the time, in theory, he could create a logical path from the low-level behavior of the switches to the higher-level behavior of the computer. The switches turning on cause other switches to turn on, and eventually there's a pattern of switches being turned on and off which might be labeled a running process. That's all higher-level behavior or objects are, I think. The grouping/organizing of lower-level behavior or objects so that a situation is easier to conceptualize. For example, in reality, I'm a bunch of molecules or energy or whatever the base component of matter is. But our brains group large amounts of similar molecules together, so that instead of clusters of molecules, I'm now a heart, lungs, skin, etc. And those can be further grouped and be called a human, and those can be grouped and called a population, etc. There's no mystery as to how this happens. It's simply the grouping of lower-level things because it's easier to think of them on a higher level. There are no magical, extra things which emerge on the higher-level that weren't there already.

      I'm guessing you would agree with that last paragraph, but you think consciousness is just another example of lower-level processes being described at a higher level. But it seems to me it might not be. For every lower-level process or object, you can logically deduce, hypothetically, how the higher-level process or object arises. For example, if you have some bricks, you can insist that there is no 'shelter' in any of the bricks on their own. But you can deduce that, if you pile the bricks on top of one another in a certain way, you can create an enclosed area with them, which a person could go inside and protect themselves from the elements, and you define this as shelter. So shelter is a higher-level way of looking at the cluster of bricks. With consciousness, you can't do this. You couldn't explain to a color-blind person what a color looks like, even if you knew exactly how every component of the brain worked while one is perceiving a color. With a house, you can explain how the shelter arises from the basic components. The fact that you can't explain a conscious experience, but can explain other high-level things in terms of their basic components, highlights the difference I've been trying to get at.


      (I haven't read Xei's post yet, but it seems well written so I'm going to take a walk and get rid of this immense headache so I can focus my full attention on it.)

    24. #24
      Banned
      Join Date
      May 2007
      LD Count
      Loads
      Gender
      Location
      Digital Forest.
      Posts
      6,864
      Likes
      386
      Quote Originally Posted by Dianeva
      There are no magical, extra things which emerge on the higher-level that weren't there already.
      Exactly.


      .

    25. #25
      Member Achievements:
      Created Dream Journal Referrer Bronze 5000 Hall Points Tagger First Class Populated Wall Veteran First Class
      Arra's Avatar
      Join Date
      Jan 2011
      Posts
      3,838
      Likes
      3887
      DJ Entries
      50
      Thanks Xei. I'm glad it's clear now what I'm trying to argue.

      Quote Originally Posted by Xei View Post
      How can it be the case that the mind only exists as a holistic phenomenon, when we need a mind in the first place for holistic phenomena to 'exist'?
      Great point. While thinking about it more, I kept running into this problem. When considering things on a higher level, it takes consciousness for the higher-level idea to be formed. So how can consciousness be one of those higher-level ideas itself? This might be a bit different from what you're saying, but similar. (It might be compared to the argument that a god must have created life since life creates everything else that seems complex).

      @tommo
      I think I must have gone wrong in my argument somewhere, but don't understand how I have. Take this situation as an analogy:
      Spoiler for Geometry Paradox:

      It seems that you're telling me to ignore the gap, that it's illogical to think there's a gap there. But it's natural for someone presented with this situation to wonder where they've logically gone wrong. You don't doubt the truth of the geometrical facts you've dealt with all your life. You instead assume you must be going wrong with your reasoning somewhere, and strive to figure out where, just as I agree that I probably am making a reasoning error with the consciousness argument, but just don't know what it is yet.
      Last edited by Dianeva; 03-03-2011 at 11:42 PM.

    Page 1 of 5 1 2 3 ... LastLast

    Similar Threads

    1. What is consciousness?
      By Valmancer in forum Philosophy
      Replies: 151
      Last Post: 06-06-2016, 03:00 AM
    2. On consciousness
      By Nick89 in forum Extended Discussion
      Replies: 18
      Last Post: 12-22-2009, 12:07 PM
    3. Replies: 1
      Last Post: 02-01-2009, 02:41 AM
    4. What is Consciousness?
      By LucidFlanders in forum Philosophy
      Replies: 6
      Last Post: 03-06-2008, 11:55 PM
    5. where is consciousness?
      By Tavasion in forum Philosophy
      Replies: 38
      Last Post: 08-27-2006, 03:51 AM

    Tags for this Thread

    Bookmarks

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •