If a program that simulates the brain (which is certainly possible, computers can simulate any causal physical situation) is 'just conditional statements and loops' as you put it, then surely a brain in nature also is just conditionals and loops? |
|
I started writing this as a response to the AI thread in Science & Mathematics, but have realized it's more a philosophical topic. |
|
If a program that simulates the brain (which is certainly possible, computers can simulate any causal physical situation) is 'just conditional statements and loops' as you put it, then surely a brain in nature also is just conditionals and loops? |
|
I don't understand why science has not yet been able to simulate consciousness, or create AI. If all a brain is, is just chemical processes and links, basic algorithms...then why hasn't science been able to replicate this with a computer? Are the links and processes in the brain too complex for science to replicate with present technology? |
|
Well there are around 1 quadrillion synapses in the brain, we haven't made neural networks that are large enough yet. We also haven't mapped every neuron and synapse in the brain, so even if we had a network large enough, we wouldn't know how to connect it. |
|
To answer the question of why we have not been able to create an AI yet: |
|
This is true, but I don't think most people really understand or appreciate just how wide the gap between meaty brains and simulated brains still is. The typical artificial neural network employed in modern computational neuroscience research is absurdly, laughably oversimplified. They "simulate" brains to about the same degree that paper airplanes "simulate" supersonic jets. And it's not simply a matter of network size or number of connections: the individual nodes themselves, the very foundations of the networks, do only a pale shadow of what actual neurons do. For example, nodes typically pass along "bits" of information in a binary fashion, unlike actual neurons which almost certainly transmit richer information encoded in the precise temporal patterns of their firings. What's more, we don't even have a thorough understanding of how actual neural cells and synapses work; we continue to learn very basic things about how they communicate with one another (recent example). In other words, the biological models most researchers are referring to when they design these nodes are just wrong. |
|
Essentially: |
|
Join our Lucid Dreaming video game project!
Lucid Dreams: 6 ---- WILD: 0
---------------------- MILD: 2
---------------------- EILD: 0
---------------------- DILD: 4
---------------------- DEILD: 0
We can take it apart and examine it. Just in really, really small bits. Or slowly. I can't find an article you don't have to pay for, but basically some scientists have replicated something like less than a millimetre square of rats brain tissue. Also there are fMRI's which show us which parts of the brain are used for different things. |
|
It would be illogical to pretend that I didn't see the problem with consciousness I brought up in the OP. I see a genuine problem with consciousness being brought up by physical processes and therefore with the ability for AI to become conscious. I might be making a logical error, but it isn't illogical for me to wonder what that error is (it seems like that's what you're saying, sorry if you aren't, need to say this lest you accuse me again of arguing against unmade claims). I'm not going to believe something and ignore a potential logical flaw just because it's consistent with beliefs I already hold. But I do think it's likely I am making some logical error in my reasoning in the OP. I just want to know what it is. No one seems to be addressing it though. |
|
Your original post isn't very clear. Are you saying that consciousness is impossible for AI in the sole context of AI programs running on computers (or perhaps general mechanical devices, although that's vague and it's not clear why the brain shouldn't be included), or are you saying that consciousness seems impossible in all circumstances? Because the first half of your post is very computer centric but the second half seems more general. |
|
No, not that. Maybe people thought they were addressing your argument because you didn't explain it well. Don't be so ungrateful, people are just here to discuss, they generally do it in a pretty polite manner, and they generally put a lot of thought and time into it and try to honestly reply to you. |
|
I was more addressing the question that was brought up by Aquanina about why we don't have advanced AI yet, and... kind of lost sight of your original post XD |
|
Technically all you need is a self-expanding system that can process input, generate output which is composed of arbitrary segments of data that are able to assign some kind of relevance to one another independently in order to simulate/create a system like the human brain, and thus consciousness. The problem is, while this is relatively trivial to create, the actual processing of the input, and the internal connections becomes so exaggerated that it becomes impossible to get the speed of a distributed system like the human brain; sort of the same problems that are run into while attempting to simulate a physical system at the atomic level. So most AI systems are actually elaborate and thus slightly retarded so that they can actually function. In the past decade there have been many projects started to create a supercomputer which would emulate the structure of the human brain, a few by IBM I believe... So it is possible we may see such computing systems become available in the next decade or so. Perhaps sooner. |
|
I don't care if the thread goes off the topic of the original post. Anything related to the title would be fine. I'd be happy if the thread involved any engaging discussion, even if it isn't exactly about the intended topic. Talk about whatever you want. Consciousness and AI are big subjects and you can go off on a lot of tangents. It just seemed like some people read what I said but sort of absentmindedly ignored it, pretending I said something else and straw-manning. I've realized I'm relatively terrible and judging the intentions and motivations of other people, so this might not at all be what anyone intended. If it really wasn't clear what I was trying to say, or if the reply was some tangent on AI in general and no one intended to be replying to the OP, that's fine. |
|
As a side note, I think this should be merged with the original AI thread. I can't keep straight which comments have been made in which thread, which suggests they're basically the same thread. |
|
Ah, I think I see what you're getting at now, Dianeva. I have to agree, trying to figure out how the experience of consciousness arises from what is seemingly just a big complicated collection of chemical and electrical reactions is something that is hard to wrap your head around. I'm gonna have to jump on board with you and say I have no freaking clue. It doesn't seem like something that should be possible, but obviously it is. I could just default to my religious mode and say "God did it" but that doesn't make for very interesting conversation does it? besides the fact that, though I am religious, I do believe that we need to do our best to come to a scientific explanation before placing it on a higher power of some sort. But yeah, this is a tricky question, how does conciousness come about in the first place? I mean, we don't really quite know how a bundle of firing neurons translates into you remembering what you had for dinner last night, what it tasted like, smelled like, etc. other than possibly it's all due to association and neurons triggering taste receptors in the brain or some such... I don't know, it's a tricky question, especially because consciousness and feeling are so hard to describe concretely in the first place, it makes it difficult how they actually come to exist. I suppose emotions could be in part evolutionary. Because we are social creatures, and our survival depends (or at least use to depend) on being accepted by the group, emotions would help with that. Showing sympathy and helping, although for the individual may be taxing, helps the group survive as a whole. Becoming attached to another person promotes sympathy and caring, and feeling sadness or grief if a person dies would be a deterrent to neglecting a person you care for. Anger keeps people from doing something to you, for fear of retribution, and fear keeps you from doing something stupid... emotions would have helped to guarantee survival and passing on of genes, and those who didn't experience emotion to the same extent may have been rejected by the group for acting in a manner not acceptable. Though this still doesn't quite resolve the issue of consciousness, maybe it at least resolves the issue of the emotional aspect of consciousness? Perhaps it's possible that the development of basic emotions resulted in consciousness evolving at the same time, as in my mind, emotions would be impossible without basic consciousness. Any thoughts? I won't be offended if you tear this thinking to bits, I'm mostly just coming up with it on the spot XD |
|
Last edited by Entaria; 03-03-2011 at 03:02 AM.
No, it's actually like saying a large group of people trading labor for capital, capital for goods services, and vice versa would create an economy, then describing the economy as the result of the transfer of these things between people on a large scale. If you were to observe a single transaction, it would not necessarily behave like the system does overall, because the economy is the result of billions of those transactions, among other things. |
|
The main issue from a hardware standpoint is that our computers utilize binary logic gates (this or that) while neurons fire in much more complex structures (if this then that and that and that and...) and that only covers the electrical aspect of how neurons communicate. There are even still more levels of communication within the chemical transmissions and then the electrical signals also release chemical transmissions and the chemical transmissions cause electrical signals to fire... |
|
Art
The ability to happily respond to any adversity is the divine.
Dream Journal Shaman Apprentice Chronicles
I still don't understand how I didn't answer your question. I thought I was pretty straight forward. |
|
I didn't care to read the thread, because I am super tired right now, but this is my take on this. |
|
Thanks for further explaining yourself, Dianeva. |
|
Last edited by Xei; 03-03-2011 at 01:38 PM.
@Entaria |
|
|
|
Thanks Xei. I'm glad it's clear now what I'm trying to argue. |
|
Last edited by Dianeva; 03-03-2011 at 11:42 PM.
Bookmarks