• Lucid Dreaming - Dream Views




    Results 1 to 7 of 7
    1. #1
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084

      Tell me about theoretical computer science

      Hey,

      Over summer I'd quite like to learn about how computers work, which I consider to be a large lacuna in my scientific knowledge.

      I'm really interested in the really fundamental stuff; algorithms, turing machines, all that jazz. I'd like to have some general idea how a computer works on all levels; how transistors and binary works, and how this is related to programming, etcetera. I'd like to be able to do some basic programming, too.

      I'm not really sure what this is all branched under (theoretical computer science was a bit of a guess), but if you could point me in the direction of some good resources, that'd be awesome. :V

      Cheers.

    2. #2
      Member Imaginer1's Avatar
      Join Date
      Jun 2010
      LD Count
      1
      Gender
      Posts
      44
      Likes
      3
      DJ Entries
      2
      OHHHH
      I can tell you alot about this. Get ready for a wall of text!

      Basics
      Okay, to start, computers aren't as smart as they're cracked up to be, at least, compared to you. One of the first kind of computers, called the Turing Machine, was thought of by Alan Turing. The main idea is that there is a 'tape' that holds information, and a 'head' that edits that information with specific rules. For example, the rules are something along the lines of "If the Head reads a 1 on the space it is located, print a 0 on that space and go to the space to the right" and more. You can do simple unary addition and detect palindromes with this.

      Today
      Today we have fast computers that do fast logic operations very fast. One of the core things in processing is the "bit" or "Binary Digit". That is the basic unit of information in computers. Basically, it is the well-known 1 or 0. Computers can hold information, but they hold tons and tons of 1's and 0's, and that is all. They don't store "Hello", they convert each letter into a sequence of 1's and 0's and store it.

      Bits are actually a number system, in which you can write numbers. On the far right, start with a 0. [000] but we want to count. On the far right, turn that into a 1. [001]
      but we get stuck- with binary, we can't go any further. So, we remove that 1 and make it a 0 and make the one to the right of it a 1. [010]

      If you can't understand, think of it this way- the first digit stands for a 1, the second digit for a 2, the third digit for a 4, continuing to double. For example, 5 is 101, or, 1+4.
      If you want to learn more about bits, I recommend going to your local library, but this wikipedia article is reliable:
      http://en.wikipedia.org/wiki/Units_of_information
      http://en.wikipedia.org/wiki/Bit
      http://en.wikipedia.org/wiki/Binary_numeral_system

      How the Computer Thinks

      Really, your computer doesn't think, it takes inputs and outputs and knows what to do. One of the big things in computers is boolean logic.
      It is another kind of 1's and 0's thing. Sometimes, 1 means "true" and 0 means "false", or maybe they're just multiple digits representing a number. Many functions can be put together by wiring theese 'logic gates' together. Logic Gates perform simple functions on 1 or more binary digits.

      NOT- this ignores whatever goes in. If what goes in is a 1, it sends out a 0, and vice versa.
      OR- This takes two inputs, and sees if any of them are 1's. If there is a one there, a one comes out.
      XOR- this stands for "Exclusive-or". It outputs a 1 if ONLY ONE of the inputs is a 1.
      AND-This outputs a one only if both inputs are a 1.
      If you want to invert the direct output of a logic gate, put an N in front to make NOR, NXOR, or NAND...
      and there are more. I don't know if my explanation can be understood, so check out theese sites:

      http://computer.howstuffworks.com/boolean.htm
      http://joshblog.net/projects/logic-g...r/Logicly.html - this lets you actually play around with logic gates. A clock goes 1, 0, 1, 0.
      http://isweb.redwoods.cc.ca.us/instr...ogic/index.htm - learn how to add binary numbers together with logic in "Half Adder" and "Full Adder"

      INTO THE FUTURE!
      This is the last thing I'll talk about- it's a little hard to understand.

      We're thinking about making "Quantum Computers" by taking advantage of a very weird law of physics- if something that is COMPLETELY RANDOM- I don't mean like flipping a coin, you can take the laws of physics and number crunch really hard, but I mean like sending a photon through a 50% silvered mirror. Here's the fun part- when that completely random thing happens, the universe splits into two as long as nothing is observing that photon. So, the photon has passed through the mirror and reflected off of it- AT THE SAME TIME! This is called "quantum superposition"

      So here's where we're going with this: let's say a calculation on a computer takes 1 second. You want to do a calculation on, say, the numbers between 1 and 1200. That would take 1200 seconds (1 hour) for a normal computer, doing one at a time. But a quantum computer would take 1 second, as the string of photons is every possible bit combination at the same time.
      Since this is a really mathy subject, I think you wouldn't want to enter it.

      I hope this gave you some information- thank you for letting my nerd heart run free.
      LD count: 1

      Lucid Dreaming Goals:
      1. Consistently remember multiple dreams each night [O]
      2. Become lucid [O]
      3. Fly [ ]
      4. Master lucid dreaming (at least 2 lucid dreams per night) [ ]

    3. #3
      DuB
      DuB is offline
      Distinct among snowflakes DuB's Avatar
      Join Date
      Sep 2005
      Gender
      Posts
      2,399
      Likes
      362
      Well as I see it the area can be (very) roughly divided into three levels: the hardware level, the software level, and the networking level.

      I read an excellent text on networking in high school that I recommend to anyone interested in the topic: Computer Networking: A Top-Down Approach Featuring the Internet. I read the 2000 edition, and if anything this 2009 edition that I linked to is probably better. I recall it as being comprehensive, clear, and thoughtfully arranged.

      What I've called the software level is where a lot of the fun is. (Although don't get me wrong, networking is a lot of fun as well, and once upon a time I intended to make it my career specialty.) You mentioned that you'd like to become acquainted with basic programing. Well, I think if you want to get the most out of your study in this field you're going to want a pretty authoritative grasp on at least one language, and preferably some familiarity with a few others as well. It's a pretty essential tool, and most of the more theoretical aspects of CS (algorithms and whatnot) are going to be described or illustrated in terms of programming languages. To start off, I recommend C++ or Java. I've used both languages, but I learned them using primarily course materials, so I can't recommend any specific texts. More on that below.

      I was always least interested in the hardware aspect so I'll have the least to say here. I tended to pick up what I knew here or there, usually on the Web, so I don't have experience with any of the texts. An Amazon search pulled up this, Principles of Computer Hardware, which I think looks quite good, but I can't personally vouch for it.

      Given your academic and career interests, you should consider adding Computer Science as a second major or at least a minor. Learning a programming language is a practice-driven affair, not an insight-based one, and an instructor can really be invaluable in showing you how and what to practice. It also seems like a pretty useful formal certification to have when pursing a job in any field with "computational" in the prefix . Anyway, in my experience the first course is mainly focused on getting students up to speed on a certain language (and weeding out the ones who just don't have the right stuff), but beyond that first course you pretty quickly get into various sorting/searching algorithms and all that jazz. Give it some thought.

    4. #4
      Xei
      UnitedKingdom Xei is offline
      Banned
      Join Date
      Aug 2005
      Posts
      9,984
      Likes
      3084
      Since this is a really mathy subject, I think you wouldn't want to enter it.
      I'm reading mathematics at the University of Cambridge so actually I'm pretty well equipped to cope with it, haha!

      Thanks for the info.
      Given your academic and career interests, you should consider adding Computer Science as a second major or at least a minor. Learning a programming language is a practice-driven affair, not an insight-based one, and an instructor can really be invaluable in showing you how and what to practice. It also seems like a pretty useful formal certification to have when pursing a job in any field with "computational" in the prefix . Anyway, in my experience the first course is mainly focused on getting students up to speed on a certain language (and weeding out the ones who just don't have the right stuff), but beyond that first course you pretty quickly get into various sorting/searching algorithms and all that jazz. Give it some thought.
      Well, my interest is partly spurred by a computer project which is part of my maths course, actually. They put a lot of emphasis on the importance of programming for modern mathematics, although we're kinda left on our own. After my undergraduate degree I'm planning on doing a PhD in computational neuroscience where they teach you the basics of programming, so I don't really need to worry about 'minoring' in it (plus that doesn't really work in the UK education system, I don't think...).

      The language we're using right now is MATLAB, which I may stick with for now, because I was reading about programming neural networks, and it turns out MATLAB is the standard language for doing so.

      What I'm most interested in is a bottom-up approach so that I can understand what I'm doing really means. At the moment it's too much like magic.

    5. #5
      DuB
      DuB is offline
      Distinct among snowflakes DuB's Avatar
      Join Date
      Sep 2005
      Gender
      Posts
      2,399
      Likes
      362
      Quote Originally Posted by Xei View Post
      After my undergraduate degree I'm planning on doing a PhD in computational neuroscience where they teach you the basics of programming, so I don't really need to worry about 'minoring' in it
      I don't know, sounds unusual to be covering such basic stuff in a PhD curriculum. We have a computational neuroscience program in my department and the sense that I've gotten from talking to some of the grad students in that area is that they're expected to have some programming facility when they enter the program. But I haven't quizzed them too hard about it. In any case, it seems likely such programs would be pretty heavily biased in favor of accepting students with prior programming experience, even if it's not a formal requirement per se (although it may be).

      Quote Originally Posted by Xei View Post
      What I'm most interested in is a bottom-up approach so that I can understand what I'm doing really means.
      In that case it sounds like the second book I linked to may be more up your alley. The title makes it come off as a straightforward book about hardware, but you can tell from browsing the table of contents that it makes plenty of connections to the higher levels as well. But like I said, I haven't read it.

    6. #6
      Member Imaginer1's Avatar
      Join Date
      Jun 2010
      LD Count
      1
      Gender
      Posts
      44
      Likes
      3
      DJ Entries
      2
      Quote Originally Posted by Xei View Post
      I'm reading mathematics at the University of Cambridge so actually I'm pretty well equipped to cope with it, haha!
      Ok, to tell the truth, I don't understand the math of it. But I will tell you what else is in the future...

      Neural Networks.

      I'm too tired to type a whole other wall of text, so I'll just give you a good link :
      http://www.ai-junkie.com/ann/evolved/nnt1.html
      LD count: 1

      Lucid Dreaming Goals:
      1. Consistently remember multiple dreams each night [O]
      2. Become lucid [O]
      3. Fly [ ]
      4. Master lucid dreaming (at least 2 lucid dreams per night) [ ]

    7. #7
      Member Achievements:
      1 year registered 1000 Hall Points Veteran First Class
      illidan's Avatar
      Join Date
      Jul 2007
      Gender
      Location
      2046
      Posts
      135
      Likes
      2
      Transistors and pretty much the whole electronics part of computers is not really part of theoretical computer science, it's part of computer engineering.

      As for theoretical computer science, some of the most important topics are (roughly sorted by difficulty):
      • formal logic
      • automata theory
      • formal languages
      • computability theory
      • complexity theory
      • algorithm theory
      • formal semantics
      • graph theory
      • cryptography
      • quantum computing


      (I apologize if I missed someone's favourite subject here.)

      The field is quite complex and many-sided. It doesn't really teach you how to program, though. There a many programmers out there who are purely practice-oriented and have absolutely no clue about theoretical computer science. Don't get me wrong, it's a fascinating field but it's probably a good idea to learn programming first before you tackle any of the heavy theory. I'm saying this because most people already know how to program when they begin to study theoretical computer science. It's not that they wouldn't understand the theory otherwise but that it wouldn't be much use to them.

      If you don't just want to learn how to program but also want to understand how computers work, you probably want to learn the basics of computer science and computer engineering. If you really interested in a bottom-up approach (as in from the-very-bottom-up) you start out in physics. Here's my attempt to outline how such an approach might look like:
      • All about electricity, power, current, resistance, capacity, basic electric circuits etc.
      • Materials and conductivity: The properties of insulators, conductors and semiconductors.
      • How to build electrical elements using semiconductors. All about diodes, transistors etc.
      • How to perform boolean operations using transistors, i.e. how to build boolean gates.
      • How the binary system works. All about boolean functions and how to optimize them. Combinational/sequential logic.
      • How to use boolean gates to perform more complex operations such as addition and multiplication.
      • All about memory, microprogramming, program execution in hardware, pipelines etc.
      • Programming in assembly language. All about registers, instructions, different CPU designs (RISC, CISC, etc).
      • All about higher level programming languages and how they are translated to assembly code.
      • Programming constructs: statements, conditional statements, loops etc. How does it look like in assembly.
      • Simple algorithms (search and sorting algorithms etc.) and programming techniques (recursion, linked lists etc.)

      If you get this far, you should have the basic knowledge on how computers work. From there you could move on to more advanced topics.

      The topics in theoretical computer science that would be good to know when trying to understand how a computer works are probably just formal logic (for understanding (the optimization of) boolean functions, gates etc.) and automata theory (for understanding sequential circuits).

      Computer engineering is as the name suggests more engineery (the focus is on how to accomplish practical tasks), whereas theoretical computer science is highly formal (it's all about definitions, theorems, proofs etc).

    Similar Threads

    1. Mathematical Numbers are Theoretical; Not Proven
      By [SomeGuy] in forum Science & Mathematics
      Replies: 106
      Last Post: 04-09-2009, 03:11 PM
    2. Learning about computer science/programming
      By ThePhobiaViewed in forum Tech Talk
      Replies: 27
      Last Post: 11-21-2008, 12:28 AM
    3. Replies: 11
      Last Post: 09-13-2008, 10:02 AM
    4. LWL Technique (Learning While Lucid) THEORETICAL
      By Sean999 in forum Attaining Lucidity
      Replies: 9
      Last Post: 06-24-2006, 10:24 PM
    5. Theoretical Physics
      By HyperNova in forum Extended Discussion
      Replies: 7
      Last Post: 03-27-2006, 04:30 AM

    Bookmarks

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •