Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Education Programming Hardware News

From a NAND Gate To Tetris 103

mikejuk writes "Long before the current crop of MOOCs (Massive Online Open Course) there was a course that taught you all you needed to know about computers by starting from the NAND gate and working its way up through the logic circuits needed for a computer, on to an assembler, a compiler, an operating system, and finally Tetris. Recently one of the creators of the course, Shimon Schocken, gave a TED talk explaining how it all happened and why it is still relevant today. Once you have seen what is on offer at http://www.nand2tetris.org/ you will probably decide that it is not only still relevant but the only way to really understand what computers are all about."
This discussion has been archived. No new comments can be posted.

From a NAND Gate To Tetris

Comments Filter:
  • by Anonymous Coward

    There's a circuit building platform which applies to this in the works:

    http://www.circuits.io/

    • by lindi ( 634828 )

      The point of nand2tetris is that you can understand all parts of the system. With circuits.io you can't even see the source code let alone have time to read through all of it. Last time I checked they did not let you to export your designs either (gerber export does not count, it's not something you'd want to modify).

      • I thought nand2tetris sounded pretty interesting. My hopes were deflated when I found that half of the reading material does not exist (chapters 7-12), and that the entire "computer" project is simulated virtual hardware. It's a bit ironic for the author(s) to so emphatica;;y profess a true understanding about computer hardware and then implement the entire concept in software. I was really hoping for some TTL breadboard / wirewrap / soldering... *something* tangible. It looks like interesting reading anyw
  • NAND Gate (Score:5, Funny)

    by zippo01 ( 688802 ) on Tuesday October 16, 2012 @02:24AM (#41666505)
    I watched this video and it really does seem like it would be a fun course! I'm not really sure about the whole God giving man NAND. Though that is prolly why my belief in God is 11. Hah, what a crappy NAND joke.
  • Logic is Logic (Score:5, Interesting)

    by thejuggler ( 610249 ) on Tuesday October 16, 2012 @02:31AM (#41666519) Homepage Journal
    Somewhere between learning to write my first "Hello World" program on the Apple IIe (and the TI99/4A) and making a career out of programming years later, I went to schools for Computer Repair and Bio-Medical Electronics. I still have a pile of 7400 series [wikipedia.org] IC chips and my breadboards amongst other electronic components. I learned analog and digital circuit design in the late 80's. The logic learned in those classes still applies to everyday programming today. No matter what I did in those previous careers, the training I did then still applies today. AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming. From there Truth Tables are built and boolean algebra is applied to create any and all circuits and code today. In my humble opinion these are still essential to training people new to various IT fields. It's like having to learn nous, verbs, adverbs and adjectives in order to write understandable thoughts. If you lack this basic understanding learning the more advance concepts is difficult at best. It's good to see these are still being taught somewhere.
    • by Anonymous Coward

      As a n00b, I just wondered: Are those 7 all the possible variations of the truth tables? (Except for the obvious TRUE and FALSE "elements" that always return true or false.)

      • Re:Logic is Logic (Score:5, Informative)

        by sFurbo ( 1361249 ) on Tuesday October 16, 2012 @04:27AM (#41666909)
        You have 2 inputs that can each be 1 or 0, so there are 4 different inputs. For each of those, you have 2 possible outputs, so there are 2^4=16 different truth tables.

        Of these, 8 are symmetrical in A and B (gives the same output for input (1,0) and (0,1)). Theses are AND, OR, NAND, NOR, XOR, XNOR, TRUE and FALSE

        The remaining 8 are 4 sets of duplicates (if you switch A and B, we call the gate the same name). These are A, not-A, A AND NOT-B, and its negation, NOT-A OR B. The two last does not seem to be standard gates, so no, there are, in fact, two more non-trivial truth tables for two inputs.
      • Re:Logic is Logic (Score:5, Informative)

        by famebait ( 450028 ) on Tuesday October 16, 2012 @04:31AM (#41666927)

        No, but it sums up all the useful/practical ones.
        If you only have two inputs, there are only 4 rows in the table,:
        | A | B |
        | 0 | 0 |
        | 0 | 1 |
        | 1 | 0 |
        | 1 | 1 |

        This yields only 16 possible output columns:
        0000 - does not vary with input
        0001 - AND
        0010 - not commutative
        0011 - reacts only to A
        0100 - not commutative
        0101 - reacts only to B
        0110 - XOR
        0111 - OR
        1000 - NOR
        1001 - XNOR
        1010 - reacts only to B
        1011 - not commutative
        1100 - reacts only to A
        1101 - not commutative
        1110 - NAND
        1111 - does not react to input

        That makes 6 potentialy desirable operations. The seventh is NOT, which takes only one input.
        The not commutative ones could conceivably be put to useful work, but in physical designs the asymmetry is impractical, and you can trivially construct them from other gates if need be. In fact some of the useful ones ar also usually constructed from combinations of the others, and all of them *can* be constructed from combinations of NAND gates.

        • There's actually a really fantastic chart in Chapter 1 pg. 10 [nand2tetris.org] that I magnified and printed out as a quick reference. I also drew the gate pictures at the top so I could more easily interpret the schematics.

    • I found when I was at school (oh god, that was over 10 years ago) that there was *alot* of overlap between my classes

      We learned basic circuitry in Physics, we learned basic Programming and Physics in Tech Studies (that mainly taught us Electronics, through Breadboards, PICMicros and low level programming), and we learned about 6502 Assembler in Computing Studies (although we were told we were the *last* class to learn assembly in CS that year)

      And in each of those three we learned about Combination Logic to

    • Re:Logic is Logic (Score:5, Informative)

      by dkf ( 304284 ) <donal.k.fellows@manchester.ac.uk> on Tuesday October 16, 2012 @03:44AM (#41666771) Homepage

      AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming.

      Actually, real digital circuit design uses rather more elements than that, some of which can't be derived from those ideal elements either. Even excluding the clock generator (a thoroughly analog component in its core) there's still some really strange things you can usefully do with transistors that just won't model as anything simpler; my favorite is the arbitrator, it determines which signal rose (or lowered) first and which is used to connect together parts of a chip that use a common power supply but unsynchronized clocks. Simplistic digital theory says it can never work, but in reality it's very effective (and it depends on the fact that transistors are analog devices with some quantum mechanical behavior for disambiguating in the tricky cases. Mad, fun, mad fun!)

      • Hmmm... I think I just wire the output of an inverter back to its input and I get an oscillator... Add more in a chain if I want a bigger delay, or a counter/divider... No analog circuitry required, just the fact that there is a propagation delay. :-)
    • by mcgrew ( 92797 ) *

      Agreed completely. For years they seem to have completely removed the concepts of the hardware itself from programming. I've alwasy thought this was a big mistake. Out of all the programming books I've read, the book that helped the most wasn't a book on programming, but the TTL Cookbook. I don't know if it's still in print, probably not, but it was excellent.

      • by skids ( 119237 )

        I've alwasy thought this was a big mistake.

        It's a huge mistake. It's why we have programmers that don't understand the computational expense of memory/cache, bounce buffers, and parasitic intermediate representations. Without a crictical mass of such programmers, there is not enough demand pressure for toolkits that work efficiently, and the bells and whistles are chased after instead, no matter how slow said bells and whistles make the system run. Worse yet, it's why useful paradigms become a hammer seeking a nail, as CS students don't think cri

        • by Cinder6 ( 894572 )

          Well, take heart that at some places it's still taught. A class on logic circuits was required for a CS degree where I went. They called it "Computer Hardware", admittedly a rather vague name. But we started at the gate level and built (well, simulated) a computer using circuits we designed. At the end of the semester, it had to be able to run arbitrary instructions, and we had to be able to hand-assemble code for it (I got tired of doing this and wrote an assembler). It wasn't as detailed as nand2tetr

      • Ah the memories - TTL Cookbook! I still have that on a bookshelf in the basement.
    • Re:Logic is Logic (Score:4, Informative)

      by tlhIngan ( 30335 ) <slashdot&worf,net> on Tuesday October 16, 2012 @10:41AM (#41669607)

      AND, OR, NOT, NAND, NOR, XOR and XNOR are still the 7 basic logic elements that make up all digital electronics and programming

      Of which, NAND and NOR are the primitives - you can constract any gate (and thus truth table result) you want out of purely NAND or purely NOR gates.

      Why you pick one over the other is down to limitations of CMOS - PMOS transistors have to be much larger than NMOS ones to be as fast. NAND puts the fast NMOS transistors in series giving you much faster switching than if the PMOS transistors were in series (as it would be in a NOR gate)

    • by Anonymous Coward

      I also learned on the 7400 series, and still have massive piles of them around, even newer, faster SMD ones. Although while that provides a good educational foundation, I wonder how practical it is directly though. I'll throw together a circuit, that when made with SMD is the size of a credit card, and someone will ask why I didn't just use an fpga or mcu... I then realize I could have made the circuit the size of a postage stamp, and have it be flexible enough to change the logic after assembly. As muc

    • Sometimes we need to step back a few paces to ask exactly what it is that we're trying to accomplish. When a teen-ager is ready to learn to drive, it is not essential for him/her to be able to rebuild the engine or put in new piston rings. After a person has an introductory-level understanding of what an automobile does, then he/she may want to explore the details in greater depth. Some people may actually turn out to be more interested in painting the car in different colors, or using it to transport si
  • by Anonymous Coward

    ... but my god gave me nor gates and an endless world of blocks.

  • by wdef ( 1050680 ) on Tuesday October 16, 2012 @02:54AM (#41666583)

    Looks great, much like I imagined studying Comp Sci ought to be. Ok one can get the book and use the materials for self-learning, but is there a list of institutions using the course for credit?

    So many great courses and great teachers around now. Pity they didn't get all this together way back in my day. I've just been working my way through http://michaelnielsen.org/blog/quantum-computing-for-the-determined/ [michaelnielsen.org] and am astonished at the simplicity and lucidity of Nielsen's teaching.

    • by donscarletti ( 569232 ) on Tuesday October 16, 2012 @05:12AM (#41667053)

      When I studied Comp Sci in the early 00s, we had a compulsary couse on digital circuits, ground up sort of stuff, nand gates, verilog, that sort of thing. If you didn't have a course like that, it is regretable.

      My proudest moment is my 80 something year old grandfather, who's own father had built radios for a living and who's brother is a retired electrical engineer saw my textbook and grilled me about solid state switching. He said he did not understand how a signal could be selected based on another signal without the use of electromechanical relays. He knew roughly how a transistor works and I explained how they could be combined into AND, OR and NOT gates. From there, I drew a circuit digram of a multiplexer and to him it was like some great realization that there was no perversion of God's laws going on inside a CPU (joke).

      He bought his first PC and Digital Camera within a month.

      • by mcgrew ( 92797 ) *

        He said he did not understand how a signal could be selected based on another signal without the use of electromechanical relays. He knew roughly how a transistor works and I explained how they could be combined into AND, OR and NOT gates.

        Something sounds funny there. Transistors play the same role as vaccuum tubes, which were around before your grandpa was born. The first electronic computers used tubes. His brother was an electrical engineer, was your grandfather one as well or just a hobbyist? How good w

        • No, the first computational devices were electromechanical punchcard databases, they used relays throughout, as did the automatic switching systems in the phone network. Tubes we only used as amplifiers when my grandfather was a young man, computers like Colossus and Eniac that did in fact use vaccuum tubes for switching were largely either obscure or classified until my grandfather was in his mid 30s and too busy with his legal practice, family and music to much care.
        • I might also add that in early digital, stored program computers, relays were still used for multiplexing in the register and memory banks, the reason is that is 1) they would only be activated once per instruction at decoding time, 2) they could be ganged to flip a lot of paralell wires at the same time driven by one coil, saving much space and complexity, 3) they had lower resistance and lower leakage than a valve. Valves were only used for arithimatic in those days, where their speed was needed.
    • >Looks great, much like I imagined studying Comp Sci ought to be.

      This was actually what I got out of 4 years of Computer Science at my university (UC San Diego) - an understanding of everything that was going on from the moment I compiled my code and ran it, down to the transistor and logic gate level. We had to write our own compilers, create our own CPUs (including our own assembly language for it), and so forth. Very valuable stuff - at a certain point you realize all the grey areas about what is goin

  • by euroq ( 1818100 ) on Tuesday October 16, 2012 @03:14AM (#41666657)

    I love this! I am a hardcore developer who's done assembly to Java. I have many non-technical friends who ask, "how does a computer work?" The short answer is that two electrical pulses, which we call either 0 or 1, go through something (a gate like NAND) to get an output of 0 or 1, and you combine that in a massive logic puzzle to get a computer. This course describes everything in detail. Love it. Well, not "everything" but certainly everything non-educated but technical people want to know.

    • I think it makes more sense to start with the "transistor is a switch" level of abstraction. Make some logic gates out of transistors and then go from there. Most people have heard that "computers are made out of transistors" and have probably always wondered where exactly they fit in.

      One could try to start even lower, and introduce some basic semiconductor physics--but I'm not sure of a clean of way of introducing those concepts without lying a lot.

  • Quite tough to align multiple NAND gates without open spaces between them. With one circular and three flat faces it doesn't fit well. Oww and that annoying circle for the inverter... Happy that I live in Europe: Though I dislike our (square) logic gate symbols, they are great for tetris... http://en.wikipedia.org/wiki/Logic_gate#Symbols [wikipedia.org].

    See: we Europeans beat the USA even with logic gate-tetris !

  • Bottom Up Approach (Score:1, Interesting)

    by Anonymous Coward

    I agree with the bottom up approach to learning programming & CS.

    I started off learning BASIC, but it was very slow going. As soon as I got into 8086 Assembly everything clicked into place. After making a few simple games, I built my own Boot loaders and toy OSs, Languages, etc. without any teaching or courses needed. I remember thinking, "I wonder how you make a bootable disk?", Turns out everything I needed to know was in the BIOS documentation: Interrupt 19 [ctyme.com]

    I agree that knowing about the circuitry

    • by Anonymous Coward on Tuesday October 16, 2012 @03:59AM (#41666837)

      Your reply just made Donald Knuth cry.

      Sometimes you need to learn a generic and simplified technology before you can comprehend the incredibly complex and optimized real world examples. And sometimes real world examples are so narrowly designed that you would lose out on a general understanding of computing by focusing on that one design. Finally, sometimes real world examples carry the baggage of the past which can waste valuable time.

      • ...you are talking about valuable time, but seem to be suggesting that he learn MMIX? Really?
        • by vlm ( 69642 )

          ...you are talking about valuable time, but seem to be suggesting that he learn MMIX? Really?

          TAOCP is not a MMIX textbook... Wait till you learn what else is in the books.... Learning MMIX was about 0.001% of the way....

    • I agree with the bottom up approach to learning programming & CS.

      This is not necessarily a "bottom up" approach. Digital logic winds up being pretty important to know for writing very high level programs -- secure multiparty computation, model checking and formal verification systems, automated theorem proving, and so forth. This is not a stack, it is a cycle.

  • I haven't taken this particular course, but the "Introduction to Computer Design" course at my university, where we started with AND and OR gates, and ended by building a simple microprocessor, was definitely one of my favorites. It definitely had the feeling of magic: you figure out what you want to do, put together a bunch of random bits of logic, draw a box around it, and suddenly you've got an adder or an instruction decoder. I still feel that way whenever I write a really new bit of funcitonality.
  • by Artifakt ( 700173 ) on Tuesday October 16, 2012 @03:31AM (#41666725)

    I'm not saying its a good idea to develop an elitist attitude towards the people that use them, but this explains why there's some rational basis for looking down on scripting languages. It's not that they are inherently bad or that the people who use them lack the ability to do 'real programming'. But, they are basically all about not having to know anything at all about how the other layers of abstraction work, and a consequence is they also don't give the programmer any real connection to how the hardware layer works and how you get from it to what they know.
                For example, if you know how an OS is generally compiled in a language such as C or C++, then the next step is understanding that the compiler is itself running 'on a level above' assembly language. Understand that, and its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things. That program may be much less portable than something written in Perl, but it's inherently very powerful at what it does. It's not that people who program in assembly are necessarily any smarter or better at it than people who write Python. That's certainly debatable. The thing that isn't debatable is that the closer a programmer gets to machine language, the more they can do that nothing higher in the heirarchy can stop, position itself against, or even detect. At some point, that means trying to secure scripted code, or compiled code, or anything above assembly is like trying to defend a point with what may be a perfectly good machine gun, but the other side is the only one with stealthed, antimatter pumped, orbital X-ray laser arrays. They can have sloppy aim, lack elegance and inspiration, and still win.
                Nowdays, there are plenty of people working with a modern OS, even one that is still all compiled at just one level above assembly (if there are any real systems that you want to count as modern that still fit that, what with silverlight, dotnet, flash and so on on just about every machine out there), who don't understand the heirarchial nature of coding worth a damn. It seems to get worse as you get to people writng applications for the various OSes. Some of these people are very good coders (or scripters, or whatever), but they really just can't write secure apps, because they don't really understand what the difference between a script kiddee attacker and a threat whole governments wish they could get on their side really is.
              That's just one of the things this course and others like it are supposed to fix. A lot of us need this. Hell, I've known this stuff for 35-40 years, and this reminds me I should get out the old books and do a little refresher. If you've read things about coding becoming as professional as aero-space engineering or similar, and found yourself agreeing with any of them, this is where it starts.

    • by Anonymous Coward

      its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

      I think you fail to understand how the OS controls these permissions. The operating system sets the level of protection in the hardware and if a program violates the protection without the appropriate hardware level privilege level, the microprocessor will trigger a fault which will transfer control back to the OS which will then terminate the program (GPF). The only way to change the privilege level is through an operating system call.

      • by mcgrew ( 92797 ) *

        I wrote a program twenty years ago that would reboot your computer. The program was four bytes long, its name took more disk space than its code. Of course, I didn't use an assembler, just DOS Debug.

        The closer you get to the bare wires, the more damage you can do.

    • by Rockoon ( 1252108 ) on Tuesday October 16, 2012 @05:21AM (#41667097)

      I'm not saying its a good idea to develop an elitist attitude towards the people that use them, but this explains why there's some rational basis for looking down on scripting languages. It's not that they are inherently bad or that the people who use them lack the ability to do 'real programming'. But, they are basically all about not having to know anything at all about how the other layers of abstraction work, and a consequence is they also don't give the programmer any real connection to how the hardware layer works and how you get from it to what they know.

      The same argument could be used against C++, or C, and not just scripting languages like you claim. I know that most C programmers think they are doing low level programming, but they aren't.

      For example, if you know how an OS is generally compiled in a language such as C or C++, then the next step is understanding that the compiler is itself running 'on a level above' assembly language. Understand that, and its a straightforward conclusion that a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

      Umm, no. Just no. I have a great idea.. when you don't know what you are talking about, don't fucking talk. We both know that you don't know what you are talking about, which leads to the conclusion that you like to pretend to know what you are talking about... in short, you are a dishonest fuck.

      • by Arker ( 91948 )

        The same argument could be used against C++, or C, and not just scripting languages like you claim. I know that most C programmers think they are doing low level programming, but they aren't.

        I'm a little confused by your plus 5 mod, I usually get flambaited into oblivion when I say so, but I was taught that C, along with Pascal, pretty much embodied 'high level programming language' and anything more abstract isnt programming, it is scripting. Assembler is still quite abstract. Low level programming means h

        • I'm a little confused by your plus 5 mod, I usually get flambaited into oblivion when I say so, but I was taught that C, along with Pascal, pretty much embodied 'high level programming language' and anything more abstract isnt programming, it is scripting.

          I too was taught that C was a high level programming language. However, over the years, the C zealots seem to have been successful at redefining what a low level programming language is to include their pet favorite language. I was not taught that other languages were scripting. I was taught that there were assemblers, compilers, and interpreters.

          C isnt exclusive to compilers. There are plenty of C interpreters.
          Java's byte code, on the right architecture, is a full fledged assembly language.
          JavaSCRIPT

          • by Arker ( 91948 )

            The difference has nothing to do with compiled versus interpreted, that is a common misconception because early scripting languages were more often than not interpreted, but it's never been a hard and fast rule. Scripting is stringing together pre-existing tools with i/o and control logic. You can probably script in any language if you want to but some languages are written specifically for it. It's not a bad thing, it's very often the quickest most efficient way to deal with things.

          • by wdef ( 1050680 )

            Most famously, Perl people can get their knickers in a knot about calling a Perl program a "script", which is felt to be diminishing (rationally or not), since Perl is compiled to bytecode (depending on how it runs) and thus is not an interpreted or so-called "scripting" language. It can be autoconverted to C (not sure how good that is) and it can be compiled to binaries.

    • Two objections:
      1. The programmers who use scripting languages extensively often understand lower-level code, even down to the machine code, but choose not to use it because it creates a whole bunch of unnecessary headaches.
      2. Well-written scripting languages ensure that the lower-level layer is not vulnerable to the most likely forms of attack, like buffer overflows. That means that the lower-level attack doesn't work, so in your scenario you might have a good machine gun and a roof to protect your position

      • On 1. it's not about choosing to use a certain language. Yes there are plenty of valid reasons for choosing a scripting language for certain tasks. I tend to disagree that anyone who has only learned scripting languages is all that likely to have learned enough about the progression from machine language to whatever interprets their script. I don't think it happens enough to use that word 'often' the way you are using it. I don't even think that the people programming in classical compiled languages such as

        • My argument on point 1 is that it's hardly unusual for 4-year computer science program to expose students to low-level code (and demand they write some low-level code). You show a developer who mostly writes in Python some C or assembler and there's a good chance they'll understand what they're looking at.

          Regarding point 2, the specific phenomenon that I'm going to focus on is classic buffer overflows, because those still are some of the most common forms of attack (see CWE-120 [mitre.org]). In C, a programmer has to b

        • The only programming language I know is Python.

          But I also know Verilog, and use it a good bit more than the Python.

          If I could have an HDL that acted like Python, I'd love it. Languages (HDL and Programming) are tools, and the high-level ones tend to be better tools. The low-level ones tend to be good for learning, but there's a lot to be said for the effort saved by using a high-level language.
    • a program can always be written in assembly that bypasses ANY controls the OS has about accessing different parts of memory, doing file copying, assigning user and admin permissions, and similar things.

      You should read up about things like protected mode (30 years old), sandboxing, hypervisors, ... . All used to make sure your application doesn't just read/write whatever memory/ports it wants.
      And just because a compiler translates C to assembly, it doesn't mean you have full control over the generated assembly by tweaking the C-code (that's why there is inline assembly).

      The real problem is that the average Joe user doesn't want to be inconvenienced by all this security, he just wants to be able to run th

  • by jkrise ( 535370 ) on Tuesday October 16, 2012 @03:35AM (#41666737) Journal

    This speech at TED was featured in my local Linux User Group based in South India a few weeks back. I am planning on conducting this course for my College students in CS and IT branches.

    The video also features Pramode CE who runs a consultancy for Embedded Systems and Software in nearby Trichur in Kerala State.

    Very nice course, one I fully recommend for all ages.

  • Anyone else first read that as NAND2Tardis?
    Need coffee

    • Any ful kno the TARDIS is analogue....
      • by pugugly ( 152978 )

        No, the TARDIS actually does the low level operations in digitally manipulated planck time units

        The analog interface does have a discrepancy of +- 10^102nd, but this has no practical effect on the operations and can be ignored by the end user.

  • by Anonymous Coward

    This was given as a course in my University (the Hebrew University in Jerusalem) eight years ago in which the lecturer was the other creator (Prof. Noam Nissan).
    I think the beauty of it is the fact that you manage to understand the principles of each level of the architecture without going into deep depth of each one, and so it manages to remain interesting throughout the course.
    At the end of the course, we ended up writing xonix (which involved writing a non-recursive flood-fill algorithm).
    Some team even w

  • I'm heavily into teaching and computer science so I'm always looking into this kind of stuff. I came across this book and worked my way through all the exercises.

    I love this book/course so much that I'm looking for a place that'll allow me to teach this course. Salary not required. But it's all dot-net and stuff around here. Lamers.

  • Same concept, written by Charles Petzold in 1999:

    http://www.amazon.com/Code-Language-Computer-Hardware-Software/dp/0735611319 [amazon.com]

  • by Anonymous Coward

    The NAND gate is a pretty high level abstraction of the quantum mechanics taking place within the semiconductor junctions. Maybe start with vacuum tubes, since those are a much simpler implementation of NAND...

    • by Teancum ( 67324 )

      Why start with vacuum tubes when you can go all the way back to Leyden jar [wikipedia.org] batteries and spinning copper wires using hand-made tools and raw ore you dug out of the ground. I suppose you also blow your own silla-based glass too?

    • by Hatta ( 162192 )

      NAND is more fundamental than transistors. You can build a computer without transistors, as you note with vacuum tubes. You cannot build a computer without using NAND. As he notes in the video, it's a CS course, not EE or physics.

  • Tanenbaum's [wikipedia.org] Structured Computer Organization takes a similar approach, going from the boolean logic of a transistor gate up to the OS and application level. I took a class with the first edition of Tanenbaum's book as the text in 1983 and l learned more about computers from it than from any other class before or since.

  • Slide 4 on the intro is wrong, Pong came out in the early 70's, in the 80's we had the Z80 and the Motorola 68000 in most of our arcade games. http://www.nand2tetris.org/lectures/PDF/lecture%2000%20introduction.pdf [nand2tetris.org]
  • by Dishwasha ( 125561 ) on Tuesday October 16, 2012 @08:48AM (#41668349)

    Coincidentally I have been running through this course in my spare time and I have to say it is the best I have found in 10 years. I've been itching to build a homebrew cpu like http://www.homebrewcpu.com/ [homebrewcpu.com] but lacked the basic skills to design a proper ALU and such. Most other courses either start way too basic and then shoot too far forward or they gloss over the basics and go right in to advanced concepts. So far I have made it through Chapter 2 and I'm proud to say that I've built all the basic components in HDL without looking anything up outside of the course material. Being able to build complex components on top of basic components I built myself is very rewarding. This is a must take course if you want a more intimate understanding of how computers work. And if building a computer from basic gates isn't nerdy enough for you, build your own [youtube.com] transistors [youtube.com].

  • What would Alexey Pajitnov and Henk Rogers think about this? Their company successfully sued the publisher of an unlicensed Tetris clone for copyright infringement a few months ago.
  • . . .the only way to really understand what computers are all about.

    Really? The ONLY way?

  • I looked at the "Getting Started With Digital Logic - Logic Gates" part. Anybody who has actually built something with TTL on a breadboard should know that 7400 series gates can sink a lot more current than they can source. Connecting a logic output to ground through a LED may not draw enough current to turn the LED on fully. The right way to do it is to connect the LED between the logic output and the Vcc rail in a pull-down configuration (with a current limiting resistor). Of course, that gives you in

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...