Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Digital Education Programming Software Games Hardware News Build Entertainment Technology

Man Builds Giant Homemade Computer To Play Tetris (bbc.com) 127

An anonymous reader quotes a report from BBC: A man has finished building an enormous computer in the sitting room of his bungalow in Cambridge. James Newman started work on the "Megaprocessor," which is 33ft (10m) wide and 6ft (2m) high, in 2012. It does the job of a chip-sized microprocessor and Mr Newman has spent $53,000 creating it. It contains 40,000 transistors, 10,000 LED lights and it weighs around half a ton (500kg). So far, he has used it to play the classic video game Tetris. Mr Newman, a digital electronics engineer, started the project because he was learning about transistors and wanted to visualize how a microprocessor worked. The components all light up as the huge device carries out a task. Mr Newman hopes the Megaprocessor will be used as an educational tool and is planning a series of open days at his home over the summer. You can watch a video demonstration of the monstrosity here.
This discussion has been archived. No new comments can be posted.

Man Builds Giant Homemade Computer To Play Tetris

Comments Filter:
  • DEC Logo as icon? (Score:5, Informative)

    by devphaeton ( 695736 ) on Tuesday July 05, 2016 @08:14PM (#52453119)

    Why the Digital Equipment Corporation logo as the icon for this story (and other DIY stuff)?

    Has /. gotten so young that nobody knows it means something more than just "digital", or has /. gotten so old that nobody remembers DEC?

    • I'll go with the first one. DEC would exclude the millennials and most definitely BeauHD [twitter.com]. Now I'm feeling old. Thanks a lot Beau!
    • by OzPeter ( 195038 )

      Complained about this the last time this logo was used. It's becoming an onion on my belt thing.

      To these kids everything is "digital"

    • Has /. gotten so young that nobody knows it means something more than just "digital"

      Replace "young" with "brain dead editors" and you have your answer.

    • Re: (Score:2, Interesting)

      by DRJlaw ( 946416 )

      Has /. gotten so young that nobody knows it means something more than just "digital", or has /. gotten so old that nobody remembers DEC?

      Or has DEC been dead and buried for so long (18 years) that someone has decided to repurpose the graphic simply because they can?

      *BINGO*

      DEC, dead. Compaq, dead. HP, dead enough [youtube.com].

      Let it go.

      • by sconeu ( 64226 )

        HP is only MOSTLY dead...

        • Unfortunately for HP, I think the only purpose they have for existing anymore is "to blave"(sic).
        • Too bad they can't get Miracle Max to help them out. Nobody has seen him around these parts for quite a while.
      • Why doesn't one of the enterprising young editors for /. come up with their own "Digital" graphic image to use on /.?
        Why reuse the corporate logo from a company from the beginnings of the digital era?
        Laziness is the answer.
    • by fred911 ( 83970 )

      Why do you bother with icons (options icons)? Ever since /.media reneged on the "thanks click here to reject ads" link, I've reinstalled ABP. Looking at icons? No thanks.

      • My reaction too. I keep seeing the bitching posts, but I don't see the icons that people are bitching about. Love that on /. of all places that the bitching gets upvoted. What happened to being tech savvy enough to block the shit that irritates you on the web?

    • Possibly because of the size and that older DEC computers were built with "off the shelf" DEC logic modules. Similar to how this computer was built with logic modules connect to each other by cables.
      • But older IBM computers were made with cards like that (one or a few flip flops per card) and they didn't use an IBM logo for the article. A better icon for this sort of article would be a transisrtor symbol, or maybe a soldering iron ( Steve Ciarcia used to say he wrote his best code in solder).

    • by tlhIngan ( 30335 )

      Why the Digital Equipment Corporation logo as the icon for this story (and other DIY stuff)?

      Has /. gotten so young that nobody knows it means something more than just "digital", or has /. gotten so old that nobody remembers DEC?

      The logo has fallen into public domain as well, I believe. I've been seeing tons of things with that logo on it - from music audio processing boxes to practically anything needing a fancy "digital" logo.

    • by skids ( 119237 )

      Yeah, they should have used a cray [wikipedia.org] logo. Because this is cray cray.

    • There are still some of us who remember DEC. I joined DEC in 1979 and left Digital in 1991. Those were the good old days...
  • Megaprocessor promptly died of slashdotting n/t
  • Video mirror (Score:3, Informative)

    by Anonymous Coward on Tuesday July 05, 2016 @08:26PM (#52453183)

    http://web.archive.org/web/20160705214332/http://www.megaprocessor.com/Images/megaprocessor-tour1-2mbps.mp4 [archive.org]

    I haven't seen a slashdotting in quite a while. I tried to dig up some mirrors (MirrorDot, CoralCDN, etc), but they're all dead now. Internet Archive to the rescue

    • by NotInHere ( 3654617 ) on Tuesday July 05, 2016 @08:27PM (#52453191)

      Probably because the webserver is running on that machine as well.

      • by Anonymous Coward

        if only there was free video hosting available online that could handle spikes...

        • by tepples ( 727027 )

          if only there was free video hosting available online that could handle spikes...

          Some video hosts can handle spikes but will take down videos at the drop of a hat, especially when Tetris clones are involved. Arika Co., Ltd, developer of an official Tetris game, sent a bunch of DMCA takedown notices to YouTube in May 2009, and one of these videos was a video about The Tetris Company's copyright enforcement practices.

    • Just found a faster mirror: https://www.youtube.com/watch?... [youtube.com]

      • Yes, now if the editor would just replace the link TFS with the youtube link the guy might not be bankrupt tomorrow...
      • i learned something today... i learned that there are people out there that really suck at tetris.

  • by berchca ( 414155 ) on Tuesday July 05, 2016 @08:29PM (#52453197) Homepage

    Kilo-for-kilo, the cheapest hobby computer money can buy!

    • The cheapest hobby computer would probably be a PIC 10F202. In the little 6 or 8 pin package they are 10 cents or so in quantity, the tools to code them are free and the hardware to flash the binary object into them is a few dollars. The 24 bytes of RAM and 512 bytes (12 bit words, really) of program memory keeps the coder honest and frugal.

      • Yup. In more recent times I've used arduinos and raspberry pi's for most things but I do still keep quite a reserve of PIC processors.
    • Hardware is generally a fairly cheap hobby. The initial costs for soldering equipment, etching, measuring (oscilloscope+logic analyzer) and so on set you back a few 100, after that you're looking at pennies for hours and hours of fun and entertainment. And given that the IoT is coming really soon now (tm), it can well be fun and profit.

      • Indeed, a good chunk of IoT type sensors and devices I have around the house I made myself.

        But it also depends on what you want to do/make. More and more keeping up with the Joneses means not just getting custom boards but being able to solder ball grid arrays. You need hot air skills, infrared gear, masks, etc.
  • by Anonymous Coward

    An educational tool to show how people waste money....

  • by plover ( 150551 ) on Tuesday July 05, 2016 @08:47PM (#52453279) Homepage Journal

    He built it because he could, of course, but he's planning on it becoming an educational display. It's just that a computer with no actual applications is a pretty boring thing for non-techies to behold.

    • I am sure even non-techies think this is impressive. Each one of those modules for that computer would of had to be assembled and tested by hand. Even then this is no simple HACK computer. It has square root for christ sakes (even if it is a bit long in cycles). This thing is WAY over-engineered yet very pretty to look at.

      I'd be interested to know how modeler it is. That is can you move the logic modules around to change the instructions with the way he has those cables connected. I always liked the i

      • The entire time watching the video I was thinking about having to debug it. Finding the transistor that doesn't work anymore or the wire that broke inside the insulation has to be a fun exercise in narrowing down the problem until you've found the cause.

      • I'd got a step further and say that ONLY techies would be interested.
  • by Nkwe ( 604125 ) on Tuesday July 05, 2016 @08:54PM (#52453311)
    This is a prime example of what should be on the site. Thanks )
    • In this case that's news about a nerd.
    • I agree - this was terrific. Very inspiring - one of those "I want to build one too." I remember way-back in college the instructor showed on the chalkboard how an ALU works (it was a 90 minute lecture). He drew clock lines - a few gates, memory, and a few binary instructions. He then walked through each clock tick - moved bits around - and visually showed the "computer" executing the instructions (it was something simple like Add two values and store in memory). But it made me sit up and notice.

  • by localroger ( 258128 ) on Tuesday July 05, 2016 @08:54PM (#52453317) Homepage
    Maybe he should have gone for Space Invaders?
  • Apparently, someone needs to up their meds...
  • by Anonymous Coward

    ... you really can build a mainframe from the things you find at home [bsutton.com].

  • I could bet that Processing Units manufacturers (Intel, ATI/AMD, NVidia, ARM,etc.) had built things like this internally before, for years. just to understand better what they are doing.

    Can any insider of those companies confirm or deny my conjecture, please?

    • by localroger ( 258128 ) on Tuesday July 05, 2016 @09:46PM (#52453571) Homepage
      This is actually the way real computers were built in the 1950's and most of the 1960's. Integrated circuits weren't invented until the late 1960's, and integrated microprocessors in the mid-1970's. Before that if you had a computer, it was built like this (or even more primitively, with vacuum tubes and delay lines for memory). Although this video doesn't mention it the Megaprocessor is actually a clone of the 6502, based on the reverse engineering of that chip which was done by the visual6502 people. Actual discrete transistor designs were a bit more streamlined to reduce the discrete component count.

      The people who built early microprocessors mostly didn't bother emulating them first because they had a lot of experience with discrete design; processors were not mysterious to them and they had confidence that they knew what would work. The 6502 was in fact laid out entirely by hand directly in MOS masks, not more abstract circuit diagrams, and had to be reverse engineered in our day because no record remained of how its fine features worked.

      • Although this video doesn't mention it the Megaprocessor is actually a clone of the 6502, based on the reverse engineering of that chip which was done by the visual6502 people.

        No, you're thinking of the MOnSter 6502 [monster6502.com]. The Megaprocessor has its own instruction set [megaprocessor.com], with 4 (semi-)general purpose registers (some load and store instructions can only use R0 or R1 as the register source/destination and R2 or R3 as an index register).

      • Integrated circuits weren't invented until the late 1960's

        That's not really true, the AGC was using IC gates around 1962 already.

        Although this video doesn't mention it the Megaprocessor is actually a clone of the 6502, based on the reverse engineering of that chip which was done by the visual6502 people. Actual discrete transistor designs were a bit more streamlined to reduce the discrete component count.

        Which is peculiar because the 6502 should have not nearly as many as 40k transistors. (If I were building a CPU out of discrete transistors, I'd definitely go for some kind of stack machine, perhaps with unencoded instructions to boot.)

        • Integrated circuits weren't invented until the late 1960's

          That's not really true, the AGC was using IC gates around 1962 already.

          Perhaps they're thinking of MSI, which first showed up in the late 1960's; SSI predated that.

          Although this video doesn't mention it the Megaprocessor is actually a clone of the 6502, based on the reverse engineering of that chip which was done by the visual6502 people. Actual discrete transistor designs were a bit more streamlined to reduce the discrete component count.

          Which is peculiar because the 6502 should have not nearly as many as 40k transistors.

          And, in fact, the 6502 had more like 3.5k transistors [swtch.com]; the MOnSter 6502 [monster6502.com], which is what the "clome of the 6502" person was actually thinking of, has about 4300 transistors.

        • by jeremyp ( 130771 )

          Which is peculiar because the 6502 should have not nearly as many as 40k transistors.

          That's explained by the fact that it is not a 6502. The processor architecture is his own 16 bit design. This is a really impressive achievement. He designed the processor architecture from scratch, wrote an assembler and simulator and then built the thing out of (mostly) discrete transistors.

          • I would have been more impressed by a 16 bit design with 4k transistors. ;) Even better, by a design automatically generated rather than hand-designed, which is what everyone has been doing for the past seven decades, usually with dismal results.
      • by clovis ( 4684 ) on Tuesday July 05, 2016 @11:34PM (#52453979)

        When I was somewhat younger, I was a so-called field engineer responsible for keeping some discrete element computers running.

        Here's a picture of a module. This would be a single logic element such as a flip-flop, NAND gate, OR, etc.
        https://www.etsy.com/listing/2... [etsy.com]

        The CPU cabinet was a huge box full of these things. The I/O controllers were in another cabinet, and the memory was in another cabinet.
        The other boxes (storage, printers, card readers) had these same modules in them.
        I never was main support for a CPU using those modules, but had some peripherals that had those things inside.

        In more modern computers, these modules were replaced by logic cards. A PCB would have the transistors/diodes, etc to make a single element such as NAND gates, flip-flops or whatever, and these cards might have as many as 4 or even 6 logic elements on a single card. woo-eee!
        I was lucky to be supporting such modern machines.

        These old machines required hand-tuning such as manually synchronizing the clock signals between the near and far part of the cabinets.

        The oldest machine I had to maintain was an 80 column card reader that used mechanical relays for all the logic elements. That was so long ago that the nightmares have stopped.

  • by Guy Harris ( 3803 ) <guy@alum.mit.edu> on Tuesday July 05, 2016 @09:36PM (#52453523)

    Somebody else built a discrete-transistor 6502 processor [monster6502.com].

    And, of course, there's the non-integrated-circuit TTL 8008 [wikipedia.org], although that was probably SSI or MSI, not discrete transistors.

    • And no need to say you could emulate whatever processor of yours on a PC, showing virtual LEDs or anything, within a day of dev,
  • Cambridge where?
    The one in England comes to mind, but theres also one in MA (and in umpteen other staes
    Theres even one in the Waikato (NZ)

    • by Wagoo ( 260866 )

      of course England.. if it was one of the others THEN it would need to be specified :)

    • When someone speaks with an English accent and refers to "Cambridge" without mentioning the state, country, etc, obviously they are referring to the Cambridge in the Waikato in New Zealand. Everyone knows that.

      When in doubt, look it up on Wikipedia and see which one pops up with the link to the disambiguation page. That's probably the one they're referring to.

  • ...he's single.

  • When I was a kid, I started designing a computer that could play tic-tac-toe using only mechanical relays. About the time I realized how many thousands of relays were required, I decided it wasn't worth the effort. I don't understand this guy's thought processes... why spend thousands of dollars and use up half your house for some you could easily do with a $5 Rasberry Pi Zero?
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      why spend thousands of dollars and use up half your house for some you could easily do with a $5 Rasberry Pi Zero?
      Flag as Inappropriate

      Because he can and because he presumably enjoyed doing so.

      It's the same reason some hobbyists still photograph with 19th-century film technology and it's part of the reason some amateur radio operators still use Morse Code (well, that, and because it may work when other ways of communicating over radio won't work as well, as efficiently, or at all under a given set of conditions).

    • by Anonymous Coward

      Why collect cars, trade cards, go fishing, hiking, play games, etc, etc? Because it's something you enjoy!

      In this instance he built something that he was able to share with others and hey, it's pretty cool as well :)

    • WTF? How on earth can you build a computer from scratch by buying a pre-build computer for $5?

      Or do you actually think the purpose of this was to play tetris?

    • why spend thousands of dollars and use up half your house for some you could easily do with a $5 Rasberry Pi Zero?

      why do anything at all, you can experience everything you want by simply watching someone else do it online.

  • Can someone make a Tetris game that drops actual physical blocks down? Maybe on pulleys. Bonus points if filled rows actually explode.

  • by germansausage ( 682057 ) on Tuesday July 05, 2016 @11:13PM (#52453911)

    How can this be? An actual tech story on slashdot. Nothing about creationism, obese people, the lack of women in STEM or mass shootings. Maybe I'll see if it happens again tomorrow.

  • Visual computing (Score:4, Interesting)

    by Waccoon ( 1186667 ) on Tuesday July 05, 2016 @11:58PM (#52454065)

    The LEDs are the coolest part. I've had trouble seeing the video on his site since it's downloading very slowly, but I love what I'm seeing so far.

    Stuff like this reminds me of RAM scanning and memory ripping back in my Amiga days. Since the Amiga had no MMU and the video chip could address the entire range of the machine's main "chip" RAM, it was popular to fiddle with the screen display and scan through system memory. You could actually watch your computer running programs in realtime. The Amiga also used planar graphics, so you could see individual bits, rather than bytes, as pixels, allowing you to identify which memory locations were used for counters, timers, disk control logic, mouse pointer coordinates, and more. I wrote a whole bunch of programs in AMOS Basic that let me directly edit memory by drawing on the screen, bubble sort graphics, visually highlight specific memory addresses used by games, and do all kinds of cool nonsense.

    I miss those days when you could read any memory address without needing signed drivers and such. I've always wondered why memory visualization has totally disappeared. It might make for some interesting lessons in how modern programs actually use memory and how memory leaks happen.

    • Re: Visual computing (Score:4, Interesting)

      by Bing Tsher E ( 943915 ) on Wednesday July 06, 2016 @12:30AM (#52454209) Journal

      I used to visually monitor small computers running by using a pair of 8 bit DACs connected to the address bus with the analog outputs connected to the X and Y of an oscilloscope in XY vector mode. Where the scope trace moved around on the screen showed the branching locations of the CPU. Even without really understanding which exact locations the processor was running through you could get a heuristic view of the program in action.

      • Sweet. Audio RAM scans were pretty popular, too. It was always fun to play back memory and listen for certain patterns and guess what kind of data it was, which was easy in the days before everything was compressed (or encrypted).

        Closest thing I've heard that was similar to what you were doing is when engineers would put an AM radio next to a PDP-11 computer, and listen to the CPU working. By programming the CPU with differently timed loops, they could produce music over the radio. [youtube.com]

  • I am totally jealous.

  • This is an impressive learning device.

    The man should get a grant from some large software corp like MS or something to build a few of these and place them in education centers and science-museums.

  • hmmm -- might I suggest this is a topic for a modern James May to bring this subject to life?

    • would second this if only to get more of the above to watch.

      however i do not think may and his lot would have the chops for this.

      it doesn't actually move - and may is not that slow.

      • Capt Slow vs 1 Hz? I dunno - might be a tight race.

        Yes it doesn't move. But his "toy" challenges like the Legos house, toy train, and race track brought something interesting to life in an enthusiastic manner. If nothing else his enthusiasm and wit could make this computer more-cool.

  • in the video, i see display ram, but i did not hear or notice anything about core/process storage.

    always wanted to do something like this myself - bravo, bis, encore.

  • So THIS is the guy getting all the bitcoins!

  • This is very cool, but if the reason he did it was because "microprocessors were opaque" he should have just simulated it in Verilog or VHDL. Then he could follow all the operations he wanted at whatever detail he liked.

  • More of these types of articles please. These are the things mass media does not report.
  • There is nothing unusual about it, to be sure, our world is full of computer genius. This event attracts only for the creation purpose. But do you really think that this Megaprocessor was made to play tetris??
  • He should have built it in Minecraft (like many others have done to various degrees) and saved himself 50 grand. Museums could have virtual tours of the thing:) Kids would love that. Put on your VR googles in the museum, and wander around the computer with your digital avatar, while a real person gives the tour to you via a headset.

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...