Forgot your password?
typodupeerror
Education Programming United Kingdom Hardware

'Retro Programming' Teaches Using 1980s Machines 426

Posted by samzenpus
from the old-timey-processing dept.
Death Metal Maniac writes "A few lucky British students are taking a computing class at the National Museum of Computing (TNMOC) at Bletchley Park using 30-year-old or older machines. From the article: '"The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you," said Doug Abrams, an ICT teacher from Ousedale School in Newport Pagnell, who was one of the first to use the machines in lessons. For Mr Abrams the old machines have two cardinal virtues; their sluggishness and the direct connection they have with the user. "Modern computers go too fast," said Mr Abrams. "You can see the instructions happening for real with these machines. They need to have that understanding for the A-level."'"
This discussion has been archived. No new comments can be posted.

'Retro Programming' Teaches Using 1980s Machines

Comments Filter:
  • by John Hasler (414242) on Wednesday August 25, 2010 @03:12PM (#33372476) Homepage
    Uhm, we were saying that in the 1980s. At Bletchley Park they should be teaching with machines that actually are old.
  • by commodore64_love (1445365) on Wednesday August 25, 2010 @03:13PM (#33372482) Journal

    You would get exactly the same "feel" as you get with an old C=64 or Atari or Amiga machine. If your goal is to get down to the bare metal, then go ahead and do so. There's no need to dust-off old machines that are on the verge of death (from age).

  • I keep saying that (Score:3, Interesting)

    by Anonymous Coward on Wednesday August 25, 2010 @03:13PM (#33372484)

    ... if you want to know how computers work, learn microcontrollers with the Atmel 8 bit family of controllers (ATMEGA8, for example). These things are wonderfully documented, there is a free C/ASM development environment with emulator (single-step, breakpoints, etc.). The real deal is just a few dollars for a development board (or get an Arduino, same thing). You don't get the absolutely down to the transistor insight, but that's really just a few experiments with TTL gate chips and LEDs away.

  • "Actors" (Score:5, Interesting)

    by Applekid (993327) on Wednesday August 25, 2010 @03:13PM (#33372488)

    They each took on the role of a different part of the machine - CPU, accumulator, RAM and program counter - and simulated the passage of instructions through the hardware.

    The five shuffled data around, wrote it to memory, carried out computations and inserted them into the right places in the store.

    It was a noisy, confusing and funny simulation and, once everyone knew what they were doing, managed to reach a maximum clock speed of about one instruction per minute.

    I wish I had a teacher like this while in [US public] school.

  • Probably a good idea (Score:4, Interesting)

    by Anonymous Coward on Wednesday August 25, 2010 @03:17PM (#33372534)

    We ran some older machines in my first programming course. When you can see the direct results in speed (or lack of) it can help teach better approaches. Writing a game and seeing the screen flicker when you ask the CPU to do too much is good modivation to find a more effectient approach. One our our instructors also did something like this with visual sorting procedures. If you can see the difference in speed between one sorting approach and another, it sinks in.

  • Old trick... (Score:3, Interesting)

    by ibsteve2u (1184603) on Wednesday August 25, 2010 @03:18PM (#33372556)
    I wasn't alone in keeping '286, 386, and '486 boxes around until Pentiums became prolific...and the same goes for dual cores etc...you write code that runs fast on the older generations, and you never hear user-land complaints about your stuff's performance on the new.

    Of course, with the advent of .NET....well, now you're only as good as Microsoft is.
  • Ridiculous (Score:2, Interesting)

    by 16K Ram Pack (690082) <`moc.liamg' `ta' `dnomla.mit'> on Wednesday August 25, 2010 @03:21PM (#33372598) Homepage

    The five soon discovered that just because a program was simple did not mean the underlying code was straight-forward. To make matters more testing, the BBC Micro offers a very unforgiving programming environment.

    My first piece of commercial programming was on a BBC Micro and having that environment didn't teach me anything, it just made programming more of a pain than being able to cut and paste, set debug breaks and so forth. And it doesn't teach any more than using C#/VB because it's a machine designed around using BASIC, which is itself an abstraction (and IIRC, you didn't have functions, so had to endure the horror of GOSUB/RETURN).

  • by tomhudson (43916) <.barbara.hudson. ... bara-hudson.com.> on Wednesday August 25, 2010 @03:25PM (#33372640) Journal

    It's not about "understanding low-level programming" - it's about having a direct connection between what you do and what happens. No virtual machine, no garbage collector, no super-fast compile/link/run/modify cycle (s you're going to take a few minutes to THINK about why something didn't work instead of just doing the "quick fix let's test it and see if we got it right this time" route).

    screw your head on a crippling dinosaur

    The article never said they were using Windows.

  • by localman57 (1340533) on Wednesday August 25, 2010 @03:27PM (#33372660)
    If you want to get an intimate feel for writing programs without being able to waste resources, try embedded systems programming. The microchip 10F series has only a few dozen bytes of ram, and a couple hundred words of flash. And no hardware multiply. Making it do useful things is an art. Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.

    That said, it does seem like a cool class. One I'd like to take, but for personal interest, not professional development.
  • crufty calculator? (Score:3, Interesting)

    by chitselb (25940) on Wednesday August 25, 2010 @03:35PM (#33372762) Homepage

    from the link: "using 30-year-old or older machines."
    from the fine article: "First released in 1981; discontinued in 1994 using 30-year-old or older machines."

    I recently (three weekends ago) fired up my Commodore PET 2001 [flickr.com] (a *genuine* pre-1980 computer) and have been writing a Forth [6502.org] for it. It's really a lot of fun, and I'm finding that 30 years experience in various high-level languages has improved my "6502 assembler golf" game a lot. It's very incomplete, but the inner interpreter mostly works. Feel free to throw down on it here [github.com]

    Charlie

  • by BlindSpot (512363) on Wednesday August 25, 2010 @03:38PM (#33372804)

    10 years ago when I went through University, the core of the mandatory Assembly programming course was taught on the PDP-11 architecture, then 30 and now 40 years old.

    Granted it's not quite the same. We used emulators and not the real things. Also it was for different motivations. The prof felt it was simpler to teach the cleaner PDP-11 instruction set than the 80x86 or 680x0, although the course did eventually also extend to both. Also he happened to be an expert in systems like the PDP-11.

    However the idea of using old systems as teaching aids is hardly new - or news IMO.

  • by Nethead (1563) <joe@nethead.com> on Wednesday August 25, 2010 @03:42PM (#33372866) Homepage Journal

    You're an old hacker and may relate to this. I found free on craigslist a hand built (hand wrapped) Z80 CP/M box with dual 8" drives and a case of diskettes. No instructions or schematics. This winter I'm going to dig into it with my scope and logic probe and see if I can get the old baby working again. I was amazed that I was still able to just look at what components were on the boards and get a fairly clear idea of how it was put together. I figure the hard part will be following the address lines and seeing where the memory and I/O is at. I did see a few 74LS138s on board so I guess that is where I'll start. Now to pull the old DOS box out of the shed with the ISA EPROM burner and see what's there. From what I understand the boot ROM was home brewed. This should be very interesting.

  • by Moridineas (213502) on Wednesday August 25, 2010 @03:44PM (#33372894) Journal

    I was in AP computer science over a decade ago. We used C++ using the "apstring" and "apvector" classes that were similar to the STL.

    We of course had to implement bubble short, quicksort, insertion sort, and so.

    It was fairly slow on our computers (386s/486s/maybe one pentium!) and you could REALLY see a visible difference between the difference sorts. It was very obvious.

    I rewrote the sorts using standard C arrays instead of apvector. Even on those ancient computers, the differences were suddenly almost gone. Bubblesort using straight arrays was faster than apvector quicksort--at least for fairly small arrays. I don't remember the specifics anymore, but you had to be sorting IIRC several thousand things before there was much of a recognizable difference.

    So yeah, that made a big impression on me. Then again that class, and intro classes in college were the last time I've had to write my own sorting algorithm...

    I think it's a good thing that people who have maybe only used 2ghz+ computers are given a chance to experience something else. I guess a better question would be, why is expanding your horizons ever a bad thing?

  • by Thud457 (234763) on Wednesday August 25, 2010 @04:05PM (#33373130) Homepage Journal
    How come nobody's done a hobby home brew CPU [homebrewcpu.com] using tubes and core memory?
  • by Moridineas (213502) on Wednesday August 25, 2010 @04:07PM (#33373158) Journal

    I think it's because apstring and apvector were both simplified versions of the real deal. And the entire source code for both was pretty small and understandable for people just getting into C++, templates, etc.

    We at least created modified versions of them as well, extending or re-implimenting certain functions. I don't really remember too many specifics!

  • Re:Knowability (Score:1, Interesting)

    by Anonymous Coward on Wednesday August 25, 2010 @04:16PM (#33373284)

    What you're talking about is the fragmentation of computer science and computer programming. It used to be that one would learn the entire system; now, the system has got large enough that you really need to specialize in a component in order to be skilled enough to get anything done.

    Back when I was in university, we learned C, some Pascal, LISP, ASM for a number of architectures (including theoretical models), and then a bunch of Big O-related maths. By the time we were doing upper level courses, the hardware courses were all about the hardware and the software courses were development environment agnostic -- they didn't care what tools you used, they just wanted it fully documented so that the TA could come along and understand what you'd done and replicate the end product in the build environment. In both cases, it was really more about displayed knowledge through documentation than it was about what APIs you used to get it done.

    The thing about drivers and libraries is that SOMEONE has to write them -- a high level programmer has no need to understand them, but an A-level programmer is required to.

  • by ceoyoyo (59147) on Wednesday August 25, 2010 @04:18PM (#33373322)

    Oh, the luxury.

    In my digital design course we had to build a simple computer. After we'd demonstrated adders made out of NAND gates we were allowed to use an arithmetic unit chip, and wired things up with latches etc. so we had a workable bus. Programming was accomplished through DIP switches and output via an LED bank. Just like they used to do (substituting LEDs for light bulbs). When you programmed those things you made sure your code was efficient otherwise your hand would get tired flipping the switches.

    The really cool part was when we wired two of them together.

  • by Monkeedude1212 (1560403) on Wednesday August 25, 2010 @04:22PM (#33373354) Journal

    A) It teaches people how to use unfamiliar hardware/software. Chances are the thing you are going to be running at your job is not going to be the thing you studied in university for.

    C) It teaches kids how computers actually work by pealing back layers of abstraction. Think about it, has the average person under 20 ever used a CLI? For anything? I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.

    I can't stress this enough. I'm 22 - so close to the age range you mentioned, and I had only ever used Windows 3.1 when I was around 3 to 5 years old, and even then it was just to boot up some old Kings Quest or Math Tutor game - and beyond that I only ever used to use the MS-DOS prompt on Windows 95 for ipconfig so that we could get a good Age of Empires game going. Once I got into the Polytechnic that changed a lot because one of our professors was very Linux happy and learning to use the terminal on a Fedora machine was great experience. He then insisted that we learn to do the same (or attempt to do the same) in the command line on our windows machines.

    I ask other people a few years younger than me, who would even consider themselves computer experts, because they know how to build one from scratch and the hotkeys associated with Windows - but any of the ones who haven't gone to school basically don't know how to use that kind of interface. Or if they've used it, they know how to run about 4 commands, but not how to navigate through the file structure to execute various tasks.

  • by Anonymous Coward on Wednesday August 25, 2010 @04:24PM (#33373380)

    So I just used Reflector to decompile System.IO.File.Move and, unsurprisingly, it calls the win32 MoveFile function. So why is the win32 function such a win?

  • by mspohr (589790) on Wednesday August 25, 2010 @04:28PM (#33373428)
    This reminds me of a quote I read from Philippe Khan back in the really old days. He used the original IBM PC (4.77 MHz) to test code (Turbo Pascal) when much faster (8 MHz) machines were available. He said he "liked to watch the computer work".
  • psh (Score:1, Interesting)

    by Anonymous Coward on Wednesday August 25, 2010 @04:38PM (#33373544)

    For my A-level we made our computers from lcd screens and pic chips, and programmed them in assembler... This was just before the millennium, these computers are far more advanced.

  • by Lord Kano (13027) on Wednesday August 25, 2010 @05:19PM (#33374092) Homepage Journal

    Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.

    Just last year, I was working for a company (A fairly large one) and they were still running programs written in DEC FORTRAN 77 on Vaxen. A part of my job was to port these programs to an.......Alpha.

    LK

  • by Anonymous Coward on Wednesday August 25, 2010 @05:39PM (#33374392)

    I'd rather have errata than no errata. The latter probably just means the implementation errors are undocumented. The Atmels have a clean architecture and a nice instruction set which makes assembly programming an acceptable proposition. Then there's the Arduino system, which is just an Atmega under a catchier name: That means there's lots of working code and hardware projects. Whatever, yes, there are other microcontrollers out there and if you can get more local help for any of them, then that's probably a good enough reason to base your choice on. I really like the integrated development environment that Atmel makes available for free (but you're not forced to use it, you can use other compilers and programmers). Your mileage may vary.

  • by TheRaven64 (641858) on Wednesday August 25, 2010 @06:06PM (#33374828) Journal

    I was going to post and suggest that they should use 29-year-old computers instead, because the BBC Micro was (designed as) an absolutely superb platform for teaching programming. Then I read TFA, and it turns out that, in fact, that is exactly what they are doing.

    It was a huge step backwards when the BBC Model Bs in my school were replaced with 386 PCs. The PCs were networked and so had to have some security. The BBCs let you poke absolutely anything and booted straight into a programming language (a dialect of BASIC, but one that supported structured programming and included a built-in assembler), and included a collection of I/O ports that were trivial to connect your own projects to and drive from BASIC. The OS was stored in ROM, and if you did anything especially wrong, you just turned it off and on again and were back to the pristine condition.

  • by soliptic (665417) on Wednesday August 25, 2010 @07:28PM (#33375844) Journal

    I don't know what the A level syllabus is

    It's a little over a decade since I did mine, and I don't know how much they've changed. But FWIW mine involved partly learning algorhythms / programming - in Pascal, with tiny bits of assembly - and partly a bunch of theoretical stuff such as binary (floating point) arithmetic, BNF, Codd's normal forms, basic hardware/architecture principles & protocols, etc. I can't claim to remember the proportion very accurately. Somewhere between 30:70 and 50:50 I think.

  • Re:Mid 1980's for me (Score:3, Interesting)

    by Glonoinha (587375) on Wednesday August 25, 2010 @09:41PM (#33376760) Journal

    Hey man, don't bag on ARCnet. That shit was insane - you could pull coax for as long as the eye could see and still maintain decent throughput (for the day) and it would tolerate about as much 'stupid' as any network I've ever encountered. I saw instances of some serious 'stupid' on ARCnet networks - including one length of cable that didn't quite reach, so they spliced it using two pieces of coathanger soldered on both ends to the frayed ends of each piece of coax. They used cardboard to keep the two pieces of coathanger separated.
    Craziest thing I've ever see. And it still ran, full speed.

  • by TheRaven64 (641858) on Thursday August 26, 2010 @06:39AM (#33379128) Journal

    The problem with using an emulator, is that you're then using an emulator. When you're learning to program, it's much more satisfying having something running on real hardware - even obsolete real hardware - than in an emulator. If you're going to use a virtual platform, you may as well use something like Java, in terms of the amount of how enjoyable it is.

    One of the problems that they are trying to address with this idea is that modern programming is so abstracted from the real hardware that you don't get a feel for how things really work. Running an emulator would defeat this purpose.

    And don't downplay the I/O. Wikipedia tells you that the BBC had:

    serial and parallel printer ports; an 8-bit general purpose digital I/O port; a port offering four analogue inputs, a light pen input, and switch inputs; and an expansion connector (the "1 MHz bus") that enabled other hardware to be connected. Extra ROMs could be fitted (four on the PCB or sixteen with expansion hardware) and accessed via paged memory

    One of the machines in my school had a ZIF socket connected to one of the ROM slots, so you could burn programs to an EPROM and have them available on boot.

    The GPIO ('user') port was amazingly useful. It was mapped to a byte of address space, so you could read the 8 input pins and write the 8 output pins with PEEK and POKE commands from BASIC. My school computer lab had a load of things that you could connect to this port. Some were simple, like a 7-segment display (I learned hex programming that - each segment was controlled by a single bit, so you needed to supply the hex digit corresponding to the shape that you wanted). Some were more complex, like a set of traffic lights and a robot arm. I borrowed one and some light gates over the summer holidays and made it drive my scalextric set. Not very well, because it was rapidly pulsing the controller rather than gently controlling it, but it managed to do one complete lap of a circuit before crashing. The code for that was relatively simple - it measured the speed at each light gate and adjusted the pulsing speed if it was too slow or too fast. You drove the car around once for it to record the 'correct' speed, and then you could race against it. Well, in theory - mostly you could race and it would crash, but I still learned a lot about control theory from the problems I encountered there (and from having a father with a PhD in control theory around to help, of course).

Is a person who blows up banks an econoclast?

Working...