'Retro Programming' Teaches Using 1980s Machines 426
Death Metal Maniac writes "A few lucky British students are taking a computing class at the National Museum of Computing (TNMOC) at Bletchley Park using 30-year-old or older machines. From the article: '"The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you," said Doug Abrams, an ICT teacher from Ousedale School in Newport Pagnell, who was one of the first to use the machines in lessons. For Mr Abrams the old machines have two cardinal virtues; their sluggishness and the direct connection they have with the user. "Modern computers go too fast," said Mr Abrams. "You can see the instructions happening for real with these machines. They need to have that understanding for the A-level."'"
They'll just use them to play Elite all day (Score:5, Funny)
Re: (Score:2)
http://en.wikipedia.org/wiki/File:An_der_sch%C3%B6nem_blauen_Donau.ogg [wikipedia.org]
Re:They'll just use them to play Elite all day (Score:5, Interesting)
I was going to post and suggest that they should use 29-year-old computers instead, because the BBC Micro was (designed as) an absolutely superb platform for teaching programming. Then I read TFA, and it turns out that, in fact, that is exactly what they are doing.
It was a huge step backwards when the BBC Model Bs in my school were replaced with 386 PCs. The PCs were networked and so had to have some security. The BBCs let you poke absolutely anything and booted straight into a programming language (a dialect of BASIC, but one that supported structured programming and included a built-in assembler), and included a collection of I/O ports that were trivial to connect your own projects to and drive from BASIC. The OS was stored in ROM, and if you did anything especially wrong, you just turned it off and on again and were back to the pristine condition.
Re: (Score:3, Funny)
*CONFIGURE TUBE
That got me into DR-DOS (I forget what version and can't be bothered to try and reason it out due to alcoholic indifference).
Oh, and a couple of months after I was playing with that crap my IT teacher at school turned up with a Schneider 8086 machine with an orange monochrome monitor and admitted that he had no idea how to use it but that it was the latest generation of computer that businesses were using. I actually got
Re: (Score:3, Interesting)
The problem with using an emulator, is that you're then using an emulator. When you're learning to program, it's much more satisfying having something running on real hardware - even obsolete real hardware - than in an emulator. If you're going to use a virtual platform, you may as well use something like Java, in terms of the amount of how enjoyable it is.
One of the problems that they are trying to address with this idea is that modern programming is so abstracted from the real hardware that you don't
Re: (Score:3, Insightful)
Re: (Score:3, Funny)
Sadly, I remember when spelling and grammar were taught in schools. Those days seem to be gone as well, and I miss them much, much more than the days of limited computer resources.
Well, that is the end of my rant.
Mid 1980's for me (Score:3, Insightful)
We had to remote into this old Unix System V box and do a few exercises for our course education. No, its not as far back as these students were going but it was helpful to become familiar with that kind of architecture, because you never know whats still going to be kicking around when you get on the job.
Re: (Score:2)
You really don't. You could end up with one of the two ARCnet plus deployments that were done.
Re: (Score:3, Interesting)
Hey man, don't bag on ARCnet. That shit was insane - you could pull coax for as long as the eye could see and still maintain decent throughput (for the day) and it would tolerate about as much 'stupid' as any network I've ever encountered. I saw instances of some serious 'stupid' on ARCnet networks - including one length of cable that didn't quite reach, so they spliced it using two pieces of coathanger soldered on both ends to the frayed ends of each piece of coax. They used cardboard to keep the two pi
makes sense (Score:5, Insightful)
Makes a lot more sense than starting them off in some poo like Java where they never need to know about the real hardware.
Re: (Score:2, Insightful)
Agree completely.
I like and use Java everyday but I would not suggest learning Computer Science with it.
I learned data structures and algorithms with plain C and pointers which IMHO is the proper way to do it.
Just recently I was reading a book about data structures and algorithms in Java, and it was very funny the loops people have to jump to create a simple linked list or stack... because Java is *not* done for that...
Re: (Score:2)
Re: (Score:3, Insightful)
B-b-but the article isn't about modern programming. It's about the A-level program, which is about how computers work, as per TFS/A. Some people need to actually work on hardware in order for your modern programmers to implement their algorithms using an interface.
IMO: Great (Score:5, Insightful)
I am getting back into assembly programming after 8 years of C# and it is a bit of a shift in thought. My college switched from C/C++ to Java my senior year for incoming freshman - a real shame. Programming is totally different when you have no respect of memory management.
Re: (Score:3, Insightful)
As a sysadmin, I agree.
I hate programmers. They need to stop writing such bloated crap, hogging my cycles and eating my RAM. Sure, it's OK on one system, but when you've got to push that resource use out across 10, 20, 30, or more servers?
That gets expensive, especially with the recent memory cost hike (yikes!) It's infuriating when the only significant change is the library/framework getting upgraded, and then you've got to upgrade a couple dozen clients and/or servers as a result due to poor performance.
Re: (Score:3, Funny)
Now to make these 5 Apps and have them run at 1/2 of the memory footprint takes 40% more time.
Thankfully, you only need to spend six months beating your developers to be aware of memory bloat once. So, count the 40% as a capital cost and move on. (Added bonus: your cultural shift will cause new programmers to adapt or fail, thus extending the value of your investment in proper discipline!)
Re: (Score:3, Informative)
(why should you rewrite a framework everytime you start a new application)
Lots of frameworks are corsets. Looks nice, but only hinders when you want to get the job done.
'Modern computers go too fast,' (Score:4, Interesting)
Just dump Windows and goto DOS (Score:5, Interesting)
You would get exactly the same "feel" as you get with an old C=64 or Atari or Amiga machine. If your goal is to get down to the bare metal, then go ahead and do so. There's no need to dust-off old machines that are on the verge of death (from age).
Re:Just dump Windows and goto DOS (Score:5, Insightful)
But DOS doesn't have all the neat tools.
I learned on a PDP-8; 16K of hand-strung RAM and a CPU slow enough that you could put an AM radio next to it and hear it compute.
This thing came with all sorts of neat tools, including assembler (of course) and a FORTRAN compiler.
You learned to program good, tight code and really, really thought about your data structures.
Sure, programming today is much, much easier and we can do lots more. I cringe when I look at FORTRAN IV code these days; it's painful. But it did teach me a lot.
Re:Just dump Windows and goto DOS (Score:5, Interesting)
Re:Just dump Windows and goto DOS (Score:4, Funny)
Or you could just fire up a terminal ... Oh, Windows. Never mind.
I just dusted off a couple of C-64s, an Amiga 500, a Sun 3/50 and an Apple IIc the other day. They were filthy. And on top of the box of cables I needed to get at.
Re: (Score:3, Interesting)
You're an old hacker and may relate to this. I found free on craigslist a hand built (hand wrapped) Z80 CP/M box with dual 8" drives and a case of diskettes. No instructions or schematics. This winter I'm going to dig into it with my scope and logic probe and see if I can get the old baby working again. I was amazed that I was still able to just look at what components were on the boards and get a fairly clear idea of how it was put together. I figure the hard part will be following the address lines a
Re: (Score:3)
>>>RTFA. Or even the stub. "Modern computers go too fast"
Which makes no sense to me. Whether you're using a modern 3000 megahertz CPU or the old 1 megahertz 6502, makes no difference. Both run faster than the human mind can analyze.
I keep saying that (Score:3, Interesting)
... if you want to know how computers work, learn microcontrollers with the Atmel 8 bit family of controllers (ATMEGA8, for example). These things are wonderfully documented, there is a free C/ASM development environment with emulator (single-step, breakpoints, etc.). The real deal is just a few dollars for a development board (or get an Arduino, same thing). You don't get the absolutely down to the transistor insight, but that's really just a few experiments with TTL gate chips and LEDs away.
"Actors" (Score:5, Interesting)
They each took on the role of a different part of the machine - CPU, accumulator, RAM and program counter - and simulated the passage of instructions through the hardware.
The five shuffled data around, wrote it to memory, carried out computations and inserted them into the right places in the store.
It was a noisy, confusing and funny simulation and, once everyone knew what they were doing, managed to reach a maximum clock speed of about one instruction per minute.
I wish I had a teacher like this while in [US public] school.
Sounds like fun (Score:2, Funny)
Re: (Score:3, Insightful)
Re:Sounds like fun (Score:5, Insightful)
New cars are wonderfully simple under the hood, once you strip away all the plastic. Ever taken apart an old carburetor before? Ever try to get it back together in working order? Give me a FI computer, airflow sensor, and fuel injector any day. Not surprisingly, cars went from a maintenance interval of 1,000 miles with a life expectancy of 50,000 miles to a maintenance interval of 10,000 miles and a life expectancy of 250,000 miles by *avoiding* complexity.
Re: (Score:3, Insightful)
Yes, lots. While I appreciate the old joke that "carburetor is a French word, meaning 'leave it alone'", I never found carburetors to be capricious, only more complex than 'screwdriver mechanics'. You have to know how the carburetor works, and you have to have the correct service manual, and you have to have the tools to assure the precision parts are all in spec.
The only difference with carburetor overhaul was you
Re: (Score:3, Informative)
Re:Sounds like fun (Score:4, Informative)
The model-T was even stranger: http://en.wikipedia.org/wiki/Model_T [wikipedia.org]
Probably a good idea (Score:4, Interesting)
We ran some older machines in my first programming course. When you can see the direct results in speed (or lack of) it can help teach better approaches. Writing a game and seeing the screen flicker when you ask the CPU to do too much is good modivation to find a more effectient approach. One our our instructors also did something like this with visual sorting procedures. If you can see the difference in speed between one sorting approach and another, it sinks in.
Re:Probably a good idea (Score:5, Informative)
Re:Probably a good idea (Score:4, Insightful)
There's still nothing like having your actual computer take another 10s to run the same sort someone else's does in 1s. Our current machines are so fast that sorting 1000 items in .1s vs .01s means pretty much nothing to a learning programmer, even though the order of magnitude difference is the same. And harping on Big O isn't "getting your hands dirty".
There are people who learn quite well from theory. But that's not everybody, and actual, perceptible feedback is a very effective learning tool.
How does it work? (Score:3)
The computing A-level is about how computers work and if you ask anyone how it works they will not be able to tell you
That's what most people would say when you asked them how something works. Computers, fermentation, a Wok . . . etc.
"Um . . . I dunno . . . "
Re:How does it work? (Score:4, Funny)
You don't understand - this is Bletchley Park, you know, the codebreakers during WW2. Old habits die hard. They *could* tell you, but then they'd have to kill you.
Re: (Score:3, Funny)
No, they could tell you, but you wouldn't be able to use that info on your A-levels for 50 years.
Old trick... (Score:3, Interesting)
Of course, with the advent of
Knowability (Score:5, Insightful)
One of the great things about the early micros (and probably the even-earlier minis) is that they were Knowable. With a little time, an intelligent person could become familiar with the workings of the entire architecture. I used to have a map of every memory location in the 64KB of ye olde C64 (most of it was user RAM of course) explaining what each byte was for. POKE a different value to a certain address, and the background color changes. PEEK at a certain address and it tells you the current hour. You could learn this... all of it. Obviously that's just not possible with modern computers (probably not even modern phones); no one person can grok the whole system.
Re: (Score:2, Insightful)
The professor does have a point. I remember writing one of my first BASIC programs. All it did was wait for a key to be pressed, then fill all character positions on the screen with that letter. I could watch it go line by line. Then I wrote the same program in 6502 assembler. The entire screen of characters changed instantly, as fast as I could type. A feeling of awe and sudden empowerment rushed through my vei
Re:Knowability (Score:4, Insightful)
Fortunately we've moved on
Still, the parent author has a good point about the nature of computers. There was a time when you could grasp every detail of the system. Now, to understand any one component means a lifetime of specialization. This is good because it makes computers extremely powerful, but it is bad in a way because it makes the world a mysterious place.
When I started learning html around 1997, it was simple. I could know every tag. I kept up with it through frames and into a small amount of javascript and css. Thirteen years later, many sites are extremely sophisticated, accounting for the nuances of various browsers, etc. It's certainly a formidable hobby at this point.
When the world is a mysterious place, it is a frightening place. A child growing up in the modern world sees all of our amazing technology as magic. Everything is a mysterious black box that just works, or when it doesn't, you call someone to fix it, or toss it and replace it. That's why I always respected Fred Rogers. Even if it was something as simple as taking children "behind the scenes" at the supermarket, he always strived to debunk the modern world at a pace anyone could keep up with.
We shouldn't resist technological progress, but we should be aware of how living in a high tech world affects people. Happiness is linked strongly to control. If you choose only to control your own thoughts, you will probably be very happy because success is almost guaranteed. But if, like most people, you choose to try to control the world around you, technology is a double edged sword. While in some senses it has afforded people unprecedented power, in other ways, growing up surrounded by inscrutable sophisticated electronics can be alienating.
Being able to know things is good for your health. Scientia potentia est!
Re:Knowability (Score:5, Insightful)
And where do drivers come from, faeries? Unless you want a generation of aging programmers who understand the workings of the machine to die off completely and become reliant on drivers originating from unknown mystical places, the younger generation of programmers MUST learn these things.
I do not think it is a coincidence that computers get faster every year, but my experiences as an end-user are not one bit improved since Windows 3.11. Things still break constantly. Mysterious transient problems. Multi-vendor finger-pointing because NONE of them truly understand the complete working system or want to be responsible for it. etc. etc. I still spend about 30% of my time just *waiting* for my PC to do God-knows-what between certain actions.
It's horrendous, and the problem, in my opinion, is that we have become reliant on "high level" app guys like yourself who just "trust" that whatever is beneath their app is going to do what it needs to magically and that it will all come to pass. And then the apps get heavier, and heavier, and round and round we go in perpetual mediocrity. I've been stuck in this feedback loop since 1989, and I'm telling you it makes me want to just swing a hammer for a living instead sometimes...
Re: (Score:3, Insightful)
Ridiculous (Score:2, Interesting)
The five soon discovered that just because a program was simple did not mean the underlying code was straight-forward. To make matters more testing, the BBC Micro offers a very unforgiving programming environment.
My first piece of commercial programming was on a BBC Micro and having that environment didn't teach me anything, it just made programming more of a pain than being able to cut and paste, set debug breaks and so forth. And it doesn't teach any more than using C#/VB because it's a machine designed around using BASIC, which is itself an abstraction (and IIRC, you didn't have functions, so had to endure the horror of GOSUB/RETURN).
Edlin (Score:3, Insightful)
Or was that the 70's? Gosh I can't remember now cuz I'm so old.
Edlin (Grr....) (Score:2, Insightful)
Re: (Score:2)
> Or was that the 70's?
1980: it came with early versions of MSDOS. I avoided ever learning it by using CPM/86 and Unix.
Emulation? (Score:2)
Wouldn't it be better to use the emulation route? For example, writing a program for the original gameboy, and running it through the emulator. I remember at university we learned assembly on an emulated MIPS. We could focus on the individual instructions, on hardware that was simple and clean, but it all ran on the unix servers (x terminals).
crufty calculator? (Score:3, Interesting)
from the link: "using 30-year-old or older machines."
from the fine article: "First released in 1981; discontinued in 1994 using 30-year-old or older machines."
I recently (three weekends ago) fired up my Commodore PET 2001 [flickr.com] (a *genuine* pre-1980 computer) and have been writing a Forth [6502.org] for it. It's really a lot of fun, and I'm finding that 30 years experience in various high-level languages has improved my "6502 assembler golf" game a lot. It's very incomplete, but the inner interpreter mostly works. Feel free to throw down on it here [github.com]
Charlie
Been there, done that (Score:5, Interesting)
10 years ago when I went through University, the core of the mandatory Assembly programming course was taught on the PDP-11 architecture, then 30 and now 40 years old.
Granted it's not quite the same. We used emulators and not the real things. Also it was for different motivations. The prof felt it was simpler to teach the cleaner PDP-11 instruction set than the 80x86 or 680x0, although the course did eventually also extend to both. Also he happened to be an expert in systems like the PDP-11.
However the idea of using old systems as teaching aids is hardly new - or news IMO.
Beebs are good machines (Score:2, Informative)
I was using them at college when they were new.
My first job was writing software that controlled scientific instruments and their was an awful lot of eductaional software written for them because they were designed to be used in schools. The Basic was more structuured and it could use microcassettes or 5 1/4 flopies with its own DOS.
In short, if you are going to use a dinosaur, it is the best dinosaur to choose
Mastery? (Score:3, Insightful)
For Mr Abrams the old machines have two cardinal virtues; their sluggishness and the direct connection they have with the user.
Another hacker learning skill you must obtain, that he forgot to mention, is how to completely master a system. This is different from merely learning enough.
At one point, I could tell you every minute detail of OS-9 (the motorola 6809 CPU OS, not the apple product two decades later) and I also nearly mastered 68hc11 assembly, Z-80 assembler, and the PDP-8.
There is no point trying to teach kids how to master something using, perhaps, the linux kernel, because its too freaking big, at least for a one or two semester course.
The mastery skill requires figuring out what you don't know and then figuring out how to find it. Very much like spatial mapping, I see a blank spot in my map of how it all works, so how will I get from where I know to where I don't know? Also you learn how to learn the philosophy of a complete working system, sort of a C/S ecology mindset. Finally there is a bit of reflective thinking that interacts across now usually broadly separated problem areas, look how the memory allocation system has reflected onto the design of the I/O drivers and vice versa.
Learning how to master a topic is a valuable skill, and at least for CS students, frankly best learned on the smaller older stuff. Too many newbies think asking small specific questions of google is all they need, and think they can scale up to a big project merely by asking more little questions, without thinking thru the big picture.
A fourth thing the dude forgot is that older computers were MORE powerful. Power is what comes out of the barrel of a gun, its not P=I*V or MIPS. A single old MVS mainframe could run a small govt department or a multinational corp.
Compiling... (Score:3, Funny)
It's true, today's computers ARE too fast for students.
Kids today don't know the joy of being able to slack off for 5 to 10 minutes in class while their screen says "Compiling..."
"I like to watch the computer work." (Score:3, Interesting)
Who says what? (Score:3, Insightful)
I noticed a bunch of low (even 4 digit)/. user ids in this thread -- like the guy who got the CP/M box off craigslist. I think it would be quite interesting to do a correlation between low /. user IDs and opinion on the subject. The hypothesis is that older people will have a softer spot for older machines.
Myself? I think learning to program in older machines is a great idea. But then again I learned to program in Sinclair ZX-81's BASIC language -- back when 16kb was a memory expansion...
Any way this could be done everywhere? (Score:3, Informative)
I'm reminded of something that happened to me while I was a student assistant at a remote job entry location of a university's computer facilities.
The incoming batch of engineering freshmen were being taught, as was the tradition, to program badly in FORTRAN. An instructor assigned them the problem of counting the ways to make change for a dollar, assuming you had plenty of all the denominations of coins. How did he have them do it? Nested DO loops, one per denomination, with each denomination running from 0 to 100 / the denomination's value, of course!
The result? Bunches of students exceeding the thirty-second time limit for WATFIV jobs so their programs were cancelled before they finished. They'd run them again, of course--maybe the first time was a fluke. (The university ran on a 370/138 at the time....) Then they'd come in and ask how to run in a different job class so they weren't limited to thirty seconds.
I wrote a program in Algol W with a recursive function that would solve the general change making problem. It solved the specific one in 0.01 seconds. A friend and coworker (alas, no longer with us) wrote a non-recursive program in FORTRAN that took less than 0.01 seconds, so that the output showed it as running in 0.00 seconds. Our boss took the listings and output and had a discussion with the instructor. He, and I hope his students, learned something.
Nowadays, they wouldn't. Today's computers would run the horribly inefficient version so quickly that nobody would care, and they'd move on to the next thing.
So I applaud this approach, and hope everyone gets that experience.
Re: (Score:3, Funny)
That's not fair! We were also taught how to program badly in Lisp, Pascal and Modula-2!
Re:Does that make sense ? (Score:5, Insightful)
Re: (Score:2, Insightful)
Re:Does that make sense ? (Score:5, Funny)
How will the student then apply his knowledge to modern languages such as Java, C# ?
It's really pretty simple. After seeing what a computer can do with code intimately optimized for the machine it's running on, they will be exposed to the status quo in Java or C# and their heads will explode. Problem solved on our end!
Re: (Score:3, Funny)
Or at least cause their head veins to pulse as shown in this computer lab photo [treksf.com].
Re:Does that make sense ? (Score:4, Interesting)
I was in AP computer science over a decade ago. We used C++ using the "apstring" and "apvector" classes that were similar to the STL.
We of course had to implement bubble short, quicksort, insertion sort, and so.
It was fairly slow on our computers (386s/486s/maybe one pentium!) and you could REALLY see a visible difference between the difference sorts. It was very obvious.
I rewrote the sorts using standard C arrays instead of apvector. Even on those ancient computers, the differences were suddenly almost gone. Bubblesort using straight arrays was faster than apvector quicksort--at least for fairly small arrays. I don't remember the specifics anymore, but you had to be sorting IIRC several thousand things before there was much of a recognizable difference.
So yeah, that made a big impression on me. Then again that class, and intro classes in college were the last time I've had to write my own sorting algorithm...
I think it's a good thing that people who have maybe only used 2ghz+ computers are given a chance to experience something else. I guess a better question would be, why is expanding your horizons ever a bad thing?
Re: (Score:3, Interesting)
I think it's because apstring and apvector were both simplified versions of the real deal. And the entire source code for both was pretty small and understandable for people just getting into C++, templates, etc.
We at least created modified versions of them as well, extending or re-implimenting certain functions. I don't really remember too many specifics!
Re:Does that make sense ? (Score:4, Insightful)
How will the student then apply his knowledge to modern languages such as Java, C# ?
Do you believe that a school should teach Java, or teach programming?
BTW, C++ can kernel-mode C programming jobs aren't going away, and tend to pay better than Java jobs as the talent pool is growing smaller. Especially for kernel-mode programming, very few schools are turning out bright young talent with any relevent skills in that area, so the labor pool is aging out but the demand isn't shrinking.
Re:Does that make sense ? (Score:4, Insightful)
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
I don't know what the A level syllabus is
It's a little over a decade since I did mine, and I don't know how much they've changed. But FWIW mine involved partly learning algorhythms / programming - in Pascal, with tiny bits of assembly - and partly a bunch of theoretical stuff such as binary (floating point) arithmetic, BNF, Codd's normal forms, basic hardware/architecture principles & protocols, etc. I can't claim to remember the proportion very accurately. Somewhere between 30:70 and 50:50 I think.
Re: (Score:3, Insightful)
I don't think this is something that's worth doing for *vocational* reasons. You don't do this because you'll produce a supply of programmers who are better at the flavor du jour of programming language. You don't do it as an *alternative* to access to modern machines either.
You do it for *educational* purposes, to produce people who understand on a deeper level what is going on than somebody who has studied for some kind of vocational certification. Perhaps they'll go onto be hardware designers, or sy
Re:Does that make sense ? (Score:5, Interesting)
That said, it does seem like a cool class. One I'd like to take, but for personal interest, not professional development.
Re:Does that make sense ? (Score:4, Informative)
They're A-level students, i.e. the final two years of school, ages 16-17 and 17-18. It's probably more interesting than making some crappy VB application, which is what I remember the A-level computing students doing (I didn't do the subject, I did extra maths instead -- it was much more useful for finding a place on a good CS course at university).
Re: (Score:3, Insightful)
If you want to get an intimate feel for writing programs without being able to waste resources, try embedded systems programming. The microchip 10F series has only a few dozen bytes of ram, and a couple hundred words of flash. And no hardware multiply. Making it do useful things is an art. Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.
Agreed on all of the above, but the experience of working on the relics will translate to modern embedded systems sufficiently well that I think there is value. In many cases the relics will be even slower and be more RAM- and ROM-constrained than all but the the tiniest of today's embedded microcontrollers.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Oh, the luxury.
In my digital design course we had to build a simple computer. After we'd demonstrated adders made out of NAND gates we were allowed to use an arithmetic unit chip, and wired things up with latches etc. so we had a workable bus. Programming was accomplished through DIP switches and output via an LED bank. Just like they used to do (substituting LEDs for light bulbs). When you programmed those things you made sure your code was efficient otherwise your hand would get tired flipping the swi
Re:Does that make sense ? (Score:4, Interesting)
Oh, and unlike some relic from the 70's, you can actually get a job programming for tiny microcontrollers.
Just last year, I was working for a company (A fairly large one) and they were still running programs written in DEC FORTRAN 77 on Vaxen. A part of my job was to port these programs to an.......Alpha.
LK
Au contraire (Score:2)
I have programmed on TRS-80s and 8088 w/8087s. Compiled C and Read & Go BASIC.
But now I'm programming python on an 8-core Xeon. When I'm writing a stored procedure or a nested loop of two recordsets, I ***STILL*** catch myself thinking about how slowly those instructions would take on a slower machine. "Do you know how LONG that looping will take?... oh. 0.000006 seconds. heh heh. I catch myself "subvocalizing" the loops, and I shy away from something "so resource intensive" and look for another, more e
Re: (Score:3, Insightful)
I'll bet, though, that in those cases where the performance really is that critical you're a lot more able to deal with it than someone who thinks the nested loop takes ZERO time.
Re:Au contraire (Score:4, Informative)
Re: (Score:2)
The resources of students have negative cost. That is, they pay _you_ for you to use their resources (their brains).
Re:Does that make sense ? (Score:5, Insightful)
A) It teaches people how to use unfamiliar hardware/software. Chances are the thing you are going to be running at your job is not going to be the thing you studied in university for.
B) It teaches kids how to not make mistakes in coding. Make a big enough mistake and the entire system goes down. Compilers are also a lot less fault tolerant.
C) It teaches kids how computers actually work by pealing back layers of abstraction. Think about it, has the average person under 20 ever used a CLI? For anything? I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.
D) It puts things in perspective. It shows how you don't need a Core i7 to play games, that a graphics card with 100 times the memory of the entire computer isn't required to make art, etc.
E) Its fun. The old computers had a lot more easter eggs built in and little tiny quirks. These days you get a Dell/HP/Gateway/Acer/Asus/etc slap Windows/Linux/OS X on it and its the same as any other Windows/Linux/OS X box, but the old computers all had little things different, some things were frustrating of course, but when you don't have to do it for any too serious of work, it can be kinda fun digging out the old Commodore 64.
Re: (Score:2)
Re: (Score:2)
You were thinking of putting it in a loop.
Re:Does that make sense ? (Score:5, Funny)
Re: (Score:2, Funny)
I thought we were supposed to pick the two we liked and ignore the others...
Re: (Score:2)
Think about it, has the average person under 20 ever used a CLI? For anything?
The average person under 100 hasn't ever used a CLI. But I think it's just as important to programmers now as it was 25 years ago.
Re: (Score:2)
I love this idea. I learned 6502 assembly way back, and spent time in grad school hand optimizing fortran subroutines to extract the optimal performance out of a well aged (in 1985) Cyber 730 system.
Most programmers I run into today have never once touched assembly, and many of them think that ANSI C is the same as assembly. While it has been a LONG time since I have coded anything, I have shown a couple coders some old school optimization techniques that blew their minds
Of course I know that modern, opti
Re: (Score:2)
I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.
Most people don’t even use that. When the Run dialog opens, it contains the last thing that was executed from it, and on a non-technical user’s computer this always seems to be the last thing I executed from it however long ago it was that I used their computer last.
If you ask me, the closest thing people come these days to a CLI is the Location bar in Internet Explorer... maybe even the search bar in Google.
Re: (Score:2)
Another way to get to this level of demand is to have them work on extremely large datasets using mathematically sophisticated models. Even on a modern computer, developing a sparse matrix-represented tensor from R^{60 000} dimensions to R^{7500^2} will take a bit of planning.
This will train (mostly) the same skill set and also prepare them for real work. Unfortunately, most CS teachers don't know jack about numerical optimization/statistics/data mining.
Instead of old machinery, they can just code for the M
Conceptual Model (Score:3, Informative)
Another factor is the conceptual model is simpler. It is possible to know the entire layout of one of these classic machines. The CPU, instruction set, registers, I/O chips and memory layout. You can exactly where a program will load in memory. A 6502 has an Accumulator, X register, Y register, 6 flags, a stack pointer and a program counter. It is possible to know exactly how the computer works on both a hardware and software level.
Try that with a PC, what happens when you flip the power switch. Well which
Re: (Score:3, Interesting)
A) It teaches people how to use unfamiliar hardware/software. Chances are the thing you are going to be running at your job is not going to be the thing you studied in university for.
C) It teaches kids how computers actually work by pealing back layers of abstraction. Think about it, has the average person under 20 ever used a CLI? For anything? I think the closest people come these days to actually using a CLI is typing in something on the Windows "Run" dialog.
I can't stress this enough. I'm 22 - so close to the age range you mentioned, and I had only ever used Windows 3.1 when I was around 3 to 5 years old, and even then it was just to boot up some old Kings Quest or Math Tutor game - and beyond that I only ever used to use the MS-DOS prompt on Windows 95 for ipconfig so that we could get a good Age of Empires game going. Once I got into the Polytechnic that changed a lot because one of our professors was very Linux happy and learning to use the terminal on a Fe
Re:Does that make sense ? (Score:5, Interesting)
It's not about "understanding low-level programming" - it's about having a direct connection between what you do and what happens. No virtual machine, no garbage collector, no super-fast compile/link/run/modify cycle (s you're going to take a few minutes to THINK about why something didn't work instead of just doing the "quick fix let's test it and see if we got it right this time" route).
The article never said they were using Windows.
Better teach them C (Score:4, Insightful)
Absolutely. Better teach them C so they will know how data structures and memory management work.
Languages that try to do everything may help you write code faster but can be treacherous.
Let's see a simple example. In Python there is a subtle matter of memory management that can be dangerous to the untrained programmer. When you copy a list like this: a = b you are creating a pointer to the other list, when you copy like this: a = b[:] you are allocating memory for a new list and copying the contents.
When you know C, the difference between the two copy instructions above is obvious, but if you don't know what is memory management this can become very difficult to understand. I bet there are many bugs created by Java, Python, and other modern languages that come from this inability to understand how the language works under the hood.
Working on old computers can be fun for some people, but to train programmers nothing beats learning C. C is close enough to the hardware to let one understand the details of how software runs, yet abstract enough to represent any typical von Neumann computer.
Re: (Score:2)
Exactly. Higher-level languages are _for_ people who already understand lower-level languages. Just as calculator are _for_ people who already understand arithmetic. Schools don't give calculators to kindergarteners or any child who hasn't yet understood arithmetic. First understand arithmetic, and demonstrate you do so by working _without_ a calculator, _then_ be allowed a calculator. Giving people some high-level language on super fast machines with "retina" pixel density and high-level languages i
Re:Does that make sense ? (Score:5, Insightful)
Yes, it makes great sense. WHen getting started, it really helps if you're forced to deal with the low level, and more if you can actually see the low level.
I've spent a large part of my career writing software realted to tape drives. It really helped me getting started that I could sit down in front on an old IBM 9-track reel-to-reel to test my code. Not the most useful thing for production data storage, but terrific for seeing problems with production code. Miss the end-of-tape marker? Flap-flap-flap-flap doh!
Similarly, writing and debugging production assembly code made me very comfortable with debugging and crash analysis on higher-level languages, even if I didn't quite have matching source. And that experience in turn lets me understand "what really happens" with a language like C# or Java, and for example explain to people why, for example, the .NET file rename function is no substitute for the Win32 file rename system call, despite the fact "they both just rename a file". STuff that should be obvious to even a junior programmer but, well, isn't.
Re: (Score:3, Informative)
The .NET File.Move() (and FileInfo.MoveTo()) calls the Win32 MoveFile(), but you really need MoveFileEx().
Requirement: update a file in such a way that all other processes see the update as atomic, even if you crash at an arbitrary point. (Not an uncommon problem in systems programming.)
Solution: make the change to a copy of the file in a temp area, then "rename over" the known file to get the atomicity you need.
This approach to making file updates atomic is a fundamental property of a filesystem, but unav
Re: (Score:2)
These systems have high-level I/O options (keyboard, monitor) that embedded systems don't, while avoiding all of the complexity (multi-level caching, speculative execution, out-of-order execution, etc.) of a modern desktop.
Re: (Score:2)
> These systems have high-level I/O options (keyboard, monitor) that embedded systems don't...
Some embedded systems have serial I/O and support terminals.
Re:History doesn't repeat itself (Score:5, Informative)
Actaually the BBC PC isn't far from the perfect embedded system trainer.
From the Wilkipedia.
"The machine included a number of extra I/O interfaces: serial and parallel printer ports; an 8-bit general purpose digital I/O port; a port offering four analogue inputs, a light pen input, and switch inputs; and an expansion connector (the "1 MHz bus") that enabled other hardware to be connected. Extra ROMs could be fitted (four on the PCB or sixteen with expansion hardware) and accessed via paged memory. An Econet network interface and a disk drive interface were available as options. All motherboards had space for the electronic components, but Econet was rarely fitted. Additionally, an Acorn proprietary interface called the "Tube" allowed a second processor to be added. Three models of second processor were offered by Acorn, based on the 6502, Z80 and 32016 CPUs. The Tube was later used in third-party add-ons, including a Zilog Z80 board and hard disk drive from Torch that allowed the BBC machine to run CP/M programs."
Four A2Ds 8 bits of GIO, and switch inputs. All available from Basic on a machine with a Floppy, Keyboard, and Monitor. Sweet.
I so wanted one of these back in the day. Too expensive and not really available in the US at the time.
Re:History doesn't repeat itself (Score:5, Insightful)
Seriously, how is this useful in modern computing, other than as a "Back in my day..." quote?
Learning how to use older/simpler machines is an excellent way to learn about a number of fundamental concepts. Modern computing, for all its advances, still operates off the same fundamental principles as it did fifty years ago; it's simply become orders of magnitude more complex.
Now, while it's perfectly possible to learn how to do this sort of thing using emulation or specialized training software, there's real value to having an appreciation of the history of the field you're planning to enter, and working with machines that were once considered state-of-the-art is a very effective way to gain a sense of just how insanely far computing has come. Note, too, that simply because you're never going to be called upon to program a PDP-8 in real life doesn't mean that you can't learn a fair amount of generally-applicable knowledge about hardware, logic, branching, execution, input, output, and instruction sets. In fact, by pulling yourself out of a familiar environment, you're forced to pay attention to important things that you'd otherwise happily ignore--like "well, how does what is in my head actually get into a computer's inner workings?"
Finally, always remember that programming is a subset of computer science. Even if all you ever expect to do is write code, a deeper knowledge of what goes on between the compiler and the electrons is going to be quite useful--and will make you a better coder, to boot.