Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
Education United Kingdom Hardware Technology

30 Years of the BBC Micro 208

Posted by timothy
from the still-living-in-the-basement-probably dept.
Alioth writes "The BBC has an article on the BBC Microcomputer, designed and manufactured by Acorn Computers for the BBC's Computer Literacy project. It is now 30 years since the first BBC Micro came out — a machine with a 2 MHz 6502 — remarkably fast for its day; the Commodore machines at the time only ran at 1MHz. While most U.S. readers will never have heard of the BBC Micro, the BBC's Computer Literacy project has had a huge impact worldwide since the ARM (originally meaning 'Acorn Risc Machine') was designed for the follow-on version of the BBC Micro, the Archimedes, also sold under the BBC Microcomputer label by Acorn. The original ARM CPU was specified in just over 800 lines of BBC BASIC. The ARM CPU now outsells all other CPU architectures put together. The BBC Micro has arguably been the most influential 8 bit computer the world had thanks to its success creating the seed for the ARM, even if the 'Beeb' was not well known outside of the UK."
This discussion has been archived. No new comments can be posted.

30 Years of the BBC Micro

Comments Filter:
  • jaded (Score:5, Insightful)

    by masternerdguy (2468142) on Thursday December 01, 2011 @08:53AM (#38225350)
    People used to get excited when a CPU clock was measured in MEGAHERTZ! Now we're jaded...
  • by Anonymous Coward on Thursday December 01, 2011 @09:00AM (#38225404)

    The most remarkable thing about the BBC is that they're still going running production code.

    I had the good fortune of working with (or rather, near) one of these systems a few years ago. When I asked why they hadn't upgraded the machine in nearly 3 decades the head of the system simply responded; "It still works."

  • Re:jaded (Score:5, Insightful)

    by syousef (465911) on Thursday December 01, 2011 @09:05AM (#38225448) Journal

    People used to get excited when a CPU clock was measured in MEGAHERTZ! Now we're jaded...

    The fucking things did not run a GUI that emulated transparent glass. They could process video or images that we use today etc. People use to get excited about ASCII art and how clever that was. Today you can see pictures Hubble has taken in intricate detail, and instead of playing ASCII strip poker people are viewing HD porn instantly.

    When home computers were new anything they could do was a marvel. Now we've seen what more processing power can do. We have a lot of bloat. We also have a lot of functionality that is taken for granted. You have to remember that international direct dialing was considered a wonder when the BBC microcomputer was introduced. ("What, you mean no operator connects you!?")

  • Speaking of apple (Score:2, Insightful)

    by goombah99 (560566) on Thursday December 01, 2011 @10:06AM (#38225990)

    In the late 1980s Apple Computer and VLSI Technology started working with Acorn on the second generation of the ARM core. So once again Apple is there. It's getting like the black obelisk on 2001. Pick anything and apple may not have invented it but they did shape what it became.

  • Re:jaded (Score:4, Insightful)

    by Miamicanes (730264) on Thursday December 01, 2011 @10:38AM (#38226340)

    Which means we go back to the same strategies we did in the 80s and early 90s -- coprocessors. Or, put another way, multiple cores, stacked GPUs, DMA, hardware DSPs. And (gasp), the Second Coming of CISC.

    At the end of the day, RISC was a way to get cheaper megahertz (and later, gigahertz). Now that we've largely maxed out clock speed to the point where it's almost counterproductive, CISC is just about the only place we have left to go. Instead of wasting 50 cycles loading values into registers. operating on those registers, evaluating the outcome, and branching based upon it, you can have complex variable-length opcodes that use billions of transistors and have sinful amounts of silicon dedicated to niche operations that would have been absurd to even contemplate 25 years ago with far fewer clock cycles.

    There's a reason why a 16MHz 68000 can still run circles around a 100MHz ARM, and why a 1GHz Pentium-M beats a 1GHz Atom or Arm to a bloody pulp -- the CISC chips get more done behind the scenes with every public clock cycle. The fact that behind the curtain, they're secretly executing chains of RISC instructions with private, semi-asynchronous clocks as fast as they can & just presenting the public facade of a CISC architecture responding to a system-wide clock is a quibble. The point is that every time the public system clock ticks, they're getting WAY more done than a conventional RISC architecture could ever fantasize about. In effect, a modern AMD64 (or Core2) CPU is like a container full of virtual, disposable/pooled RISC processors that get instantiated to execute a single public opcode while privately dancing to the beat of their own drummer.

  • by itsdapead (734413) on Thursday December 01, 2011 @12:55PM (#38228400)

    The current ARM has little to do with the BBC micro. Apple purchased a stake in Acorn with the goal of getting them to FAB a low powered CPU to power the Newton.

    ...except that if the 6502 BBC micro hadn't happened, Acorn wouldn't have developed the ARM2/3 to use in the next gen BBC Micro and there wouldn't have been anything for Apple to buy in to. It may have evolved since then, but Apple sure as hell didn't invent the ARM.

    The first ARM-based machines were the ARM2-powered Acorn Archimedes range, released in 1987, the entry level model of which was still branded as "BBC Micro". At the time, they kicked sand in the face of 80286-based machines. The Newton didn't appear until much later.

    Cheekily, in 1994, Apple touted their new PowerPC-based Macs as the first RISC-based personal computers.

The meta-Turing test counts a thing as intelligent if it seeks to devise and apply Turing tests to objects of its own creation. -- Lew Mammel, Jr.

Working...