Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Entertainment Technology

Atari's Home Computers Turn 40 (fastcompany.com) 86

harrymcc writes: Atari's first home computers, the 400 and 800, were announced at Winter CES in January 1980. But they didn't ship until late in the year -- so over at Fast Company, Benj Edwards has marked their 40th anniversary with a look at their rise and fall. Though Atari ultimately had trouble competing with Apple and other entrenched PC makers, it produced machines with dazzling graphics and sound and the best games of their era, making its computers landmarks from both a technological and cultural standpoint.
This discussion has been archived. No new comments can be posted.

Atari's Home Computers Turn 40

Comments Filter:
  • It was good to get up to speed on BASIC but the keyboard was extremely painful to use. The 800 did not have that problem. The early apple computers got a foothold into schools, this may have been a part of Atari's undoing.
    • The early apple computers got a foothold into schools, this may have been a part of Atari's undoing.

      Sounds likely. My school got C64s in at some point; after that most of my classmates got C64s as well.

      • Totally. We didn't fragment until the 16 but era, but in Ontario Canada we were all commode in the 8bit era.

        As much as I wanted an Amiga, I had to settle for a PC as my parents forbade upgrading from the c64, and I could buy used pc parts by then and smuggle them home. CGA 286 on a TV sucked. I finally got caught when someone gave me an amber composite monitor. I couldn't hide it. I got into so much trouble.

    • IIRC, my first computer lab had: a handful of Apple II's, an Apple IIe (with Winchester drive), a handful of Commodore C64's, a few TRS-80 model 3, and 2 Osbornes. I loved the Apple IIe most of out of all of that mess. This would have been mid- to late-1983. I still vastly preferred sneaking over to the library to the teletype machine and playing Star Trek on the school board's mainframe when no one else was signed up.

  • by 50000BTU_barbecue ( 588132 ) on Saturday December 21, 2019 @06:39AM (#59544112) Journal

    I grew up when home computers were just starting. My parents sent me to a computer summer camp which was basically a few hours a day on the computers there. They were all Atari 800s, with a teacher to show us the things.
    So that was my first contact with any computers as a kid. However money being tight I eventually got a VIC-20 with a Datasette for Christmas. A little disappointing but I soon got stared on borrowing cartridges from friends and buying computer and electronics magazines.
    Then I got into hardware and was able to build a 16K RAM expander... 8Kx8 chips were expensive, like 20$ a piece in 1983ish. 40$ back then is like 100$ today, a lot of cash for a 13 year old.
    I kept going with Commodore stuff but really my first computing experience was with the Atari 8 bit machines. I would often use the neighbor's machine to play with their kids too.
    Ah simpler times. It was fun, and it's too bad Atari and C= didn't survive. I guess it was inevitable as hardware became so powerful that computers all look the same now, it's the software that matters. No one can describe their computer's graphics in term of a chip's name or model or even talk about a specific register doing this or that.
    Oh well. Enough old-man wool-gathering.

    • by hawk ( 1151 ) <hawk@eyry.org> on Saturday December 21, 2019 @02:44PM (#59544988) Journal

      They did a lot of interesting things with it, but they were doomed from the start.

      I had serial number 49 as my demo machine at the store where I worked at the time. It evened the original graphics chip that didn't make it into production (more colors for some modes, iirc, but otherwise compatible).

      Anyway, it was partly future, while maintaining too much of the past and complication.

      The design used an existing bus (SS-50, iirc)--but got interrupted by the "new" FCC regulations late in the process. The result was that the connector *was* there on the motherboard--but was completely enclosed by the metal "pot" used as an RF shield. *That* is why it had that silly serial interface for *everything*.

      It used what was probably the fastest (per clock cycle) 8 bit CPU, the 6502, and at almost twice the speed of the Apple ][--but disabled about half the time for the display circuitry.

      And fast 8 bit or not, the writing was already on the wall for 8 bits, with the 16s already shipping. There was no long term path forward.

      The cartridges made it easy to swap programs, but were more expensive and complicated. And putting one slot in the 400 and two in the 800 pretty much meant that no-one wasted resources on something needing the second slot.

      It used its own screwball BASIC, instead of the MBASIC available on nearly every other machine of the time. Sure, it had its won advantages, but it made translating existing programs difficult (and much of the small commercial programs of the time *were* in BASIC).

      The above-mentioned serial bus made things expensive. Rather than plugging a card into a cheap slot (Apple, S-100, and a couple of lesser known) that had everything already decoded on the mainboard, and supplied power and the host CPU, every peripheral (e.g., floppy) had to have it's own power supply, CPU, and interface circuit, and software at both ends (or firmware on the peripheral).

      On top of being expensive, the interface was s_l_o_w. And, already slow, the designers of the original floppy apparently hadn't heard of interleaved sectors, so each track was laid out in radial order. And so by the time it had finished transmitting data to the computer, the next sector had passed, and it had to wait for it to spin around again: to be clear, this means that it could read slightly *less* than one sector per disk revolution. The "Rev C" ROM upgrade (also included in later drives) alleviated this, but still left it slower than much of the competition.

      It had bank switched memory built in from the start (8k blocks? It's been a while), allowing a stated capacity for three slots of 192k--but while the memory cartridges made expansion easier, they also made it significantly more expensive (and the 400 had less capacity to keep costs down). And in spite of supposed capacity, it was initially limited to 48k due to the lack of available cartridges greater than 16k. And the very fact that there was a *need* for enough memory to require bank switching was a hint of the impending doom . . .

      There's plenty more like this, but it really comes down to great and clever implementations within the limits of 8 bit technology just as the curtain was setting, while simultaneously being more expensive in many ways (but not initial purchase of base unit).

      I would, though, like to be able to actually play Sands of Mars--which shipped on something like 15 double-sided (as in manually flip) diskettes . . .

  • 8085 versus 6502 (Score:3, Interesting)

    by reporter ( 666905 ) on Saturday December 21, 2019 @06:44AM (#59544116) Homepage

    The Apple II, the Atari 800, and the Commodore 64 have one common element: the 6502 microprocessor.

    If you are nostalgic for those home computers, you can relive part of the past by buying a laptop with an ARM microprocessor. When Sophie Wilson designed it, she was inspired by the 6502 and the Berkeley RISC II. In other words, the ARM is a descendant of the 6502.

    The other popular processor in the late 1970s and early 1980s is the 8085. Its descendants are Celeron, Pentium, and Xeon.

    In other words, the current battle between x86 and ARM is a replay of the battle between 8085 and 6502 in the 1970s.

    • When Sophie Wilson designed it, she was inspired by the 6502 and the Berkeley RISC II. In other words, the ARM is a descendant of the 6502.

      "Inspired' only in the sense that she thought the 6502 was underpowered, and she needed something faster. I don't think that counts as making the ARM a descendant of the 6502.

      • "Inspired' only in the sense that she thought the 6502 was underpowered, and she needed something faster. I don't think that counts as making the ARM a descendant of the 6502.

        There was more to it than that - see the interrupt handling and memory access were inspired by 6502 [wikipedia.org] and a visit to the 6502 developers convinced Acorn that it was feasible to develop their own CPU.

        Also, although the 6502 may not technically be a RISC chip according to the tick-list taught in CS, the 6502 vs. Z80/8080/x86 holy wars of the time had a lot of parallels with the RISC vs. CISC debate*: the 6502 was a simple design, with a small instruction set and no microcode vs. the more complex Z80/x86 with

        • and even x86 has switched to a RISC-core + instruction decoder architecture.

          Thats a hell of a spin. The x86 won the performance war because of the CISC instruction set, and the fastest silicon all have CISC instruction sets, including GPU's (you aint seen nothing until you've seen a GPU instruction set, where an instruction can add two values, multiply two others, while swizzling packed registers.)

          The last RISC processor to be king of performance was the DEC Alpha, which died at the decoder bandwidth alter. While CISC may be complicated to decode, its still much less die area th

          • CISC vs RISC, and uKernel vs monolithic Kernel.

            We lost an entire generation on nonsense. Now even my desktop has a 12MB l3 making it all silly.

          • Thats a hell of a spin.

            Not as much as trying to change the subject from general-purpose CPUs to GPUs.

            The x86 won the performance war because of the CISC instruction set, and the fastest silicon all have CISC instruction sets

            The x86 won the war because DOS/Windows won the OS war, not because it was a particularly good processor. Once IBM chose it, the game was up. The first ARM processors, designed for desktop PCs and workstations, ran rings around their x86 contemporaries, but couldn't run Windows (and hence MS Office, Internet Explorer etc.) and so never had a chance to escape from a UK Education/Enthusiast ghetto. After that, future ARM designs we

          • by shess ( 31691 )

            and even x86 has switched to a RISC-core + instruction decoder architecture.

            Thats a hell of a spin. The x86 won the performance war because of the CISC instruction set,

            But not really because it was CISC. Rather, because CISC instruction sets provided a higher layer of abstraction which could be internally implemented in new and different ways, whereas RISC instruction sets baked in many underlying implementation instructions. Over time, it was worthwhile to tweak compilers to output different patterns of CISC instructions as implementations changed and sped up some instructions relative to others, but that work came after the fact, and was often motivated by the install

    • by Misagon ( 1135 )

      That's understandable. The 6502 was a non-pipelined CISC but it didn't need many clock cycles per instruction.
      It was often a case of one cycle per memory access and one cycle for the computation on-chip. Divide those tasks into different instructions and pipeline them, and you would basically get a RISC.

      But I would say that its main competitor on home computers was not the 8085 but the close sibling the Zilog Z80 (both derivatives of the 8080). The Z80 had some more complicated instructions but it often nee

      • That's understandable. The 6502 was a non-pipelined CISC but it didn't need many clock cycles per instruction.
        It was often a case of one cycle per memory access and one cycle for the computation on-chip. Divide those tasks into different instructions and pipeline them, and you would basically get a RISC.

        But I would say that its main competitor on home computers was not the 8085 but the close sibling the Zilog Z80 (both derivatives of the 8080). The Z80 had some more complicated instructions but it often needed four times as many clocks than the 6502 to complete an equivalent instruction. In home computers, the 6502 often ran at ~1 MHz but the Z80 was often clocked at ~4 MHz to stay competitive with it.

        I would say that the main competitor to the 6502 was the 6800. Which makes sense; since the 6502 was designed by ex-Motorola 6800 Engineers.

        But the 8 bit CPU that I wish had taken off more was the Motorola 6809. That thing was incredible; but other than the Radio Shack Color Computer (CoCo), and some Industrial Single-Board-Computers, and Williams(?) Pinball machines, it just never seemed to get the cred. it deserved...

    • Re:8085 versus 6502 (Score:5, Interesting)

      by NoMoreACs ( 6161580 ) on Saturday December 21, 2019 @07:50AM (#59544192)

      The Apple II, the Atari 800, and the Commodore 64 have one common element: the 6502 microprocessor.

      If you are nostalgic for those home computers, you can relive part of the past by buying a laptop with an ARM microprocessor. When Sophie Wilson designed it, she was inspired by the 6502 and the Berkeley RISC II. In other words, the ARM is a descendant of the 6502.

      The other popular processor in the late 1970s and early 1980s is the 8085. Its descendants are Celeron, Pentium, and Xeon.

      In other words, the current battle between x86 and ARM is a replay of the battle between 8085 and 6502 in the 1970s.

      I cut my teeth on Assembly Language (and sometimes just raw Machine Language) on my Apple 1.

      I have written literally tens of thousands of lines of 6502 Assembly.

      I have also written ARM7TDMI Assembly Language. Not a whole lot; but enough.

      And in between those two, I have written Assembly Language stuff for about a dozen microprocessor/microcontroller families, and multitudes of variants of each.

      About the only thing those two Microprocessors, 6502 and ARM, have in common is that they both are Von Neumann architecture.

      • About the only thing those two Microprocessors, 6502 and ARM, have in common is that they both are Von Neumann architecture.

        I don't think anybody is claiming that the ARM is some sort of 16 or 32-bit update of the 6502 (that would be the 65C816 [wikipedia.org] as in the Apple IIGS).

        Its not about the instruction set (the 6502 instruction set wouldn't even make sense on a 32 bit machine, and since an ARM has always been able to simulate a 6502 in software faster than the real thing, backwards compatibility was never an issue) - but the influence of 6502's design on ARM vis. things like interrupt handling and hard-wired instructions is well docum

        • ...but the influence of 6502's design on ARM vis. things like interrupt handling and hard-wired instructions is well documented in any article on the history of ARM [wikipedia.org]

          Wanting a "similar efficiency ethos" for ARM is a design philosophy choice but nothing about the 6502's physical architecture ended up in ARM.

      • I cut my teeth on Assembly Language (and sometimes just raw Machine Language) on my Apple 1.

        I have written literally tens of thousands of lines of 6502 Assembly.

        I have also written ARM7TDMI Assembly Language. Not a whole lot; but enough.

        And in between those two, I have written Assembly Language stuff for about a dozen microprocessor/microcontroller families, and multitudes of variants of each.

        Me too. I started on Acorn Atom, later programmed the BBC Micro and later had an Acorn Archimedes.

        About the only thing those two Microprocessors, 6502 and ARM, have in common is that they both are Von Neumann architecture.

        This. ARM was in no way "inspired" by 6502. There's zero common ground in the instruction set. None.

      • by AmiMoJo ( 196126 )

        The 6502 demonstrated a few things that pointed the way for ARM.

        6502 would get similar performance to a Z80 running 4x as fast. Many instructions were one cycle and RISCy in their simplicity. It had the special zero page which showed how valuable a large number of registers could be.

        • by globe0 ( 1000025 )

          6502 would get similar performance to a Z80 running 4x as fast.

          Theoretically, practically it was 'only' 2x - 3x faster and depending on task even slower (the lack of registers hurt)

          Many instructions were one cycle

          Not a single 6502 instruction was one cycle, shortest ones were 2 cycles long (NOP, INX/Y, TAX/Y, LDA/X/Y with immediate operand ..etc) as soon as you needed to accessed memory it was 3+ cycles (zero page access was faster while normal memory access took 4 and more - like using indexed addressing modes crossing page boundary)

    • The 8086 [wikipedia.org] is likely the CPU of which you're thinking. That was the one that launched modern x86 architectures. A variant of the 8086, the 8088 [wikipedia.org], had an 8-bit data bus and was found in the original IBM PC and early 1980s IBM PC clones.
    • The 8085 was only in the TRS-80 Model 100. Most good x86 machines of the time sported Z80 chips. Only a 6502 enthusiast (a far weaker, but also cheaper processor) would even mention the 8085. Intel didn't win back the x86 market from Zilog until the IBM PC used the 8088 processor. By that time it was a battle between the x86 and the 68000, and the 6502 was a processor for family/kids' computers in cheap plastic cases that you could buy in Department Stores.

      • by _merlin ( 160982 )

        It wasn't x86 until the 8086 (and its 8-bit friend the 8088). Both the 8085 and Z80 were designed to be a better 8080 with binary compatibility. The Z80 did a better job of that, powering a generation of CP/M machines as well as other home computers (and living on in the TI graphing calculators). The 8086 was only 8080-compatible at assembly language level.

        • The point being that what people term 'x86' today has a direct traceable lineage from the 8080 and it's predecessor, the 8008.

    • Re:8085 versus 6502 (Score:5, Informative)

      by AmiMoJo ( 196126 ) on Saturday December 21, 2019 @11:11AM (#59544436) Homepage Journal

      Chuck Peddle, designer it the 6502, died this week. Not much has been said about it for some reason.

      • It's too bad we don't have some sort of globally accessible, free, distribution system for technical news. Perhaps one where we could even share our experiences about the given topic, and post ASCII swastikas.
        • by AmiMoJo ( 196126 )

          I did think about posting a story on his death but as often as not they get marked as spam and I temporarily lose the ability to post.

          I don't know if it's the anti-spam system going wrong or trolls moderating my submissions, but it's certainly discouraging.

    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Saturday December 21, 2019 @01:28PM (#59544784)
      Comment removed based on user account deletion
    • > In other words, the current battle between x86 and ARM is a replay of the battle between 8085 and 6502 in the 1970s

      except... the 6502 accomplished roughly twice the work per clock tick as the 8080/Z80, while today's x86/AMD64 accomplishes 2-4x the work of an ARM per tick(*). So now, the advantage is switched.

      (*) Ergo, the reason why a 2Ghz 4-core i7 w/hyperthreading beats a 2.8Ghz 8-core ARM to a bloody pulp, then metaphorically flushes it down the toilet.

  • by AxisOfPleasure ( 5902864 ) on Saturday December 21, 2019 @08:13AM (#59544208)

    The Atari VCS 2600 made a huge impact in the UK but Atari's 8bits just never made much impact in the UK, odd as fellow US company Commodore did very well in the UK with the VIC20 and then the very popular CBM64 which fought against the Sinclair Spectrum.

    The microcomputer "revolution" of 1980s was a fragmented mess at least here in the UK, there must have been dozens of vendors all plying their own machines, baring the MSX range, all completely incompatible. Here in the UK it was Sinclair and Commodore that took the crown, with BBC and Amstrad, then Atari pulling the next few places. As we hit 16bit, Atari pushed the ST, which I absolutely loved coding on. My last 8bit was the Amstrad 464 and then 664, Amstrad skipped the 16bit micro and went straight to PC market with their clone and that's where I first got start with my first PC my parents bought in 1987, never looked back.

    And ultimatley that's what Atari should have done, once the ST had secured it's niche in music studios they should have used that to springboard into the PC market but they were too busy fighting with Commodore in the prorprietary 16bit micro arena and that's sadly where they spent their last days, shame.

    • by janeil ( 548335 )

      I also loved coding on the Atari ST series motorola 68000, what a great cpu for its day. Sixteen 32-bit general purpose registers! The 'movem' opcode! I was really close to buying a TT030 but it just never seemed to be ready. How sad I was when I moved to microsoft in the mid-90's and saw intel 80386 assembly, all that offset BS and protected memory, what a drag.

      I still have my 1040Ste desktop and hard drive data stashed on all my machines to run with Steem every once in a while, it still works fine on wind

      • The 80386 was 'segmented' for compatability with older generations. It proved to be the springboard for linux once somebody was bold enough to throw away concern with being 'IBM Compatible' and let the thing sing.

      • How sad I was when I moved to microsoft in the mid-90's and saw intel 80386 assembly, all that offset BS and protected memory, what a drag.

        The 386 never required you to use pages with offsets, it always had full 32 bit addressing.

      • by Misagon ( 1135 )

        It was even nicer to code on the Amiga. ;)
        I still have two Amigas and a STe. The Amiga's chipset was the true successor to the Atari 8-bit's, both having been designed by Jay Miner, and with some features being similar: e.g. display lists vs. copper lists.

        I hear you about x86 assembly. In the early '90s, after having become proficient with the 68000, I wanted to learn assembly programming for MS-DOS.
        I was baffled at how primitive the 80286 was in comparison to the elegant 68000. It was like being back to 65

        • That was MS-DOS being the constraint, not the 80286 processor. The 286 was specifically designed for much more than DOS, with protected mode, etc.

          • I spent literally *years* circa 1991-1993 trying to figure out how to do 386 assembly with flat (ok, 4gb segment, which circa 1992 *was* essentially infinite memory with flat addressing).

            It was supremely frustrating... "everyone" knew the 386 & beyond could *do* it... but nice books coherently explaining how to use it simply didn't exist... and when they finally did, stores like Borders didn't sell them, and finding them to special-order was damn hard unless you already knew what you were looking for (e

      • by nagora ( 177841 )

        The 68K was a great CPU for any day. The current Intel/AMD lineup is a pile of shit compared to it at least in terms of programming model. If Motorola had done what Intel did and cut every corner it could in order to pump the CPU clock maybe they would have won. But probably not, as Intel's shitty chips would never have been installed in a computer by the 1980s if market forces had anything to do with it.

        Garbage instruction set, terrible en/decoding pipeline, a whole ONE general purpose register (they alway

        • by Megane ( 129182 )
          The second stack pointer was probably the best thing about the 6809. The inner loop of a FORTH interpreter was literally two instructions. It's too bad that Motorola (apparently) decided it was too hard to make a microcode version of its instruction set when they did the 68HC11.
    • The Atari VCS 2600 made a huge impact in the UK but Atari's 8bits just never made much impact in the UK, odd as fellow US company Commodore did very well in the UK with the VIC20 and then the very popular CBM64 which fought against the Sinclair Spectrum.

      Atari never stood a chance, they used cartridges to distribute games when all the others used cassette tapes that could be copied by connecting two cassette players together at the local computer club and presssing play on one and record on the other.

      (plus the Ataris were more expensive...)

      • by ncc74656 ( 45571 ) *

        Atari never stood a chance, they used cartridges to distribute games when all the others used cassette tapes

        On what planet? Cartridges were actually pretty common. They were used not only on the 8-bit Ataris, but the Commodore VIC-20, 64, and 128, TRS-80 Color Computer, TI-99/4 and 99/4A...even the IBM PCjr had cartridge slots. About the only platforms of note that didn't use cartridges were the Apple II and IBM PC, and on those, disk drives took over as primary storage fairly early on (the IIc, IIGS, a

        • Cartridges were actually pretty common. They were used not only on the 8-bit Ataris, but the Commodore VIC-20, 64, and 128, [...]

          While they did exist, I have never seen cartridges with software on the Commodore computers: all software came on tape (or on diskette, but Commodore 64/128 disk drives were expensive and slower than cassettes on other home computers, so disks were never popular on the C64/128).

          The only cartridges which were popular on the C64 were the Final Cartridge [c64-wiki.com] and others which expanded the functionality and made it easier to hack the machine.

          • by Megane ( 129182 )
            There were quite a few C64 cartridges, but they were almost all games, and most that I have found were produced by Commodore. Floppy disks and cassette tapes are cheaper to produce, never mind the C64's abysmal floppy disk speed.
      • by Megane ( 129182 )
        It was always kind of odd that Europe/UK stayed with cassette tape for so long, when the US quickly moved to floppy drives on everything. Even the slow floppy drive interface on the C64 can't explain it, as a C64 with a tape drive was quite rare in the US.
  • by Shemmie ( 909181 ) on Saturday December 21, 2019 @09:07AM (#59544262)

    "Though Atari ultimately had trouble competing with Apple and other entrenched PC makers, it produced machines with dazzling graphics and sound and the best games of their era"

    You spelt 'Amiga' wrong.

    This will never die. Unlike the companies.

    • by Saffaya ( 702234 )

      This article is about ATARI 8 bits computers, not the 16/32 ones, so bringing some AMIGA sour grapes isn't warranted.

      Unless you would add that the AMIGA was the design-wise successor to the ATARI 800 (same designers), in which you would be right.

      And with its philosophy of power without the price, the ATARI ST was the spiritual successor to the C64 (same CEO).
      Hence, any animosity between ATARI ST and AMIGA fans is totally pointless.

      Both were awesome and had their own edge anyone with an impartial mind would

    • The Atari was launched in 1979, long before the Amiga, long before the C64 and even before the VIC 20.

      So yes, it had the best graphics/sound of its era.

  • I have very fond memories of being introduced to this machine in 2nd grade. I remember well its beeping membrane keyboard and cavernous cartridge slot concealed by a popup hatch. It threw off a lot of heat too when running -- the whole plastic chassis was generously perforated with vertical venting. But that's where I first learned rudimentary BASIC programming. Almost four decades later, it is truly remarkable to see how computing has changed the world, yet I feel that I still have a touchstone back to
    • I wonder how well the membrane keyboards of the early 80's held up over this time. I have seen lots of membrane keypads for POS systems that have the plastic worn away in just a few years. This is why I always thought that the Atari 800 was the better machine - It had independent key switches.
  • My first computer was my wire wrapped 1802 COSMAC VIP I made following the design out of Popular Electronics. The 400 was my first purchased computer. Father and I drove to USA to get me one. Had the choice of the Atari or an apple and Im so glad I picked the Atari. Learned a lot from that machine.
  • Atari should have released a console version that ran the cartridge games. Would have been a vast improvement over the 2600. The Atari 800 could have been the development machine, and they would have sold a bundle of them. The Atari 5200 was a disaster, and the 7800 was too late. We could still be using Atari consoles if they had made that jump early.
    • by hawk ( 1151 )

      They would have had to do a hardware implementation of the 2600 for this to work--the 6502 didn't have the power to emulate itself and simulate the graphics.

      The 2600 used a 6502 variant in a smaller package with less memory address lines. It was responsible for *everything*, down to sending the retrace signals.

      While such a machine would probably have been desirable to folks whose 2600 went out, the extra cost for everyone else would have killed it.

      The Apple ][ emulation mode on the Apple /// was implemente

      • Not 2600 cartridge games, Atari 8 bit computer cartridge games. They were far more advanced.
        • by hawk ( 1151 )

          but we're *talking* about the 8 bit computer, the 400/800.

          I assume that the 1200 (?) was backward compatible with them.

          Later 68k based Atari machines would have had to pretty much do hardware emulation; Atari was over by the time software emulation could have been a thing.

        • by Megane ( 129182 )

          It was called the Atari 5200, and they still managed to fuck it up. Slight incompatibility with 400/800 code, horrible controllers, no SIO bus. The controllers were probably the worst part, flex circuits with tin-plated contacts that oxidized within months, anti-ergonomic fire button placement (having to use your thumb both to hold the controller and press the fire button), non-centering joysticks, and more. They also had 15-wire cables; if any wire broke inside they were basically useless. But they were a

    • by RedMage ( 136286 )

      Hindsight isn't all that clear here - I don't think it would have been feasible at home-console prices until the 5200 release date, and even that was pretty expensive for a home console when it first came out. The 5200 debuted at $269 USD in 1982, about the same time the C64 came out. The Atari 800 was still selling at around $500 USD or more if you wanted storage or memory. I'm not sure Atari could have made the 5200 attractive, especially without 2600 backward compatibility.
      By the time the 7800 came ou

      • by Megane ( 129182 )
        FWIW, the 2600 module for the 5200 came rather late in its life. Only the very last board revision of the original 4-port console was even compatible with it. You had to send in your 5200 to be "modified", probably a motherboard replacement, meaning no 4-port sold in stores may have compatible. And it was still a parasite module like the one for Colecovision, and needed its own controllers. So it was basically a non-issue for selling the 5200.
    • > Atari should have released a console version that ran the cartridge games.

      They did: the Atari XEGS https://en.m.wikipedia.org/wik... [wikipedia.org]

      Remember, Atari as a company vanished from the earth for ~2 years. The XEGS was a 1985 product that finally became available in 1987 after Atari's resurrection.

  • The Atari 800 was awesome, I still have mine and play games on it regularly. It was built to last. It was where I started my programming hobby, and still tinker with all things hardware. I just wish my old Atari 810 disk drive still worked with its whopping 96K floppy drive :) It was much better than my Atari 410 tape drive. Now days I have a SIO to USB adapter to emulate the drive and store all the stuff on my PC.

  • by Tulsa_Time ( 2430696 ) on Saturday December 21, 2019 @05:33PM (#59545520)

    That is 500 in Computer (dog) years...

  • Yes, I'm the proud owner of an Atari 800 plus 2 "Happy" drives. Great little computer. The Atari 800 had a dedicated graphics chip (Antic) and dedicated sound chip (Pokey) plus a bunch of other stuff that was pretty forward-thinking.

    It's boxed up and tucked away down in the basement, but I still have it all- the console with the *real* keyboard, hundreds of games and their manuals, joysticks, Courier HST modem, etc etc. Tons of stuff all bagged and boxed. I should sell it as a bundle on ebay.

  • ..but I hated my friends Atari computer.

    Who thought a membrane keyboard was a good idea?

    Ironically, we were playing on that (Star Viking?) when his dad, a senior VP at IBM, came in and told us they were going to soon come out with a computer to compete with the apple, called a "PC".

  • I got mine the day I was leaving Seattle on vacation to San Diego for Christmas. I had no fear that I could connect it to my mom's TV. I sponsored the S*P*A*C*E Group (Seattle-Puget Sound Atari Computer Enthusiast) group at our Seattle Byte Shop store. We eventually outgrew that space and had to move to a location in Kent, WA. Alan Kay brought us the printed out listing for the 400/800 ROMs to encourage application development. He showed a video of him dressed and a computer chip explaining what the A
  • The original Atari 400 and 800 were designed when the FCC Class B requirements for electronic noise emissions from home electronic devices (like computers) were first adopted. At the time most consumer electronics manufacturers were saying it was an impossible standard to meet and trying to convince the FCC to back down; meanwhile, the engineers at Atari designed a home computer that not only met but comfortably exceeded the requirements. They did it by putting a metal cage around everything, even the plug-

  • Jay Miner was involved in creating the Atari home computer range, later on he developed the Amiga.

  • Everything is changing in the world. Do you remember the first TVs and then the computers? Compared to today's technology, they were monsters. Today, our whole life is in one smartphone or other gadgets. Tv we can watch too through wifi. There's digital television. One little wifi adapter mag https://www.infomir.eu/eng/pro... [infomir.eu] has more capabilities than ever that first computer.

God doesn't play dice. -- Albert Einstein

Working...