Turning the Arduino Uno Into an Apple ][ 113
An anonymous reader writes: To demonstrate how powerful modern computers are compared to their forebears, engineer Damian Peckett decided to approximate an Apple ][ with an Arduino Uno. In this post, he explains how he did it, from emulating the 6502 processor to reinventing how characters were displayed on the screen. "The Apple II used a novel approach for video generation, at the time most microcomputers used an interlaced frame buffer where adjacent rows were not stored sequentially in memory. This made it easier to generate interlaced video. The Apple II took this approach one step further, using an 8:1 interlacing scheme. This had the first line followed by the ninth line. This approach allowed Steve Wozniak to avoid read/write collisions with the video memory without additional circuitry. A very smart hack!" Peckett includes code implementations and circuit diagrams.
Did this really need demonstration? (Score:1)
Very cheap modern computer is capable to emulating a 28 year old cheap computer.
Re:Did this really need demonstration? (Score:4, Informative)
Apple II w/ 4K of memory would cost $5236.87 ($1298) in todays dollars. While this may be a lot less than a lot of computers at the time I wouldn't call it a cheap computer by any stretch of the imagination.
Re: (Score:3)
To be fair, he was mostly emulating a MOS 6502 that would cost about $125 in today's dollars.
Re: (Score:3)
I hate to quote the fucking article because who reads that anyway? But When the 6502 was first released it was priced at $25 USD. At the time this was unheard of, being up to six times cheaper than the nearest competitors. Some people even thought that the low price had to be some form of scam. does in fact establish that the processor was considered a cheap processor.
Re: (Score:2)
The 6502 was cheap for a reason. It was the low end alternative. It's no coincidence that the thing eventually was bought out by a video game manufacturer. And that so many warts and 'undocumented' features existed in it.
Fun to hack on. But professionals were specifying 6802s and a little later Z-80s.
The whole 'merit' of the 6502 and Apple at the time was an exclusive 'killer app' (Visicalc) that moved units. Marketing hustle, not technical superiority or even equivalence. That scent of ripe apple, back the
Re:Did this really need demonstration? (Score:5, Informative)
There was a 6500 before the 6502 (I had one) but it used a weird technology that meant it drew almost all its power from the clock lines (two phase non-overlapping clock) and the interface voltages were also non-standard, so the 6502 was magnificently better. It was cheaper because of volume - the die size was almost exactly the same - the chip was almost exactly the same. (I think they got some major order before it was even available for general release), and there was a second source (Rockwell).
The 6800 was a superior processor if you did not have much string processing to do. The 68000 was an entirely different beast.
Re: (Score:1)
After they designed the 6800, they realised that processing strings on a 6800 was hell's own job cos it only had one pointer (although it was 16 bit).
Reminds me of my own adventures with assembly language on a Z80 CPU. In writing my first non-trivial (yet not that complex) program, I decided I wanted to establish a calling convention for subroutines as well as use the stack for temporary variables. On an 8086 this is pretty simple, at the beginning of a function you just "MOV BP, SP" followed by "SUB SP, x" to make room for your temporary variables, then reference everything via "[BP+y]" with positive values pointing to the function arguments and negat
Re: (Score:2)
The 8085 actually had an undocumented instruction to add a constant value to SP and put the result in DE. But apparently all this was happening roughly at the time the 8088 was coming out, and one of their advertised "advantages" was being able to cross-assemble 8080 code to the 8086, so they memory-holed those instructions. Most 8085 clone cores had them though. And yes, I have also seen the result of trying to compile C for the Z-80, and it isn't pretty.
You should look into the 6809 to see what you get w
Re: (Score:1)
The 6502 was years before the 6802 or Z80
6800 - November 1974
6502 - 1975
z80 - July 1976
6802 - March 1977
Things went faster in the past.
Re: (Score:2)
You have it reversed - the 6800 predated the 6500.
The 6800 was done by Motorola, but those designers were
Re: (Score:2)
A while back I found a Heathkit ET-3400 at a thrift store. I tried writing a simple program for it when I realized that holy shit, the original 6800 doesn't even have the ABX instruction, that was in the 6801/6803 core. I mean, I knew all those other instructions like ADDD and MUL wouldn't be there, but I didn't know ABX would be missing too. And I was doubly annoyed because I had done a lot of 6809 programming so I already knew I was going to miss a lot of things.
And the 6809 was the most superior 8-bit C
Re: (Score:2)
The 6500 was MOS Technology's first chip, and sold for $20 in quantity one. It was designed to be pin-compatible with the 6800, though the instruction set was different so it was not a drop-in replacement. The requirement for the quadrature clock was shared with that chip, so it really wasn't any harder to use. MOS Technology withdrew the 6500 from the market under legal pressure from Motorola; MOS Technology probably would have won the legal battle but did not have the resources to fight it.
Meanwhile they
Re:Did this really need demonstration? (Score:5, Informative)
The Apple 2 came out in 1977. Visicalc was 1979 and didn't start development until 1978. No Apple was not building the machine around Visicalc.
Re: (Score:2)
Well I don't know about "also ran". There were lots of CP/M machines. Visicalc was after basic word processing one of the first major applications for CP/M machines moving them from hobbyist to usable for home businesses ...
Regardless the Apple 2 was not designed around the specs for Visicalc per GP. That's just not true.
Re: (Score:1)
Re: (Score:2)
VisiCalc was actually developed on a MicroMind? I didn't know that!
The ECD MicroMind was a tragic example of the perfect being the enemy of the good. It was an ambitious design for the time, notably including memory mapping hardware so the system could have more than 64K RAM, and a powerful graphics board that had both bitmap graphics capability and a programmable character generator. It used stackable boards rather than the usual card slots. But they spent so much time perfecting the design and adding more
Re: (Score:1)
Re: (Score:2)
Both the Atari and Commondore computers you mention had a couple additional processors to handle video, sound, etc. They didn't go with just the 6502 alone. http://en.wikipedia.org/wiki/J... [wikipedia.org]
Re: (Score:2)
Mostly the 6502 was cheap because they made a marketing decision to make it cheap. The convention wisdom of the time was to charge through the nose for small quantities of chips and soak developers. In theory this helped companies keep down the large volume prices of their chips and make them more attractive to companies that were going to buy millions of them. MOS Technology, a startup chip maker, decided to try something radical to put itself on the map: sell single chips at prices low enough that hobbyis
Re: Did this really need demonstration? (Score:1)
Re: (Score:2)
Actually, the 6502 only cost $6.95 in today's dollars.
http://www.mouser.com/ProductD... [mouser.com]
Re: (Score:3)
well, kind of.
I mean it's neat, because it's on such a crapp microcontroller.
the atmel avr used is pretty much an ancient microcontroller today. I mean, it's pretty much slower than the first pc our family had(8mhz x86 vs atmel 8mhz, or maybe 16, depends). our first pc also had a video card to take care about the graphics display and so forth...
sure, if you loaded an emulator on raspberry pi, I wouldn't give a shit or any props. but for this yes.
Re: (Score:2)
Well, the 6502 was the shittiest processor of it's generation (which is why the Apple dudes could afford it) so it makes sense to emulate low end with low end.
Your first PC had a CRT controller (likely a 6845 or derivative.) That's a relatively simple LSI chip. People were doing 'cheap video' with software even back in the day (i.e. Lancaster's cookbook)
Re: (Score:1)
Re: (Score:2)
The AVR is a fine microcontroller. It isn't meant to be a CPU for a PC.
Re: (Score:2)
I said that it was ancient, not that it was shitty as such. well, that kind of makes it shitty if you compare what is on the market in 2015 - for any application. it's not particularly cheap either and in official arduino uno board it's ridiculously expensive for what it is.
sure, my 3d printer runs on atmel avr. that's what makes the firmware rather shitty from what it could be and impossible to improve upon, there's no space left on the rom and the ram is exhausted and there's no empty cycles(96% of home 3
Re: (Score:2)
You must have a very different definition of crap than most of us use. It's not ancient either, it is current.
Perhaps you don't know what a microcontroller is for? Hint, for the purposes a microcontroller is intended, the home computer would be a miserable choice, then or now.
Re: (Score:2)
It's for sale right now, but it's still ancient. You can get an ARM CPU for roughly the same price, with 10 times the clock, more peripherals, more choice of packages, more memory, and 32 bits instead of 8.
Re: (Score:2)
ARM is more expensive. You can get an AVR for $3 quantity 1. ARM also requires more of the board it's connected to and still consumes more power.
I like ARM and when the capability is needed, I wouldn't hesitate to recommend it but it's not a one size fits all world.
Re: (Score:2)
No, ARM isn't more expensive. Try, for example, the ST Microelectronics STM32F030R8T6. That's a Cortex-M0 ARM, 48MHz, 64K flash, 8K RAM, 55 I/O pins. $2.22 in quantity one. Reference: http://www.digikey.com/product... [digikey.com] That's just one part I happen to be familiar with; there may be even cheaper ARM alternatives out there.
Quantity one price of an ATMega328? $3.25. That's the surface mount version; the DIP is $3.38. Reference: http://www.digikey.com/product... [digikey.com]
It's true that if you stay with Atmel, ARM will b
Re: (Score:2)
ATTINY2313-20PU $1.62 [verical.com].
Not as fast and not as powerful, but if that's all you need, why overbuy? If you slow it way down, it comes in at 20 micro amps @ 1.8v
There's nothing wrong with the ARM, it's just not always what is needed.
One reason AVR looks more expensive is that it is currently the cool maker choice so you see a lot of them at vastly inflated prices. One reason it's 'cool' is it's ease of use and minimal demands for support circuitry.
Re: (Score:2)
Another factor is that the AVR chips are mostly still 5 volt parts. That means that they have to be made with a very out-of-date process and are much larger than current designs. (The processors used in AVR Arduinos can be run all the way down to 2V at reduced performance, but the fact that they allow 5V operation dictates the process used.) All the microcontroller ARM chips that I am familiar with are 3.3 volt chips (that's the maximum, most can also be run at lower voltages, typically down to 1.8V); highe
Re: (Score:2)
That is a very good point. I'm in a project now that needs low power. It is nice to be able to power sensors off of one of the DIO pins so I can power them down at will without adding to component count. And as you say, the 5V design will be more robust for little effort.
I tend to think of < 5V as something you do if you have to, never as a first choice.
Re: (Score:2)
in the purposes it serves in many arduino projects, it's crap. it's used for all kinds of things it's crap for. realtime motor controls, motion planning, audio analyzing, you name it. all kinds of stuff it's pretty crappy for but it's the "standard", so it gets thrown in there.
the arduino atmels are a) not low power use b) not powerful.
from what I can look at volume pricing, the volume pricing of the atmels isn't that great either.
it's not intended at any specific application either. it's intended to do any
Re: (Score:2)
Dunno about that...i think the AVR instruction set is probably faster at 8Mhz then an 8086 at the same speed; it's at the very least equivalent. The x86 instruction set really is a mess...
Re: (Score:2)
The x86 instruction set really is a mess...
Clock for clock, a modern intel CPU is much faster than an AVR. Who cares if it doesn't look pretty, when it gets the job done ?
Re: (Score:2, Interesting)
> Very cheap modern computer is capable to emulating a 28 year old cheap computer.
Actually, it may contradict common sense, but you're missing part of the "history".
As technologies evolve and brands consolidate, some old ideas are lost. Things like the mentioned 8:1 interleave, floppy drive skewing schemes or they way images are generated in vector display are harder to simulate (though feasible). And even if simulated, not everyone would know how to use them. I'm particularly reminded of ATARI 2600 imag
Re: (Score:2)
There's even a fair number of N64 emulators that can't play Ocarina of Time properly. There's a part very early in the game where y
Re: (Score:2)
Back in the mid '90s, I think everyone was surprised to find that you needed at least a 486DX-25 to emulate the Atari 2600. It was because the 2600 required cycle-accurate timing to emulate it properly. You could, and everyone did, do stuff with the Stella chip (which I call a "1-D" graphics chip) in the middle of a scan line, sometimes abusing its counter registers in interesting ways.
The N64 was a different beast with emulation. I think the biggest problem was needing a lot of RAM to emulate it properly
Re: (Score:2)
I was playing with apple emulators 18 years ago and I had the same reaction.
Cool hack, but not very useful (Score:3)
The cool things are that he used a 8-bit AVR microcontroller to emulate the 6502, and that he used a USB chip on the prototyping board to create video...
Unfortunately, it runs much slower than a 1MHz 6502.
It appears that he did his own reverse-engineering of the 6502. One peculiarity that he may have missed is that it has undocumented op-codes, and those do show up in some programs.
Other people have done much more reverse engineering of the chip, down to the gate level even.
Re: (Score:2, Interesting)
> Other people have done much more reverse engineering of the chip, down to the gate level even.
See: http://www.visual6502.org/JSSim/index.html [visual6502.org]
Re: (Score:2)
I thought the best part was reprogramming the 16u2 to bit bang VGA.
Interlacing? WTF? (Score:4, Interesting)
most microcomputers used an interlaced frame buffer where adjacent rows were not stored sequentially in memory
First of all, I know a lot about micros from the late '70s and early '80s (I was there, maaaan!), and I can't remember a single one other than the Apple II series that didn't display rows sequentially.
This approach allowed Steve Wozniak to avoid read/write collisions with the video memory without additional circuitry.
I'm pretty sure the story I heard was that it saved one TTL chip in the counter chain to do it that way, which was just the kind of thing Woz would do.
Collisions? Exactly what kind of collisions are you talking about? IIRC, the Apple II used interleaved access, where the 6502 would access RAM on every other clock, and the video would access it in between. (This method was also used on the original Macintosh, though the 68000 sometimes needed a wait state.) But that has nothing to do with the funky row counters.
Re: (Score:2)
I don't understand the article's explanation. Anyone care to elaborate?
Re:Interlacing? WTF? (Score:5, Informative)
Woz designed the Apple ][ video system so the order it read data from RAM automatically fulfilled the DRAM refresh requirements. And, the video system reads were interleaved with CPU access, so the CPU never had to wait while video or DRAM refresh was happening, as was common with other designs.
The claim of an 8:1 interlace isn't really correct. The bitmapped memory layout used an 8:8:1 interleave. The first 8 rows were addressed 0x400 apart, that pattern was then repeated 8 times with an offset of 0x80. Details can be Googled. Part of the reason for that is so DRAM refresh hit every required location often enough.
Re: (Score:3)
I'm pretty sure ZX Spectrum did interleaving with 8:1 ratio.
When a game loaded off cassette tape and displayed its loading screen (took about 10-20 seconds I think), you'd see pixel row 0, 8, 16, ... Then pixel row 1, 9, 17, ...
Re: (Score:3)
Re: (Score:3)
Ah, here's an article and great video that demonstrates the ZX spectrum's interlacing:
http://whatnotandgobbleaduke.b... [blogspot.co.uk]
Re: (Score:3)
First of all, I know a lot about micros from the late '70s and early '80s (I was there, maaaan!), and I can't remember a single one other than the Apple II series that didn't display rows sequentially.
The BBC micro outside of Teletext mode didn't. It had a quite whacky scheme which (in Mode 0) went something like this:
The first byte corresponds to the first 8 pixels of row 1, one bit per pixel. I can't remember which endianness. The second byte corresponds to the first 8 pixels of row 2. And so on up to and
Re: (Score:3, Interesting)
Re: (Score:3)
Mate, WTF?
You said you couldn't think of any computers other than the Apple II which had non-contiguous rows. I provided an example, which I assumed you wanted since you couldn't think of any others. No need to go on the attack about something I'm not disputing (whether it was the majority).
Anyway the BBC wasn't exactly unique, as it used the 6845 http://en.wikipedia.org/wiki/M... [wikipedia.org] which also used on other machines. Maybe not the majority, though I've no idea.
Re: (Score:2)
Re: (Score:1)
I came here to say pretty much exactly what you did. The funky addressing saved a chip. It's pretty widely documented / known.
Yes, the video used opposite bus phases from the CPU (and doubled as refresh counter for the DRAMs), so there were no wait states due to video fetch. But as you point out, that has nothing to do with the Apple ]['s weird video memory map.
Re: (Score:2)
C64 used a non-sequential scheme that mirrored it's character display.
8 bytes sequential on most machines means a linear series of pixels on the same scan line.
On the C64, those bytes got stacked up to form a character, each byte on a sequential scan line, assuming one starts at a character boundary.
Re: (Score:2)
Re: (Score:2)
for hires, rather than reading the same 40 bytes eight times in a row, and feeding to a character generator,eight different sets of 40 bytes were read (of which six set bits, and two danced around the colorburst signal. the pixel rate was just at the colorburst signal, so shifting half a bit tickled it and gave a different set of colors. Not just clever,but fiendeshly clever)
hawk
Re: (Score:2)
The Amstrad PCW also had a complex memory layout for the screen. It had a "Roller RAM" lookup table for each row which could be modified in order to achieve fast scrolling of the screen. The memory for a single character was also stored sequentially in the memory.
Re: (Score:2)
Checking wikipedia, that was needed because the display was 100% bit-mapped. You just can't push that much data around on an 8-bit processor for vertical scrolling. On a TRS-80 Color Computer, if you used every register possible with PSHS/PULS, scrolling 6K of bitmapped video data 8 pixels at a time was almost tolerable. The Amstrad had 23K.
And I'm still mad at myself for not buying up the two or three PCWs I saw at thrift stores back in the '90s, because I later found out that it used the same 3" floppy d
Re: (Score:2)
And I'm kicking myself for not buying the used Apple ][ in a wooden case at the surplus store around the corner, which I've come to realize wasn't a ][ at all . . . :(
hawk, who still has his 128k mac and 1802 wirewrap systems
6502 orgasms (Score:2)
Memories of programming with the 6502 instruction set are so delicious that the only comparable thing to compare it to was my first orgasm.
I actually first did it on a General Electric GEPAC computer in 1966. It had an almost identical instruction set to the 6502, but with 24 bit words. Hip programmers expressed themselves in octal in those days.
Re: (Score:1)
I assume you were polite and cleaned off the top of the machine when you were done?
Re: (Score:1)
The Z80 OTOH was clearly better than all the others. To this day I find it had an elegant design.
Ugh no, The assembly language looks ok-ish, but look at the instruction lengths and timing for anything using IX or IY, and tons of undocumented instructions that may or may not work on different versions of the chip.
Re: (Score:3)
Forget the troll. I understand exactly how you felt: just like getting inside a flying saucer.
IMHO it's arguable that the 8080 was better than 6502, though. Each had their advantages.
The Z80 OTOH was clearly better than all the others. To this day I find it had an elegant design.
I disagree.
Although I absolutely love coding Assembly language on the 6502, and have written tens of thousands of lines of same; I would rather the 6809 had taken off. The design of that processor was truly forward-thinking. It was a shame that only the Radio Shack (RIP) Color Computer (CoCo) employed that CPU; because it was closer to an 8/16 bit "miniature 68k" than it was to the 6800/6801/6501/6502 designs.
Among other things, It had an A and B Accumulator, which you could concatenate into one 16 bit
Re: (Score:1)
Re: (Score:3)
YES. 6809 was a truly elegant design that deserved to have far more success than it did.
Yeah, I always wondered about that. It never got transmogrified into an "HC" version, never-ever became a Microcontroller (oh, how much more fun the HC11 would have been, if it was based on a 6809 instead of a 6801?), etc. This article [wikipedia.org] states incorrectly that a modified version of the 6809 forms the CPU in the HC11; but I remember from the datasheet that it was called a modified 6800 (the 6801); so?
The other possibility is that Motorola was already investing heavily into the 68k R&D, and didn't want t
Re: (Score:3)
It may be the same reason we switched from 68HC11 to PIC and IBM used the 8088 instead of the 68000; Motorola was never customer friendly except with their literature. Availability and second sources were common problems.
Another possibility is that with the release of the 68008, there was no reason to further m
Re: (Score:3)
The Vectrex video game system also used the 6809.
And my guess as to why Moto didn't use the 6809 as the basis of the 6811/6812 is because they wanted to use microcode, and it would have been harder because of the post-byte index modes.
And then there was the 68000... apparently the marketing guys back in the day were dead set on only selling thousands of 68000s for full-blown Unix-type systems, and against selling millions of 68000s as an embedded processor. By the time of the Macintosh/Amiga/Atari ST when
Re: (Score:2)
The Vectrex video game system also used the 6809.
Yeah, I saw that in the Wikipedia article.
And my guess as to why Moto didn't use the 6809 as the basis of the 6811/6812 is because they wanted to use microcode, and it would have been harder because of the post-byte index modes.
Interesting. I am not familiar enough with the internal circuit topology of the '09 to comment. Did the PLA approach take more, less, or about the same silicon as if the '09 would have used a microcoded approach? Because that was around the time when Mot. (And other) electronics salespeople started taking about "nanoacres of silicon", LOL! So, if the microcode-based designs were done in less silicon, and with the microcontroller price-wars heating-up, I could see Mo
Re: (Score:2)
But the 6809 was the best. (inb4 6309 which no computer ever shipped with)
It is unfortunate that Motorola didn't use it as the basis of the 6811 and 6812, but it was probably harder to express the 6809 in microcode.
Re: (Score:2)
Memories of programming with the 6502 instruction set are so delicious that the only comparable thing to compare it to was my first orgasm.
Especially true when comparing to the 4 years later coming 8088 and its painfully segmented memory access.
CoCo (Score:1)
Re: (Score:2)
One of the gentile geniuses of our time
...and one of the Gentle Geniuses, too!
No Interlacing (Score:3)
The video produced by the Apple II is not interlaced at all. Many video devices used to mix and overlay video in studios had trouble with this fact. True the video memory is not sequential but that's not the same thing as interlacing. Way back in 1983 I had lunch with Woz and a half dozen or so mostly game developers at the Independent Developrs conference. I asked him if he would want to change anything about the design of the II. He said he might add the two chips needed to make the video memory map sequential. Several of us including myself said that most of us would still use a lookup tables for updating video memory anyway (it was faster) and that didn't really matter much. In the end he agreed.
As far as the 6502 being the shittiest processor of it's generation I would have to disagree. True it has fewer registers and instructions (RISC?) than most even older designs like the 8080, but it did have some unique adressing modes that made it the perfect processor for the graphics the Apple did. This coupled with the fact you can use the 256 bytes of zero page much faster and much like processor registers (indexed memory referencing) made it one neat machine.
Re: (Score:2)
Except for the anoying detail that if you wanted to be interoperable with anything that was written in Applesoft Basic and ProDOS, weren't very many of them to really play around with. [apple2.org.za]. One would usually have to resort to saving most of the entries they wanted to use, and then restoring them upon exit. This works, but I recall it wasn't very amenable to being interrupted with reset. While writing a custom reset handler mitigated some o
Re: (Score:2)
ProDOS
ProDOS was a Disk Operating System, not a Language.
Oh, and I wrote a preemptive RTOS in 6502 for my Apple ][-based Stage Lighting Controller back in 1982, using a hardware "time-slicer" (interrupt-generator) running at 2 KHz, long before I knew what an RTOS was.
And I also wrote a "virtual-memory" system for Applesoft BASIC programs, that used the On-Error GoTo and "Ampersand Hook" to allow a programmer (we didn't call them "Developers" in those days!) to either write a program in "modular" form, or, in
Re: (Score:2)
You did preemptive multitasking on the Apple //? Way cool.... mine was strictly a cooperative multitasker, although it considered waiting for input (either from a remote connection or the keyboard) to be indicative that it was safe for the task requesting input to yield control. I had no hardware clock in my apple, so I could not do full-preemptive multitasking. As I said, when I was writing it I didn't even know the word 'multitasking' would describe what I was doing... I always described the mechanis
Re: (Score:2)
You did preemptive multitasking on the Apple //? Way cool.... mine was strictly a cooperative multitasker, although it considered waiting for input (either from a remote connection or the keyboard) to be indicative that it was safe for the task requesting input to yield control. I had no hardware clock in my apple, so I could not do full-preemptive multitasking.
Thanks for the props, LOL!
Looking back on it, It was actually pretty close to a true, modern RTOS, with semaphores and "mailboxes", "task-suspending", and the whole bit. I had 16 "slots" (threads) that could be managed at a time. I called the functions "TaskMaster", IIRC. It was born out of the need to have multiple asynchronous functions, such as crossfades, sequences (which I could even "nest" up to 8 levels deep!), and to manage the CHARACTER-BASED, OVERLAPPING "Windowing" system I created for it as we
Re: (Score:2)
I started on the Z-80 and later had 6809, so I never could find much love for the 6502. But it started a revolution by being designed for high yield, and initially sold for $20 each quantity one when the 6800/8080/Z-80 processors were more like $200 each Q1.
I once got to use an Ohio Scientific Challenger III. It had 3 processors, 6502, 6800, and Z-80, but the people who owned it only ever used the 6502 with a version of Microsoft BASIC. It supported multi-user by having a 48K RAM card for each user at 0000
Re: (Score:2)
Wow, you don't come across people who've even heard of Ohio Scientific that often, much less actually used one. The first computer I ever used was a C2-OEM, with 8" floppies, and I have a (still working) C4P in my garage.
Re: (Score:2)
>Wow, you don't come across people who've even heard of
>Ohio Scientific that often, much less actually used one.
*sigh*
get off my lawn, I suppose. (I just reseeded it anyway)
hawk, suddenly feeling old
Re: (Score:2)
Prodos?
PRODOS????
damned newbies . . .
hawk
I wonder (Score:2)
I wonder how Woz feels about this kind of development. He has a /. account so if you read this: did you ever think there would be computers powerful enough and people interested enough to implement your brainchild on a credit card sized machine with different architecture within your lifetime. What do you think of the arduino movement in comparison with the DIY computer movement from our time?
Re: (Score:2)
I wonder how Woz feels about this kind of development. He has a /. account so if you read this: did you ever think there would be computers powerful enough and people interested enough to implement your brainchild on a credit card sized machine with different architecture within your lifetime. What do you think of the arduino movement in comparison with the DIY computer movement from our time?
Well, considering that Apple pretty-much did an Apple-//e-on-a-chip [wikipedia.org] back in 1991, I'd say he'd be rather bemused.
But supportive, nonetheless...
Re: (Score:2)
Earlier than that.
The Mac IIfx had a pair of chips each of which effectively had such a creature. One ran the serial/network ports, and I forget the other.
Had apple sold that chip, combined with the network that ran on the second (unused) pair of standard home wiring, they could have *owned* home automation years ahead . . .
hawk
Woz Sez He Regrets the Video Addressing Shortcut (Score:2)
Instead, we had BASCALC and HBASCALC calls in the Apple Monitor ROM.
And we liked it!
Absolutely fabulous article! (Score:3)
This was easily the best, by far, technical article ever linked in a Slashdot submission.
I just had to express my amazement. Holy shit, such deliciously nerdy article...
Re: (Score:2)
Re: (Score:2)
What the fuck is a Apple ][?, What the fuck does ][ stand for?, Are you so fucking chic that typing 2, or II is a problem?. Fucking hipsters.
OMFG! You're kidding, right?
You must IMMEDIATELY turn in every single computing device you own.
Re: (Score:2)