Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware

Oldest-Known Version of MS-DOS's Predecessor Discovered (arstechnica.com) 70

An anonymous reader quotes a report from The Guardian: Microsoft's MS-DOS (and its IBM-branded counterpart, PC DOS) eventually became software juggernauts, powering the vast majority of PCs throughout the '80s and serving as the underpinnings of Windows throughout the '90s. But the software had humble beginnings, as we've detailed in our history of the IBM PC and elsewhere. It began in mid-1980 as QDOS, or "Quick and Dirty Operating System," the work of developer Tim Paterson at a company called Seattle Computer Products (SCP). It was later renamed 86-DOS, after the Intel 8086 processor, and this was the version that Microsoft licensed and eventually purchased.

Last week, Internet Archive user f15sim discovered and uploaded a new-old version of 86-DOS to the Internet Archive. Version 0.1-C of 86-DOS is available for download here and can be run using the SIMH emulator; before this, the earliest extant version of 86-DOS was version 0.34, also uploaded by f15sim. This version of 86-DOS is rudimentary even by the standards of early-'80s-era DOS builds and includes just a handful of utilities, a text-based chess game, and documentation for said chess game. But as early as it is, it remains essentially recognizable as the DOS that would go on to take over the entire PC business. If you're just interested in screenshots, some have been posted by user NTDEV on the site that used to be Twitter.

According to the version history available on Wikipedia, this build of 86-DOS would date back to roughly August of 1980, shortly after it lost the "QDOS" moniker. By late 1980, SCP was sharing version 0.3x of the software with Microsoft, and by early 1981, it was being developed as the primary operating system of the then-secret IBM Personal Computer. By the middle of 1981, roughly a year after 86-DOS began life as QDOS, Microsoft had purchased the software outright and renamed it MS-DOS. Microsoft and IBM continued to co-develop MS-DOS for many years; the version IBM licensed and sold on its PCs was called PC DOS, though for most of their history the two products were identical. Microsoft also retained the ability to license the software to other computer manufacturers as MS-DOS, which contributed to the rise of a market of mostly interoperable PC clones. The PC market as we know it today still more or less resembles the PC-compatible market of the mid-to-late 1980s, albeit with dramatically faster and more capable components.

This discussion has been archived. No new comments can be posted.

Oldest-Known Version of MS-DOS's Predecessor Discovered

Comments Filter:
  • Old memories (Score:2, Interesting)

    by ddtmm ( 549094 )
    I don't remember QDOS but I remember QDOS II (or QDOS 2?) from about 1985-1987 or so, which I think was stood for Quick DOS and ran on top of DOS. It was a great graphical way (as in, ASCII graphics) to use DOS for people that didn't really know how to use DOS. I wonder if the latter was a related to the same company.
    • I still have a 5.25" floppy disk that is labeled, in handwriting, IBM DOS 1.0. The disk hasn't been readable for many years, even if I did have a drive to put it in.

      • by gavron ( 1300111 )

        I have one of those too! The label is fully readable as is the QR code I hadnprinted.
        E

        • by vbdasc ( 146051 )

          Positively amazing, provided that QR code was invented about 15 years after your diskette was made.

          • by guruevi ( 827432 )

            The QR standard was re-invented recently. 2D barcodes came into existence soon after the creation of what we now understand to be barcodes and modern people will now misnomer older 2D barcodes (which I've seen on insurance and car registration documents for decades) as QR codes.

      • Re:Old memories (Score:5, Interesting)

        by ArchieBunker ( 132337 ) on Tuesday January 02, 2024 @11:43PM (#64126767)

        5.25" floppies tend to last much longer than their 3.5" counterparts. The 8 inch floppies last practically forever because the density isn't that high.

        • I wouldn't know! I do have an 8" floppy in my collection, but have never had a device that could read it.

        • by vbdasc ( 146051 )

          AFAIK, 8 inch diskettes use the same material and the same low-level record density as 360Kb 5.25 inch diskettes. The ubiquitous 8-inch 250Kb diskette format seems so small in comparison due to writing only one side of the diskette and using FM recording (frequency modulation) instead of MFM.

        • by tlhIngan ( 30335 )

          5.25" floppies tend to last much longer than their 3.5" counterparts. The 8 inch floppies last practically forever because the density isn't that high.

          Actually, 8" disks are plenty dense - the technology behind the 1.2MB format actually came from the 8" format, just shrunk down. 8" floppies actually can hold several MB of data.

          In fact, many people have connected 8" drives to PCs - there are adapters now that adapt the floppy 34-pin to the 8" 50 pin interface, and setting your BIOS to 5.25" 1.2MB mode is suf

          • The typical format used with an 8" disk was 360kB or less. IIRC my Altos formatted to even less than that, like 300 or 320kB. So the sector sizes are enormous.

          • by vbdasc ( 146051 )

            8" floppies actually can hold several MB of data.

            How exactly? The math is simple. These floppies use 500 kbits per second raw data transfer when formatted with MFM, and spin with 6 revolutions per second (yes, these are exactly the same numbers as the 1.2MB 5.25'' HD floppies). This makes about 83.3 raw kbits per track. Multiply this to 2 sides and 77 cylinders, and you get about 1.6MB maximum theoretical capacity (because you can't use all the raw bits for data)

            A commonly used large 8'' floppy format was 1232KB (77 cylinders x 2 sides x 8 sectors per tra

      • by AmiMoJo ( 196126 )

        There is a decent chance that the data can be recovered, if you think it might be valuable. Data recovery from floppy disks has come a long way in recent years.

    • Every time I think of early DOS versions, one command always bubbles up from the depths of my memory:

      PROMPT $P$G

      --
      Abort, Retry, Fail?

    • Although I never touched 366DOS, I did have an occasion to use EDLIN and frankly, I was not impressed at all

      It did, however, allow me to edit the config.sys file that I bungled up, while every other tool to edit text files was unavailable

      All hail EDLIN the tool you never wanted to use, particularly when it was the only tool you could use

      • Unix was not any better in this respect. Remember, ed(1) is the standard editor [wikipedia.org].
        • Re:Old memories (Score:4, Informative)

          by BigZee ( 769371 ) on Wednesday January 03, 2024 @06:23AM (#64127219)
          vi was introduced in 1976. Keep in mind that ed was written when using teletypes was still fairly common.
          • Yes, vi has page editing capabilities without EDLIN's weird (imo) one line at a time approach.

            My initial experience was with DECVAX (DECTPU) and DECStations (community college night job at a DEC sponsored lab), so stepping backwards into EDLIN in the late 80's was no fun. However this was all because the civil engineering company my day job was at used PC's for CAD and Desktop Publishing.

            I do not remember the details, but I was likely attempting to set up memory management on a 286 with a whopping 2MB of R

      • You could always use COPy CON: CONFIG.TMP and type it in one line at a time. Not the most elegant method, but it works okay for simple files. Make sure you like it, then rename.

        That worked with only a bootable disk.
        • Hmmm, Should have been workable on a single drive machine. Boot with a disc we'll label "A:" (this disc will also need to contain "COPY.EXE") ; enter the command "COPY CON B:\TEMP.TXT" ; after "COPY.EXE" has been loaded, the OS will prompt you to insert "disc B:\". Which you do, you create your file, hit CTRL-C ; the OS will write the data to B:\TEMP.TXT" then prompt you to re-insert "disc A:\", from which it will re-load any OS parts which had been unloaded before returning the command prompt.

          A fankle, fo

      • As a DOS user for quite some years (My first PC was an IBM PC 5150 with 64kB on board, and 384kB on an ISA card) the very first thing copied to a new install was always an editor. Or if I had space on the floppy I was using at the time, Norton Commander, which had an editor.

        I had a 30 MB Quantum external MFM disk on my PC, but it eventually died.

        • I had a 30 MB Quantum external MFM disk on my PC, but it eventually died.

          My first computer came with a 42MB MFM hard drive, but I later re-formatted it with an RLL-capable controller card and it automagically became a 60MB hard drive. Very tricksy!

        • If your first computer had a disk drive, you were not a bleeding edge technology user.
    • Re: (Score:3, Interesting)

      I remember these times and was acquainted with some of those early Seattle area hackers-- hackers before the word was corrupted. I've suffered a stroke a while back and it hinders my recollection of names, but I have a lot of memories. I had a friend, Rodger Modeen, a Boeing engineer who took an interest in 'phone freaking' and was an acquaintance of the notorious 'Captain Crunch', the guy that found he could manipulate the phone system using the whistle from the cereal box to generate the 2600 cps tones.
      • Re:Old memories (Score:4, Informative)

        by vbdasc ( 146051 ) on Wednesday January 03, 2024 @04:21AM (#64127099)

        Another hotbed of activity was some folks who worked at a South Seattle fire station. These, in fact, were the ones who took the circulating pirated source code and created the package that became the Seattle Computer product that Bill Gates purchased for $50,000, resold to IBM, and then seized himself as MSDOS.

        I do not mean to disrespect your valuable information, but the Seattle Computer product (QDOS/86-DOS) was not pirated, its source code wasn't circulating (maybe you meant CP/M source code?) and its creator Tim Paterson might have been working shifts in a fire station (I have no idea), but he is more well known to have been an employee of Seattle Computer, where he was also designing and soldering S-100 boards that his company was selling.

        The theory that QDOS/86-DOS was somehow using pirated source code from CP/M was once fairly widespread, but then was completely debunked decades ago, These days, with the availability of the source code of early DOS, everyone can check it for themselves.

    • QDOS = "Quick and Dirty Operating System"
  • by Comrade Ogilvy ( 1719488 ) on Tuesday January 02, 2024 @11:44PM (#64126769)

    ...our Chief Scientist would proclaim that now we have found Patient Zero, it is possible to devise a cure for Microsoft.

  • ...now it all starts to make sense.

    The subsequent over-complexity, inconsistency, and arcane limits of DOS commands and Windows APIs must have roots in the Quickness and Dirtyness of the original OS that Microsoft bought and, again quickly, adapted and extended.

    In fairness, they were dealing with the abomination of CPU chip design that was the 8086 processor. IBM's monstrosity of a chip can be contrasted with the Motorola 68000 of the same era. Take a look at their instruction sets side by side sometime if
    • by AmiMoJo ( 196126 ) on Wednesday January 03, 2024 @01:12AM (#64126887) Homepage Journal

      It was Intel's chip. IBM nearly went with the Motorola 68000, but couldn't secure a second source for it.

      The 68k is much nicer. Lots of registers (for the time) and support for 24 bits of flat address space.

      • by vbdasc ( 146051 ) on Wednesday January 03, 2024 @04:42AM (#64127123)

        The 68000 was created before its time, in a technical sense. It struggled to reach the projected yields and frequencies. It's telling that the Mac and the Amiga didn't appear before 1984. While the Lisa was released in 1983, its price was exorbitant.

        • by AmiMoJo ( 196126 )

          Another issue was slow RAM. The Amiga was heavily constrained by RAM speed, and therefore difficult to get good performance from when all the other hardware was competing for cycles too.

          It wasn't until the mid 80s that faster RAM became available. Even then, it wasn't cheap. The PC Engine being able to run its 6502 derivative at 8MHz was a real coup, and it was often faster than 68k for game code.

          • by Waccoon ( 1186667 ) on Wednesday January 03, 2024 @07:32AM (#64127285)

            In the early 80's, almost all computers at that time period used shared memory architectures. There was nothing wrong with the architecture of the original A1000 and A500.

            The real problem is that Commodore never updated the Amiga's memory controller over time. The AGA Amigas used modern 80ns RAM chips, but the memory controller (Alice) was hardwired to use 150 ns timing, which seriously starved the CPU. Also, the Amiga's logic didn't produce data cache control signals, so if you had a 68030, it couldn't run correctly without an extra memory controller, making upgrades very expensive. Trying to use a 68040 on an Amiga made things even more complicated, as it required burst mode operation and Agnus/Alice couldn't do that.

            The early Amigas were great. The later ones were a mess of hacks that didn't really work.

            • by AmiMoJo ( 196126 )

              I'm not saying it was a flaw exactly, but consider that the original Amigas were so short of memory bandwidth that 32 colour modes had a massive impact on available CPU and blitter cycles. High res modes paused the CPU and blitter during each scanline because all access slots were needed by video DMA.

              Getting good performance requires understanding and managing DMA slots. Modern developers can use WinUAE to make it easier, but back in the day you pretty much had to figure it out by experimentation. The Hard

          • Another issue was slow RAM. The Amiga was heavily constrained by RAM speed, and therefore difficult to get good performance from when all the other hardware was competing for cycles too.
            It wasn't until the mid 80s that faster RAM became available.

            There were SRAM expansions for Amigas. It installed as fast RAM obviously, which is where you need the speed anyway.

          • Another issue was slow RAM. The Amiga was heavily constrained by RAM speed, and therefore difficult to get good performance from when all the other hardware was competing for cycles too.

            It wasn't until the mid 80s that faster RAM became available. Even then, it wasn't cheap. The PC Engine being able to run its 6502 derivative at 8MHz was a real coup, and it was often faster than 68k for game code.

            Only the first 512k of RAM was constrained, and even then the custom chips hit the RAM on odd cycles (the 68000 can only read memory on even cycles) to keep things moving at a decent pace.

            Amiga fast ram is basically RAM above 512k, and the OS would prioritize (if the Fast RAM first program was run ) the fast RAM for a performance boost.

            • by Bert64 ( 520050 )

              What you're thinking about was called "chip ram", which was shared between the amiga custom chips and the cpu. Where "fast ram" was only accessible by the cpu and could be much faster.
              The limit was only 512k on the earliest models, it was increased to 2mb and remained there until the demise of commodore.

            • by AmiMoJo ( 196126 )

              Most people only had 512k, or 1 meg but the extra 512 was "slow RAM", which the chipset couldn't access but which was on the same bus so the CPU had to wait for access anyway.

              • Only on the 1000. By the A500 and A2000, anything beyond your first 512kB (or for later A500s, 1MB, but even then only if you did some additional work) was fast ram.

                • by AmiMoJo ( 196126 )

                  I don't think so. The A500 definitely had slow RAM as well. I think the key thing was that it could be fast RAM, but to keep costs down it could also be controlled by the chipset (Agnus?) so that a controller/bus arbiter was not needed.

      • by Agripa ( 139780 )

        The 68000 had disadvantages compared to the 8088.

        It did not support the existing inexpensive 8-bit peripherals, it had no legacy compatibility with CP/M applications, and there was no 8-bit version which allows for narrower memory and half as much bus logic to save costs.

        Later disadvantages of the 68000 include double indirect addressing and inability to restart from a fault because of lost state.

    • I wonder what would have happened if they'd started with some tiny embedded OS kernel like FreeRTOS or RTEMS, which would easily run on an 8088 with little memory, rather than a hacked-up bootstrap loader with extra crap kludged onto the side of it over time?
      • by AmiMoJo ( 196126 ) on Wednesday January 03, 2024 @03:48AM (#64127039) Homepage Journal

        To be fair, DOS itself isn't that bad. Most of the issues are to do with the 8086 and the IBM BIOS. The IBM architecture and ISA bus are not great either, particularly the way interrupts work.

        The most famous issue is due to the 8086's 1MB address space. Some of it is needed for things like the video memory and the various components that made up the IBM Personal Computer, leaving a maximum of 640k for RAM. Of course, the original machines came with as little as 16k RAM, so it probably didn't look like a major limitation at the time.

        By the standards of the day, i.e. against things like CPM, it wasn't that bad.

        • Well, that's only because CP/M was pretty bad so almost anything was better than it. Compared to contemporaneous systems like OS/9, DOS was pretty dire.
        • That was one of the rather neat things about the BBC, the teletext main screen mode took up a mere 1K of ram while allowing moderately rich semi graphical interfaces, so you got almost the full 32K for data.

          • by AmiMoJo ( 196126 )

            I seem to recall its 6502 was running at 2MHz, so they must have splashed out on 4MHz RAM to use the "every other cycle" trick for the video hardware.

            Actually I seem to remember the fast RAM being a difficult part of the development. Was it only one Japanese manufacturer that offered it?

            The 6502 was nice like that. The Amiga was complex and hard to get good performance from in no small part due to the way it had to arbitrate RAM access, forcing the programmer to really think about it.

            • I seem to recall its 6502 was running at 2MHz, so they must have splashed out on 4MHz RAM to use the "every other cycle" trick for the video hardware.

              IIRC, and frankly this is going a way back so I may not RC, the 6502 took two ticks to do anything. I don't know if that meant that the it actually only hit the RAM every other cycle.

              • by AmiMoJo ( 196126 )

                It was something like that, basically you run the RAM twice as fast as the CPU and every other cycle is available for video etc

        • Just to note, the original IBM PC came with an 8088, not an 8086 (PC-XT and PCjr also). That doesn't change any of the above though, the chips are essentially the same except the 8086 has a 16-bit data bus and the 8088 an 8-bit data bus. The PS/2 models 25 and 30 along with the AT&T 6300 used an 8086, but the 8088 or 80286 seemed to be more commonly used.
          • by AmiMoJo ( 196126 )

            True indeed. The result was an 8 bit memory bus and 8 bit ISA, further limiting performance.

    • I actually started construction of a 6800 system, but as I've said many times, "If you live near Seattle, you've got to know Microsoft."
    • by vbdasc ( 146051 )

      ...now it all starts to make sense.

      The subsequent over-complexity, inconsistency, and arcane limits of DOS commands and Windows APIs must have roots in the Quickness and Dirtyness of the original OS that Microsoft bought and, again quickly, adapted and extended.

      If we're talking about APIs, original DOS stole them from CP/M, DOS 2.0 stole from Unix, DOS 3.0 stole some from IBM OSes, and Windows stole their API from the Mac. So, if you find DOS/Windows APIs cumbersome, it's the originals' fault :)

      monstrosity of a chip [i8086] can be contrasted with the Motorola 68000 of the same era. Take a look at their instruction sets side by side sometime if you have time, with an eye for elegance, orthogonality, simplicity, consistency of design...

      Funny, the MC68000 architecture is quite reminiscent to IBM's own System/360 from 1965.

      • The 68000 was a great chip. Unfortunately, Motorolla didn't have a working 68000 at the time IBM was shopping.
        While the Intel 8088 was inferior, it was 1) available, 2) familiar to IBM Engineers (I beleive as a micro-controller).
        This diffictulty in making a working product is also the reason Acorn (as in ARM) built their own (and better) version of the 68000.

        Western Digital was created by ex-Motorolla employees who created a better 6800 chip, ie the 6502. See a pattern developing?

        • by Agripa ( 139780 )

          The 68000 was a great chip. Unfortunately, Motorolla didn't have a working 68000 at the time IBM was shopping.

          The 68000 was available. What was not available yet was the 68008 with an 8-bit external bus which would have kept the system price down.

      • by HBI ( 10338492 )

        The only accurate statement there is that DOS cloned some CP/M system calls into int 21h to ease translation.

        • by Agripa ( 139780 )

          The only accurate statement there is that DOS cloned some CP/M system calls into int 21h to ease translation.

          And cloned various data structures to be compatible with translated CP/M applications.

  • by dlarge6510 ( 10394451 ) on Wednesday January 03, 2024 @04:01AM (#64127059)

    > which contributed to the rise of a market of mostly interoperable PC clones

    That made for fun times. As a kin in the 90's I mostly started with win 3.11 after my C64 but of course had to deal with DOS 6.22 during that time too.

    I still like using DOS on real hardware for many things, PIC programming and other serial stuff. I practically abandoned dos and windows once I found GNU/Linux, those early days of home computing were full of discovery, upgrades, compatibility and, SHENANIGANS!

    The Microsoft FUD documents, Bill Gates vs Richard Stallman, Linus Tovalds, Bruce Perence, Eric S Raymond and so many others. Microsoft being hauled into court for antitrust reasons, all the incompatibilities between IBM clones and of course the blatent artificial incompatibilities between MS-DOS, PC-DOS and DR-DOS.

    You dont get that today. Everything works and anything new is highly complicated, written in the language of the year, possibly cloud based and may also be bloated simply because we are so spoilt with GiB's of memory, TiBs of storage and multiple cores running several GHz's. Lean, clever, efficient code need not apply and almost every bit of hardware is the same as any other bit and is very mich plug and play.

    The names involved today do include some of the old ones, but mostly are names of those running social networks which are as interesting as mud in my opinion.

    No wonder so many like retro tech, you actually had to learn stuff and think and troubleshoot.

    • All agreed. All CENG grads should be exposed to Carmack's inverse square root algorithm, to appreciate the world they are in and set the standards high.
  • Archeologists discovered an early form of the wheel. The wheel, having nearly a third of its surface rounded left the discoverers dumbounded with amazement. "I couldn't believe it!. A third of a wheel. It almost works." said Dwayne Dibbly. "It's very much like that early version of DOS."
  • "the site that used to be Twitter." ...reposted on a site that used to be relevant and not 2-7 days behind on info/news...lol
  • But for me the version where DOS started getting good was 3.3 (or I guess 3.31). And 5.0 had the biggest impact on my life, because I really got into QBASIC. It was so much easier to learn with the built-in help compared to GW-BASIC, despite being very similar languages (minus the line numbers)

To communicate is the beginning of understanding. -- AT&T

Working...