Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing Cellphones Java Hardware Technology

Mobile Phones vs. Supercomputers of the Past 247

An anonymous reader writes "The recently published Top 500 list of the world's fastest supercomputers is based on the Linpack benchmark developed decades ago by Jack Dongarra. This same test has been ported to Android mobile phones, which means that we can compare the performance of our phones against that of the supercomputers of the past. For example, a tweaked Motorola Droid can hit 52 Mflop/s, which is more than 15 times faster than the CPUs used in the 1979 Cray-1." But even today's most powerful cellphones don't come with an integrated bench.
This discussion has been archived. No new comments can be posted.

Mobile Phones vs. Supercomputers of the Past

Comments Filter:
  • by Pojut ( 1027544 ) on Thursday June 03, 2010 @01:56PM (#32448928) Homepage

    ...make me kinda sad. On the one hand, I LOVE when I was born (1984). I'm old enough to remember a time without the Internet, without a PC in every home, and when cell phones were the size of briefcases...yet I'm still young enough to take advantage of technological innovations, keep up with advances, and appreciate the impact it has on our lives.

    On the other hand, I wonder how much amazing stuff I would see had I been born even just 20 years later. In my lifetime I have already watched (for example) the NES as a state of the art system turn into the average gaming PC having a video card capable of over 1 teraflop worth of processing power. How much extra innovation and advancement would I see if I had STARTED with those 1+ teraflop cards?

    "Fifteen hundred years ago everybody knew the Earth was the center of the universe. Five hundred years ago, everybody knew the Earth was flat, and fifteen minutes ago, you knew that humans were alone on this planet. Imagine what you'll know tomorrow." -Kay

    • by sznupi ( 719324 )

      On the bright (sort of...) side - later time of your birth probably wouldn't prevent you from being dismissive of new things, from certain point, anyway.

    • Re: (Score:3, Interesting)

      by __aapspi39 ( 944843 )

      At the risk of appearing pedantic it's worth pointing out that not as many people thought that the world was flat as is commonly believed -
      http://en.wikipedia.org/wiki/Myth_of_the_Flat_Earth [wikipedia.org]

      • Re: (Score:3, Insightful)

        by sznupi ( 719324 )

        While technically true, one can't help but wonder what the prevalent folk views were.

        Hey, even now some "theories" are just arbitrarily dismissed a bit too commonly...

    • If the transhumanists are right [wikipedia.org], you may have been born just in time for the really cool stuff.
      • They're not, in my opinion at least. I was a true believer for a while. But the actual progress just isn't matching the hoped for predictions. From a life extension perspective, there's really no chance. I've been following things for about a decade now, and the progress has been pretty much zilch. And it's not that much better from the concept of uploading. Fmri is still king of brain scanning. And as great as it is, it's about as close to what we'd need there as saying that someone is immortal because the
        • Fmri is still king of brain scanning.

          While fMRI is currently the dominant technique for scanning brain activity, it wouldn't be used for "uploading your brain". In order to preserve the information in your brain we really need the network diagram. Currently there a few groups in the world automating the process of cutting and scanning neural tissue with electron microscopy: Dr. Winfried Denk - serial block-face scanning electron microscopy and Dr. Clay Reid [acm.org], to name two. Currently, they can only scan a 1mm^3 piece of brain. But it's really

          • Re: (Score:3, Insightful)

            by sznupi ( 719324 )

            ...so it would really boil down to how useful running your brain simulation is to the rest of humanity. Guess the answer to that.

            • It occurs to me that much of what our brain is doing involves things like breathing, standing upright, and processing sensory input. If we really understood the brain, we might be able to ignore the parts that would not be needed for a disembodied brain.... reduce the storage and the computational requirements.....

              It is hard for me, and yes, i am a neuroscientist, to guess how far away we are from understanding the brain enough to safely exclude bits.

              • by sznupi ( 719324 )

                When the question really becomes "which parts of our brain make us?", perhaps it's not so certain anymore that messing directly with the brain is required at all? (I somehow wrote already about it in a nearby post [slashdot.org])

                More generally, an honest answer to question "what is our essence?" might prove unpopular for people who wish, one way or the other (many old ways around the world in every culture...), for individual immortality; while BTW forgetting they have become quite dissimilar to themselves from two or thr

              • by Lennie ( 16154 )

                If we can make a copy like above, we can experiment the hell out of it, leave all kinds of parts 'turned off'.

                You aren't hurting a real human, are you ? Hmm, interresting ethics debate that's gonna be.

                • If we can make a copy like above, we can experiment the hell out of it, leave all kinds of parts 'turned off'.

                  You aren't hurting a real human, are you ? Hmm, interresting ethics debate that's gonna be.

                  Those are great questions. If we do a proper job at emulation, you certainly could cause pain by manipulating the activity in the circuit. If we emulate the full body experience then taking away breathing or heart beat might be quite disturbing even though it shouldn't "harm" the mind. Ya... the future is going to be an interesting place!

        • by sznupi ( 719324 )

          I think waiting for any "breakthrough" is a grieve error, generally. And nothing new, humans wish for such "breakthroughs" in their individual immortality for a long, long time - we had many ressurection deities; and early Christians were absolutely convinced they will see a breakthrough very quickly, basically within a generation.

          But wishes rarely work out as predictions of future events; when the latter do happen anyway, people are generally taken by surprise or at the least didn't see it coming in quite

        • by 0123456 ( 636235 )

          But the actual progress just isn't matching the hoped for predictions.

          I remember some famous SF writer (I think it was Clarke) pointing out years ago that people tend to overestimate short-term progress and underestimate long term because our expectations tend to be linear while progress -- absent government regulation -- tends to be exponential.

          Moore's law is an obvious example. Telling someone in 1990 that in twenty years you'd be able to buy the equivalent of a Cray Y-MP (which is about what my Atom-330 benchmarks as) for $50 and it would be about the slowest mass-market C

          • Re: (Score:3, Interesting)

            by takev ( 214836 )
            No they wouldn't in 1990 everyone that was actually buying machines like the Cray had knowledge of More's law.

            In fact articles from that time where talking about how to use More's law together with an estimation of how long a calculation would need to run, to decide when to buy the computer to finish said calculation quickest (provided that you couldn't or wouldn't upgrade the computer while the calculation was running).

            This included economic calculations about the price of hardware, inflation and interest
      • by sznupi ( 719324 )

        I don't really see how "transhumanist" is applicable in the case of people very much clinging to their individual lives. Which is very...good ol' human-like; and quite typical generally.

    • by corbettw ( 214229 ) on Thursday June 03, 2010 @02:15PM (#32449184) Journal

      I was born in 1971. Which means if I were a computer I would be obsolete and replaced by a faster, younger model with prettier looks.

      Come to think of it, maybe I am a computer....

      • by jollyreaper ( 513215 ) on Thursday June 03, 2010 @03:27PM (#32450214)

        I was born in 1971. Which means if I were a computer I would be obsolete and replaced by a faster, younger model with prettier looks.

        Come to think of it, maybe I am a computer....

        Take heart! There might be an older, poorly-dressed, socially stunted computer geek willing to collect you for the sheer historical value. Of course, that usually means stored with a dozen other castoffs in the basement. I don't like where this is going.

      • I was born in 1971. Which means if I were a computer I would be obsolete and replaced by a faster, younger model with prettier looks.

        By those criteria, if I were a computer, I'd be running on vacuum tubes and large enough to fill a good-sized room.

        Uh, come to think of it...

    • by iamhassi ( 659463 ) on Thursday June 03, 2010 @02:18PM (#32449218) Journal
      "On the other hand, I wonder how much amazing stuff I would see had I been born even just 20 years later (than 1984)"

      If you were born in 2004 you would have missed out on everything. All you'd know is multi-core processors, terabytes and petabytes, touchscreen everything, wireless internet everywhere, 24/7 access to everyone you don't really know and directions to anywhere from anywhere available in your pocket. You'd have no appreciation for any of it and probably know nothing about computers because modern operating systems are far better than offerings in the 90s.

      Trust me when I say you were born at the right time.
      • by nomadic ( 141991 )
        Trust me when I say you were born at the right time.

        Hey it's Paul Simon!
      • by sznupi ( 719324 )

        Trust me when I say you were born at the right time.

        That's probably one of the most prevalent misconceptions in recorded history, right beside "the demise of youth will doom the civilisation soon"...

    • by grumbel ( 592662 )

      On the other hand, I wonder how much amazing stuff I would see had I been born even just 20 years later.

      Answer: Not much, as you would take it all for granted and be unable to appreciate it [youtube.com].

    • by natehoy ( 1608657 ) on Thursday June 03, 2010 @02:42PM (#32449592) Journal

      I'm only a couple of decades older than you. I agree with you, but I also realize that I take it as a given that, during the course of my lifespan, there's always been television (not color to start with, but there was TV), that indoor plumbing and lights have always been around, flight is not only possible but commonplace and pretty much always has been, and the moon landing happened before I was born.

      A part of me regrets missing the introduction of all of those exciting technologies and innovations, because to me they are all background things that just are. They aren't wondrous, they just are.

      No matter where you live in history, there are always improvements that you'll appreciate, but there's always amazing stuff that was there before that you will only see as part of the world as it's always been, and will be even more amazing stuff that will come after you that would probably blow your mind if you ever had the chance to see it (or would be so far beyond your comprehension you couldn't appreciate it).

      You don't truly appreciate the amazing parts of an advance unless you've watched those parts happen.

      To me, computers (and video games, etc), color/stereo televisions, microwaves, mobile phones, digital wristwatches, and many of the things you no doubt take for granted are marvels. When I was a kid, they largely did not exist. Which is not to say they all of them were completely unavailable, but when I was growing up no one I knew owned any of them and they were brand new.

      I both envy my grandparents (now all dead) and my yet-to-be-born grandchildren the wonders of their lifetimes that I will never see they way they do. The wonders of my grandparents are my commonplace items. The wonders of my grandchildren are probably beyond my imagination.

      But that's just human nature. We want to see it all. And eventually we learn we'll never succeed. It's both heartening and saddening at the same time.

      • by schon ( 31600 ) on Thursday June 03, 2010 @04:35PM (#32451054)

        when I was born (1984)

        I'm only a couple of decades older than you. [...] during the course of my lifespan, there's always been television (not color to start with, but there was TV), that indoor plumbing and lights have always been around, flight is not only possible but commonplace and pretty much always has been, and the moon landing happened before I was born.

        .. and people could always do simple arithmetic.

    • Re: (Score:3, Insightful)

      by Kjella ( 173770 )

      I think it really all depends on perspective - my old man worked with everything from vacuum tube computers and magnetic core memory up to PCs before he retired, he thought the advancements were pretty damn amazing. If you want to crown the most revolutionary time of computers there's very heavy competition. The 40s saw the first real computers, the 50s the transistor, the 60s the mainframes, the 70s the minicomputer, the 80s the PC, the 90s Internet, 00s mobile devices and wireless. Every one of them a rev

    • Re: (Score:3, Insightful)

      by tuomoks ( 246421 )

      A good time to be born, LOL. Don't take seriously the replies from other youngsters, they know maybe less than you how it was before. Cell phones size of briefcase were actually in use already -82, well, cell is a strong word, they were NMT phones but the size is correct. I had one to carry with me and one in car, ouch! Yes, the hardware efficiency has gone up a lot, the problem, the waste in software has grown even faster so in many ways we are still on same level. Fun, games, beautiful(?) pictures, etc ar

    • I LOVE when I was born (1984). I'm old enough to remember a time without the Internet

      No you're not. The Web is not the Internet. Good thing you're still young enough to be schooled, kid! :)

  • Time machine (Score:3, Insightful)

    by MrEricSir ( 398214 ) on Thursday June 03, 2010 @01:57PM (#32448938) Homepage

    So if I read this correctly, the point of this article is we should get a time machine so we can go back to the 70's and impress people with our smartphones?

    See the problem here is that they won't have wifi or 3G coverage. All we'll be able to do is show those people of the ancient past Angry Bird and maybe one of those "pull-my-finger" apps. It just won't be all that impressive.

    • Re: (Score:3, Insightful)

      I was about to say, all this computing power finally in the hands of the ordinary person and what's the most popular application? Fart Button...

    • by Anonymous Coward on Thursday June 03, 2010 @02:03PM (#32449034)

      ...by our time machines and shaved privates.

    • by kindbud ( 90044 )

      So if I read this correctly, the point of this article is we should get a time machine so we can go back to the 70's and impress people with our smartphones?

      It's already happened. How do you think we got the tech in the first place?

    • by mcrbids ( 148650 )

      Taking the 15x performance increase over the 1979 Cray, we find that there are about 4 doublings to get 15x (16x) meaning that the android phone roughly compares to a 1985 Supercomputer, which doesn't surprise me at all. My cheap, now antiquated WinMo smart phone easily plays 486-era DOS games in a virtual box emulator, despite being a radically different chipset. (Arm, not x86) So factor in approximately 50% cut in performance due to emulation, and you have my phone demonstrably comparing to (at least!) a

    • See the problem here is that they won't have wifi or 3G coverage.

      That's the minor problem. You can fake a connection. I would be thinking more of the missing content. Back then UNIX man pages were the most exciting deal around. And BTW, today they're still pretty cool.

  • Integrated bench (Score:3, Interesting)

    by Animats ( 122034 ) on Thursday June 03, 2010 @01:58PM (#32448962) Homepage

    It's sad. I was at the Computer Museum in Mountain View a few years ago, where they had a Cray-I in a corner of the lobby, just sitting there used as a bench. It's not even labeled; some visitors think it's just furniture.

    • I saw that on Wikipedia just now and thought the same thing, it would be like turning the retired Space Shuttles into a restaurant.

      Very undignified end for a brilliant piece of engineering.

      • by Aladrin ( 926209 )

        Are you kidding me? That restaurant would make so much money!

      • Actually, no!

        If done *properly* with by someone with visionary capital, a really decked out Restaurant with future tech would be 2025-Now.

        But no, we'll get some twerp with a Meijer background who would want to make it kitschy.

    • a suitable penalty for such disrespect would be to work out all the prime numbers up to 10,000, on paper. ...and then to chop down the highest tree in the forest with a herring

  • 1979 tech still wins (Score:5, Informative)

    by Skarecrow77 ( 1714214 ) on Thursday June 03, 2010 @02:09PM (#32449104)

    For example, a tweaked Motorola Droid can hit 52 Mflop/s, which is more than 15 times faster than the CPUs used in the 1979 Cray-1.

    "The Cray-1 had 12 pipelined functional units" and had "floating point performance generally about 136 MFLOPS. However, by using vector instructiosn carefully and building useful chains, the system could peak at 250 MFLOPS."

    http://en.wikipedia.org/wiki/Cray-1 [wikipedia.org]

    • by Hatta ( 162192 )

      Now rate them for watts/FLOP and tell us who wins.

    • Ya, but let's see the Cray-1 make a phone call and then fit in your pocket.
    • by imgod2u ( 812837 )

      I'm curious about Linpack results when running under Dalvik. I believe Froyo 2.2 enabled Dalvik start using the FPU of the ARM ISA. So what were the results before? Emulated floating point?

      On top of that, the ARM ISA allows for SIMD operations. I would assume the VM isn't capable of that.

      If we're to compare processing power, the Cortex A8 at 1GHz (A4, OMAP 3640, overclocked Droid) is capable of a vector multiply (2 at a time) every 3 cycles and add in 2 cycles. So that's about 400 MFLOPS.

    • I also seem to recall Cray-1 had 64 bit words, hence 64 bit floats. All I can find for ARM is 32 bit floats.
  • Speed (Score:3, Funny)

    by DebianDog ( 472284 ) <dan@dAUDENanslagle.com minus poet> on Thursday June 03, 2010 @02:19PM (#32449236) Homepage

    Did they think they could run their website on a Droid too? Man it is slow.

    • Re: (Score:3, Funny)

      by Bigbutt ( 65939 )

      No no, it's running on a Cray-1. If it were running on a Droid, it'd be a bit faster.

      [John]

  • by flaming-opus ( 8186 ) on Thursday June 03, 2010 @02:24PM (#32449304)

    I thought it was strange that the article author was reporting that a cray 1 only produced 3.4 mflops on linpack, which had a peak performance around 130 mflops. Looks like the author doesn't understand the benchmark very well.

    If you look at the data quoted in the article, the n=100 result gives the Cray1 a score of either 3 or 12 mflops, depending on which entry you look at. There is no n=1000 result listed for the Cray 1, but one can expect, looking at the Cray XMP results, that it would be around 100, given the peak performance. The ETA10 would likely get a couple thousand mflops on linpack with n=1000.

    The Cray 1 is more than a little dated. That said, if you look at supers from the early 90's, they still can do things that modern commodity hardware can't. As fast as a xeon or opteron is, it doesn't have 300Gbytes/second of memory bandwidth. Even late-80's supercomputers exceed desktops in some metrics, though probably not in raw ALU performance if the data all fits into L1 cache. The cost to run a desktop, however, is pretty compelling, and they don't leak freon when they run.

    • by afidel ( 530433 )
      Actually an 8 way Opteron with the newest processors will have a STREAM rate of ~320GB/s and have a heck of a lot more than 8GB of ram available =) Besides the T90 was mid 90's, not early =)
  • Mobile Phone: a device that can make telephone calls and can be easily transported in a pocket or purse.
    Supercomputer: a computing device that people call 'super'.

    One is a quantitative definition, and one is a qualitative definition. I will let you decide which is which!
    • Mobile Phone: a device that can make telephone calls and can be easily transported in a pocket or purse.

      Actually, that last part is a fairly modern result.

      Initially, "mobile" phones were huge. I remember some that were basically a brief-case sized battery with a corded phone attached. No way you could put 'em in a pocket or a purse.

      Just like some of the early "portable" computers (luggables) were still heavy boxes with a CRT in it -- sure, you could move them from one place to another more easily than a d

  • by petes_PoV ( 912422 ) on Thursday June 03, 2010 @02:29PM (#32449390)
    Not rendering bitty little colour screens or scanning for viruses. Plus the code was written to extract every last drop of power out of the architecture. So when you compare the amount of WORK a machine from the 70s or 80s did (my university's mainframe had a FORTRAN complier that needed less that 131kWord of memory - today the GRUB bootloader is bigger than that) with a more modern box, with all its overheads and inefficiencies, the balance isn't as great as the scoffers might think.
    • That's because in the time it takes to optimize everything into itty-bitty pieces, the next generation of hardware comes out and is faster without bothering. There are operating systems out there written entirely in assembly, and assuming they're done properly I can imagine they are quite lean... but it takes forever to add features.
    • by somenickname ( 1270442 ) on Thursday June 03, 2010 @02:36PM (#32449484)

      Not rendering bitty little colour screens or scanning for viruses. Plus the code was written to extract every last drop of power out of the architecture. So when you compare the amount of WORK a machine from the 70s or 80s did (my university's mainframe had a FORTRAN complier that needed less that 131kWord of memory - today the GRUB bootloader is bigger than that) with a more modern box, with all its overheads and inefficiencies, the balance isn't as great as the scoffers might think.

      Does that make it any less impressive that a cell phone is putting up these kinds of numbers? Does it make it less impressive that you can code up an Linpack in Java, throw it at a JVM and rely on JIT compiler to optimize the DAXPY for the hardware on the fly? I think it both of those things are pretty damn impressive.

  • Somehow, I'm not so impressed, considering Moore's Law predicts a roughly 1 million-fold (= 2^(30/2)) increase in transistor count over the span of 30 years...

  • by Doc Ruby ( 173196 ) on Thursday June 03, 2010 @02:32PM (#32449434) Homepage Journal

    For example, a tweaked Motorola Droid can hit 52 Mflop/s, which is more than 15 times faster than the CPUs used in the 1979 Cray-1.

    Cray's approach [wikipedia.org] to supercomputing wasn't just to make the CPU fast. Indeed, he outcompeted faster CPUs by making all of his computers fast, so no power in the machine was wasted waiting for something else. Especially IO and memory were his focus for throughput. A Droid's CPU is bottlenecked by the rest of the device.

    This unfair comparison isn't just whining about missing Cray's point. There's a lot of power in that Droid that the SW can't exploit, because its bottlenecks leave the fast parts waiting. Not only does that slow them down, but it wastes electrical energy. Which is the biggest problem in mobile devices.

    LINPACK isn't the best way to measure supercomputers, and "nanocomputers" like mobile phones could be better if they learned something from Cray's research 40 years ago.

    • by whyde ( 123448 ) on Thursday June 03, 2010 @03:21PM (#32450144)

      Actually, for mobile devices, the most important metric is performance per unit of power instead of just performance per unit time. After a certain speed/throughput has been reached, nobody cares how fast the CPU is, only how long the battery lasts.

      For scientific purposes, back when Cray was building systems, you got charged by the second you had access to the computer. So you carefully composed the solution to your problem to make darned sure every whizz-bang aspect of the computer was doing something useful all the time. Today, you just want to play a game for a while, then make a voice call, and don't want the battery to fizzle out before you get home (and maybe have some juice left for watching a show during your train ride home.)

      Mobile devices don't try to match the throughput of all parts of the system, because it's not in anybody's interest to keep the I/O subsystem saturated close to capacity 100% of the time you're using your Droid/iPhone... in fact, they turn them off (go into a low power state) and do aggressive power management that is coordinated system-wide.

  • by MauiMaker ( 1802288 ) on Thursday June 03, 2010 @02:48PM (#32449676)
    Back in 1983, I worked at Digital Productions [wikipedia.org] where we had one of the very few commercially owned Cray (X-MP) computers. We were doing 'proper work' of making some of the earliest CGI for film and advertising. There was a bit of film before (Tron, Westworld, Looker, JPL stuff, etc) but The Last Starfighter [wikipedia.org] was the first major film to use CGI exclusively for its spaceships, etc. in flying sequences. (Robert Preston drove a mockup car for ground scenes.) Each minute of film took (on rough avg) an hour of CPU time. All the rendering code was written in FORTRAN and ran on the Cray, outputting to film on a custom digital film printer.

    Today, the games you can play on your iPhone/Android or even the aging Nintendo DS have better graphics!! Resolution is a lot lower (not 3000x5000!) but at the screen size it certainly looks much better - and rendered in real time!

  • It's an interesting comparison, but let's keep in mind that "porting" an app from one platform to another is not a zero sum game. Efficiencies can be gained, or lost based on the compiler or the person rewriting the code. a 90% performance penalty in poorly compiled code would not be unusual. Raw mathematical computations like Linpack performs are most useful, but since this version runs on top of a Java platform, the system overhead is already probably higher than the Cray has to slog through. So the p
  • This is sad really (Score:4, Insightful)

    by scorp1us ( 235526 ) on Thursday June 03, 2010 @03:19PM (#32450098) Journal

    Remember this scene in hackers?

    PHREAK: Yo. Check this out guys, this is insanely great, it's got a 28.8 BPS modem!
    DADE: Yeah? Display?
    CEREAL: Active matrix, man. A million psychedelic colors. Man, baby, sweet, ooo!
    NIKON: I want it.
    PHREAK: I want it to have my children! ...
    KATE: What the hell are you doing?
    DADE: It's cool, I'm just looking.
    KATE: It's too much machine for you.
    DADE: Yeah?
    KATE: I hope you don't screw like you type.
    DADE: It has a killer refresh rate.
    KATE: P6 chip. Triple the speed of the Pentium.
    DADE: Yeah. It's not just the chip, it has a PCI bus. But you knew that.
    KATE: Indeed. RISC architecture is gonna change everything.
    DADE: Yeah. RISC is good.

    Now, imagine all that excitement from the processing power and bandwidth they had even on a 28.8 modem - that we now have multiples of... in our pockets Where is it being leveraged for the goal for the good of man kind? Folding and SETI are good starts, but they haven; taken off. We've got tons of idle cycles... You'd figure there'd be some processing client where you get paid for your cycles, but it only exists as illegal botnets. Where's the open utility computing? Why don't my computers' idle cycles pay for themselves?

    They were supposed to make our lives easier, but for as much as they empowered us, the exception processing got dumped on us. The nature of that work is different from the regular rhythmic routine of normal processing. Exceptions are urgent, require more effort and as a result are more stressful. And any news you get is when something is wrong.

    I like the idea of being able to chat with people on the other side of the planet, but I haven't figured out what good it is to me. We don't have much in common with each other. I like the idea that I can do my own stock trading, but this usually means I lose money instead of my money manager. ;-p

    Computers now cause as many problems as they solve (Goldman Sachs, AIG, I'm looking at you!) Is our society any better? Are people happier? Or are we more stressed out?

    (And what has my /. commenting gotten me. Not a date or a dollar for sure!)

    • Um, Folding and SETI on all your phones would indeed process a lot of data.

      And kill a lot of batteries.

      Nice idea, not. All your chargers are belong to us.

  • There was a time when 15 Crays in your pocket wooed a gal.

    (And it was cool to drink, drive and to get oneself and others killed. Times change. Mostly for the better.)
    • Re: (Score:3, Funny)

      Having 15 Crays in you pocket never wooed a gal.
      Having enough $ in your pocket to buy 15 Crays... That is another story.
  • When I started work as a computer programmer the Supercomputer of the time was the CDC6600 which had just taken the crown from the Ferranti Atlas.

    When I took early retirement about 7 years ago, I often carried four devices which each needed about the power of the 6600 to function effectively:
    A mobile phone
    An MP3 player
    A PDA (mainly used as an ebook reader)
    A GPS (OK, I didn't carry this all that often)

    A composer/researcher was using our University

The unfacts, did we have them, are too imprecisely few to warrant our certitude.

Working...