Follow Slashdot stories on Twitter


Forgot your password?
Businesses Operating Systems Software Windows Hardware Technology

Less Is Moore 342

Hugh Pickens writes "For years, the computer industry has made steady progress by following Moore's law, derived from an observation made in 1965 by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months. The Economist reports however that in the midst of a recession, many companies would now prefer that computers get cheaper rather than more powerful, or by applying the flip side of Moore's law, do the same for less. A good example of this is virtualisation: using software to divide up a single server computer so that it can do the work of several, and is cheaper to run. Another example of 'good enough' computing is supplying 'software as a service,' via the Web, as done by, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway. Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade — only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles."
This discussion has been archived. No new comments can be posted.

Less Is Moore

Comments Filter:
  • Faster Windows! (Score:1, Insightful)

    by D Ninja ( 825055 ) on Wednesday January 28, 2009 @03:07PM (#26642583)

    If so, it will be the first version of Windows that makes computers run faster than the previous version.

    Nooo...computers are running at exactly the same speed. They just won't have to chew through bloated software. Microsoft is (supposedly) making their software more efficient.

    Can't stand writers who don't understand tech.

  • by Opportunist ( 166417 ) on Wednesday January 28, 2009 @03:09PM (#26642609)

    Let's be honest here. What does the average office PC run? A word processor, a spreadsheet, an SAP frontend, maybe a few more tools. And then we're basically done. This isn't really rocket science for a contemporary computer, it's neither heavy on the CPU nor on the GPU. Once the computer is faster than the human, i.e. as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

    And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

  • Sigh (Score:4, Insightful)

    by Bastard of Subhumani ( 827601 ) on Wednesday January 28, 2009 @03:11PM (#26642635) Journal

    companies would now prefer that computers get cheaper rather than more powerful

    There's already a method for that: it's called by the catchy title "buying a slightly older one".

    A related technique is called "keeping the one you've already got".

  • by benjfowler ( 239527 ) on Wednesday January 28, 2009 @03:13PM (#26642669)

    This could simply be down to the tanking economy: people look at what they're spending, and quickly realise that:

    1) the upgrade treadmill over the last twenty years has produced insanely powerful and dirt-cheap hardware. When was the last time you had trouble running Linux on your hardware? I'm old enough to remember!

    2) and that you don't need teraflops of CPU/GPU power just to draw greasepaper-style borders around your Microsoft Word windows. Perhaps the entire industry has woken up and seen how unbelievably wasteful modern computing is, and have decided to take the dividend of Moore's Law in cash instead.

    3) recessions are good for purging wasteful and suboptimal behaviour generally.

    Maybe people will realize what an obscene waste of money and computing power and operating system like Windows Vista, which requires a gig of RAM to run, really is.

  • Re:Or... (Score:2, Insightful)

    by craighansen ( 744648 ) on Wednesday January 28, 2009 @03:20PM (#26642775) Journal
    In fact, M$ might end up settling these Vista-Capable lawsuits by offering upgrades to W7, especially if it's faster on the barely-capable hardware that the subject of the suit. Cheap way to settle for them...
  • by Hognoxious ( 631665 ) on Wednesday January 28, 2009 @03:22PM (#26642805) Homepage Journal

    as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

    It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.

  • by Anonymous Coward on Wednesday January 28, 2009 @03:22PM (#26642811)

    There are many options. Multiseat, Linux Terminal Server Project (LTSP), thin clients (netbook and barebone desktop), etc. In other words, the return of time-sharing evolved and using hi tech.

    Multiseat []

    LSTSP []

    thin client []

    netbook []

    Computer terminal []

    etc... []

    A low end computer is enough for 99% of the work. Almost nobody need or use a 4 core CPU (except for games), but usually, this kind of power is not used in the enterprise.

  • by xs650 ( 741277 ) on Wednesday January 28, 2009 @03:36PM (#26643027)

    Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.

    Without Vista, MS wouldn't be able to claim that 7 was faster than their previous version of Windows.

  • Re:Bad Logic (Score:5, Insightful)

    by timholman ( 71886 ) on Wednesday January 28, 2009 @03:38PM (#26643085)

    Microsoft is notorious for ignoring customer desires to fix what they have and offering unprompted additions and UI changes.

    Yes, and as so many have pointed out, their history of doing so is now backfiring on them in a big way. And it's not just with Vista, it's with Office as well.

    Case in point - several months ago my department bought upgrade licenses to Office 2008. I was perfectly happy with Office 2004, but I installed Office 2008 because I knew that if I didn't, I wouldn't be able to read whatever new formats that Office 2008 supported. It had happened with every other Office upgrade cycle in my experience - you either upgraded or you'd be unable to exchange documents with your peers.

    But something funny happened this time - I have yet to receive a .docx, .xlsx, or .pptx file from anyone. I have quite consciously chosen to save every document in .doc, .xls, or .ppt "compatibility" format. Everybody I talk to says they're doing exactly the same thing. Everyone now knows the game that Microsoft plays, and no one is willing to play it anymore. I could have stayed with Office 2004 and never noticed the difference. So what motivation will I have to upgrade to the next version of Office?

    If it weren't for Microsoft's OEM licensing deals, Vista would have a tiny fraction of its current market share. XP is "good enough". But Microsoft doesn't push Office onto new machines the way it does Windows, the older Office formats are also "good enough", and you have open source alternatives like OpenOffice if Microsoft tries to deliberately break Office compatibility on the next version. I fully expect Microsoft's Office revenues to take a steep dive in the next few years. The Vista debacle is only the beginning.

  • Re:Bad Logic (Score:4, Insightful)

    by jonbryce ( 703250 ) on Wednesday January 28, 2009 @03:41PM (#26643125) Homepage

    Not really, especially in the days when you had Intel and AMD racing to be the producer of the fastest chip.

  • Re:Hmm... (Score:0, Insightful)

    by roe-roe ( 930889 ) on Wednesday January 28, 2009 @03:55PM (#26643351) Homepage
    While I can appreciate your sentiment, you *can't* get a decent laptop for $500. You can get a laptop that will run XP or GNU/Linux or *BSD for $500. But the world uses Windows, and if you are going to be running Vista well, you are looking at $800 for the laptop. And while, that is phenomenal, TFA is trying to convey that over the next few months they want to take the $800 laptop and make it cost $500, and that $500 desktop to cost $400. Industries hurting now don't care where we are going to be in 100 years or even how far we have come in 10. The industry has been chasing this ever increasing sliding scale of performance. Consumers have benefited by getting more powerful machines.

    Oddly enough, Moore's observations are still viable, but it is the economy that is going to slow the trend. Demand is shifting from the same price point to one lower. This will cause a momentary dip in the trend. Once the new price point stabalizes Moore's Law will again be relevant.
  • Re:Let's see (Score:4, Insightful)

    by jank1887 ( 815982 ) on Wednesday January 28, 2009 @03:58PM (#26643411)
    you can pipe to more from a DOS prompt too. so a few of us get to stay on the lawn.
  • Re:Sigh (Score:3, Insightful)

    by RiotingPacifist ( 1228016 ) on Wednesday January 28, 2009 @04:19PM (#26643771)

    he said older not used. An older cpu [] is still as safe as a newer one []

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday January 28, 2009 @04:20PM (#26643791) Journal
    While I can see the desire for cheaper rather than more powerful, I do wonder how much of the power/price tradeoff curve actually makes sense. Traditionally, the very high end of the curve makes very limited sense, since it is the nightmare world of low yields, early adopter taxes, and super critical enterprise stuff. In the middle, the power/price curve tends to be roughly linear; before gradually becoming less favorable at the bottom, because of fixed costs.

    As long as a processor, say, has to be tested, packaged, marked, shipped, etc.(which costs very similar amounts,whether the die in question is a cheap cutdown model or a high end monster) there is going to be a point below which cutting performance doesn't actually cut price by any useful amount. Something like the hard drive is the same way. Any drive has a sealed case, controller board, motor, voice coil unit, and at least one platter. Below whatever the capacity is of that basic drive, there are no real cost savings to be had(incidentally, that is one of the interesting things about flash storage. HDDs might be 10 cents a gig in larger capacities; but that doesn't mean that you can get a 4gig drive for 40 cents, I had a quick look, and you can't get anything new for under about $35. With flash, you might be paying 100 cents a gig; but you pretty much can get any multiple you want).

    Cost, overall, is gradually being whittled down; but, once all the low hanging super high margin products are picked off, there is going to be a point past which it simply isn't possible to exchange price for performance at any reasonable rate. Used and obsolete gear offers a partial solution(since it can be, and is, sold below the cost of production in many cases) but that only works if your needs are small enough to be fulfilled from the used market.
  • Re:Bad Logic (Score:3, Insightful)

    by CaptCovert ( 868609 ) on Wednesday January 28, 2009 @04:21PM (#26643801)
    There are a few things you're not considering here, though:
    1. The Open Office XML format (.docx, .pptx, .xlsx, etc) is the default behavior for Office 2008, and while those of your department may trade docs now in that format, sooner or later, people will get lazy and start creating docs in the newer format.
    2. If you do any document transfers with other companies, eventually, you will see the dreaded .docx. What will you do then?

    I'm not even going to get into the number of FOSS-based companies you leave in the cold by hanging onto .doc and the proprietary document format that it represents instead of using the freely available OOXML specification.

  • by Peepsalot ( 654517 ) on Wednesday January 28, 2009 @04:23PM (#26643837)
    In recent years not only has CPU performance been increased, but the efficiency in terms of power consumption per unit of work has greatly improved.

    Even if the majority of users begin realize they have no practical use for top end CPUs with gobs processing power, everyone still benefits from higher efficiency CPUs. It reduces electric bills, simplifies cooling systems, allows for smaller form factors, etc. I think in the future the power efficiency will become more important as people start to care less about having the ultimate killer machine in terms of processing power. People are already performing actions on their mobile devices(iPhone, Blackberry, etc) which were possible only on a desktop in past years. The strict power requirements of these devices with tiny batteries will continue to demand improvements in CPU technology.

    I'm waiting for the day when it is common to see completely passively cooled desktop computers, with solid state hard disks, no moving parts, sipping just a few watts of power without emitting a single sound.
  • by GreatBunzinni ( 642500 ) on Wednesday January 28, 2009 @04:27PM (#26643887)

    The article you pointed out is pure nonsense. It claims that bloat isn't important due to the fact that memory cost dropped. Not only that, it tries to base that claim on this idiotic metric of dollar per megabyte and how the fact that software like microsoft's excel bloat from a 15MB install in the 5.0 days to a 146MB install in the 2000 days is somehow a good thing because in the 5.0 days it took "$36 worth of hard drive space" while "Excel 2000 takes up about $1.03 in hard drive space". No need to justify a 100% footprint. We are saving money by installing more crap to do the exact same thing.

    In fact, the idiot that wrote that article even had the audacity to state:

    In real terms, it's almost like Excel is actually getting smaller!

    Up is down, left is right, bloat is actually good for you.

    But people still complain. Although it appears that we should be grateful for all that bloat, we are somehow being ungrateful by believing that all that bloat is unnecessary. But fear not, the idiot that wrote the article has a nice accusation for all those bloat haters out there:

    I guess those guys just hate Windows.

    Yes, that's it. We don't hate orders of magnitude increase in bloat simply to be able to perform exactly what has been easily done with a fraction of resources. We don't hate to be forced to spend money on hardware to be left with a less than adequate solution when compared with the previous generation. We simply hate windows. Good call.

    The article is bad and you should feel bad for posting a link to it.

  • by pz ( 113803 ) on Wednesday January 28, 2009 @04:31PM (#26643957) Journal

    We more-or-less got enough computing power for most things with the introduction of the PIII 1GHz CPU. You might not agree with this, but it's at least approximately true. A computer outfitted with that processor and reasonable RAM browses the web just fine, plays MP3s, reads email, shows videos from YouTube, etc. It doesn't do everything that you might want, but it does a lot.

    If we took the amazing technology that has been used to create the 3 GHz multi-core monsters with massive on-chip cache memory in a power budget of 45W or so in some cases, and applied it to a re-implementation of the lowly PIII, we'd win big. We'd get a PIII 1GHz burning a paltry few watts.

    And this is precisely why chips like the Intel Atom have been so successful. Reasonable computing power for almost no electricity. We don't necessarily need just MORE-FASTER-BIGGER-STRONGER, which is the path Intel and AMD have historically put the most effort into following, we also need more efficient.

  • Re:More is More (Score:3, Insightful)

    by ianare ( 1132971 ) on Wednesday January 28, 2009 @04:35PM (#26644027)

    If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.

    Depends on the frequency of the task too. In Photoshop going from 10 to 5 secs for a simple task done often (say noise reduction) is a big deal. Or if an IDE takes 1 vs 2 seconds to load the popup showing class variables/methods. It doesn't sound like a big deal, but when you have to do it hundreds of times a day, believe me, those seconds add up ! I would (and have) gladly upgrade a PC for these kinds of real world improvements.

  • by Overzeetop ( 214511 ) on Wednesday January 28, 2009 @04:47PM (#26644243) Journal

    Actually, what we need is a massively fast processor which can scale - quickly - to a "slow" processor like the PIII. Most of the time my systems are idle, and I'd be happy with them running at 400MHz if I could do it for a 90% savings in power. When I hit the gas, though, and want to load my ipod with music from my FLAC collection, doing on the fly transcoding, I want both (or all 4, or 8) cores running at 3+GHz, making my file transfer speeds the bottleneck. I don't care if I burn 150W-200W on the processor at those times, as long as it happens quickly.

    I don't use my processor much, but when I use it I want it to be fast. Common apps, like AutoCAD, Adobe Acrobat, and anything processing images, is just painful on my 1.86GHz P4mobile (close to a Core2), but I live with it because I'm too cheap to upgrade. If I could increase the speed by a factor of 5-10, but scale the power back for the 99% of the time I don't need it I could get better battery life and a faster machine. As it is, if I want that kind of speed improvement, I'm looking at a machine which requires a 3lb brick of an AC adapter and an 8lb boat anchor that gets about 2hours of best-case runtime. (Apple's new laptop notwithstanding).

  • Not necessarily. For some AI problems that may be true, but what about doing things like recognizing a picture, a face, a rock, a tree? That takes some massive image processing bandwidth. If you take cues from biology, it's a massively parallel fuzzy logic thing, but still, it's a LOT of computational power. AI advances are still not purely in the algorithm.
  • Re:More is More (Score:3, Insightful)

    by Archangel Michael ( 180766 ) on Wednesday January 28, 2009 @05:07PM (#26644593) Journal

    But you don't stop doing those tasks because it takes 10 Seconds. While you do have a valid point, those chunks of time don't stop you, they only annoy you.

    Trust me, you don't know what you DO NOT do because it takes to long to do it. I've seen tasks grow in time, from a minute to 1.5 hours as the logs grow due to increased activity. As the time mounted to process the logs, the less frequently I processed them, often running them over a weekends and evenings because I couldn't afford to process them during the week day.

    Getting a new machine and having that 1.5 hour log processing suddenly become four minutes meant I could process the logs just about anytime I had 4 or 5 minutes to spare.

    It was much easier to find 5 minutes than 1.5 hours.

  • by Moraelin ( 679338 ) on Wednesday January 28, 2009 @05:26PM (#26644879) Journal

    Well, actually it's just proof that history repeats itself. Because this thing has happened before. More than once.

    See, in the beginning, computers were big things served by holy priests in the inner sanctum, and a large company had maybe one or two. And they kept getting more and more powerful and sophisticated.

    But then it branched. At some point someone figured that instead of making the next computer which can do a whole megaflop, they can do a minicomputer. And there turned out to be a market for that. There were plenty of people who preferred a _cheap_ small computer, than doubling the power of their old mainframe.

    You know how Unix got started on a computer with 4k RAM, which actually was intended to be just a co-processor for a bigger computer? Yeah, that's that kind of thing at work. Soon everyone wanted such a cheap computer with a "toy" OS (compared to the sophisticated OSs on mainframes) instead of big and powerful iron. You could have several of those for the price of a big powerful computer.

    Then the same thing happened with the micro. There were plenty of people (e.g., DEC) who laughed at the underpowered toy PCs, and assured everyone that they'll never replace the mini. Where is DEC now? Right. Turned out that a hell of a lot of people had more need of several cheap PCs ("cheap" back then meaning "only 3 to 5 thousands dollars") instead of an uber-expensive and much more powerful mini (costing tens to hundreds of thousands.)

    Heck, in a sense even multitasking appeared as sorta vaguely the same phenomenon. Instead of more and more power dedicated to one task, people wanted just a "slice" of that computer for several tasks.

    Heck, when IBM struck it big in the computer market, waay back in the 50's, how did they do it? By selling cheaper computers than Remington Rand. A lot of people had more use for a "cheap" and seriously underpowered wardrobe-sized computer than for a state of the art machine costing millions.

    Heck, we've even seen this split before, as portable computers split into normal laptops and PDAs. At one point it became possible to make a smaller and seriously less powerful PDA, but which is just powerful enough to do certain jobs almost as well as a laptop does. And now it seems to me that the laptop line has split again, giving birth to the likes of the Eee.

    So really it's nothing new. It's what happens when a kind of machine gets powerful enough to warrant a split between group A who needs the next generation that's 2x as powerful, and group B which says, "wtf, it's powerful enough for what I need. Can I get it at half price in the next generation?" Is it any surprise that it would happen again, this time to the PC? Thought so.

  • Re:Hmm... (Score:2, Insightful)

    by quickOnTheUptake ( 1450889 ) on Wednesday January 28, 2009 @05:31PM (#26644959)

    What happens when UPS can do 1 hour delivery to most places on the globe?

    How would they manage that? Take a scenario of shipping to the opposite side of the globe. The earth's diameter is almost 8000 miles. That means the package would be moving at least 8000mph. Over 10X the speed of sound. . . . if it can go through the center of the earth. And this doesn't include any of the overhead of sorting, and pickup/dropoff.

  • by Qbertino ( 265505 ) <> on Wednesday January 28, 2009 @06:37PM (#26645943)

    It should be obvious, shouldn't it? Our work enviroment of choice has been the Desktop metophor for about 20 years now. Todays computers are powerfull enough to handle very luxurious desktop enviroments. I've basically replaced my very first PC - the first ever ATX bigtower casing, an InWin from 1996, that ways around about a metric ton - with a Mac Mini. 3D wise I even think it's a downgrade, allthough I only have a Geforce 4200 Ti as my latest 3D card in there.

    But, as others here have pointed our allready, it consumes about the tenth of the power, makes almost no noise at all - even now I can barely hear it - and it is like 40 times as small. Meanwhile FOSSnix based systems are only getting better without making computing skills obsolete and making it even more finacially attractive to go for cheap and small.

    The next performance race for most people will only take place after the standards for powerconsumption, size and noise have been raised and met. After that regular computers will be heading for more power again. I presume that next league will stall after a decade again, when 200$ computers the size of my external HDD have reached the power to render photorealistic motiongraphics in real time.

  • by Peaquod ( 1200623 ) on Wednesday January 28, 2009 @10:02PM (#26648357)
    Amen! The phrase "Moore's Law" irritates me to no end. I understand that it is the common vernacular, but it is almost always misused here on /. "Moore's Law" was simply an observation that has remained remarkably consistent over time. And it had nothing to do with cost or "computing power" - just that the number of transistors per unit area double roughly every 18 months. There is no "Law" to be followed or violated! Sheesh!
  • by Glasswire ( 302197 ) <glasswire AT gmail DOT com> on Wednesday January 28, 2009 @10:06PM (#26648399) Homepage

    The observation behind Moore's Law doesn't say anything about performance. It's a projection about the rate at which transistor density on a wafer grows every 18 months. Historically most companies incl Intel had used this to make bigger dies for larger and more complex processors, but you can also use the improvements in transistor real estate to simply make smaller dies and hence, more of them, increasing yield and bringing down the price. So bringing the price down isn't different than Moore's Law, it's just another way to use ML.

  • False choice (Score:3, Insightful)

    by symbolset ( 646467 ) on Wednesday January 28, 2009 @10:39PM (#26648627) Journal

    The presentation of the false choice fallacy is that you must choose option a or option b. As far as I can tell, businesses want not only option a and option b, they also want "the same performance in less watts". And a number of other things.

    By presenting the trend as a singular choice the author presents a false choice. What is actually happening is that the computing ecosystem is becoming more diverse. As we select from a richer menu, we are enabled to pursue our goals large and small with equipment that suits the application. It's a good thing.

  • by lpq ( 583377 ) on Wednesday January 28, 2009 @11:48PM (#26649123) Homepage Journal

    "If so, it will be the first version of Windows that makes computers run faster than the previous version."

    This is the 2nd bit of falseness -- WinXP was faster than WinME.

    Second -- WinXP is still quite a bit faster than Win7.

    The article states that Win7 improves in areas where Windows was "OS-bound" over Vista. However, it says there is NO IMPROVEMENT in Win7 for applications. It was applications that noticed a 10-15% performance hit in Vista vs. XP due to the overhead of the DRM'd drivers. As near as I can tell from everything that has been leaked out about Vista before, during and after its development was that MS added (not replaced), but added a whole new abstraction layer. They tried to make it transparent where they could, but this was the basic reason why nearly all drivers that actually touched hardware had to be rewritten -- the USER has to be completely isolated from the real hardware -- so DRM can detect hardware/software work-arounds or unauthorized modifications or "taps" into the unencrypted data stream. This goes down to the level of being able to tell if something is plugged into a jack due to impedance changes -- if impedance or electrical specs don't match exactly with what the circuit is supposed to produce -- the OS is supposed to assume tampering and mark the OS-state as "compromised". Think of it being similar to the Linux - Kernel's tainted bit. Once it's set, it is supposed to be irreversible unless you reboot -- because the integrity of the kernel has been compromised. The DRM in Vista-Win7 was spec'ed to be similar but with finer level sensors -- so if anything -- a code path takes too long to execute, or circuits don't return their expected values, it's to assume the box is unsecure, so content can be disabled or downgraded at the content-providers option.

    All this is still in Win7 -- the only difference it all the drivers that were broken during the switch to Vista won't be rebroken -- so you won't get anywhere near the OEM flack and Blog-reports about incompatibilities -- those will all be buried with Vista -- along with your memories -- so hopes MS. But MS has already made it clear that you won't be able to upgrade an XP machine to Win7 -- so they can control the experience from the start -- either by starting with corrupted Vista drivers, or Win7 drivers -- take your pick. Both are designed to ensure your compliance, but more importantly -- both cause performance degradation that everyone pays for by needing a bigger machine than they needed for XP.

    The whole planet will be paying an excess carbon tax -- forever -- all to add in content-producer's demanded wish list.

    This whole bit about the IT industry warming up to Win7 because it's not so bad compared to Vista just makes me want to puke. It's still corrupt and slow.

    The government should require MS to open-source WinXP -- so it can be supported apart from MS -- who's obviously going for a "content-control" OS (like the next Gen Apple's are slated for). This will be the beginning of the end for non-commercial, open-source OS's or media boxes. It will be all pay-to-play -- just like Washington.

  • by Anonymous Coward on Wednesday January 28, 2009 @11:50PM (#26649135)

    Way back when I was in college (Windows 3.1 days) I read every computer magazine I could get my hands on to learn what I could about Windows and computing in general. Then a few months before Windows 95 came out the hype began and I realized that computer magazines for the general public were really just PR tools for Microsoft which you pay for. I'm noticing this same trend on the internet today. Notice the subtle little plug for Windows 7 in the above article (... If so, it will be the first version of Windows that makes computers run faster than the previous version.) I've heard all this before.... I stopped buying computer magazines and I certainly tune out most of the marketing dribble I read on the Internet.

... though his invention worked superbly -- his theory was a crock of sewage from beginning to end. -- Vernor Vinge, "The Peace War"