Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Linux Business Operating Systems Software Hardware

"Good Enough" Computers Are the Future 515

An anonymous reader writes "Over on the PC World blog, Keir Thomas engages in some speculative thinking. Pretending to be writing from the year 2025, he describes a world of 'Good Enough computing,' wherein ultra-cheap PCs and notebooks (created to help end-users weather the 'Great Recession' of the early 21st century) are coupled to open source operating systems. This is possible because even the cheapest chips have all the power most people need nowadays. In what is effectively the present situation with netbooks writ large, he sees a future where Microsoft is priced out of the entire desktop operating system market and can't compete. It's a fun read that raises some interesting points."
This discussion has been archived. No new comments can be posted.

"Good Enough" Computers Are the Future

Comments Filter:
  • ...and doggone, people like them!
    • by The Grim Reefer2 ( 1195989 ) on Wednesday April 22, 2009 @06:50PM (#27681133)

      Pretending to be writing from the year 2025

      In the year twenty twenty-five if Intel is still alive. If Microsoft can survive they may find...

  • meh (Score:5, Insightful)

    by Captain Splendid ( 673276 ) <`capsplendid' `at' `gmail.com'> on Wednesday April 22, 2009 @03:24PM (#27678587) Homepage Journal
    Being saying since the Pentium II days. This "always-be-upgrading-the-latest-spec" is fine for hardcore users, but for everybody else, "good enough" happened quite a few hardware generations ago. The sad part is that we're only now having this conversation.
    • Re:meh (Score:5, Interesting)

      by Feminist-Mom ( 816033 ) <feminist DOT mom AT gmail DOT com> on Wednesday April 22, 2009 @03:30PM (#27678669)
      There will always be higher res movies to view and process, and more data from the world to be saved. I remember one colleague telling me in 1995 that if I got a 2 gig drive it would never be full.
      • Re: (Score:3, Insightful)

        You're missing the point. As I said above, there's a difference between hardcore users and genpop. My mom's a great example. A simple, lightweight email/web/skype tablet, and she's all set. And ~500Mhz of processing power is all you really need for that.
        • Re:meh (Score:5, Insightful)

          by Chris Burke ( 6130 ) on Wednesday April 22, 2009 @04:32PM (#27679607) Homepage

          And ~500Mhz of processing power is all you really need for that.

          Yeah, well, in the days of the Pentium II which topped out at 450MHz, that would have been "hardcore".

          So clearly the needs of even the most modest computer users have gone up substantially.

          And assuming the software industry continues to find interesting things for people like your mom to do with their computer, then this will continue.

          Don't get me wrong, there's a "good enough" in every computing generation and there's nothing wrong with people targeting that instead of the latest n' greatest. More and more people are, which is why netbooks are becoming so popular. But the bottom-end netbook of five years from now will be significantly more powerful than the bottom-end netbook of today, and odds are that extra performance will give someone who doesn't need anything more than a netbook a real benefit.

          Point is -- "good enough" is real and valid, but still a moving target.

          • Re: (Score:3, Insightful)

            Point is -- "good enough" is real and valid, but still a moving target.

            Writing everything in Python (or god forbid, a language whose interpreter is written in Python) is not helping us lower our minimum requirements.

            To fanboys: I'm a python-lover like yourself. Love writing it. It's just that everyone else should write in C so their code is fast to run :D

          • Re:meh (Score:5, Insightful)

            by roc97007 ( 608802 ) on Wednesday April 22, 2009 @07:53PM (#27681655) Journal

            >> And ~500Mhz of processing power is all you really need for that.

            > Yeah, well, in the days of the Pentium II which topped out at 450MHz, that would have been "hardcore".

            > So clearly the needs of even the most modest computer users have gone up substantially.

            Yes, but have they gone up recently? Sure, if you go back far enough you'll find unusable computers by the standards of today's average user, but seriously... when was the last time the "average" user really had to upgrade?

            Let's parameterize this to make sure we're all talking about the same thing. "Average", for the purposes of this comment, being defined as someone who uses email, browses the web, plays a few card games, and oh hell, including -- to be brutally fair -- casual video viewing from the likes of youtube and hulu.

            Let's see... The absolute cheapest desktop system I can conveniently find at the moment has a 1.8G Celeron and a gig of memory. That's an embarrassment of riches to perform the paltry functions described above. Laptops: I have at home a Thinkpad 240X (500 Mhz Pentium III, memory maxed out) made in June 2000 -- that's a whole DECADE ago now -- that will do all of those things, *and* play DIVX encoded videos fullscreen without hesitations, and I don't think you could buy a new laptop anywhere today, for any price, that didn't have significantly better specs. *Phones* have better specs.

            To the Linux geeks out there -- yes, you can get more bang for hardware buck with Linux, but don't flatter yourself into thinking that's the only reason ultra-cheap computers are "good enough". Windows XP runs fine for average usage (see above) on hardware made a decade ago. (And of Windows versions, XP itself is "good enough" for the average user, but that's another story.)

            This is not a Linux Phenomenon. It's a case of the manufacturers not shifting paradigms fast enough. In the Old Days (say, the 1990's) we really needed a steep performance development curve because all kinds of new stuff was happening that would make use of every computing cycle one could conveniently afford. Windows tended to drive this, because each new version needed faster hardware to drive it, and there was (arguably) more functionality (or fewer bugs) in each new version to warrant upgrading.

            But shortly after the turn of the century, two things happened: (1) A version was released of the most popular OS on the planet that (finally) was solid enough that the average user didn't immediately aspire to upgrade. (2) Hardware performance leapfrogged past what most people really needed, especially considering (1) above. With no Killer App and no new lugubrious-yet-tantalizing release of Windows to drive it, hardware was suddenly too fast for main street.

            It was fairly recently that the industry finally understood that there was a market for Cheap. What followed was a scramble to adapt to this new market. You could hear the grinding of continental paradigm shift. Even Microsoft -- for God's sake -- is starting to become concerned about performance, instead of just assuming that Moore's Law will somehow compensate for unchecked bloat.

            Let's face it: There is no consumer Killer App for the quad core Nehalem. There are a few painfully cutting edge geeks that may find a use for that kind of power, but most are just fooling themselves -- playing a game of my-cpu-runs-hotter-than-yours. Yet you can put together a killer Nehalem-based PC for less than the cost of a wide screen TV.

            And that's not even taking the economy into account.

            > And assuming the software industry continues to find interesting things for people like your mom to do with their computer, then this will continue.

            That's the current problem (if you want to call it that). The software industry has nothing your mom would need current midrange hardware to run, with nothing in particular coming up.

            Maybe there is a killer app waiting out there -- maybe when true AI becomes practical, it'll drive another technology race. But there hasn't been anything for awhile.

            • Re: (Score:3, Interesting)

              by Just Some Guy ( 3352 )

              Yes, but have they gone up recently?

              Yes. In my house, the defining point was our purchase of a Flip video camera. It's a little $150 flash-based unit meant for people like us who want grandma to see movies of the kids with minimal hassle. You shoot your video, plug it into a Mac or PC's USB port, double-click the runnable software that's stored on the camera itself, and watch your movies. If you want to upload them to YouTube, select the desired clips and click the "upload" button - the software handles the rest.

              It's a slick little camera

              • Re: (Score:3, Interesting)

                by roc97007 ( 608802 )

                How old is "older"? Daughter is still happy with her dual 800 G4 from 2001. She's a heavy Photoshop user, and response on her elderly G4 is better than her more recent Dell (2 Ghz Pentium 4, circa 2004). There's a brisk business out there in non-current Macs -- you can probably offer her a substantial upgrade for a paltry sum without even stepping in a Mac store.

                I do video editing at home, (my video camera is a little higher end than a Flip) and in 2008 I bought an AMD Athlon 64 3200+ (I think it was

      • by LWATCDR ( 28044 )

        For viewing an Atom/ION system will work very well. Processing? One does wonder if an Atom/ION system would do that as well for you average home users if the editing software took advantage of the GPU.

      • mp3 has been around for how long? There's a peak to the curve of diminishing returns. I have a 61" TV, and anything higher than 1920x1080 would be almost useless from far enough away to see the whole screen. Why would video have to get any higher res? There are human physical limits to vision and hearing. We got to the audio limits quite a while ago with mp3. Video is just now getting there. The only thing left is 3D or some other killer app, but for right now, all the media most people care to consume is h
        • Re:meh (Score:5, Funny)

          by Obfuscant ( 592200 ) on Wednesday April 22, 2009 @04:15PM (#27679319)
          And even HD is overkill if you ask many older people... they can barely tell the difference.

          I AM an old person, you insensitive clod.

          I can tell the difference, I just don't care. If I want to see a high-resolution sunset, I'll go outside and watch it live. I don't need to see every nose hair on the news reporter or every pore and pimple on these damn kids who seem to be everywhere on TV these days. And get off my lawn...

      • by Jezza ( 39441 )

        Sure, but most people can be an iteration (or two) behind. Very few people need (or could even benefit from) the latest hardware hotness, Will you need to upgrade? Sure. Do you need the fastest machine you can get your grubby mitts on? Probably not.

    • Re:meh (Score:5, Insightful)

      by im_thatoneguy ( 819432 ) on Wednesday April 22, 2009 @03:37PM (#27678781)

      Yeah I expect "The Year of Good Enough Hardware" will coincide with the 10th anniversary of the "Year of Linux on the Desktop".

      We didn't need fast computers for everyday computing and then we started indexing the entire hard drive.
      We didn't need fast computers for everyday use and then we started watching YouTube h264.
      We didn't need fast computers for everyday use and then we wanted to be able to preview documents without opening them.
      We didn't need fast computers for everyday use and then we wanted to be able to...

      The list goes on and on.

    • Re:meh (Score:4, Interesting)

      by grocer ( 718489 ) on Wednesday April 22, 2009 @03:44PM (#27678883)
      Good enough has changed because Linux keeps up with the Windows upgrade cycle...I attempted to dust off a Pentium II 300 with 448mb RAM, 40 gig hard drive, CD-ROM, DVD/CD-RW, ESS Mastro II PCI sound card, and an Nvidia TNT2 (32 mb). To a get a mostly usable system (partially attributable to broken ACPI), I went from Ubuntu 8.04 to 8.10 to XUbuntu 8.10 before ultimately making a reasonable net appliance with FreeBSD-7.1 & XFce4...that lasted about two weeks until I got a DFI AK76-SN with an Athlon XP 1800+, 512mb RAM, and an Nvidia Ti4200 (128mb) from my brother because I was bitching about not being able to get a stable system from ancient hardware...granted I moved from circa '97-'98 hardware to probably about '00-'01 but what a difference 3 years makes when the current kernel has been basically synced to the MS upgrade cycle because that's what's been driving hardware development...I now have Ubuntu 8.04 running on a completely usable system, no difference from the XP Pro box upstairs in terms of functionality.
      • Re:meh (Score:5, Insightful)

        by Moebius Loop ( 135536 ) on Wednesday April 22, 2009 @03:59PM (#27679095) Homepage

        ...but what a difference 3 years makes when the current kernel has been basically synced to the MS upgrade cycle...

        I don't think it's fair to claim the kernel is synced to the MS upgrade cycle -- the kernel is not the problem, it's the desktop environments and the distros that feature them that are chasing the OSX/MS "bells and whistles".

    • Re:meh (Score:5, Insightful)

      by Eil ( 82413 ) on Wednesday April 22, 2009 @03:46PM (#27678923) Homepage Journal

      Agreed completely. There was a time when I absolutely had to have the latest and greatest just to get things done. Now, my ome and work PCs are years old and are running CPUs that were low-budget even when brand new.

      Unless you're building some kind of specialized business or research system, the only reasons to shell out thousands of dollars on hardware is if you're doing virtualization or are a hardcore gamer with no social life. :P

      • by hurfy ( 735314 )

        i'll go with the already here also :)

        The fastest computer in the office is a P4-3GHZ. As long as it has HT it will do for office work here. If i bought anti-virus for the sonicwall and deleted if off the desktops we could be using 2GHz or less machines still.

        Our main app uses a terminal emulator. Using Windows XP with antivirus running takes at least a P4 2.5GHz HT CPU to successfully emulate a DUMB terminal without lag from the local computer.

        Most horsepower now seems to go to run the system itself :(

    • Re:meh (Score:5, Insightful)

      by Facegarden ( 967477 ) on Wednesday April 22, 2009 @03:46PM (#27678937)

      Being saying since the Pentium II days. This "always-be-upgrading-the-latest-spec" is fine for hardcore users, but for everybody else, "good enough" happened quite a few hardware generations ago. The sad part is that we're only now having this conversation.

      Eh, it seems different now - companies don't just have a range of products ranging from slow to fast, they actually champion some of their slower products (netbooks). Even power users are buying a netbook for on the go use, because they are mostly good enough. Sure, we have big fast desktops, but this is the first time even power users are buying low powered machines.
      -Taylor

    • by Yvanhoe ( 564877 )
      The sad part is that "good enough to run badly written hardware" is what everybody wants and that there is no limit to what it can consume.
    • Re:meh (Score:4, Insightful)

      by rolfwind ( 528248 ) on Wednesday April 22, 2009 @04:26PM (#27679497)

      Being saying since the Pentium II days. This "always-be-upgrading-the-latest-spec" is fine for hardcore users, but for everybody else, "good enough" happened quite a few hardware generations ago. The sad part is that we're only now having this conversation.

      Being "good enough" depends on your usage. If all you do is small spreadsheets, good enough may have happened in the 1980s. If all you do is word processing, also 1980s. Now, going into office suites, depending on your need and use like powerpoint, it could have been anywhere from the late 90s to mid-2000s.

      But it's also based on expectations and expections are too often influenced from past experiences rather than having the imagination of what could be.

      I program and I browse. Programming can always use a faster computer at compile time for C-type languages. 10 years ago, I would have said my computer back then would have always been good enough for browsing. Most content was static, it displayed the pages easily enough. You know what happened? Flash, Ajax, and the rest - watching videos, more dynamic pages, etcetera. What a internet "should be" has been redefined. Should I pretend this is the end of the road and no other advances in what we think as the internet will happen? Definitely not. For one, higher speed connections will keep transforming what we think our www experience should be. And a more powerful computer is necessary for that.

      And videogames aren't even fooling us yet with their graphics. They got damned good, but they aren't out of uncanny valley yet. And we're not even beginning in 3D displays yet, still looking at these boring 2D planes - when will that happen, what is the killer app there?

      How many undiscovered killer apps are there still? When will the first good AI come out? Or robots with real AI?

      "Good enough" is not good enough. I can't even believe it's a subject worth pondering. It's not exciting and not a reason to be in the computer field. It's static and boring. The story of humanity is the story of constant progress. The only reason people are looking into it is that the MHZ wars have stagnated, and people haven't the best solutions yet how to harness multi-core, a type of despondent response to the seeming lack of progress, the gigantic leaps and bounds computers were making just 5-9 years earlier. Those are problems worth looking into, but I know computers now are definitely not good enough.

      The interface alone is still entirely too dumb, for one.

  • by Midnight Thunder ( 17205 ) on Wednesday April 22, 2009 @03:25PM (#27678603) Homepage Journal

    This kinda of reminds of the '640KB should be enough for everyone' theory. If everyone is just content surfing the web and writing e-mails, then sure the 'good enough' solution sounds fair, but if 'good enough' also means dealing with a Windows ME experience then no thanks. At the same time what is considered 'good enough' will evolve over time and new solutions are created and user expectations evolve.

    Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.

    In many ways plenty of people are already using 'good enough' computers. Whether they are satisfied with them is a whole other question.

    • Re: (Score:3, Insightful)

      by lewiscr ( 3314 )

      Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.

      Today's "Good Enough" computer won't. Tomorrow's "Good Enough" computer will.

      And from the FA, a "Good Enough" computer won't last forever. It just has to last long enough that Microsoft destroys itself because people don't buy a new OS every 2 years.

    • by mr_mischief ( 456295 ) on Wednesday April 22, 2009 @03:40PM (#27678835) Journal

      iLife '09 already tries (and does a decent job, if the demos are to be believed) of categorizing your photos by setting and subject. It uses face recognition and any embedded GPS data in the image file from your camera to do so.

      BTW, I'm not an Apple fanboy, and I'm pissed that's what was covered in their presentation Sunday that was supposed to be about how environmentally friendly their systems and manufacturing processes are.

    • by mcrbids ( 148650 ) on Wednesday April 22, 2009 @03:45PM (#27678921) Journal

      The reality is that computers today "live longer" than they used to. Having a 9-10 year old computer was once unthinkable; it's now almost normal for just about any old Pentium 4 to still be in use today, and the Pentium 4 was apparently released in late 2000. [raptureready.com]

      I put a new (but cheap!) AGP video card into an older P4 desktop computer (hint: PC-133 RAM!) that my son now uses to play Spore - one of the newer, hotter games around - it plays just great.

      It's a trend - computers are "doing" for longer than they used to. They are in use for longer, and people hang on to them longer. They are less willing to buy the top-end because there's no reason to.

      • Re: (Score:3, Insightful)

        They are in use for longer, and people hang on to them longer. They are less willing to buy the top-end because there's no reason to.

        You pretty much hit the nail on the head for Microsoft's problem as well. I may be one of the few people that doesn't have that much of a problem with Vista and 7, but if I didn't get my copies from my university for $24.00 I would never have transitioned. XP just works, and more to the point, was designed to work on those older machines.

    • by jollyreaper ( 513215 ) on Wednesday April 22, 2009 @03:46PM (#27678929)

      This kinda of reminds of the '640KB should be enough for everyone' theory. If everyone is just content surfing the web and writing e-mails, then sure the 'good enough' solution sounds fair, but if 'good enough' also means dealing with a Windows ME experience then no thanks. At the same time what is considered 'good enough' will evolve over time and new solutions are created and user expectations evolve.

      That last sentence is the key to the whole debate. There's been wicked kewl shite just over the horizon ever since I've been in computers and for quite a few years beforehand. But we've reached a point where the innovations in software don't really require more horsepower on the user's machine.

      If we strictly consider the office work environment, we pretty much had everything we needed with win2k and office2k. There's been no new killer app introduced since then. Probably the only argument to be made is that there's more in excel 2007 than in 2k but those extra goodies came at the price of a lot of crap.

      Also bear in mind that the customer base has fragmented tremendously. Computer users used to be a unified market of geeks and business types but now it's as fragmented as the user base for home entertainment. Some people are happy with a small broadcast TV, some people need a thousand cable channels and a 72" screen with all the doodads. Both people are in the same general market but their segments are widely divergent.

      Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.

      Call that ten years from now. I don't have an interest in photography now, probably won't by then, but since you do you'll be happy to upgrade for those features. I know I'll have a different machine by then and will be doing different things. Your mother might still be happy running on your trade-down, it does everything she needs.

      In many ways plenty of people are already using 'good enough' computers. Whether they are satisfied with them is a whole other question.

      Fifteen years ago most people didn't have a need for web and email so developing that need was pretty big in the first place. Some may never progress beyond that point.

    • If you are still running everything locally then 'good enough' is a moving target. I've found that my desktop requirements are dropping as I move my storage to a NAS appliance, my applications to server class hardware, etc. In business it is very much the same. Doesn't have to be the cloud.
    • by LWATCDR ( 28044 )

      "Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognize the faces in my photo collection. "
      I hope they don't come out with 32MP entry level cameras.
      First of all even a 10 MP is good enough for some very large prints and is more than big enough for most monitors.
      Second 32MP with crappy optics will still give you crappy pictures. Optics are not driven by Moore's law.
      What hopefully will happen is that senors speed, color, and low light performance will increase. Probably

  • Get what you pay for (Score:4, Interesting)

    by mc1138 ( 718275 ) on Wednesday April 22, 2009 @03:25PM (#27678607) Homepage
    I'm all for cutting costs using an open source OS, but the problem with increasingly cheaper hardware is staying power. Yeah it might be all you need, but how long is it going to be around for. Of course the trade off is, is it cheaper to get short term cheap computers, or long term expensive computers. And, to top it all off, if we do switch to a disposable computing model will we having recycling programs in place to make sure we reuse the rare and valuable parts, and keep the really toxic parts out of landfills?
    • I'm all for cutting costs using an open source OS, but the problem with increasingly cheaper hardware is staying power.

      I think you're missing the point, which I take to be this: We've reached the point with regards to hardware, that we *already* have "staying power", except in all but the highest-end applications.

      Even now, what you'd most likely deem "cheap" hardware is more than capable of running the most common applications well, and the OS' themselves are sufficiently reliable that one of the compellin

  • by explosivejared ( 1186049 ) * <hagan.jaredNO@SPAMgmail.com> on Wednesday April 22, 2009 @03:25PM (#27678615)
    There is nothing particularly insightful about the article. Obviously the largest portion of the computer using population would never need cutting edge power, so effectively "good enough" has always been the paradigm. How many of us have super computers? This is just a piece with some wishful thinking hoping that people eventually see through Microsoft's coerced perpetual upgrade cycle.
  • by levell ( 538346 ) on Wednesday April 22, 2009 @03:27PM (#27678643) Homepage

    The article argues that people won't upgrade from XP - it expects that as MS tries to force them, people will migrate to Linux instead. I think as Microsoft discontinues support for XP, people will move to Windows 7 - sales of Windows based netbooks seem to be much higher than for Linux.

    Whether the same will hold true when the time comes for MS to try to get people to upgrade from Windows 7 to whatever comes next, it's too early to tell. Hopefully by then Linux will have managed to gain enough market share that most people have heard of it and/or know someone running it and the barrier to a non-MS OS will be much lower

    • by tchuladdiass ( 174342 ) on Wednesday April 22, 2009 @03:34PM (#27678743) Homepage

      But Windows won't run on the next generation of netbook computers (the ARM-based ones, such as what Freescale/Pegatron is comming out with). Unless you count WinCE. But Linux will run the same apps as always, since everything can be (and has been) ported.

      Of course, this hinges on the assumption that ARM-based netbooks will take off, and I think they will. For one thing, they get much better battery life than you can get out of an x86 (even though Atom is low powered, you still have the thirsty chipset). And the prices are better than most of the x86 netbooks ($100 to $200).

      • by levell ( 538346 ) on Wednesday April 22, 2009 @03:45PM (#27678899) Homepage

        Not all Linux software is OSS. If Flash (for example) wasn't available to ARM, I think it would make them less attractive - my wife spends a lot of time watching the BBC iPlayer on our Asus EEE.

        This article [engadget.com] claims the ARM version of Flash will be out in May. I hope it is. I like our EEE but an 8hour battery life for the price they are talking about would be enough to make me buy one.

    • The article argues that people won't upgrade from XP - it expects that as MS tries to force them, people will migrate to Linux instead. I think as Microsoft discontinues support for XP, people will move to Windows 7 - sales of Windows based netbooks seem to be much higher than for Linux.

      I have no issue upgrading to a 'new and better' operating system, on the condition that I see some worth in what's new and better. The issue I have with Windows Vista and Windows 7, is that other than higher hardware require

      • Re: (Score:2, Insightful)

        by levell ( 538346 )

        I have no issue upgrading to a 'new and better' operating system, on the condition that I see some worth in what's new and better.

        Once XP isn't supported and security flaws continue to be discovered, staying on XP will be unappealing.

        • by shutdown -p now ( 807394 ) on Wednesday April 22, 2009 @09:20PM (#27682291) Journal

          Once XP isn't supported and security flaws continue to be discovered, staying on XP will be unappealing.

          I don't even think that's going to matter for the majority of people. They will go and buy a new PC eventually, and that will come with Win7. If that doesn't suck on release as much as Vista did (and all indications from the beta seem to hint that it's actually very good), then Win7 will stay. And that's all there will be to it.

    • Re: (Score:3, Interesting)

      by jmorris42 ( 1458 ) *

      > ..sales of Windows based netbooks seem to be much higher than for Linux.

      True... now. But look at the factors Microsoft had to deal with to make that happen.

      1. They had to 'encourage' the vendors to go upscale and forget about the low end. Of course most didn't need much encouraging anyway since this whole $200 and falling netbook idea scared the willies out of most of em. But while computer companies fear the low margins pricing that low will entail the consumer electronics people see an opportunit

    • Re: (Score:3, Insightful)

      by Eil ( 82413 )

      Hopefully by then Linux will have managed to gain enough market share that most people have heard of it and/or know someone running it and the barrier to a non-MS OS will be much lower

      Linux is already far more prevalent than many of us could have dreamed a decade ago. Linux ships on cell phones, set top boxes, routers, laptops, desktops, even TVs. Linux practically runs the Internet. Any sysadmin worth his salt knows how to install a distro and get basic services up and running. While it's not quite a house

    • Re: (Score:3, Insightful)

      by PopeRatzo ( 965947 ) *

      The author of this article's supposition requires a 15 year economic downturn. Whenever I hear this predicted, say, from World Net Daily or one of the other far right publishers, it surprises me that to come up with this prediction requires the belief that there will be no technological advances in that period. It's amazing what a couple of new technologies can do to economic predictions. In the eighties, we thought there would be recession for ever, but then personal computers and the Internet came along

  • about 10 years ago it because apparent that the good enough PC was the future. Hot swappable parts, up to entire CPUS and redundant data storage meant the for many applications running 20 computer, with five down at any time, became an effective solution.

    I don't know how less good enough computers are going to become over the next ten years. It might be a an issue of power, but I think what happened is that we realized that computers became over powered for the average user. This is not an issue of goo

  • The future? (Score:4, Insightful)

    by nurb432 ( 527695 ) on Wednesday April 22, 2009 @03:30PM (#27678663) Homepage Journal

    Hell, that was 10 years ago.

    If we hadn't let the programmers run amok and force them to write efficient code, what we had back then was 'good enough' for most people. ( not all, but most )

    And to prove my point, i'm still running a 10 year old desktop with a 900mhz PIII running Freebsd on a daily basis.

    • Re:The future? (Score:4, Insightful)

      by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Wednesday April 22, 2009 @04:17PM (#27679369) Homepage Journal

      If we hadn't let the programmers run amok and force them to write efficient code, what we had back then was 'good enough' for most people.

      OK, so give me an example of inefficient code and your explanation for why it's inefficient. As a professional programmer, I get tired of bearing the blame for "bloat". Sure, I write in high-level languages instead of assembler these days. I write database-backed web applications, and while I'm capable of implementing them on bare hardware, my boss would much rather just buy a faster server and let me code in Python than wait 4 years while I hammer out a prototype in assembler. The end result is that I can add new features in a timeframe that our customers will tolerate.

      If we were stuck on the hardware of 1999, we'd be writing software the same way we did in 1999. Having been there, it sucked compared to what we can do today and I would never voluntarily go back. Do carpenters build "bloated" homes because they use general-purpose fasteners to bind pieces of standardized wood together, or are you willing to tolerate a little deviation from the ideal because you don't want to wait while they grow a tree in the exact shape of your blueprints? Well, I want to do the same for software. If you consider that "bloat", then you don't understand modern software development and what it delivers to end users.

    • Oh really? (Score:5, Insightful)

      by Sycraft-fu ( 314770 ) on Wednesday April 22, 2009 @04:19PM (#27679387)

      Well then, get on writing efficient code that'll decode HD video on a 900mhz processor. Don't tell me that is something "normal users" don't want, video on PCs is exploding and people are all about a higher res better looking picture. Don't forget the 5.1 audio that goes with that, and HRTF calculations for those that want to wear headphones but get surround. Oh it can't handle that? Well there you go then.

      I get real tired of this whining about "Programmers aren't efficient," thing, as though the be-all, end-all of coding should be the smallest program possible. No, it shouldn't, computers are getting more powerful, we should use that power. There are a number of reasons for programs to get bigger and require more power:

      1) Features. I don't want computers to be stuck and never get any better. I want more features in my software. This goes for all software, not just power user type apps. For example one thing I really value in Office 2003 (and 2007) is their in line spell checker. It is very good at figuring out what I mean when I mistype, and learns from the kind of mistakes I make to autocorrect and make more accurate guesses in the future. Well guess what? That kind of feature takes memory and CPU. You don't get that for free. No big deal, my computer has lots of both. But it isn't "bloat" that it has features like that, rather than being a very simple text editor.

      2) Manageability of code. Generating really optimized code often means generating code that is difficult to work with. I mean in the extreme, you go for assembly language. You get the smallest programs doing that, and if you are good at it the fastest. Ok great, but maintaining an assembly program is a bitch, and it is easy for errors including security issues like buffer overflows to sneak in. Now compare that to doing the same thing in a fully managed language like Java or C#. Code will be WAY bigger, especially if you take the runtimes in to account. However it'll be much cleaner and easier to maintain. No it won't be as efficient, but does it matter? For many tasks there's plenty of power so that's fine.

      3) New technologies. HD video is an example that is out now, true speech understanding (as in you can command the computer using natural language) would be one that we haven't reached yet. These are things that are only possible because of increased processor power and memory/storage capacity. Look at video on the computer. For a long time it was non existent, then when it started it was little postage stamp sized things that weren't useful, to now where you have full screen HD that looks really smooth. It wasn't as though peopel haven't always wanted better video, it was that computers back in the day couldn't handle it. Only recently have drive become large enough to hold it, and CPUs fast enough to decode it in realtime.

      4) Faster response. Computers have gotten MUCH faster at user response. The goal is that users should never have to wait on their system, ever, for anything. The computer should be waiting on the human, not the other way around. We keep getting closer and closer. If you don't try new systems it is hard to appreciate, but it has been massive strides. As a simple example I remember back in high school when I went to print a paper for school, I'd issue the print command and wander to the kitchen. Printing a 5 page paper was a lengthy process. The computer had to use all it's resources for some time to render the text and formatting in to what the printer can handle. Now, I submit a 50 page print job with graphics and all and it is spooled nearly immediately. The printer has the entire job seconds later, since these days the printer has it's own processor and RAM. It is printing before I can walk over to it. Things that I used to have to wait on, are now fast.

      5) Better multitasking. People like to be able to have their computers do more than one thing at a time, and not bog down. It can be simple things like listen to music, download a file, and surf the web but not that long ago it wasn't possibl

  • Welcome to my world (Score:3, Informative)

    by jsiren ( 886858 ) on Wednesday April 22, 2009 @03:31PM (#27678689) Homepage

    Not willing to spend a lot of money on something that will lose its value faster than... well... anything, really, I have adopted the "good enough computing" doctrine years ago: I find computers that are sufficiently powerful for my use as cheaply as possible - nowadays they're usually free. I have gotten several perfectly good computers by saying "I can take that off your hands if you want.

    So far all my software needs have been covered with Linux and other open source software.

    I do have two Macs, but they follow the same philosophy: the combination of hardware+software is good enough for the purpose, and keeps its value better than a PC. [source: local sales of secondhand computers]

  • Umm. Yeah? (Score:5, Insightful)

    by fuzzyfuzzyfungus ( 1223518 ) on Wednesday April 22, 2009 @03:31PM (#27678695) Journal
    Is predicting the rise of "good enough" really all that bold? Although we don't think of it this way, the rise of "good enough" has already happened at least once.

    Remember all those $10,000+ Real Serious Workstations, running Real Serious OSes that real computer users did real work on, back when the kiddies were twiddling bits on the Z80 box they built in their garage? All of them are dead. Almost all computers now in use are the direct descendants of the low end crap of the past.

    Further, even within the category of boring x86s, almost all of us are already running something much closer to "good enough" than to "good". Some enormous proportion of PCs are in the sub-$1000 category, which still entails a bunch of tradeoffs(not nearly as many as it used to; but still).

    It will, indeed, be interesting if Microsoft hits the chopping block during the next round of "good enough"ing(or, more realistically, gets shoved to rather more cost insensitive business sectors that like backwards compatibility, the same way IBM was); but "good enough" is already all around us.
  • Especially since the advent of "Slop-Ware" and Windows versions that need exponentially more power and capacity than the last version.

  • Even now, the low end, cheaper netbooks [often with no CD drive or even hard drive] are very popular.

    A lot of people like to use them as a smaller, less costly replacement or addition to a full blown laptop.

  • we have to remain careful of competition - being cheaper doesn't help if someone is selling hardware or software under market price in order to maintain market share.

    nobody can deny that Microsoft is basically giving Windows XP away for free on netbooks. While they are totally able to do this, Linux can't make up for this loss by stashing vast amounts of money from other overpriced software.

    what we need to do is beat microsoft on usability on every aspect, not just price. Including marketability, liability

  • 2025! (Score:5, Funny)

    by D Ninja ( 825055 ) on Wednesday April 22, 2009 @03:37PM (#27678777)

    The year of the Linux desktop!

  • by orev ( 71566 ) on Wednesday April 22, 2009 @03:37PM (#27678783)

    Microsoft knew this a long time ago. That's why they are where they are today... everywhere. You don't need something that's perfect and awesome, you just need something good enough so people can get by. The cost savings you get by not putting tons of effort into perfection can be passed on to consumers, who almost always buy on price alone.

  • By selling Windows XP you can bundle in a lot of trial versions of programs like Microsoft Office, virus scan etc. New computers are stuffed up with adware these days.

    This means the effective price of Windows XP is actually negative. Something Linux can not compete with. Who wants to pay to bundle a trial of an office package with Linux that comes with Open Office preinstalled?

  • Cars (Score:3, Insightful)

    by Cillian ( 1003268 ) on Wednesday April 22, 2009 @03:39PM (#27678813) Homepage
    To continue the usual car analogy, this isn't what has happened with technology such as cars. Cars were "Good enough" long ago, but these days most cars still have an excess of performance and are far from "Good enough". Ok, I'm not entirely serious - I think we'll reach a point with computers where the performance gain becomes negligible (Either that or the current trend of bloat and crap increasing and everything being just as slow will continue). As there has been a recent surge in more environmental/efficient cars, similar things seem to be happening to computers - there are a decent number of advances in saving power and things these days in technology.
    • Re: (Score:3, Insightful)

      by PitaBred ( 632671 )
      That's EXACTLY what happened with cars. We have the technology to build Bugati Veyrons. Most people still buy Toyota Corollas, though. They're good enough. They don't have all the bells and whistles, they don't have all the performance, but they're the right price and they do everything people really need them to do. Sure, it'd be nice to have a lot of the extras, but people don't find them worth it. Same thing with newer computers... the old P4 is fast enough to watch the videos of the grandkids and send e
  • by notarockstar1979 ( 1521239 ) on Wednesday April 22, 2009 @03:45PM (#27678903) Journal
    I've been in the small/medium sized business support for a while and I'm here to tell you that "Good Enough Computers" are the standard. You'll have a few engineers and designers (along with a boss or two that is a wannabe nerd) that have the latest and greatest but the vast majority of users in those businesses have had good enough computers for a long time. Sally Dataentryspecialist has a computer that she can type up Word documents on. Jimmy Executive has a laptop that's just good enough to browse porn and play DVDs. This includes home computers. They never ask about some brand new state of the art system (see exceptions above), it's always about the eMachine or Gateway that their dear grandmother left them when she died, and the only use it saw before they had it was traveling to church websites on Sunday.

    This is especially true in small town America.
  • But "good enough" computing won't suffice for gamers. They're usually the ones who drive the cycle of upgrading usually anyways. Most gamers' systems are ridiculously overpowered (mine included), and will continue to be so, well after games have reached the point to be indistiguishable from reality. They're always going to want to push that just one FPS more, that extra level of AA, etc. PC gaming enthusiasts won't go away, and as more generations grow up with computers, they'll become more adept at using t
  • by russotto ( 537200 ) on Wednesday April 22, 2009 @03:55PM (#27679047) Journal

    Parkinson's law is "Work expands to fill all available time". It applies to processing power too. What's "good enough" today won't be "good enough" tommorrow, because someone will invent some CPU-sucking memory-hogging disk-flogging killer app that everybody will want to have.

    I don't know what it will be. But then again, who predicted grandmothers would be editing home movies of their grandkids on their computers? Try that on a machine which is just "good enough" for email and the web.

    • Re: (Score:3, Interesting)

      Parkinson's law is "Work expands to fill all available time". It applies to processing power too. What's "good enough" today won't be "good enough" tommorrow, because someone will invent some CPU-sucking memory-hogging disk-flogging killer app that everybody will want to have.

      Sure, but over time the percentage of computers that are sold new, and in general use, that are significantly below the "top of the line" increases -- and that's not just a prediction of the future, but I think something that is true o

  • We will, within ten to twenty years, find ourselves in a world where, in theory, everyone has access to the same technology. The fact is Microsoft, Oracle, IBM and all of the others will realize that the Internet is like Aerosmith, the poor man's Rolling Stones. All data will be centralized and coordinated by governments and the wealthy to be parlayed into election results and financial gain.

    Those unlikely enough to be on the outside of this new class-based proprietary world will be lulled into believin
  • The obvious going mainstream seems to be the stimulus for it ceasing to be true. Extrapolating from the popularity of sensor equipped devices, like the wii & iphone, it seems likely that computers that monitor and respond to your gestures, voice and attention will be arriving soon.

  • For most people software written a decade or more ago was "Good Enough" and they don't need modern technology.

    It is called Retocomputing when you use old computers and old software. You can buy them cheap at Auctions and Garage Sales and eBay.

  • Fallacy (Score:4, Insightful)

    by Un pobre guey ( 593801 ) on Wednesday April 22, 2009 @04:04PM (#27679145) Homepage
    This is an oft-repeated fallacy, that most people don't need powerful CPUs or OSs. A post above claims to have been saying this since the Pentium II days. This is essentially the same short-sightedness as the apocryphal "128K ought to be enough for anybody" remark from way back when.

    It is patently and obviously ridiculous. A Pentium II PC, especially on a Pentium II-compatible motherboard with its memory and other characteristics, would not be an acceptable platform for the average user. It would be very slow and would immediately have memory issues. Current graphics hardware would probably not be compatible, and even if it was the 3D software like OpenGL or the MS equivalent would have unacceptably bad performance. Contemporary games would be dreary experiences indeed.

    Lots of multimedia authoring software can use as many cores and as much RAM as you can afford. 3D gaming environments with ever more active objects, each with some amount of basic AI and moving parts, will also keep pushing the envelope even further. "Tab creep" in your web browser, where you end up accumulating open tabs, each with graphics, javascript, and maybe audio or video give memory footprints well into the hundreds of MB.

    Maybe deaf and blind little old ladies with severe arthritis can get by with a Pentium II, but not too many others. In 2025 the things that will pass for personal computer desktops (something like them will still exist in spite of the cyclical "The PC is Dead" hype), will have a dozen or more CPU cores or perhaps hundreds of smaller cores of various kinds to distribute different types of processing. Cache memory will be much larger than today as will be system RAM and storage. Software will be similar to today's except for far greater detail and granularity of content, and multiple new ways to interact with the data. That will demand a lot of compute power.

    No doubt people will continue to say things like "an exaflop and a zettabyte ought to be enough for anyone," and people like me will continue to deride and mock them.

  • by number6x ( 626555 ) on Wednesday April 22, 2009 @05:01PM (#27680019)

    With the debut of Windows 3.1 'good enough' became the accepted norm in computing.

    You could pay more for a NeXT workstation, a Sun workstation, or even a Mac. However Windows 3.1 was 'good enough'. Most people didn't need networking support built in, or the compilers or software that was available for the other platforms.

    You could have gone all multimedia with a fancy Amiga that did incredible sound and graphics, but 16 colors and trading files via floppy was 'good enough' for the majority of people. You could add hardware and software to Windows 3.1 computers if you really had a need to network them. The computers Windows ran on were capable of displaying better graphics (games that booted to DOS showed this), but Windows 3.1 was 'good enough'.

    Windows 3.1 really did make computers easier to use. Macs, Amigas and NeXT did a 'better' job of making computers easier for people, but Windows 3.1 did a 'good enough' job at making things easier. At about US$2,400.00, a mid range computer with Win 3.1 on it was a lot cheaper than the competition. It was 'good enough' and cheaper.

    The history of economics shows that 'good enough' and cheap wins.

    Think of the 'best' hamburger that you ever ate...

    Did you think of a plain old McDonald's hamburger? Probably not. In any scale of human measure (taste, smell, satisfaction) McDonald's hamburgers rarely rank as 'best'. But measured in market share the McDonald's hamburger is the best.

    Ford's Model T was not as fast or as fancy or as comfortable or as good in quality as the hand crafted automobiles it competed with. But thanks to mass production and economies of scale it was cheaper and it was 'good enough'. Ford and other mass produced vehicles dominated the market. There are still purpose built vehicles, but they are a small specialty segment of the market.

    'Good enough' and cheap is always the 'best' when you consider things from a market dominance point of view. What a human thinks is 'best' and what the market thinks is 'best' are not the same thing.

  • by highways ( 1382025 ) on Wednesday April 22, 2009 @05:16PM (#27680167)

    "I think there is a world market for about five computers"

    http://en.wikipedia.org/wiki/Thomas_J._Watson [wikipedia.org]

    They were good enough then. Since then, the market has expanded a little.

There are never any bugs you haven't found yet.

Working...