"Good Enough" Computers Are the Future 515
An anonymous reader writes "Over on the PC World blog, Keir Thomas engages in some speculative thinking. Pretending to be writing from the year 2025, he describes a world of 'Good Enough computing,' wherein ultra-cheap PCs and notebooks (created to help end-users weather the 'Great Recession' of the early 21st century) are coupled to open source operating systems. This is possible because even the cheapest chips have all the power most people need nowadays. In what is effectively the present situation with netbooks writ large, he sees a future where Microsoft is priced out of the entire desktop operating system market and can't compete. It's a fun read that raises some interesting points."
Smart enough... (Score:2)
Re:Smart enough... (Score:5, Funny)
Pretending to be writing from the year 2025
In the year twenty twenty-five if Intel is still alive. If Microsoft can survive they may find...
Re:Smart enough... (Score:5, Funny)
Re:Smart enough... (Score:5, Insightful)
Its funny because the same feeling people get about using Linux (will it run what I need to), I now get when I boot into Windows. I sit there in front of windows and wonder, what can I do with this? I'm not sure its going to run the applications I need it to. The tables have turned.
Re:Smart enough... (Score:5, Interesting)
I know exactly what you mean. I had to boot into Vista the other day to update my iPod, and it was a mess. I mean, it's pretty much a brand new install, and I've done as much as possible to reduce running services and apps, but still...it can barely handle a single browser on this computer. And it's so damn unresponsive. Combine that with the horror than is iTunes (It just starts doing all kinds of crap that I don't want it to do, and it slows my computer to such a crawl that it takes ten minutes get my mouse to the cancel button) and what should have taken five minutes ended up taking over an hour.
After that experience I have truly realized why I love Linux. I love it because, even on my $500 Dell Vostro, I can run a browser with 15 tabs open, and leave it running for weeks at a time (an old, leaky firefox even!)...while running KDevelop and Pidgin and Amarok and Konsole and Epiphany (yes, I run two browsers sometimes) and kate and whatever else I need. And nothing slows down. I love it because I can squeeze almost 6 hours of life out of a battery than can barely hit 3 on Vista. I love it because I can do 'sh passmount.sh' and punch in a password rather than typing in some huge string, typing in a username and password, selecting a drive and hitting next 6 times...but if I want GUI tools, they're right there too. I love it because all of my apps run. All of them. From Fantasy General and Zone Raiders (old DOS games) to World of Warcraft and Command and Conquer: Tiberium Wars. Basically, I love it because it does what I want. Everything I want. But _only_ what I want.
Re:Smart enough... (Score:4, Informative)
Mounting my shared disk space from the university. Which I need to do frequently, as I don't have a printer, so I have to transfer all my papers over to the network space and then go to a lab to print. So in Windows, I have to type in the massive string that is whatever the hell the drive is I'm trying to mount (I don't even know it), then put in my username and password, and select a drive to mount it to, etc, etc. On Linux I just run a shell script and enter my password.
Though I suppose I could have both of them do automount, but I don't like automounting network disks like this...because if I'm not on the university's network, the system starts spewing error messages about not being able to find it. And as I leave the campus network at least once a week, and have multiple network disks that I mount, that would also be rather annoying.
Re:Smart enough... (Score:4, Informative)
I did wonder if it was something like that, but if you can create a bash script then you can also create a logon script: use something like
net use U: /del /del
net use P:
net use U: \\MY_SERVER\users
net use P: \\MY_other_server\public
in your Windows logon script then it will auto-mount the drives (without any annoying messages resulting from a persistent share not being able to find the network path when you're not on the Uni network). We use that kind of thing in our logon scripts at work.
(copied the script lines above from http://www.windowsnetworking.com/kbase/WindowsTips/WindowsNT/AdminTips/Logon/WindowsNTLoginScriptTricksandTips.html [windowsnetworking.com] )
Re: (Score:3, Insightful)
Imagine... Windows without cygwin. Unusable.
Re:Smart enough... (Score:4, Informative)
GNOME?
Re: (Score:2)
It's "websites/web sites", not "web-sights".
Re:Smart enough... (Score:5, Insightful)
Have you actually seen Linux? Honestly - you CAN learn the CLI (and a powerful skill it is) but you really don't NEED to (no more than you need to use the CLI in Windows).
Take a look at Ubuntu (which is one of the easiest Linux's out there). It's simple to install. Adding applications is easy. Updating is easy. Seriously, what's not to like (apart from the brown colour scheme)?
You can get plenty of paid support, from proper firms (Oracle, Novell, IBM - to name a few). I'm not sure where the engineers live, but they've got jobs (even if they don't have windows).
Re:Smart enough... (Score:5, Funny)
I like the brown color scheme. It gives Ubuntu a warm, earthy feel.
Re:Smart enough... (Score:4, Funny)
I know - I'm an insensitive (earthy) clod.
Re:Smart enough... (Score:5, Insightful)
I feel that if Ubuntu made it easier to change to well made themes, it would cause many people to take a second, if not even the first look. Design is important, and that includes the look and feel. Imagine if during the setup you were given up to ten themes in different colors, all "professionally" done. I am not trying to hate on Ubuntu, I love it, but when I show it to people with the default colors they go uh, yea ok....
They could even stick to earthy tones and cooler colors. I like blues myself. I would even be willing to donate to this, because I feel that it could that strongly help adoption. I just don't have the artistic skills myself.
Re:Smart enough... (Score:5, Insightful)
I feel that if Ubuntu made it easier to change to well made themes...
How is System>>Preferences>>Appearance anything but easy?
You'll note he said "easier". A default theme that more people like is easiest. A picker during install is easier.
And, as we're talking about mass-market users here, a "System" menu is just about the scariest thing they can imagine. They don't see, "This is where you can customize your computer to just the way you want it. Have at it, you freedom-lovin' hacker-dude!" They see, "click the wrong button and you're fucked." Except all the buttons are in Chinese. And they don't read Chinese.
Re: (Score:3, Interesting)
It's simple to install. Adding applications is easy. Updating is easy. Seriously, what's not to like
It doesn't run photoshop and itunes.
Before you harp "run wine or gimp" normal people don't know about wine, gimp, or why they should use it. They want to attach their iphone to their computer like Steve Jobs says they can with a Mac or Windows box. They want to run photoshop because it is already what they know. They don't want to be told a random piece of software won't run because it's not open source or a random device won't work because the manufacturer didn't open their specs. They don't care about
Re:Smart enough... (Score:5, Insightful)
And how many of those people running Photoshop actually paid for it?
Re:Smart enough... (Score:4, Insightful)
This is year 2025, the ITMS opened its protocols when YouMedia became the dominant player. Adobe has released a Linux port of its whole creative suite and it's available for purchase in your distro's package manager. GIMP merged again with Cinepaint and it is now the dominant photo editor among starting photo aficionados.
16 years is more than enough time for this stuff to happen.
Re: (Score:3, Insightful)
16 years is more than enough time for this stuff to happen.
That's what they said about ubiquitous jetpacks and flying cars back in the 1960's.
Re:Smart enough... (Score:4, Interesting)
Before you harp "run wine or gimp" normal people don't know about wine, gimp, or why they should use it. They want to attach their iphone to their computer like Steve Jobs says they can with a Mac or Windows box.
Actually, I'd suggest that if they actually need Photoshop, they should use it. If not, they should use Gimp. And I'd suggest Amarok, but that's another matter...
What, exactly, do you suggest?
They don't care about FSF's definition of "free" or even "free as in beer" since they'll gladly throw cash at expensive gadgets and software sold by Apple, MS, and Adobe.
No, but they ma start to care when they actually want something Linux does well, or some software on Linux, and Windows won't do it.
This began with Firefox. At first it was a vocal minority, and there wasn't much change -- most websites would still be designed for IE, and many of them would look terrible in Firefox. And, as you predicted, people blamed Firefox. (Well, Mozilla at first, and then Firefox.)
The first thing that Firefox did right was the extension concept. In fact, Firefox was born out of the idea that Mozilla had a solid foundation, but too much crap built-in that could be done as an extension.
The real catalyst came with just a few of those extensions. Firebug made Firefox possibly the best browser to develop on -- very quickly, web developers started to prefer Firebug, and dislike Internet Explorer. Of course, to this day, few sites will actually be so bold as to refuse supporting IE, but similarly, few sites will not work on Firefox.
That was really a prerequisite to getting most users to even consider Firefox. Users don't like to even think about the browser, so asking them to run two -- Firefox for most things, and IE for that one last site -- is lunacy.
The other important extensions are Greasemonkey and the various blockers -- adblock, flashblock, noscript, etc. These are important in that they give the user a reason to love Firefox -- who wants to go back to the web before Adblock? These are features IE doesn't have, because there's no incentive -- why would Microsoft sabotage their own live.com ads?
It's worth noting: Firefox didn't have to add ActiveX support. (It's been added, but it's buggy enough that people use IE anyway.) There are still sites Firefox cannot be used for. Yet Firefox has forced IE back below 80% marketshare.
I see no reason Linux can't do the same thing. It will take longer, but it is possible. But constructive criticism will be useful here, because it's unlikely Linux will ever just run Photoshop, at least until Linux gains sufficient marketshare that Adobe targets it. (Not that this stopped them from porting the Flash player or Acrobat Reader...)
So, what does Linux do well now, or what is it that Windows and OS X really suck at that Linux could do?
Re:Smart enough... (Score:5, Insightful)
I would argue that it is Adobe's responsibility, as they are the company creating a product that people are paying outrageous amounts of money for. They get to choose what operating systems they wish to make it for.
It doesn't matter whose responsibility it is. It doesn't matter whose fault it is. What matters is whose problem it is. And it's Linux's problem.
If you want people to use your system, and they can't due to some issue, that issue is your problem.
I always wonder, when Linux advocates blame some third party, what they think the end-user's thought process is going to be. Do you think some graphics artist is going to think, "I want to run Photoshop, but it doesn't work under Linux. It's not Linux's fault, so I guess it's OK. I'll just run Linux and wait for Adobe to port Photoshop over."
The blame is Adobe's, the problem is Linux's.
only if things work the way they should. (Score:5, Insightful)
I recently tried Ubuntu after leaving Linux as my primary OS in 2003. You're wrong. The GUIs are only fine if you're willing to stick with their narrow limitations. I think it's because they're constantly being rewritten instead of incrementally improved.
Examples:
When I hooked up a second display and clicked "detect displays", it did nothing. No error message, no effect. I see no way to fix this without editing config files manually.
My sound doesn't work at all. It's listed properly in all the config screens, but nothing comes out of the speakers. Now what do I do? I see no easy way to try different driver or other things without delving into a kernel module mess. Hello, terminal.
How do I disable that wretched shutdown beep with a GUI? The mute control has no effect on it, nor does disabling the system beep in the sound preferences.
This is basic stuff that's been an issue for 10 years.
Sorry, but desktop Linux in 2009 gave me the same experience as desktop Linux in 2003. I.e. 3 days of googling and sludging through manuals to get things working. The process is a tad smoother now, but it's still only good for two groups: Grandma who'll leave it the way it is, and experts who live Linux. Almost everyone I've ever met falls somewhere in between. It's hard to be just savvy in Linux. It's all or nothing.
Pretty skins are just that, skin deep.
Don't give me that paid support crap. I've never called MS support. I've never called Apple support. I can figure out how to maintain their systems by using them. If I'm going to have pay someone to help me with how to do basic things in Linux then I might as well just buy one of the other two.
meh (Score:5, Insightful)
Re:meh (Score:5, Interesting)
Re: (Score:3, Insightful)
Re:meh (Score:5, Insightful)
And ~500Mhz of processing power is all you really need for that.
Yeah, well, in the days of the Pentium II which topped out at 450MHz, that would have been "hardcore".
So clearly the needs of even the most modest computer users have gone up substantially.
And assuming the software industry continues to find interesting things for people like your mom to do with their computer, then this will continue.
Don't get me wrong, there's a "good enough" in every computing generation and there's nothing wrong with people targeting that instead of the latest n' greatest. More and more people are, which is why netbooks are becoming so popular. But the bottom-end netbook of five years from now will be significantly more powerful than the bottom-end netbook of today, and odds are that extra performance will give someone who doesn't need anything more than a netbook a real benefit.
Point is -- "good enough" is real and valid, but still a moving target.
Re: (Score:3, Insightful)
Point is -- "good enough" is real and valid, but still a moving target.
Writing everything in Python (or god forbid, a language whose interpreter is written in Python) is not helping us lower our minimum requirements.
To fanboys: I'm a python-lover like yourself. Love writing it. It's just that everyone else should write in C so their code is fast to run :D
Re:meh (Score:5, Insightful)
>> And ~500Mhz of processing power is all you really need for that.
> Yeah, well, in the days of the Pentium II which topped out at 450MHz, that would have been "hardcore".
> So clearly the needs of even the most modest computer users have gone up substantially.
Yes, but have they gone up recently? Sure, if you go back far enough you'll find unusable computers by the standards of today's average user, but seriously... when was the last time the "average" user really had to upgrade?
Let's parameterize this to make sure we're all talking about the same thing. "Average", for the purposes of this comment, being defined as someone who uses email, browses the web, plays a few card games, and oh hell, including -- to be brutally fair -- casual video viewing from the likes of youtube and hulu.
Let's see... The absolute cheapest desktop system I can conveniently find at the moment has a 1.8G Celeron and a gig of memory. That's an embarrassment of riches to perform the paltry functions described above. Laptops: I have at home a Thinkpad 240X (500 Mhz Pentium III, memory maxed out) made in June 2000 -- that's a whole DECADE ago now -- that will do all of those things, *and* play DIVX encoded videos fullscreen without hesitations, and I don't think you could buy a new laptop anywhere today, for any price, that didn't have significantly better specs. *Phones* have better specs.
To the Linux geeks out there -- yes, you can get more bang for hardware buck with Linux, but don't flatter yourself into thinking that's the only reason ultra-cheap computers are "good enough". Windows XP runs fine for average usage (see above) on hardware made a decade ago. (And of Windows versions, XP itself is "good enough" for the average user, but that's another story.)
This is not a Linux Phenomenon. It's a case of the manufacturers not shifting paradigms fast enough. In the Old Days (say, the 1990's) we really needed a steep performance development curve because all kinds of new stuff was happening that would make use of every computing cycle one could conveniently afford. Windows tended to drive this, because each new version needed faster hardware to drive it, and there was (arguably) more functionality (or fewer bugs) in each new version to warrant upgrading.
But shortly after the turn of the century, two things happened: (1) A version was released of the most popular OS on the planet that (finally) was solid enough that the average user didn't immediately aspire to upgrade. (2) Hardware performance leapfrogged past what most people really needed, especially considering (1) above. With no Killer App and no new lugubrious-yet-tantalizing release of Windows to drive it, hardware was suddenly too fast for main street.
It was fairly recently that the industry finally understood that there was a market for Cheap. What followed was a scramble to adapt to this new market. You could hear the grinding of continental paradigm shift. Even Microsoft -- for God's sake -- is starting to become concerned about performance, instead of just assuming that Moore's Law will somehow compensate for unchecked bloat.
Let's face it: There is no consumer Killer App for the quad core Nehalem. There are a few painfully cutting edge geeks that may find a use for that kind of power, but most are just fooling themselves -- playing a game of my-cpu-runs-hotter-than-yours. Yet you can put together a killer Nehalem-based PC for less than the cost of a wide screen TV.
And that's not even taking the economy into account.
> And assuming the software industry continues to find interesting things for people like your mom to do with their computer, then this will continue.
That's the current problem (if you want to call it that). The software industry has nothing your mom would need current midrange hardware to run, with nothing in particular coming up.
Maybe there is a killer app waiting out there -- maybe when true AI becomes practical, it'll drive another technology race. But there hasn't been anything for awhile.
Re: (Score:3, Interesting)
Yes, but have they gone up recently?
Yes. In my house, the defining point was our purchase of a Flip video camera. It's a little $150 flash-based unit meant for people like us who want grandma to see movies of the kids with minimal hassle. You shoot your video, plug it into a Mac or PC's USB port, double-click the runnable software that's stored on the camera itself, and watch your movies. If you want to upload them to YouTube, select the desired clips and click the "upload" button - the software handles the rest.
It's a slick little camera
Re: (Score:3, Interesting)
How old is "older"? Daughter is still happy with her dual 800 G4 from 2001. She's a heavy Photoshop user, and response on her elderly G4 is better than her more recent Dell (2 Ghz Pentium 4, circa 2004). There's a brisk business out there in non-current Macs -- you can probably offer her a substantial upgrade for a paltry sum without even stepping in a Mac store.
I do video editing at home, (my video camera is a little higher end than a Flip) and in 2008 I bought an AMD Athlon 64 3200+ (I think it was
Re: (Score:2)
For viewing an Atom/ION system will work very well. Processing? One does wonder if an Atom/ION system would do that as well for you average home users if the editing software took advantage of the GPU.
Re: (Score:2)
Re:meh (Score:5, Funny)
I AM an old person, you insensitive clod.
I can tell the difference, I just don't care. If I want to see a high-resolution sunset, I'll go outside and watch it live. I don't need to see every nose hair on the news reporter or every pore and pimple on these damn kids who seem to be everywhere on TV these days. And get off my lawn...
Re:meh (Score:4, Funny)
But HD is higher resolution than the real world...
Re: (Score:2)
Sure, but most people can be an iteration (or two) behind. Very few people need (or could even benefit from) the latest hardware hotness, Will you need to upgrade? Sure. Do you need the fastest machine you can get your grubby mitts on? Probably not.
Re:meh (Score:5, Insightful)
Yeah I expect "The Year of Good Enough Hardware" will coincide with the 10th anniversary of the "Year of Linux on the Desktop".
We didn't need fast computers for everyday computing and then we started indexing the entire hard drive.
We didn't need fast computers for everyday use and then we started watching YouTube h264.
We didn't need fast computers for everyday use and then we wanted to be able to preview documents without opening them.
We didn't need fast computers for everyday use and then we wanted to be able to...
The list goes on and on.
Re:meh (Score:5, Insightful)
You're kidding, right? Only tech enthusiasts want to watch YouTube? Tech enthusiasts are the ones using features like Spotlight (and whatever Microsoft calls their version of it in the Vista start menu)? Tech enthusiasts want to preview their documents before opening them, because they can't remember the name of the file they want but they know what it looks like?
Please. Tech enthusiasts are the ones who don't need these features, because we can get along just fine without them (although YouTube has some pretty awesome stuff on it). It's the non-technical users who need indexing and previews.
Re:meh (Score:4, Interesting)
Re:meh (Score:5, Insightful)
...but what a difference 3 years makes when the current kernel has been basically synced to the MS upgrade cycle...
I don't think it's fair to claim the kernel is synced to the MS upgrade cycle -- the kernel is not the problem, it's the desktop environments and the distros that feature them that are chasing the OSX/MS "bells and whistles".
Re:meh (Score:5, Insightful)
Agreed completely. There was a time when I absolutely had to have the latest and greatest just to get things done. Now, my ome and work PCs are years old and are running CPUs that were low-budget even when brand new.
Unless you're building some kind of specialized business or research system, the only reasons to shell out thousands of dollars on hardware is if you're doing virtualization or are a hardcore gamer with no social life. :P
Re: (Score:2)
i'll go with the already here also :)
The fastest computer in the office is a P4-3GHZ. As long as it has HT it will do for office work here. If i bought anti-virus for the sonicwall and deleted if off the desktops we could be using 2GHz or less machines still.
Our main app uses a terminal emulator. Using Windows XP with antivirus running takes at least a P4 2.5GHz HT CPU to successfully emulate a DUMB terminal without lag from the local computer.
Most horsepower now seems to go to run the system itself :(
Re:meh (Score:5, Insightful)
Being saying since the Pentium II days. This "always-be-upgrading-the-latest-spec" is fine for hardcore users, but for everybody else, "good enough" happened quite a few hardware generations ago. The sad part is that we're only now having this conversation.
Eh, it seems different now - companies don't just have a range of products ranging from slow to fast, they actually champion some of their slower products (netbooks). Even power users are buying a netbook for on the go use, because they are mostly good enough. Sure, we have big fast desktops, but this is the first time even power users are buying low powered machines.
-Taylor
Re: (Score:2)
Re:meh (Score:4, Insightful)
Being "good enough" depends on your usage. If all you do is small spreadsheets, good enough may have happened in the 1980s. If all you do is word processing, also 1980s. Now, going into office suites, depending on your need and use like powerpoint, it could have been anywhere from the late 90s to mid-2000s.
But it's also based on expectations and expections are too often influenced from past experiences rather than having the imagination of what could be.
I program and I browse. Programming can always use a faster computer at compile time for C-type languages. 10 years ago, I would have said my computer back then would have always been good enough for browsing. Most content was static, it displayed the pages easily enough. You know what happened? Flash, Ajax, and the rest - watching videos, more dynamic pages, etcetera. What a internet "should be" has been redefined. Should I pretend this is the end of the road and no other advances in what we think as the internet will happen? Definitely not. For one, higher speed connections will keep transforming what we think our www experience should be. And a more powerful computer is necessary for that.
And videogames aren't even fooling us yet with their graphics. They got damned good, but they aren't out of uncanny valley yet. And we're not even beginning in 3D displays yet, still looking at these boring 2D planes - when will that happen, what is the killer app there?
How many undiscovered killer apps are there still? When will the first good AI come out? Or robots with real AI?
"Good enough" is not good enough. I can't even believe it's a subject worth pondering. It's not exciting and not a reason to be in the computer field. It's static and boring. The story of humanity is the story of constant progress. The only reason people are looking into it is that the MHZ wars have stagnated, and people haven't the best solutions yet how to harness multi-core, a type of despondent response to the seeming lack of progress, the gigantic leaps and bounds computers were making just 5-9 years earlier. Those are problems worth looking into, but I know computers now are definitely not good enough.
The interface alone is still entirely too dumb, for one.
Re: (Score:2)
Your desktop is a classic GameBoy?
What is 'good enough'? (Score:5, Interesting)
This kinda of reminds of the '640KB should be enough for everyone' theory. If everyone is just content surfing the web and writing e-mails, then sure the 'good enough' solution sounds fair, but if 'good enough' also means dealing with a Windows ME experience then no thanks. At the same time what is considered 'good enough' will evolve over time and new solutions are created and user expectations evolve.
Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.
In many ways plenty of people are already using 'good enough' computers. Whether they are satisfied with them is a whole other question.
Re: (Score:3, Insightful)
Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.
Today's "Good Enough" computer won't. Tomorrow's "Good Enough" computer will.
And from the FA, a "Good Enough" computer won't last forever. It just has to last long enough that Microsoft destroys itself because people don't buy a new OS every 2 years.
Re:What is 'good enough'? (Score:5, Insightful)
Re:What is 'good enough'? (Score:5, Informative)
iLife '09 already tries (and does a decent job, if the demos are to be believed) of categorizing your photos by setting and subject. It uses face recognition and any embedded GPS data in the image file from your camera to do so.
BTW, I'm not an Apple fanboy, and I'm pissed that's what was covered in their presentation Sunday that was supposed to be about how environmentally friendly their systems and manufacturing processes are.
"Good enough" is what people actually DO (Score:5, Insightful)
The reality is that computers today "live longer" than they used to. Having a 9-10 year old computer was once unthinkable; it's now almost normal for just about any old Pentium 4 to still be in use today, and the Pentium 4 was apparently released in late 2000. [raptureready.com]
I put a new (but cheap!) AGP video card into an older P4 desktop computer (hint: PC-133 RAM!) that my son now uses to play Spore - one of the newer, hotter games around - it plays just great.
It's a trend - computers are "doing" for longer than they used to. They are in use for longer, and people hang on to them longer. They are less willing to buy the top-end because there's no reason to.
Re: (Score:3, Insightful)
They are in use for longer, and people hang on to them longer. They are less willing to buy the top-end because there's no reason to.
You pretty much hit the nail on the head for Microsoft's problem as well. I may be one of the few people that doesn't have that much of a problem with Vista and 7, but if I didn't get my copies from my university for $24.00 I would never have transitioned. XP just works, and more to the point, was designed to work on those older machines.
Re:What people DO is take photos and video (Score:4, Insightful)
Not really. The megapixel wars are basically over. We've just about topped out what people realistically need in a camera. Video is the same. There is really no need to ever capture more than 1080p for home use, the human eye simply can't perceive any quality improvements on screen sizes that are realistic for the home. So basically if you're computer can play 1080p video with 7.1 channel lossless audio, you're pretty much at the end of the tunnel for most people.
Re:What is 'good enough'? (Score:5, Insightful)
This kinda of reminds of the '640KB should be enough for everyone' theory. If everyone is just content surfing the web and writing e-mails, then sure the 'good enough' solution sounds fair, but if 'good enough' also means dealing with a Windows ME experience then no thanks. At the same time what is considered 'good enough' will evolve over time and new solutions are created and user expectations evolve.
That last sentence is the key to the whole debate. There's been wicked kewl shite just over the horizon ever since I've been in computers and for quite a few years beforehand. But we've reached a point where the innovations in software don't really require more horsepower on the user's machine.
If we strictly consider the office work environment, we pretty much had everything we needed with win2k and office2k. There's been no new killer app introduced since then. Probably the only argument to be made is that there's more in excel 2007 than in 2k but those extra goodies came at the price of a lot of crap.
Also bear in mind that the customer base has fragmented tremendously. Computer users used to be a unified market of geeks and business types but now it's as fragmented as the user base for home entertainment. Some people are happy with a small broadcast TV, some people need a thousand cable channels and a 72" screen with all the doodads. Both people are in the same general market but their segments are widely divergent.
Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognise the faces in my photo collection. This sound like far fetched stuff today, but as these technologies peculate down from high end systems and people get used to the computer doing more of their mind-numbing repetitive tasks, user expectation will adapt and want them in their 'good enough' computers.
Call that ten years from now. I don't have an interest in photography now, probably won't by then, but since you do you'll be happy to upgrade for those features. I know I'll have a different machine by then and will be doing different things. Your mother might still be happy running on your trade-down, it does everything she needs.
In many ways plenty of people are already using 'good enough' computers. Whether they are satisfied with them is a whole other question.
Fifteen years ago most people didn't have a need for web and email so developing that need was pretty big in the first place. Some may never progress beyond that point.
depends on where the apps are run (Score:2)
Re: (Score:2)
"Will my 'good enough' computer handle my photo library, my 32MP entry level camera, recognize the faces in my photo collection. "
I hope they don't come out with 32MP entry level cameras.
First of all even a 10 MP is good enough for some very large prints and is more than big enough for most monitors.
Second 32MP with crappy optics will still give you crappy pictures. Optics are not driven by Moore's law.
What hopefully will happen is that senors speed, color, and low light performance will increase. Probably
Re:What is 'good enough'? (Score:4, Funny)
I have no idea why, but this is how it sounded when I read it:
(Futurama)
Pr. Farnsworth: How good is this computer?
Fry: Good Enough.
Pr. Farnsworth: That's not good enough!
Get what you pay for (Score:4, Interesting)
I think you're missing the point (Score:3, Interesting)
I think you're missing the point, which I take to be this: We've reached the point with regards to hardware, that we *already* have "staying power", except in all but the highest-end applications.
Even now, what you'd most likely deem "cheap" hardware is more than capable of running the most common applications well, and the OS' themselves are sufficiently reliable that one of the compellin
Re: (Score:2)
"Good Enough" is now and always has been (Score:5, Interesting)
Re:"Good Enough" is now and always has been (Score:5, Funny)
How many of us have super computers?
I own a PS3, you insensitive clod!
People will upgrade to Windows 7 (Score:5, Insightful)
The article argues that people won't upgrade from XP - it expects that as MS tries to force them, people will migrate to Linux instead. I think as Microsoft discontinues support for XP, people will move to Windows 7 - sales of Windows based netbooks seem to be much higher than for Linux.
Whether the same will hold true when the time comes for MS to try to get people to upgrade from Windows 7 to whatever comes next, it's too early to tell. Hopefully by then Linux will have managed to gain enough market share that most people have heard of it and/or know someone running it and the barrier to a non-MS OS will be much lower
Re:People will upgrade to Windows 7 (Score:5, Interesting)
But Windows won't run on the next generation of netbook computers (the ARM-based ones, such as what Freescale/Pegatron is comming out with). Unless you count WinCE. But Linux will run the same apps as always, since everything can be (and has been) ported.
Of course, this hinges on the assumption that ARM-based netbooks will take off, and I think they will. For one thing, they get much better battery life than you can get out of an x86 (even though Atom is low powered, you still have the thirsty chipset). And the prices are better than most of the x86 netbooks ($100 to $200).
Re:People will upgrade to Windows 7 (Score:5, Interesting)
Not all Linux software is OSS. If Flash (for example) wasn't available to ARM, I think it would make them less attractive - my wife spends a lot of time watching the BBC iPlayer on our Asus EEE.
This article [engadget.com] claims the ARM version of Flash will be out in May. I hope it is. I like our EEE but an 8hour battery life for the price they are talking about would be enough to make me buy one.
Re: (Score:2)
The article argues that people won't upgrade from XP - it expects that as MS tries to force them, people will migrate to Linux instead. I think as Microsoft discontinues support for XP, people will move to Windows 7 - sales of Windows based netbooks seem to be much higher than for Linux.
I have no issue upgrading to a 'new and better' operating system, on the condition that I see some worth in what's new and better. The issue I have with Windows Vista and Windows 7, is that other than higher hardware require
Re: (Score:2, Insightful)
I have no issue upgrading to a 'new and better' operating system, on the condition that I see some worth in what's new and better.
Once XP isn't supported and security flaws continue to be discovered, staying on XP will be unappealing.
Re:People will upgrade to Windows 7 (Score:4, Insightful)
Once XP isn't supported and security flaws continue to be discovered, staying on XP will be unappealing.
I don't even think that's going to matter for the majority of people. They will go and buy a new PC eventually, and that will come with Win7. If that doesn't suck on release as much as Vista did (and all indications from the beta seem to hint that it's actually very good), then Win7 will stay. And that's all there will be to it.
Re: (Score:3, Interesting)
> ..sales of Windows based netbooks seem to be much higher than for Linux.
True... now. But look at the factors Microsoft had to deal with to make that happen.
1. They had to 'encourage' the vendors to go upscale and forget about the low end. Of course most didn't need much encouraging anyway since this whole $200 and falling netbook idea scared the willies out of most of em. But while computer companies fear the low margins pricing that low will entail the consumer electronics people see an opportunit
Re: (Score:3, Insightful)
Linux is already far more prevalent than many of us could have dreamed a decade ago. Linux ships on cell phones, set top boxes, routers, laptops, desktops, even TVs. Linux practically runs the Internet. Any sysadmin worth his salt knows how to install a distro and get basic services up and running. While it's not quite a house
Re: (Score:3, Insightful)
The author of this article's supposition requires a 15 year economic downturn. Whenever I hear this predicted, say, from World Net Daily or one of the other far right publishers, it surprises me that to come up with this prediction requires the belief that there will be no technological advances in that period. It's amazing what a couple of new technologies can do to economic predictions. In the eighties, we thought there would be recession for ever, but then personal computers and the Internet came along
the future is the past (Score:2)
I don't know how less good enough computers are going to become over the next ten years. It might be a an issue of power, but I think what happened is that we realized that computers became over powered for the average user. This is not an issue of goo
The future? (Score:4, Insightful)
Hell, that was 10 years ago.
If we hadn't let the programmers run amok and force them to write efficient code, what we had back then was 'good enough' for most people. ( not all, but most )
And to prove my point, i'm still running a 10 year old desktop with a 900mhz PIII running Freebsd on a daily basis.
Re:The future? (Score:4, Insightful)
If we hadn't let the programmers run amok and force them to write efficient code, what we had back then was 'good enough' for most people.
OK, so give me an example of inefficient code and your explanation for why it's inefficient. As a professional programmer, I get tired of bearing the blame for "bloat". Sure, I write in high-level languages instead of assembler these days. I write database-backed web applications, and while I'm capable of implementing them on bare hardware, my boss would much rather just buy a faster server and let me code in Python than wait 4 years while I hammer out a prototype in assembler. The end result is that I can add new features in a timeframe that our customers will tolerate.
If we were stuck on the hardware of 1999, we'd be writing software the same way we did in 1999. Having been there, it sucked compared to what we can do today and I would never voluntarily go back. Do carpenters build "bloated" homes because they use general-purpose fasteners to bind pieces of standardized wood together, or are you willing to tolerate a little deviation from the ideal because you don't want to wait while they grow a tree in the exact shape of your blueprints? Well, I want to do the same for software. If you consider that "bloat", then you don't understand modern software development and what it delivers to end users.
Oh really? (Score:5, Insightful)
Well then, get on writing efficient code that'll decode HD video on a 900mhz processor. Don't tell me that is something "normal users" don't want, video on PCs is exploding and people are all about a higher res better looking picture. Don't forget the 5.1 audio that goes with that, and HRTF calculations for those that want to wear headphones but get surround. Oh it can't handle that? Well there you go then.
I get real tired of this whining about "Programmers aren't efficient," thing, as though the be-all, end-all of coding should be the smallest program possible. No, it shouldn't, computers are getting more powerful, we should use that power. There are a number of reasons for programs to get bigger and require more power:
1) Features. I don't want computers to be stuck and never get any better. I want more features in my software. This goes for all software, not just power user type apps. For example one thing I really value in Office 2003 (and 2007) is their in line spell checker. It is very good at figuring out what I mean when I mistype, and learns from the kind of mistakes I make to autocorrect and make more accurate guesses in the future. Well guess what? That kind of feature takes memory and CPU. You don't get that for free. No big deal, my computer has lots of both. But it isn't "bloat" that it has features like that, rather than being a very simple text editor.
2) Manageability of code. Generating really optimized code often means generating code that is difficult to work with. I mean in the extreme, you go for assembly language. You get the smallest programs doing that, and if you are good at it the fastest. Ok great, but maintaining an assembly program is a bitch, and it is easy for errors including security issues like buffer overflows to sneak in. Now compare that to doing the same thing in a fully managed language like Java or C#. Code will be WAY bigger, especially if you take the runtimes in to account. However it'll be much cleaner and easier to maintain. No it won't be as efficient, but does it matter? For many tasks there's plenty of power so that's fine.
3) New technologies. HD video is an example that is out now, true speech understanding (as in you can command the computer using natural language) would be one that we haven't reached yet. These are things that are only possible because of increased processor power and memory/storage capacity. Look at video on the computer. For a long time it was non existent, then when it started it was little postage stamp sized things that weren't useful, to now where you have full screen HD that looks really smooth. It wasn't as though peopel haven't always wanted better video, it was that computers back in the day couldn't handle it. Only recently have drive become large enough to hold it, and CPUs fast enough to decode it in realtime.
4) Faster response. Computers have gotten MUCH faster at user response. The goal is that users should never have to wait on their system, ever, for anything. The computer should be waiting on the human, not the other way around. We keep getting closer and closer. If you don't try new systems it is hard to appreciate, but it has been massive strides. As a simple example I remember back in high school when I went to print a paper for school, I'd issue the print command and wander to the kitchen. Printing a 5 page paper was a lengthy process. The computer had to use all it's resources for some time to render the text and formatting in to what the printer can handle. Now, I submit a 50 page print job with graphics and all and it is spooled nearly immediately. The printer has the entire job seconds later, since these days the printer has it's own processor and RAM. It is printing before I can walk over to it. Things that I used to have to wait on, are now fast.
5) Better multitasking. People like to be able to have their computers do more than one thing at a time, and not bog down. It can be simple things like listen to music, download a file, and surf the web but not that long ago it wasn't possibl
Welcome to my world (Score:3, Informative)
Not willing to spend a lot of money on something that will lose its value faster than... well... anything, really, I have adopted the "good enough computing" doctrine years ago: I find computers that are sufficiently powerful for my use as cheaply as possible - nowadays they're usually free. I have gotten several perfectly good computers by saying "I can take that off your hands if you want.
So far all my software needs have been covered with Linux and other open source software.
I do have two Macs, but they follow the same philosophy: the combination of hardware+software is good enough for the purpose, and keeps its value better than a PC. [source: local sales of secondhand computers]
Umm. Yeah? (Score:5, Insightful)
Remember all those $10,000+ Real Serious Workstations, running Real Serious OSes that real computer users did real work on, back when the kiddies were twiddling bits on the Z80 box they built in their garage? All of them are dead. Almost all computers now in use are the direct descendants of the low end crap of the past.
Further, even within the category of boring x86s, almost all of us are already running something much closer to "good enough" than to "good". Some enormous proportion of PCs are in the sub-$1000 category, which still entails a bunch of tradeoffs(not nearly as many as it used to; but still).
It will, indeed, be interesting if Microsoft hits the chopping block during the next round of "good enough"ing(or, more realistically, gets shoved to rather more cost insensitive business sectors that like backwards compatibility, the same way IBM was); but "good enough" is already all around us.
Never going to Happen. (Score:2, Insightful)
Especially since the advent of "Slop-Ware" and Windows versions that need exponentially more power and capacity than the last version.
Netbooks (Score:2)
Even now, the low end, cheaper netbooks [often with no CD drive or even hard drive] are very popular.
A lot of people like to use them as a smaller, less costly replacement or addition to a full blown laptop.
big dangers (Score:2)
we have to remain careful of competition - being cheaper doesn't help if someone is selling hardware or software under market price in order to maintain market share.
nobody can deny that Microsoft is basically giving Windows XP away for free on netbooks. While they are totally able to do this, Linux can't make up for this loss by stashing vast amounts of money from other overpriced software.
what we need to do is beat microsoft on usability on every aspect, not just price. Including marketability, liability
2025! (Score:5, Funny)
The year of the Linux desktop!
Microsoft knew this a long time ago (Score:4, Interesting)
Microsoft knew this a long time ago. That's why they are where they are today... everywhere. You don't need something that's perfect and awesome, you just need something good enough so people can get by. The cost savings you get by not putting tons of effort into perfection can be passed on to consumers, who almost always buy on price alone.
Re:Microsoft knew this a long time ago (Score:4, Interesting)
The most memorable lines from Pirates of Silicon Valley...
- We're better than you are! We have better stuff.
- You don't get it, Steve. That doesn't matter!
Windows is Adware now (Score:2)
By selling Windows XP you can bundle in a lot of trial versions of programs like Microsoft Office, virus scan etc. New computers are stuffed up with adware these days.
This means the effective price of Windows XP is actually negative. Something Linux can not compete with. Who wants to pay to bundle a trial of an office package with Linux that comes with Open Office preinstalled?
Cars (Score:3, Insightful)
Re: (Score:3, Insightful)
Small/Medium Businesses (Score:5, Insightful)
This is especially true in small town America.
I'm surprised no one's mentioned this (Score:2, Insightful)
Parkinson's Law applies (Score:5, Insightful)
Parkinson's law is "Work expands to fill all available time". It applies to processing power too. What's "good enough" today won't be "good enough" tommorrow, because someone will invent some CPU-sucking memory-hogging disk-flogging killer app that everybody will want to have.
I don't know what it will be. But then again, who predicted grandmothers would be editing home movies of their grandkids on their computers? Try that on a machine which is just "good enough" for email and the web.
Re: (Score:3, Interesting)
Sure, but over time the percentage of computers that are sold new, and in general use, that are significantly below the "top of the line" increases -- and that's not just a prediction of the future, but I think something that is true o
Ego Sum Magis Cynical (Score:2)
Those unlikely enough to be on the outside of this new class-based proprietary world will be lulled into believin
The curse of PC world (Score:2)
The obvious going mainstream seems to be the stimulus for it ceasing to be true. Extrapolating from the popularity of sensor equipped devices, like the wii & iphone, it seems likely that computers that monitor and respond to your gestures, voice and attention will be arriving soon.
Good enough computing is Retrocomputing (Score:2)
For most people software written a decade or more ago was "Good Enough" and they don't need modern technology.
It is called Retocomputing when you use old computers and old software. You can buy them cheap at Auctions and Garage Sales and eBay.
Fallacy (Score:4, Insightful)
It is patently and obviously ridiculous. A Pentium II PC, especially on a Pentium II-compatible motherboard with its memory and other characteristics, would not be an acceptable platform for the average user. It would be very slow and would immediately have memory issues. Current graphics hardware would probably not be compatible, and even if it was the 3D software like OpenGL or the MS equivalent would have unacceptably bad performance. Contemporary games would be dreary experiences indeed.
Lots of multimedia authoring software can use as many cores and as much RAM as you can afford. 3D gaming environments with ever more active objects, each with some amount of basic AI and moving parts, will also keep pushing the envelope even further. "Tab creep" in your web browser, where you end up accumulating open tabs, each with graphics, javascript, and maybe audio or video give memory footprints well into the hundreds of MB.
Maybe deaf and blind little old ladies with severe arthritis can get by with a Pentium II, but not too many others. In 2025 the things that will pass for personal computer desktops (something like them will still exist in spite of the cyclical "The PC is Dead" hype), will have a dozen or more CPU cores or perhaps hundreds of smaller cores of various kinds to distribute different types of processing. Cache memory will be much larger than today as will be system RAM and storage. Software will be similar to today's except for far greater detail and granularity of content, and multiple new ways to interact with the data. That will demand a lot of compute power.
No doubt people will continue to say things like "an exaflop and a zettabyte ought to be enough for anyone," and people like me will continue to deride and mock them.
'good enough' computing became the norm in 1991 (Score:5, Interesting)
With the debut of Windows 3.1 'good enough' became the accepted norm in computing.
You could pay more for a NeXT workstation, a Sun workstation, or even a Mac. However Windows 3.1 was 'good enough'. Most people didn't need networking support built in, or the compilers or software that was available for the other platforms.
You could have gone all multimedia with a fancy Amiga that did incredible sound and graphics, but 16 colors and trading files via floppy was 'good enough' for the majority of people. You could add hardware and software to Windows 3.1 computers if you really had a need to network them. The computers Windows ran on were capable of displaying better graphics (games that booted to DOS showed this), but Windows 3.1 was 'good enough'.
Windows 3.1 really did make computers easier to use. Macs, Amigas and NeXT did a 'better' job of making computers easier for people, but Windows 3.1 did a 'good enough' job at making things easier. At about US$2,400.00, a mid range computer with Win 3.1 on it was a lot cheaper than the competition. It was 'good enough' and cheaper.
The history of economics shows that 'good enough' and cheap wins.
Think of the 'best' hamburger that you ever ate...
Did you think of a plain old McDonald's hamburger? Probably not. In any scale of human measure (taste, smell, satisfaction) McDonald's hamburgers rarely rank as 'best'. But measured in market share the McDonald's hamburger is the best.
Ford's Model T was not as fast or as fancy or as comfortable or as good in quality as the hand crafted automobiles it competed with. But thanks to mass production and economies of scale it was cheaper and it was 'good enough'. Ford and other mass produced vehicles dominated the market. There are still purpose built vehicles, but they are a small specialty segment of the market.
'Good enough' and cheap is always the 'best' when you consider things from a market dominance point of view. What a human thinks is 'best' and what the market thinks is 'best' are not the same thing.
They were in the 40's too. (Score:3, Insightful)
"I think there is a world market for about five computers"
http://en.wikipedia.org/wiki/Thomas_J._Watson [wikipedia.org]
They were good enough then. Since then, the market has expanded a little.
Re:Compare/Contrast with Apple (Score:5, Funny)
Im sorry, you seem to have failed to make any form of point.
Please re-insert your thought process and try again.
Re: (Score:2)
And yes, it is design over function. Consumer Reports puts Dyson as worse performing than many other vacuums, and it gets near the bottom of the price/performance ratio. Same with BMW. The premium of the price is not nearly worth the miniscule actual improvement of the product.
Re: (Score:3, Insightful)
And when people have reduced budgets because the economy tanks, "design over function" companies like BMW, Dyson and Apple will go by the wayside.
I read earlier today that Apple's profits went up this quarter.
Re:Compare/Contrast with Apple (Score:5, Interesting)
Apple is, in fact, a significant beneficiary of "good enough". They make mostly laptops, which always have price/performance and worse absolute performance than do desktops. Nobody much cares; because laptops are more convenient, and they are fast enough for the job(even within the laptop market, Apple doesn't bother with any dual HDD offerings, or SLI setups, because the lower spec stuff is good enough). On the desktop side, all of Apple's consumer offerings are all-in-ones with extremely limited expandability. Nobody(except gamers) much cares; because the stuff built in is good enough, and PCI blanking plates are ugly.
Having a manufacturer selling limited-performance hardware, with minimal expansion capacity, distinguished by industrial design and software, rather than performance, and doing quite well is exactly what "good enough" looks like.
That doesn't mean that Apple is the only part of "good enough" el-cheapo walmart desktops and netbooks are also a (larger in marketshare terms) part; but Apple is hardly in opposition to "good enough".
Re: (Score:2)