Less Is Moore 342
Hugh Pickens writes "For years, the computer industry has made steady progress by following Moore's law, derived from an observation made in 1965 by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months. The Economist reports however that in the midst of a recession, many companies would now prefer that computers get cheaper rather than more powerful, or by applying the flip side of Moore's law, do the same for less. A good example of this is virtualisation: using software to divide up a single server computer so that it can do the work of several, and is cheaper to run. Another example of 'good enough' computing is supplying 'software as a service,' via the Web, as done by Salesforce.com, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway. Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade — only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles."
Let's see (Score:5, Funny)
Less: 120884 bytes
More: 27752 bytes
Wow, that's right!
Re:Let's see (Score:5, Funny)
Even more literally..!
$ ls -i /usr/bin/less /usr/bin/less /usr/bin/more /usr/bin/more
3603778
$ ls -i
3603778
Re: (Score:3, Informative)
moo@you:~$ ls -l
-rwxr-xr-x 1 root root 30316 2008-09-25 07:08
moo@you:~$ ls -l
-rwxr-xr-x 1 root root 120884 2008-02-01 20:51
Re: (Score:3, Informative)
From a quick check of systems within easy reach...
They are the same on OSX, FreeBSD, OpenBSD
They're different on Solaris, and more is nonexistent on Centos afaict.
Re:Let's see (Score:5, Funny)
All the windows experts are scratching their heads now.
That's OK, maybe that'll make them get off of our lawn too.
Re:Let's see (Score:4, Insightful)
Re:Let's see (Score:5, Funny)
Re: (Score:3, Funny)
But if you know that, don't you have your OWN yard to have people get off of?
Your basement has a lawn?
There, fixed that for you.
Re:Let's see (Score:5, Funny)
Those don't exist, and our server has a peculiar way of letting me know that:
:-D
kosh ~ 3 % ls -l $(which Less)
ls: Less not found
ls: not not found
ls: found not found
Ecomomics (Score:2, Interesting)
Operating versus capital (Score:2)
Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?
You have a good point that this could be a test of your hypothesis. The purchase of a computer to a company or governement is frequently considered a "capital" purchase. Even though over time, the cost of computing is dominated by the operating cost of software, power, upgrades and IT.
However since capital is usually scare in organizations it tends to drive acqusitiion decisions. People buying things that they can't easily replace will tend to seek higher perfromance equipment.
But that may be about to ch
Or that history repeats itself (Score:5, Insightful)
Well, actually it's just proof that history repeats itself. Because this thing has happened before. More than once.
See, in the beginning, computers were big things served by holy priests in the inner sanctum, and a large company had maybe one or two. And they kept getting more and more powerful and sophisticated.
But then it branched. At some point someone figured that instead of making the next computer which can do a whole megaflop, they can do a minicomputer. And there turned out to be a market for that. There were plenty of people who preferred a _cheap_ small computer, than doubling the power of their old mainframe.
You know how Unix got started on a computer with 4k RAM, which actually was intended to be just a co-processor for a bigger computer? Yeah, that's that kind of thing at work. Soon everyone wanted such a cheap computer with a "toy" OS (compared to the sophisticated OSs on mainframes) instead of big and powerful iron. You could have several of those for the price of a big powerful computer.
Then the same thing happened with the micro. There were plenty of people (e.g., DEC) who laughed at the underpowered toy PCs, and assured everyone that they'll never replace the mini. Where is DEC now? Right. Turned out that a hell of a lot of people had more need of several cheap PCs ("cheap" back then meaning "only 3 to 5 thousands dollars") instead of an uber-expensive and much more powerful mini (costing tens to hundreds of thousands.)
Heck, in a sense even multitasking appeared as sorta vaguely the same phenomenon. Instead of more and more power dedicated to one task, people wanted just a "slice" of that computer for several tasks.
Heck, when IBM struck it big in the computer market, waay back in the 50's, how did they do it? By selling cheaper computers than Remington Rand. A lot of people had more use for a "cheap" and seriously underpowered wardrobe-sized computer than for a state of the art machine costing millions.
Heck, we've even seen this split before, as portable computers split into normal laptops and PDAs. At one point it became possible to make a smaller and seriously less powerful PDA, but which is just powerful enough to do certain jobs almost as well as a laptop does. And now it seems to me that the laptop line has split again, giving birth to the likes of the Eee.
So really it's nothing new. It's what happens when a kind of machine gets powerful enough to warrant a split between group A who needs the next generation that's 2x as powerful, and group B which says, "wtf, it's powerful enough for what I need. Can I get it at half price in the next generation?" Is it any surprise that it would happen again, this time to the PC? Thought so.
Re: (Score:3, Funny)
Because you don't need more cycles in biz (Score:5, Insightful)
Let's be honest here. What does the average office PC run? A word processor, a spreadsheet, an SAP frontend, maybe a few more tools. And then we're basically done. This isn't really rocket science for a contemporary computer, it's neither heavy on the CPU nor on the GPU. Once the computer is faster than the human, i.e. as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.
And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.
Re:Because you don't need more cycles in biz (Score:5, Insightful)
It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.
Re: (Score:2, Troll)
If I cannot turn that crap off, I don't want the software. If I can turn it off, I turn it off.
An interface is supposed to do its job. When I play games or when I watch an animation, I want pretty. When I work, I want efficiency. Don't mix that and we'll remain friends.
Re:Because you don't need more cycles in biz (Score:5, Funny)
It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.
Actually, that's Cole's Law [wikipedia.org], which states that an unused plate space must be occupied with cheap filler that no one really wants.
Re:Because you don't need more cycles in biz (Score:5, Funny)
Its usually expressed as Gate's Corollary to Moore's Law: Whatever Moore Giveth, Gates Taketh Away.
Re: (Score:2)
We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.
Exactly. I have a 5-year-old laptop with a Pentium 2.4 gigahertz processor, but even with today's software (latest versions of OO.org, Firefox, Google Earth, etc.) it runs just fine. Sure, a newer computer would be somewhat faster, but this is not "so slow it's painful" like my Pentium 133 was 5 years after I bought it.
It works well enough that I recently put in a larger hard drive and a new battery to keep it useful for the foreseeable future, because I do not intend to replace it until it dies (or until
Re: (Score:2)
And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.
Agreed. There would have to be a new paradigm shift (ok, I fucking hate that word, maybe usage need change?) to make an upgrade worthwhile.
For my personal needs, DOS was good for a long time. Then Win95 came out and true multitasking (ok, kinda working multitasking that still crashed a lot) made an upgrade compelling. I couldn't really use any of the browsers on my old dos box and Win95 opened a whole new world. That computer got too slow for the games eventually and that drove the next upgrade. Online vide
Re: (Score:2)
And, bluntly, I don't see any "we must have this!" features in any office standard application, at least since 2000.
Multitasking was a compelling reason. It became possible to run multiple applications at once. Must-have.
Better interveaving between office products and email was a must have, too.
Active Directory (and other, similar technologies) made administrating multiple accounts a lot easier and certainly helped speeding up rollouts. Also, must-have (for offices, but we're talking office here).
And so on.
Re: (Score:3, Interesting)
Lets be honest here. What would we like the average office PC to be doing? If they are beefy enough to run a grid on, and so also perform many of the data retention, de-duplication, HPC functions, and many other things, then yes, having faster-better-more on the desktop at work could be interestingly useful. Software is needed to use hardware that way, meh.
There will never be a time in the foreseeable future when beefier hardware will not be met with requirements for its use. Call that Z's corollary to Moor
Re: (Score:2)
I have an AMD dual core 4400 with a couple Nvidia 7300's that will take anything I throw at it. I don't think I'll ever need another computer unless the thing fries. They have also become so cheap that they have also become commodities. Why fix something for
Re: (Score:2)
the industry will almost die in the next few years. [...] I don't think I'll ever need another computer unless the thing fries.
Which means that clamoring for cheap will mean hardware makers will design _more_ early failure into hardware, and reduce warranties to nil.
Re: (Score:2)
If we were talking about CPU power, I'd completely agree with you. A Pentium IV was fast enough for most people, and a modern Core2Duo is more than enough. I still get to points where my sys
Re:Because you don't need more cycles in biz (Score:5, Funny)
Re: (Score:2)
You forgot "anti virus software". By fare the biggest (and probably least useful) resource hog.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
Still, will that processing have to be on the client? Probably not. Heck, I don't pretend to be a computer science Einstein, but I wouldn't be at all surprised that the limiting factor in functional AI is access to piles and piles of centralized data ("piles" is a technical CS term that means "a lot"). I don't need a fancy computer to access Google, and when Google is finally self-aware I don't suppose I'll need more than a web browser to talk to it either.
It's pretty certain that computers are going t
Or... (Score:3, Interesting)
That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.
Or, it could be good news for them. Especially in the light of the things like the "Vista Capable" bru-ha-ha, and the impact Vista had on RAM prices when fewer than the projected number of consumers ran out to buy upgrades.
Maybe Intel and NVidia are going to be wearing the sadface, but I'm willing to wager HP and the like are almost giddy with the thought of not having to retool their production lines yet again. They get to slap on a shiny new OS and can keep the same price point on last year's hardware.
Some of the corporations in the world buy new hardware simply to keep it 'fresh' and less prone to failure. My own company has recycled a number of Pentium 4 machines that are still quite capable of running XP and Internet Explorer. With the costs of new desktop hardware at an all-time low for us, we get to paint a pretty picture about ROI, depreciation, budgets, and the like.
Re: (Score:2, Insightful)
Re: (Score:2)
Unfortunately, if companies start buying computers in 5 or 6 year cycles instead of 2 or 3 year cycles then HP definitely won't be giddy. They'll be even less giddy if the average price of a desktop PC drops to $200 and the average price of a laptop sinks to $500.
Sigh (Score:4, Insightful)
There's already a method for that: it's called by the catchy title "buying a slightly older one".
A related technique is called "keeping the one you've already got".
Re: (Score:2)
A related technique is called "keeping the one you've already got".
I don't know... That sounds expensive.
It's funny, but it's also true. (Score:2)
A previous company I worked for would lease their workstations for 3 years. That did mean that they were constantly paying for computers ... and rolling out new boxes.
But there weren't many problems with the HARDWARE during those 3 years.
As they started keeping the workstations longer, there were more problems with the hardware AND there were problems with replacing the hardware that broke. Which was leading to a non-uniform desktop environment. It's more difficult to support 100 different machines with 100
Re: (Score:3, Interesting)
In some markets it simply hasn't been possible to buy a 2 year-old spec until recently. In particular, laptops used to vanish at the point at which they made sense for someone who just wanted a portable screen and keyboard with which to take notes and maybe enter some code. The only way to get a laptop much cheaper than the premium spec was to go s/h, and s/h laptops have never been that attractive an option.
Machines with the relative spec of NetBooks would have sold fine a decade ago. It's just that the la
Re: (Score:3, Insightful)
he said older not used. An older cpu [ebuyer.com] is still as safe as a newer one [ebuyer.com]
This is nothing new (Score:5, Interesting)
Some of you may remember the 1980s and early 1990s, where PCs started out costing $5,000 and declined slowly to around $2,500 for name brand models.
Around 1995, CPUs exceeded the GUI requirements of all the apps then popular (this is pre-modern gaming, of course). Around 1996 and into 1997 the prices of PCs fell off a cliff, down to $1,000.
Those who fail to remember history...
Re: (Score:2, Interesting)
Adjust those number for inflation... Or better, retro-adjust current prices for 1980 prices.
I do remember falling prices in the nineties, but now a PC is pretty close to an impulse buy. For me in 2000, a 2000€ PC was already an impulse buy (That said, I was single, a successful consultant with brand-new sports car, so my "impulses" were a bit off). These days an EEE PC is an impulse buy for anyone loving toys and having a bit spare money.
This is not a repeat of the previous price-falls, this is the
Re: (Score:3, Informative)
Now, let's take the Asus EEE PC (280$) in 1990. In 1990 you would have paid 172.28$ for that. That's a PC that would have beaten your top-of-the-line 2500$ 1990 PC to smithereens. (The i486 came out in 1989!))
Poor Intel (Score:2)
Continually making the same thing for less money is not a very good business model.
Pretty soon the customers will be asking for the same performance, free.
Reminds me of the old quote, "We have being doing so much with so little for so long we are now qualified to do anything with nothing at all".
Comment removed (Score:5, Insightful)
Re: (Score:3, Interesting)
2) and that you don't need teraflops of CPU/GPU power just to draw greasepaper-style borders around your Microsoft Word windows
You're dead right there. I always wondered why I could play games a few years ago that had some stunning graphics, yet ran very well on a 900Mhz PC with 256MB ram; yet Vista needs 1Gb and a graphics card that's better than that old spec just to draw a poxy bit of transparent titlebar.
I'd blame the managed .NET stuff, but the WDDM is native! Makes the inefficiency even worse.
Re: (Score:3, Interesting)
Maybe people will realize what an obscene waste of money and computing power and operating system like Windows Vista, which requires a gig of RAM to run, really is.
Hear, hear. To a lesser extent the same is true of all modern GUI's. Most GUI programmers today seem to have no clue how to write efficient, usable software. I am still waiting for GUI software that responds fast enough. I'm an experienced computer programmer/user and I'm endlessly irritated by the slow response of computers that are supposed t
Smaller (Score:2)
Re: (Score:2)
PC hardware has left software requirements somewhat behind, unless you want to run the very latest games.
My dual core PC from 2007 is still more than sufficient in terms of performance. The price to put a similar or better machine together has dropped from 800 Euros to 500 Euros, however (without monitor). That is assuming
-the same case
-a comparable power supply
-same memory (2 GByte)
-a slightly faster but less power-hungry CPU (AMD 45nm vs. 90nm, each in the energy-efficient version)
-a faster GPU (ATI 4670
The bells and whistles nobody uses... (Score:4, Interesting)
sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway
I think if you took out all the features that 'hardly anyone uses' you wouldn't have much of a product left. Bloatware and the 80/20 Myth [joelonsoftware.com]
Re: (Score:2)
So I'm starting to suspect that fretting about bloatware is more of a mental health problem than a software problem.
Amen.
Re:The bells and whistles nobody uses... (Score:5, Insightful)
The article you pointed out is pure nonsense. It claims that bloat isn't important due to the fact that memory cost dropped. Not only that, it tries to base that claim on this idiotic metric of dollar per megabyte and how the fact that software like microsoft's excel bloat from a 15MB install in the 5.0 days to a 146MB install in the 2000 days is somehow a good thing because in the 5.0 days it took "$36 worth of hard drive space" while "Excel 2000 takes up about $1.03 in hard drive space". No need to justify a 100% footprint. We are saving money by installing more crap to do the exact same thing.
In fact, the idiot that wrote that article even had the audacity to state:
Up is down, left is right, bloat is actually good for you.
But people still complain. Although it appears that we should be grateful for all that bloat, we are somehow being ungrateful by believing that all that bloat is unnecessary. But fear not, the idiot that wrote the article has a nice accusation for all those bloat haters out there:
Yes, that's it. We don't hate orders of magnitude increase in bloat simply to be able to perform exactly what has been easily done with a fraction of resources. We don't hate to be forced to spend money on hardware to be left with a less than adequate solution when compared with the previous generation. We simply hate windows. Good call.
The article is bad and you should feel bad for posting a link to it.
Dirty Industry Secret (Score:4, Interesting)
Years back when everyone in the mainstream were trotting out how many Mhz/Ghz their processors ran and how their Latest And Greatest system was *soooo* much better, I insisted that the computer industry had a dirty little secret. The mid to low end computers would work just fine for 90% of users out there. Computer makers didn't want people knowing this and instead hoped that they would be convinced to upgrade every 2 or 3 years. Eventually, though, people learned that they were able to read their e-mail, browse the web, and work on documents without buying a system with a bleeding edge processor and maxed out specs. This seems like the continuation of the secret's collapse. People are realizing that not only don't they need to buy a system with a blazing fast processor just to send out e-mail, but they don't need to buy 10 different servers when one powerful (but possibly still not bleeding edge) server an run 10 virtual server instances.
Faulty on many fronts (Score:2)
1) No one is following Moore's law. It's a description of what happens.
2) You can, of course, come up with some equation that describes the cost of a set amount of processor power over time.
3) This article and this summary make bad economic assumptions and use faulty logic. I suggest to all reading the comments that it's not worth reading.
That is all.
Re: (Score:2, Funny)
Incorrect about Moore's law (Score:5, Informative)
Re: (Score:2)
Yes, but you don't expect anyone on a site like this to actually know your random "factoid".
What we would also like to see is how many computing powers such as this equal a standard US LOC?
Microsoft got told off (Score:2)
But it's unlikely, and I would hesitate to say microsoft has actually preempted anything, I'd say their responding to what they've been
Hmm... (Score:4, Interesting)
I can buy a $350 mini laptop, $500 decently speced laptop, or a $500 desktop with what would have been unbelievable specs not long ago. I remember when I picked up computer shopper and was thrilled that there were any bare bones dekstops that sold at the $1K mark. Now you can get full featured systems for under .5K that do things that $2-3K machines couldn't do.
Really, there is no such thing as a "Moore's Law." It's Moore's trend lines that have been holding. That it lasted 10 years, much less this long has been utterly amazing. I fully expect for us to run into problems keeping with "Moore's Law" before 2100. 5-10 years after the trend is broken it'll be something the future folks will either forget about it entirely or look back and kinda giggle at us like we were just silly about it all. 50-100 years later no one will care though every one will be making use of the by products of it. Do you notice where the stuff for roads comes from or what Roman engineer built the most or best roads? That's generally what they'll think of any computing device older than 20 years. If Moore's law holds until 2050, every computing device that we've currently made will be either trash or museum pieces by that time. Heck, you have people getting rid/upgrading of cell phones almost every 3-6 months already.
We imagine replicators in Star Trek, but we don't need them with Walmart and 3-6 months for new products to come out. Consider Amazon+UPS next day shipping. Replicator tech would have to be cheaper and faster than that to compete. I think that it's more likely that we'll keep on improving our current tech. What happens when UPS can do 1 hour delivery to most places on the globe? Replicators might spring up, but only for the designers to use them to spend a week making 10K of a unit, to put it went on sale today, which would be sold out in two weeks and discounted by the week after. Face it; we are already living in a magical golden age. We just want it to be 1000x better in 50 years.
Vista deserves credit... (Score:5, Insightful)
Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.
Without Vista, MS wouldn't be able to claim that 7 was faster than their previous version of Windows.
More is More (Score:5, Interesting)
One of the things I learned many years ago, is that computer and computing speed isn't a function of how fast something runs. Rather it is a matter of whether or not you actually run something.
If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.
However, if you run Task B, which takes 1.5 hours to complete, and a new computer will run that same task in say 4 minutes (Real world example from my past, log processing), the difference isn't necessarily the 86 minute difference, but rather if and how often you actually run that task.
It is endlessly amusing to see "real world benchmarks" that run in 3 minutes for most processors, separated by less than 2 x. Or frames per sec. Or ...... whatever.
When shit takes too long to get done, you tend NOT to do it. If the difference is a few seconds, that is nice and all, and a few seconds may be of interest to "extreme" hobbyists.
But Real World differences are not marginally decreasing from 300 to 279 seconds. Sorry, but those extra few seconds aren't going to prevent you from running that Task.
The true measure is not how fast something gets done, but whether or not you actually do the task, because the time involve is prohibitive.
Re: (Score:3, Insightful)
If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.
Depends on the frequency of the task too. In Photoshop going from 10 to 5 secs for a simple task done often (say noise reduction) is a big deal. Or if an IDE takes 1 vs 2 seconds to load the popup showing class variables/methods. It doesn't sound like a big deal, but when you have to do it hundreds of times a day, believe me, those seconds add up ! I would (and have) gladly upgrade a PC for these kinds of real world improvements.
Re: (Score:3, Insightful)
But you don't stop doing those tasks because it takes 10 Seconds. While you do have a valid point, those chunks of time don't stop you, they only annoy you.
Trust me, you don't know what you DO NOT do because it takes to long to do it. I've seen tasks grow in time, from a minute to 1.5 hours as the logs grow due to increased activity. As the time mounted to process the logs, the less frequently I processed them, often running them over a weekends and evenings because I couldn't afford to process them during
Hardware and Software (Score:2)
The reality is that hardware has pulled so far ahead of software, it will be years before we exploit out current level of technology to its capacity.
We have some apps that don't understand how to task between CPU's. (We have some OS's that barely grasp that). We have applications that were designed in a time of 16 bit machines and fairly low limits on memory that have been patched and slowly moved along when they really need a completely new architecture underneath now to function well. We
And this is new? (Score:2)
The first 486 I got my hands on came with a $5,000 price tag.
My first Pentium came in, well spec'd, around $2,500.
The PCs I was building for myself ran about $1,500 five years ago and the last one was down around $1,100 - all "mid-range" machines, capable of playing the latest games in fairly high quality and reasonably functional for at least 18 months to 2 years.
Since a little after that Pentium, the systems I see more casual friends buying have dropped from few people buying $3,000 laptops to a fair numb
what Microsoft Giveth (Score:2)
Actually (Score:2)
by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months
-------------
BZZZZ wrong answer
Moore's original law states that: the number of transistors we are able to pack into a given size of silicon real estate inexpensively, doubles every 18 months. He changed this prediction to every 2 years in 1975, which bolstered the perceived accuracy of his prediction.
Number of transistors for a particular price is a moving target which is entirely dependent on the suppl
More than Moore('s law) (Score:5, Insightful)
As long as a processor, say, has to be tested, packaged, marked, shipped, etc.(which costs very similar amounts,whether the die in question is a cheap cutdown model or a high end monster) there is going to be a point below which cutting performance doesn't actually cut price by any useful amount. Something like the hard drive is the same way. Any drive has a sealed case, controller board, motor, voice coil unit, and at least one platter. Below whatever the capacity is of that basic drive, there are no real cost savings to be had(incidentally, that is one of the interesting things about flash storage. HDDs might be 10 cents a gig in larger capacities; but that doesn't mean that you can get a 4gig drive for 40 cents, I had a quick look, and you can't get anything new for under about $35. With flash, you might be paying 100 cents a gig; but you pretty much can get any multiple you want).
Cost, overall, is gradually being whittled down; but, once all the low hanging super high margin products are picked off, there is going to be a point past which it simply isn't possible to exchange price for performance at any reasonable rate. Used and obsolete gear offers a partial solution(since it can be, and is, sold below the cost of production in many cases) but that only works if your needs are small enough to be fulfilled from the used market.
Performance Race is Shifting Towards Perf. / Watt (Score:4, Insightful)
Even if the majority of users begin realize they have no practical use for top end CPUs with gobs processing power, everyone still benefits from higher efficiency CPUs. It reduces electric bills, simplifies cooling systems, allows for smaller form factors, etc. I think in the future the power efficiency will become more important as people start to care less about having the ultimate killer machine in terms of processing power. People are already performing actions on their mobile devices(iPhone, Blackberry, etc) which were possible only on a desktop in past years. The strict power requirements of these devices with tiny batteries will continue to demand improvements in CPU technology.
I'm waiting for the day when it is common to see completely passively cooled desktop computers, with solid state hard disks, no moving parts, sipping just a few watts of power without emitting a single sound.
What the world needs ... (Score:3, Insightful)
We more-or-less got enough computing power for most things with the introduction of the PIII 1GHz CPU. You might not agree with this, but it's at least approximately true. A computer outfitted with that processor and reasonable RAM browses the web just fine, plays MP3s, reads email, shows videos from YouTube, etc. It doesn't do everything that you might want, but it does a lot.
If we took the amazing technology that has been used to create the 3 GHz multi-core monsters with massive on-chip cache memory in a power budget of 45W or so in some cases, and applied it to a re-implementation of the lowly PIII, we'd win big. We'd get a PIII 1GHz burning a paltry few watts.
And this is precisely why chips like the Intel Atom have been so successful. Reasonable computing power for almost no electricity. We don't necessarily need just MORE-FASTER-BIGGER-STRONGER, which is the path Intel and AMD have historically put the most effort into following, we also need more efficient.
Re: (Score:3, Insightful)
Actually, what we need is a massively fast processor which can scale - quickly - to a "slow" processor like the PIII. Most of the time my systems are idle, and I'd be happy with them running at 400MHz if I could do it for a 90% savings in power. When I hit the gas, though, and want to load my ipod with music from my FLAC collection, doing on the fly transcoding, I want both (or all 4, or 8) cores running at 3+GHz, making my file transfer speeds the bottleneck. I don't care if I burn 150W-200W on the proces
Leveling off == very bad for Microsoft (Score:5, Interesting)
To make matters worse, without constant upgrades, Microsoft and ISV's can't count on new API's becoming widespread anytime soon, so they have to write applications for the lowest common denominator. This prevents Microsoft from forcing its latest agenda onto everyone -- and even worse, it could potentially provide the WINE team enough time to reach feature parity with Windows XP. (Spare me the lecture, have you tried WINE lately? It's surprisingly good these days.)
All in all, Microsoft is being forced to stand still in a place where they can't afford to. Commoditization sucks when you're a monopolist, doesn't it?
tail wagging dog (Score:3, Interesting)
> [windows 7 same as] Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.
I think this somewhat misses the point. People are less likely to buy new hardware in an economic downturn. It doesn't really have anything to do with whether the next version of Windows drives hardsware sales, as previous versions have done.
If Windows 7 really "runs faster with fewer resources" than Vista, (I'm hopeful, but this won't be established until it's actually released) then it could be that Microsoft is recognizing the fact that they will get more over-the-counter purchases if they make it more likely to run on legacy hardware. Else, people will just stick with what they have. It's the economy, not Microsoft, that's the main driver.
I am actually hopeful that we've broken the mindless upgrade cycle. I'm sorry it took a recession to do it.
What has changed really? (Score:3, Interesting)
For chip manufacturers little will change, performance per watt and cost/die/wafer require the same thing: ever smaller transistors that use less power per iteration. It's the same thing. So in reality Moore's Observation is still iterating unchecked, it's just the end packaging that will be different.
Instead of dozens of billion-transistor multicore behemoths from a wafer, they will get hundreds of tiny cut-down processors with a lower transitor count.
Now, it's been shown the latter which is a more profitable approach.
Captan Obvious strikes again (Score:3, Insightful)
It should be obvious, shouldn't it? Our work enviroment of choice has been the Desktop metophor for about 20 years now. Todays computers are powerfull enough to handle very luxurious desktop enviroments. I've basically replaced my very first PC - the first ever ATX bigtower casing, an InWin from 1996, that ways around about a metric ton - with a Mac Mini. 3D wise I even think it's a downgrade, allthough I only have a Geforce 4200 Ti as my latest 3D card in there.
But, as others here have pointed our allready, it consumes about the tenth of the power, makes almost no noise at all - even now I can barely hear it - and it is like 40 times as small. Meanwhile FOSSnix based systems are only getting better without making computing skills obsolete and making it even more finacially attractive to go for cheap and small.
The next performance race for most people will only take place after the standards for powerconsumption, size and noise have been raised and met. After that regular computers will be heading for more power again. I presume that next league will stall after a decade again, when 200$ computers the size of my external HDD have reached the power to render photorealistic motiongraphics in real time.
Faster Windows, woo hoo. (Score:4, Informative)
> first version of Windows that makes computers run faster than the previous version.
So now it will only be 10x slower then Linux instead of 100x for the same operations.
Moore NEVER mentioned computing power (Score:3, Informative)
For goodness sake, Moore's law never specified anything to do with "computing power"!
Moore observed that typically the number of transistors doubled ||on the lowest price process|| around every 2 years.
At least the poster got something right: the cost of the process.
But, it's not a law AT ALL; it's a self-fulfilling prophecy! Manufacturers know the target they have to hit (Moore's!) and they do everything they can to hit it. Anything less would result in company failure.
What Moore's Law Really Says (Score:3, Insightful)
The observation behind Moore's Law doesn't say anything about performance. It's a projection about the rate at which transistor density on a wafer grows every 18 months. Historically most companies incl Intel had used this to make bigger dies for larger and more complex processors, but you can also use the improvements in transistor real estate to simply make smaller dies and hence, more of them, increasing yield and bringing down the price. So bringing the price down isn't different than Moore's Law, it's just another way to use ML.
Hmnn? (Score:3, Funny)
1. Take random article from news site
2. Somehow manage to make it justify a new slashdot story that includes a link to ooold blog promoting windows 7.
3. ?????
4. Profit / Win laptop ?
How is vista seven related to this at all? It didn't get faster for doing less... That article states clearly that it is just using a more responsive interface, I mean, come on!...
False choice (Score:3, Insightful)
The presentation of the false choice fallacy is that you must choose option a or option b. As far as I can tell, businesses want not only option a and option b, they also want "the same performance in less watts". And a number of other things.
By presenting the trend as a singular choice the author presents a false choice. What is actually happening is that the computing ecosystem is becoming more diverse. As we select from a richer menu, we are enabled to pursue our goals large and small with equipment that suits the application. It's a good thing.
Who's been bought off by MS? (Score:3, Insightful)
"If so, it will be the first version of Windows that makes computers run faster than the previous version."
This is the 2nd bit of falseness -- WinXP was faster than WinME.
Second -- WinXP is still quite a bit faster than Win7.
The article states that Win7 improves in areas where Windows was "OS-bound" over Vista. However, it says there is NO IMPROVEMENT in Win7 for applications. It was applications that noticed a 10-15% performance hit in Vista vs. XP due to the overhead of the DRM'd drivers. As near as I can tell from everything that has been leaked out about Vista before, during and after its development was that MS added (not replaced), but added a whole new abstraction layer. They tried to make it transparent where they could, but this was the basic reason why nearly all drivers that actually touched hardware had to be rewritten -- the USER has to be completely isolated from the real hardware -- so DRM can detect hardware/software work-arounds or unauthorized modifications or "taps" into the unencrypted data stream. This goes down to the level of being able to tell if something is plugged into a jack due to impedance changes -- if impedance or electrical specs don't match exactly with what the circuit is supposed to produce -- the OS is supposed to assume tampering and mark the OS-state as "compromised". Think of it being similar to the Linux - Kernel's tainted bit. Once it's set, it is supposed to be irreversible unless you reboot -- because the integrity of the kernel has been compromised. The DRM in Vista-Win7 was spec'ed to be similar but with finer level sensors -- so if anything -- a code path takes too long to execute, or circuits don't return their expected values, it's to assume the box is unsecure, so content can be disabled or downgraded at the content-providers option.
All this is still in Win7 -- the only difference it all the drivers that were broken during the switch to Vista won't be rebroken -- so you won't get anywhere near the OEM flack and Blog-reports about incompatibilities -- those will all be buried with Vista -- along with your memories -- so hopes MS. But MS has already made it clear that you won't be able to upgrade an XP machine to Win7 -- so they can control the experience from the start -- either by starting with corrupted Vista drivers, or Win7 drivers -- take your pick. Both are designed to ensure your compliance, but more importantly -- both cause performance degradation that everyone pays for by needing a bigger machine than they needed for XP.
The whole planet will be paying an excess carbon tax -- forever -- all to add in content-producer's demanded wish list.
This whole bit about the IT industry warming up to Win7 because it's not so bad compared to Vista just makes me want to puke. It's still corrupt and slow.
The government should require MS to open-source WinXP -- so it can be supported apart from MS -- who's obviously going for a "content-control" OS (like the next Gen Apple's are slated for). This will be the beginning of the end for non-commercial, open-source OS's or media boxes. It will be all pay-to-play -- just like Washington.
Re: (Score:2)
Moore's "Law" is an observation not a law
And you're arguing semantics, not actual facts. Ever heard of "gravity, it's not just 14ft/sec^2, it's the law"? Same usage.
I think what you are witnessing is consumers and businesses hurting because of the shrinking economy and a $250 netbook is looking mighty affordable to them.
...which is pretty much what the original article is stating: consumers want cheaper prices not faster PC's.
This isn't going to stop any of the companies doing R&D to keep pace with Moore's observation.
It certainly will slow R&D as they lay off workers, so I challenge you on this point too.
Re: (Score:2)
And you're arguing semantics, not actual facts. Ever heard of "gravity, it's not just 14ft/sec^2, it's the law"? Same usage.
Not to be overly pedantic in this thread on semantics, but... 14ft/sec^2??
http://www.google.com/search?q=1+g+to+ft/s%5E2 [google.com]
Re:Bad Logic (Score:4, Interesting)
I guess that's what happens when you cut and paste computer science terms from an Economist article. In the next sentence, you state correctly that Moore's "Law" is an observation not a law! It's not that the computer industry (and I think we're only talking hardware here) follows this observation, it's that historically it has held true. No one's going to make a huge leap in R&D to be able to put 10x the number of transistors on a chip only to have engineers come down on them to stop it saying "no one has ever broken Moore's Law and we're not going to start now!" That idea is preposterous. We're limited by our own technology that happens to follow an ok model, it's not a choice!
Yes, that's all true, but if you don't think chip makers throw up graphs with a curve on them for Moore's Law and use that as a guideline for where they should be in the future, which could be called "following"... you're mistaken. Obviously if the observation continues to hold true, that's only because of the advances in R&D that produce new technology. However those advances come as a result of choices, like how much and what kind of R&D to do, and those choices are themselves driven in part by Moore's Law.
Now as far as going faster and getting 10x more transistors on a chip, sure that's not much of a choice. That's because the industry is already busting its ass to maintain the current exponential trend. For that very reason I'd never take the phrase "following Moore's Law" to mean intentionally limiting technology advancement. Au contraire, if anything I take it to mean we're "following" in the sense that you'd be "following" Usian Bolt in the 200m dash -- if you're anywhere near keeping up, you're a bad ass. The only motivation would be to drop off that pace.
Which, to some extent, we've already seen in the 00's. It's still exponential growth, but the time factor has increased somewhat. I can't remember the data I saw, but it appeared to have gone from a doubling every 18 months to 24?
By the way, I agree the examples are pretty poor. For virtualization you want the newest beefiest processor with the best hardware support for virtualization you can get. The whole idea is that you want a single machine to appear as though it is a plethora of machines each with enough horsepower to do whatever that specific machine needs to do. This is the opposite of just wanting to do the same thing cheaper, it's wanting to do the same thing times a plethora, so you need a machine that is at least one plethora times as powerful. Being cheaper overall is just a desirable side effect. I hope you agree that "plethora" is a great word.
Re: (Score:2)
Yes, that's all true, but if you don't think chip makers throw up graphs with a curve on them for Moore's Law and use that as a guideline for where they should be in the future, which could be called "following"... you're mistaken. Obviously if the observation continues to hold true, that's only because of the advances in R&D that produce new technology. However those advances come as a result of choices, like how much and what kind of R&D to do, and those choices are themselves driven in part by Moore's Law.
Precisely. Those choices are key and the Moore's Law expectation absolutely has to factor into it somewhere. My own opinion is that it sets the limit for where they can stop and relax their efforts, internally.
That's because the industry is already busting its ass to maintain the current exponential trend. For that very reason I'd never take the phrase "following Moore's Law" to mean intentionally limiting technology advancement.
Given the amount of secrecy in this industry, I'm not certain how you can back this statement up with any fact. My own assumption is that 'they' have developed technology far more capable than what they currently claim to be working on at any given time. I personally believe that what they claim i
Re:Bad Logic (Score:5, Informative)
My own opinion is that it sets the limit for where they can stop and relax their efforts, internally.
They can never stop and relax. They're chasing an exponential growth curve.
Given the amount of secrecy in this industry, I'm not certain how you can back this statement up with any fact. My own assumption is that 'they' have developed technology far more capable than what they currently claim to be working on at any given time. I personally believe that what they claim is on the drawing board is actually in prototype, what they claim to be in dev is actually ready for production, and their 'latest and greatest' is already old tech.
I've worked at several processor companies with top-of-the-line fab tech, including Mr. Moore's. While my NDAs probably mean I can't tell you anything specific with regard to scheduling, I can tell you without fear of revealing any secrets that you're way off base. They are not sand-bagging with more advanced tech waiting in the wings.
The only sense in which you are correct is that the 'latest and greatest' thing you can buy is old tech relative to things then under development. That's because there's typically a year give or take (usually give) between receiving the first silicon from the fab in the transistor node the product was designed for and all the validation, bug fixes, and spins on the product before it's ready to be sold. That means the fab tech has to be done and mostly stable by the time you start this process, so go roughly six months back before that where they're making test chips in the new fab to make sure it's working. And development of that fab tech before it's ready to run its first test chip wafer is two or more years before that, with R&D going on for years before that.
So yeah, when you could buy a 65nm CPU in the store, there may have been a 45nm CPU or just a test chip coming out of a fab somewhere, and a 32nm lithography machine being developed somewhere else, and a lab somewhere working out how 22nm lithography would work. But that's not 'sandbagging' because all of those things were years of serious non-stop development away from becoming products! Keeping on the exponential growth curve means that there has to be a constant pipeline of developments, and this pipeline is quite long.
And believe me, if they could increase the rate at which those future techs become available for making product, they would. "Sandbagging" means wasting competitive advantage, and wasting money. The machinery in the fabs for each node cost billions of dollars, and they depreciate rapidly. If they had some new tech working flawlessly, but weren't using it in products and just waiting in the wings, they'd be flushing hundreds of millions down the toilet. Time to market is one of the most important things they look at.
Honestly, if you look at actual press releases and actual product launches, it's much more likely that what they claim is a prototype is really on the drawing board, and what they claim to be ready for production is really still in development -- see the AMD Barcelona for the most recent example. You think they had the Phenom II just waiting in the wings while they got beat up in the press and the market over the launch of Phenom?
Now this isn't to say that they wouldn't sandbag if it were possible, and to some minor extent they have. When K7 had a big leg up over P3 in frequency headroom, or Core 2 vs aging K8s, sure they held back a little to get more margin on a cheaper part. But we're talking a speed grade delayed by a month or two. Barely noticeable noise on the curve. Actually tracking that curve requires non-stop expenditures and execution of R&D, and any significant slip-up could send a company flat on its face. To slow development on purpose? Ridiculous.
In my mind, this is the only way to sustain this curve - by limiting the release of new technology onto the market until Moore's says that it is time for it.
Think about it in terms of l
Re: (Score:2)
I think what you are witnessing is consumers and businesses hurting because of the shrinking economy and a $250 netbook is looking mighty affordable to them.
Even more affordable: keep using the same computer I've been using up until now, even if it's four or five years old at this point, for a cost of $0. It's still 'fast enough' for almost everything I'd want to do, and if I start to run out of disk space, I can buy external USB storage for $100 per terabyte.
Re: (Score:2)
"Vista makes your computer run faster?"
I think you misread the article. They claim Windows 7 is going to be the first Windows version that is faster than it's predecessor (in this case Vista) on the same hardware.
Re: (Score:2)
Wasn't NT4 faster than NT 3.51? Or at least, it had lower memory requirements.
Re: (Score:2)
The most popular SAAS application is webmail. People use that because it saves the hassle of setting up an email client, and they can use it from any computer by just firing up a web browser.
I personally don't like it because I like to have my data stored on my own computer, although I do have webmail software installed on my own computer so I can read my emails when outside.
Re:Bad Logic (Score:5, Insightful)
Yes, and as so many have pointed out, their history of doing so is now backfiring on them in a big way. And it's not just with Vista, it's with Office as well.
Case in point - several months ago my department bought upgrade licenses to Office 2008. I was perfectly happy with Office 2004, but I installed Office 2008 because I knew that if I didn't, I wouldn't be able to read whatever new formats that Office 2008 supported. It had happened with every other Office upgrade cycle in my experience - you either upgraded or you'd be unable to exchange documents with your peers.
But something funny happened this time - I have yet to receive a .docx, .xlsx, or .pptx file from anyone. I have quite consciously chosen to save every document in .doc, .xls, or .ppt "compatibility" format. Everybody I talk to says they're doing exactly the same thing. Everyone now knows the game that Microsoft plays, and no one is willing to play it anymore. I could have stayed with Office 2004 and never noticed the difference. So what motivation will I have to upgrade to the next version of Office?
If it weren't for Microsoft's OEM licensing deals, Vista would have a tiny fraction of its current market share. XP is "good enough". But Microsoft doesn't push Office onto new machines the way it does Windows, the older Office formats are also "good enough", and you have open source alternatives like OpenOffice if Microsoft tries to deliberately break Office compatibility on the next version. I fully expect Microsoft's Office revenues to take a steep dive in the next few years. The Vista debacle is only the beginning.
Re:Bad Logic (Score:4, Informative)
If you only installed Office 2008 for the new file formats, you can wipe it and go back to whatever ancient version you were using, as there are updates available which add support for the new xml based format. Obviously the old binary format must be so much that I'm not sure what my point was any more.
Re: (Score:3, Insightful)
I'm not even going to get into the number of FOSS-based companies you leave in
Re: (Score:3, Interesting)
The Open Office XML format (.docx, .pptx, .xlsx, etc)
That's the "Office Open XML format". MS was trying to create confusion with OpenOffice.org's format, OpenDocument Format, when they named it but they weren't quite blatant enough about it to call it "Open Office XML".
I'm not even going to get into the number of FOSS-based companies you leave in the cold by hanging onto .doc and the proprietary document format that it represents instead of using the freely available OOXML specification.
I wonder how well F/LOSS suites handle MSOOXML at this point. They seem to handle the old proprietary formats just fine, and given the nature of the freely-available MSOOXML spec, it's not unlikely that they haven't implemented all of it yet.
Re: (Score:2)
[...]If so, it will be the first version of Windows that makes computers run faster than the previous version.
Aside from how ridiculous that statement sounds to me ("Vista makes your computer run faster?")
Of course ${OS} ${VERSION} won't bump your CPU cycle frequency or increase your cache size.
But if one OS performs tasks in less time than another, if one thinks of "the computer" as an (OS, hardware) bundle, it does make the computer
Re:Bad Logic (Score:5, Informative)
Um, no it wasn't. "Moore's law" is a term that was coined after Thomas Moore gave a presentation showing that the company was managing to double transistor density each month. This observation created an interesting problem for the company. What should they do with all those extra transistors?
One option was that they could keep getting higher yields on existing chips, eventually driving the cost per unit to mere fractions of a penny. The other option was that Intel could do something useful with all that extra circuitry and maintain higher prices.
Considering that contemporary CPUs of the time were barely more powerful than the interrupt controller sitting next to them, using that silicon for sophisticated 32bit processors with on-die floating point units and SIMD instructions seemed like a no-brainer for the company. Thus as each successive generation of technology has made CPUs smaller, Intel has used the extra space to add more features and more optimizations.
At this point, things are getting a bit ridiculous. CPU manufacturers have so much extra space on which to work that they can fit 2-4 CPU cores on a single die and STILL produce a smaller chip than the last generation.
Re:Bad Logic (Score:5, Informative)
And now I'm going to do something shocking and unprecedented. I'm going to look-up the actual quote, instead of guessing what Moore's "Law" means.
"April 1965:
"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ..." Notice he claimed *complexity* not power doubled, and that it happened EVERY year. His original statement has not held true.
Re: (Score:2)
Translation: The number of transistors doubles.
I said nothing about "power". Those are your words, not mine.
Re:Bad Logic (Score:4, Informative)
Actually, it's called Moore's law because he plotted it in his 1965 paper while at Fairchild semiconductor.
specifically:
"The complexity for minimum component costs has in
creased at a rate of roughly a factor of two per year (see graph on next page). Certainly over the short term this rate
can be expected to continue, if not to increase. Over the
longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly
constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost
will be 65,000.
I believe that such a large circuit can be built on a single
wafer."
"CPU manufacturers have so much extra space on which to work that they can fit 2-4 CPU cores on a single die and STILL produce a smaller chip than the last generation."
Either you put that poorly, or you have no idea how a fab works.
There is no extra space.
Re: (Score:2)
One might even interpret it as a self fulfilling prophecy.
Why sell chips that are 10x as fast, when you can sell chips that are 2x as fast, then sell the same people new chips that are 2x as fast, repeatedly.
It almost seems like a cartel engaged in price fixing. I expect that time will reveal that is what it always has been...
Re:Bad Logic (Score:4, Insightful)
Not really, especially in the days when you had Intel and AMD racing to be the producer of the fastest chip.
Re: (Score:2)
What a wonderful thing, being able to come to one place and read all about wonderful microsoft...just wonderful!
Forewarned is forearmed, as they say. Still, at least we're not discussing Apple imploding into Steve Jobs' digestive system again, or gaining vital insight into which brand of deodorant* Linus Torvalds uses.
*Real Linux developers don't use deodorant, so that'd be silly.
Re: (Score:2)