Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses Operating Systems Software Windows Hardware Technology

Less Is Moore 342

Hugh Pickens writes "For years, the computer industry has made steady progress by following Moore's law, derived from an observation made in 1965 by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months. The Economist reports however that in the midst of a recession, many companies would now prefer that computers get cheaper rather than more powerful, or by applying the flip side of Moore's law, do the same for less. A good example of this is virtualisation: using software to divide up a single server computer so that it can do the work of several, and is cheaper to run. Another example of 'good enough' computing is supplying 'software as a service,' via the Web, as done by Salesforce.com, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway. Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade — only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles."
This discussion has been archived. No new comments can be posted.

Less Is Moore

Comments Filter:
  • Ecomomics (Score:2, Interesting)

    by Mephistophocles ( 930357 ) on Wednesday January 28, 2009 @03:06PM (#26642559) Homepage
    Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?
  • Or... (Score:3, Interesting)

    by BobMcD ( 601576 ) on Wednesday January 28, 2009 @03:11PM (#26642631)

    That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.

    Or, it could be good news for them. Especially in the light of the things like the "Vista Capable" bru-ha-ha, and the impact Vista had on RAM prices when fewer than the projected number of consumers ran out to buy upgrades.

    Maybe Intel and NVidia are going to be wearing the sadface, but I'm willing to wager HP and the like are almost giddy with the thought of not having to retool their production lines yet again. They get to slap on a shiny new OS and can keep the same price point on last year's hardware.

    Some of the corporations in the world buy new hardware simply to keep it 'fresh' and less prone to failure. My own company has recycled a number of Pentium 4 machines that are still quite capable of running XP and Internet Explorer. With the costs of new desktop hardware at an all-time low for us, we get to paint a pretty picture about ROI, depreciation, budgets, and the like.

  • This is nothing new (Score:5, Interesting)

    by georgewilliamherbert ( 211790 ) on Wednesday January 28, 2009 @03:12PM (#26642665)

    Some of you may remember the 1980s and early 1990s, where PCs started out costing $5,000 and declined slowly to around $2,500 for name brand models.

    Around 1995, CPUs exceeded the GUI requirements of all the apps then popular (this is pre-modern gaming, of course). Around 1996 and into 1997 the prices of PCs fell off a cliff, down to $1,000.

    Those who fail to remember history...

  • by Tenek ( 738297 ) on Wednesday January 28, 2009 @03:16PM (#26642721)

    sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway

    I think if you took out all the features that 'hardly anyone uses' you wouldn't have much of a product left. Bloatware and the 80/20 Myth [joelonsoftware.com]

  • by Jason Levine ( 196982 ) on Wednesday January 28, 2009 @03:18PM (#26642741) Homepage

    Years back when everyone in the mainstream were trotting out how many Mhz/Ghz their processors ran and how their Latest And Greatest system was *soooo* much better, I insisted that the computer industry had a dirty little secret. The mid to low end computers would work just fine for 90% of users out there. Computer makers didn't want people knowing this and instead hoped that they would be convinced to upgrade every 2 or 3 years. Eventually, though, people learned that they were able to read their e-mail, browse the web, and work on documents without buying a system with a bleeding edge processor and maxed out specs. This seems like the continuation of the secret's collapse. People are realizing that not only don't they need to buy a system with a blazing fast processor just to send out e-mail, but they don't need to buy 10 different servers when one powerful (but possibly still not bleeding edge) server an run 10 virtual server instances.

  • Re:Bad Logic (Score:4, Interesting)

    by Chris Burke ( 6130 ) on Wednesday January 28, 2009 @03:26PM (#26642885) Homepage

    I guess that's what happens when you cut and paste computer science terms from an Economist article. In the next sentence, you state correctly that Moore's "Law" is an observation not a law! It's not that the computer industry (and I think we're only talking hardware here) follows this observation, it's that historically it has held true. No one's going to make a huge leap in R&D to be able to put 10x the number of transistors on a chip only to have engineers come down on them to stop it saying "no one has ever broken Moore's Law and we're not going to start now!" That idea is preposterous. We're limited by our own technology that happens to follow an ok model, it's not a choice!

    Yes, that's all true, but if you don't think chip makers throw up graphs with a curve on them for Moore's Law and use that as a guideline for where they should be in the future, which could be called "following"... you're mistaken. Obviously if the observation continues to hold true, that's only because of the advances in R&D that produce new technology. However those advances come as a result of choices, like how much and what kind of R&D to do, and those choices are themselves driven in part by Moore's Law.

    Now as far as going faster and getting 10x more transistors on a chip, sure that's not much of a choice. That's because the industry is already busting its ass to maintain the current exponential trend. For that very reason I'd never take the phrase "following Moore's Law" to mean intentionally limiting technology advancement. Au contraire, if anything I take it to mean we're "following" in the sense that you'd be "following" Usian Bolt in the 200m dash -- if you're anywhere near keeping up, you're a bad ass. The only motivation would be to drop off that pace.

    Which, to some extent, we've already seen in the 00's. It's still exponential growth, but the time factor has increased somewhat. I can't remember the data I saw, but it appeared to have gone from a doubling every 18 months to 24?

    By the way, I agree the examples are pretty poor. For virtualization you want the newest beefiest processor with the best hardware support for virtualization you can get. The whole idea is that you want a single machine to appear as though it is a plethora of machines each with enough horsepower to do whatever that specific machine needs to do. This is the opposite of just wanting to do the same thing cheaper, it's wanting to do the same thing times a plethora, so you need a machine that is at least one plethora times as powerful. Being cheaper overall is just a desirable side effect. I hope you agree that "plethora" is a great word.

  • by zappepcs ( 820751 ) on Wednesday January 28, 2009 @03:31PM (#26642947) Journal

    Lets be honest here. What would we like the average office PC to be doing? If they are beefy enough to run a grid on, and so also perform many of the data retention, de-duplication, HPC functions, and many other things, then yes, having faster-better-more on the desktop at work could be interestingly useful. Software is needed to use hardware that way, meh.

    There will never be a time in the foreseeable future when beefier hardware will not be met with requirements for its use. Call that Z's corollary to Moore's Observation.... if you want.

  • Hmm... (Score:4, Interesting)

    by kabocox ( 199019 ) on Wednesday January 28, 2009 @03:32PM (#26642971)

    I can buy a $350 mini laptop, $500 decently speced laptop, or a $500 desktop with what would have been unbelievable specs not long ago. I remember when I picked up computer shopper and was thrilled that there were any bare bones dekstops that sold at the $1K mark. Now you can get full featured systems for under .5K that do things that $2-3K machines couldn't do.

    Really, there is no such thing as a "Moore's Law." It's Moore's trend lines that have been holding. That it lasted 10 years, much less this long has been utterly amazing. I fully expect for us to run into problems keeping with "Moore's Law" before 2100. 5-10 years after the trend is broken it'll be something the future folks will either forget about it entirely or look back and kinda giggle at us like we were just silly about it all. 50-100 years later no one will care though every one will be making use of the by products of it. Do you notice where the stuff for roads comes from or what Roman engineer built the most or best roads? That's generally what they'll think of any computing device older than 20 years. If Moore's law holds until 2050, every computing device that we've currently made will be either trash or museum pieces by that time. Heck, you have people getting rid/upgrading of cell phones almost every 3-6 months already.

    We imagine replicators in Star Trek, but we don't need them with Walmart and 3-6 months for new products to come out. Consider Amazon+UPS next day shipping. Replicator tech would have to be cheaper and faster than that to compete. I think that it's more likely that we'll keep on improving our current tech. What happens when UPS can do 1 hour delivery to most places on the globe? Replicators might spring up, but only for the designers to use them to spend a week making 10K of a unit, to put it went on sale today, which would be sold out in two weeks and discounted by the week after. Face it; we are already living in a magical golden age. We just want it to be 1000x better in 50 years.

  • by jawtheshark ( 198669 ) * <{moc.krahsehtwaj} {ta} {todhsals}> on Wednesday January 28, 2009 @03:36PM (#26643043) Homepage Journal

    Adjust those number for inflation... Or better, retro-adjust current prices for 1980 prices.

    I do remember falling prices in the nineties, but now a PC is pretty close to an impulse buy. For me in 2000, a 2000€ PC was already an impulse buy (That said, I was single, a successful consultant with brand-new sports car, so my "impulses" were a bit off). These days an EEE PC is an impulse buy for anyone loving toys and having a bit spare money.

    This is not a repeat of the previous price-falls, this is the computer becoming a throw-away consumer item like a toaster. (Running NetBSD obviously ;-) )

  • More is More (Score:5, Interesting)

    by Archangel Michael ( 180766 ) on Wednesday January 28, 2009 @03:43PM (#26643157) Journal

    One of the things I learned many years ago, is that computer and computing speed isn't a function of how fast something runs. Rather it is a matter of whether or not you actually run something.

    If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.

    However, if you run Task B, which takes 1.5 hours to complete, and a new computer will run that same task in say 4 minutes (Real world example from my past, log processing), the difference isn't necessarily the 86 minute difference, but rather if and how often you actually run that task.

    It is endlessly amusing to see "real world benchmarks" that run in 3 minutes for most processors, separated by less than 2 x. Or frames per sec. Or ...... whatever.

    When shit takes too long to get done, you tend NOT to do it. If the difference is a few seconds, that is nice and all, and a few seconds may be of interest to "extreme" hobbyists.

    But Real World differences are not marginally decreasing from 300 to 279 seconds. Sorry, but those extra few seconds aren't going to prevent you from running that Task.

    The true measure is not how fast something gets done, but whether or not you actually do the task, because the time involve is prohibitive.

  • by Jason Earl ( 1894 ) on Wednesday January 28, 2009 @04:09PM (#26643601) Homepage Journal

    Still, will that processing have to be on the client? Probably not. Heck, I don't pretend to be a computer science Einstein, but I wouldn't be at all surprised that the limiting factor in functional AI is access to piles and piles of centralized data ("piles" is a technical CS term that means "a lot"). I don't need a fancy computer to access Google, and when Google is finally self-aware I don't suppose I'll need more than a web browser to talk to it either.

    It's pretty certain that computers are going to get faster, but, at least for now, what I really want are for my computers to become less expensive to purchase and to run. Google can buy faster computers if it wants. I'll settle for one that uses less electricity or that comes as a prize in a Cracker Jacks (tm) box.

    Then again, I spend most of my day in either a web browser or Emacs, and neither of those applications is in dire need of a fast processor.

  • Re:Sigh (Score:3, Interesting)

    by melonman ( 608440 ) on Wednesday January 28, 2009 @04:26PM (#26643873) Journal

    In some markets it simply hasn't been possible to buy a 2 year-old spec until recently. In particular, laptops used to vanish at the point at which they made sense for someone who just wanted a portable screen and keyboard with which to take notes and maybe enter some code. The only way to get a laptop much cheaper than the premium spec was to go s/h, and s/h laptops have never been that attractive an option.

    Machines with the relative spec of NetBooks would have sold fine a decade ago. It's just that the laptop-manufacturing cartel refused to sell them to us. NetBooks have broken that cartel, and I'm expecting the shock waves to shake up prices a lot further up the spec rankings. That's what really worries companies like Sony - a $2000 Vaio becomes harder to justify the more the price of the basic product drops, and the more people demonstrate that the basic product is good enough for many tasks.

  • Re:Bad Logic (Score:3, Interesting)

    by swillden ( 191260 ) <shawn-ds@willden.org> on Wednesday January 28, 2009 @04:39PM (#26644083) Journal

    The Open Office XML format (.docx, .pptx, .xlsx, etc)

    That's the "Office Open XML format". MS was trying to create confusion with OpenOffice.org's format, OpenDocument Format, when they named it but they weren't quite blatant enough about it to call it "Open Office XML".

    I'm not even going to get into the number of FOSS-based companies you leave in the cold by hanging onto .doc and the proprietary document format that it represents instead of using the freely available OOXML specification.

    I wonder how well F/LOSS suites handle MSOOXML at this point. They seem to handle the old proprietary formats just fine, and given the nature of the freely-available MSOOXML spec, it's not unlikely that they haven't implemented all of it yet.

  • by IGnatius T Foobar ( 4328 ) on Wednesday January 28, 2009 @04:39PM (#26644087) Homepage Journal
    This is dangerous territory for Microsoft to be in. Levelling off of computer power means that buyers are getting off the upgrade treadmill -- they're not buying new computers every couple of years. Preloads on new computers are where Microsoft makes the bulk of their Windows sales.

    To make matters worse, without constant upgrades, Microsoft and ISV's can't count on new API's becoming widespread anytime soon, so they have to write applications for the lowest common denominator. This prevents Microsoft from forcing its latest agenda onto everyone -- and even worse, it could potentially provide the WINE team enough time to reach feature parity with Windows XP. (Spare me the lecture, have you tried WINE lately? It's surprisingly good these days.)

    All in all, Microsoft is being forced to stand still in a place where they can't afford to. Commoditization sucks when you're a monopolist, doesn't it?
  • tail wagging dog (Score:3, Interesting)

    by roc97007 ( 608802 ) on Wednesday January 28, 2009 @04:48PM (#26644255) Journal

    > [windows 7 same as] Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.

    I think this somewhat misses the point. People are less likely to buy new hardware in an economic downturn. It doesn't really have anything to do with whether the next version of Windows drives hardsware sales, as previous versions have done.

    If Windows 7 really "runs faster with fewer resources" than Vista, (I'm hopeful, but this won't be established until it's actually released) then it could be that Microsoft is recognizing the fact that they will get more over-the-counter purchases if they make it more likely to run on legacy hardware. Else, people will just stick with what they have. It's the economy, not Microsoft, that's the main driver.

    I am actually hopeful that we've broken the mindless upgrade cycle. I'm sorry it took a recession to do it.

  • by w0mprat ( 1317953 ) on Wednesday January 28, 2009 @05:05PM (#26644559)
    I agree that the shift will is towards smaller cheaper more energy efficiency and ubiquitous computers. However it doesn't necessarily follow that computers won't get faster and software to make use of it. Moore's inacurately named Law is still holding true at the bleeding edge, driven by gaming, content creation and research. What we are getting is a growing gap between the lowest end and the highest end. The high end will become a smaller slice of revenue for sure.

    For chip manufacturers little will change, performance per watt and cost/die/wafer require the same thing: ever smaller transistors that use less power per iteration. It's the same thing. So in reality Moore's Observation is still iterating unchecked, it's just the end packaging that will be different.

    Instead of dozens of billion-transistor multicore behemoths from a wafer, they will get hundreds of tiny cut-down processors with a lower transitor count.

    Now, it's been shown the latter which is a more profitable approach.
  • by gbjbaanb ( 229885 ) on Wednesday January 28, 2009 @06:23PM (#26645763)

    2) and that you don't need teraflops of CPU/GPU power just to draw greasepaper-style borders around your Microsoft Word windows

    You're dead right there. I always wondered why I could play games a few years ago that had some stunning graphics, yet ran very well on a 900Mhz PC with 256MB ram; yet Vista needs 1Gb and a graphics card that's better than that old spec just to draw a poxy bit of transparent titlebar.

    I'd blame the managed .NET stuff, but the WDDM is native! Makes the inefficiency even worse.

  • by bit01 ( 644603 ) on Wednesday January 28, 2009 @06:41PM (#26645999)

    Maybe people will realize what an obscene waste of money and computing power and operating system like Windows Vista, which requires a gig of RAM to run, really is.

    Hear, hear. To a lesser extent the same is true of all modern GUI's. Most GUI programmers today seem to have no clue how to write efficient, usable software. I am still waiting for GUI software that responds fast enough. I'm an experienced computer programmer/user and I'm endlessly irritated by the slow response of computers that are supposed to be operating in the GHz range.

    Take gedit, the gnome notepad-like editor, as an example. On startup this opens almost 3000 files ("strace -t -f -o t.log gedit .bashrc;grep 'open(' t.log|wc") to edit a single (1) text file. That's insane.

    It takes more than a second. That is way too slow and is the main reason why I still often use vi and nedit. Vi opens 58 files on startup. That's still too much but is still orders of magnitude faster than gedit and most GUI editors. nedit is an ancient GUI editor with not bad functionality including rectangular copies that opens ~200 files on startup.

    Incidentally, I'm well aware that file opening is only one factor amongst many of things that slow programs down. It's good proxy for measuring poor programming practices though.

    GUI programmers need to understand that for experienced computer users speed of response is a major factor in the usability of any program. A GUI with lots of prettyness is useless if it's not fast. And fast is not seconds. It is milliseconds.

    ---

    Don't be a programmer-bureaucrat; someone who substitutes marketing buzzwords and software bloat for verifiable improvements.

8 Catfish = 1 Octo-puss

Working...