Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Businesses Operating Systems Software Windows Hardware Technology

Less Is Moore 342

Hugh Pickens writes "For years, the computer industry has made steady progress by following Moore's law, derived from an observation made in 1965 by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months. The Economist reports however that in the midst of a recession, many companies would now prefer that computers get cheaper rather than more powerful, or by applying the flip side of Moore's law, do the same for less. A good example of this is virtualisation: using software to divide up a single server computer so that it can do the work of several, and is cheaper to run. Another example of 'good enough' computing is supplying 'software as a service,' via the Web, as done by Salesforce.com, NetSuite and Google, sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway. Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade — only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles."
This discussion has been archived. No new comments can be posted.

Less Is Moore

Comments Filter:
  • Let's see (Score:5, Funny)

    by Rik Sweeney ( 471717 ) on Wednesday January 28, 2009 @03:05PM (#26642547) Homepage

    Less: 120884 bytes
    More: 27752 bytes

    Wow, that's right!

  • Ecomomics (Score:2, Interesting)

    Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?
    • Proof that Moore's law is driven by economics as much as (or even more than) technological discovery/innovation?

      You have a good point that this could be a test of your hypothesis. The purchase of a computer to a company or governement is frequently considered a "capital" purchase. Even though over time, the cost of computing is dominated by the operating cost of software, power, upgrades and IT.

      However since capital is usually scare in organizations it tends to drive acqusitiion decisions. People buying things that they can't easily replace will tend to seek higher perfromance equipment.

      But that may be about to ch

    • by Moraelin ( 679338 ) on Wednesday January 28, 2009 @05:26PM (#26644879) Journal

      Well, actually it's just proof that history repeats itself. Because this thing has happened before. More than once.

      See, in the beginning, computers were big things served by holy priests in the inner sanctum, and a large company had maybe one or two. And they kept getting more and more powerful and sophisticated.

      But then it branched. At some point someone figured that instead of making the next computer which can do a whole megaflop, they can do a minicomputer. And there turned out to be a market for that. There were plenty of people who preferred a _cheap_ small computer, than doubling the power of their old mainframe.

      You know how Unix got started on a computer with 4k RAM, which actually was intended to be just a co-processor for a bigger computer? Yeah, that's that kind of thing at work. Soon everyone wanted such a cheap computer with a "toy" OS (compared to the sophisticated OSs on mainframes) instead of big and powerful iron. You could have several of those for the price of a big powerful computer.

      Then the same thing happened with the micro. There were plenty of people (e.g., DEC) who laughed at the underpowered toy PCs, and assured everyone that they'll never replace the mini. Where is DEC now? Right. Turned out that a hell of a lot of people had more need of several cheap PCs ("cheap" back then meaning "only 3 to 5 thousands dollars") instead of an uber-expensive and much more powerful mini (costing tens to hundreds of thousands.)

      Heck, in a sense even multitasking appeared as sorta vaguely the same phenomenon. Instead of more and more power dedicated to one task, people wanted just a "slice" of that computer for several tasks.

      Heck, when IBM struck it big in the computer market, waay back in the 50's, how did they do it? By selling cheaper computers than Remington Rand. A lot of people had more use for a "cheap" and seriously underpowered wardrobe-sized computer than for a state of the art machine costing millions.

      Heck, we've even seen this split before, as portable computers split into normal laptops and PDAs. At one point it became possible to make a smaller and seriously less powerful PDA, but which is just powerful enough to do certain jobs almost as well as a laptop does. And now it seems to me that the laptop line has split again, giving birth to the likes of the Eee.

      So really it's nothing new. It's what happens when a kind of machine gets powerful enough to warrant a split between group A who needs the next generation that's 2x as powerful, and group B which says, "wtf, it's powerful enough for what I need. Can I get it at half price in the next generation?" Is it any surprise that it would happen again, this time to the PC? Thought so.

  • by Opportunist ( 166417 ) on Wednesday January 28, 2009 @03:09PM (#26642609)

    Let's be honest here. What does the average office PC run? A word processor, a spreadsheet, an SAP frontend, maybe a few more tools. And then we're basically done. This isn't really rocket science for a contemporary computer, it's neither heavy on the CPU nor on the GPU. Once the computer is faster than the human, i.e. as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

    And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

    • by Hognoxious ( 631665 ) on Wednesday January 28, 2009 @03:22PM (#26642805) Homepage Journal

      as soon as the human doesn't have to wait for the computer to respond to his input, when the input is "instantly" processed and the user does not get to see a "please wait, processing" indicator (be it a hourglass or whatever), "fast enough" is achived.

      It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.

      • Re: (Score:2, Troll)

        by Opportunist ( 166417 )

        If I cannot turn that crap off, I don't want the software. If I can turn it off, I turn it off.

        An interface is supposed to do its job. When I play games or when I watch an animation, I want pretty. When I work, I want efficiency. Don't mix that and we'll remain friends.

      • by QRDeNameland ( 873957 ) on Wednesday January 28, 2009 @04:06PM (#26643559)

        It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.

        Actually, that's Cole's Law [wikipedia.org], which states that an unused plate space must be occupied with cheap filler that no one really wants.

      • by powerlord ( 28156 ) on Wednesday January 28, 2009 @04:31PM (#26643945) Journal

        It's Moore's other law - once fast enough is achieved, you have to slow it down with shite like rounded 3d-effect buttons, smooth rolling semi-transparent fade-in-and-out menus and ray-traced 25 squillion polygon chat avatars.

        Its usually expressed as Gate's Corollary to Moore's Law: Whatever Moore Giveth, Gates Taketh Away.

    • by Skater ( 41976 )

      We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

      Exactly. I have a 5-year-old laptop with a Pentium 2.4 gigahertz processor, but even with today's software (latest versions of OO.org, Firefox, Google Earth, etc.) it runs just fine. Sure, a newer computer would be somewhat faster, but this is not "so slow it's painful" like my Pentium 133 was 5 years after I bought it.

      It works well enough that I recently put in a larger hard drive and a new battery to keep it useful for the foreseeable future, because I do not intend to replace it until it dies (or until

    • And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

      Agreed. There would have to be a new paradigm shift (ok, I fucking hate that word, maybe usage need change?) to make an upgrade worthwhile.

      For my personal needs, DOS was good for a long time. Then Win95 came out and true multitasking (ok, kinda working multitasking that still crashed a lot) made an upgrade compelling. I couldn't really use any of the browsers on my old dos box and Win95 opened a whole new world. That computer got too slow for the games eventually and that drove the next upgrade. Online vide

      • And, bluntly, I don't see any "we must have this!" features in any office standard application, at least since 2000.

        Multitasking was a compelling reason. It became possible to run multiple applications at once. Must-have.

        Better interveaving between office products and email was a must have, too.

        Active Directory (and other, similar technologies) made administrating multiple accounts a lot easier and certainly helped speeding up rollouts. Also, must-have (for offices, but we're talking office here).

        And so on.

    • Re: (Score:3, Interesting)

      by zappepcs ( 820751 )

      Lets be honest here. What would we like the average office PC to be doing? If they are beefy enough to run a grid on, and so also perform many of the data retention, de-duplication, HPC functions, and many other things, then yes, having faster-better-more on the desktop at work could be interestingly useful. Software is needed to use hardware that way, meh.

      There will never be a time in the foreseeable future when beefier hardware will not be met with requirements for its use. Call that Z's corollary to Moor

    • by mrbcs ( 737902 ) *
      EXACTLY! You sir, fine Opportunist have won the thread! This is why the industry will almost die in the next few years. The upgrade cycle has died long ago. People are happy with their machines and the last (arguable) thing to cause upgrades were games.

      I have an AMD dual core 4400 with a couple Nvidia 7300's that will take anything I throw at it. I don't think I'll ever need another computer unless the thing fries. They have also become so cheap that they have also become commodities. Why fix something for

      • the industry will almost die in the next few years. [...] I don't think I'll ever need another computer unless the thing fries.

        Which means that clamoring for cheap will mean hardware makers will design _more_ early failure into hardware, and reduce warranties to nil.

    • Once the computer is faster than the human ... "fast enough" is achived ... And once you get there, you don't want faster machines. More power would essentially go to waste. We have achived this moment about 4-5 years ago. Actually, we're already one computer generation past "fast enough" for most office applications.

      If we were talking about CPU power, I'd completely agree with you. A Pentium IV was fast enough for most people, and a modern Core2Duo is more than enough. I still get to points where my sys

    • by Tubal-Cain ( 1289912 ) * on Wednesday January 28, 2009 @03:54PM (#26643321) Journal
      We reached the point of "fast enough" years ago. Computers were so fast we needed a semi-useful way to waste cpu cycles. And so the GUI was born.
    • You forgot "anti virus software". By fare the biggest (and probably least useful) resource hog.

  • Or... (Score:3, Interesting)

    by BobMcD ( 601576 ) on Wednesday January 28, 2009 @03:11PM (#26642631)

    That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.

    Or, it could be good news for them. Especially in the light of the things like the "Vista Capable" bru-ha-ha, and the impact Vista had on RAM prices when fewer than the projected number of consumers ran out to buy upgrades.

    Maybe Intel and NVidia are going to be wearing the sadface, but I'm willing to wager HP and the like are almost giddy with the thought of not having to retool their production lines yet again. They get to slap on a shiny new OS and can keep the same price point on last year's hardware.

    Some of the corporations in the world buy new hardware simply to keep it 'fresh' and less prone to failure. My own company has recycled a number of Pentium 4 machines that are still quite capable of running XP and Internet Explorer. With the costs of new desktop hardware at an all-time low for us, we get to paint a pretty picture about ROI, depreciation, budgets, and the like.

    • Re: (Score:2, Insightful)

      by craighansen ( 744648 )
      In fact, M$ might end up settling these Vista-Capable lawsuits by offering upgrades to W7, especially if it's faster on the barely-capable hardware that the subject of the suit. Cheap way to settle for them...
    • Unfortunately, if companies start buying computers in 5 or 6 year cycles instead of 2 or 3 year cycles then HP definitely won't be giddy. They'll be even less giddy if the average price of a desktop PC drops to $200 and the average price of a laptop sinks to $500.

  • Sigh (Score:4, Insightful)

    by Bastard of Subhumani ( 827601 ) on Wednesday January 28, 2009 @03:11PM (#26642635) Journal

    companies would now prefer that computers get cheaper rather than more powerful

    There's already a method for that: it's called by the catchy title "buying a slightly older one".

    A related technique is called "keeping the one you've already got".

    • A related technique is called "keeping the one you've already got".

      I don't know... That sounds expensive.

      • A previous company I worked for would lease their workstations for 3 years. That did mean that they were constantly paying for computers ... and rolling out new boxes.

        But there weren't many problems with the HARDWARE during those 3 years.

        As they started keeping the workstations longer, there were more problems with the hardware AND there were problems with replacing the hardware that broke. Which was leading to a non-uniform desktop environment. It's more difficult to support 100 different machines with 100

    • Re: (Score:3, Interesting)

      by melonman ( 608440 )

      In some markets it simply hasn't been possible to buy a 2 year-old spec until recently. In particular, laptops used to vanish at the point at which they made sense for someone who just wanted a portable screen and keyboard with which to take notes and maybe enter some code. The only way to get a laptop much cheaper than the premium spec was to go s/h, and s/h laptops have never been that attractive an option.

      Machines with the relative spec of NetBooks would have sold fine a decade ago. It's just that the la

  • This is nothing new (Score:5, Interesting)

    by georgewilliamherbert ( 211790 ) on Wednesday January 28, 2009 @03:12PM (#26642665)

    Some of you may remember the 1980s and early 1990s, where PCs started out costing $5,000 and declined slowly to around $2,500 for name brand models.

    Around 1995, CPUs exceeded the GUI requirements of all the apps then popular (this is pre-modern gaming, of course). Around 1996 and into 1997 the prices of PCs fell off a cliff, down to $1,000.

    Those who fail to remember history...

    • Re: (Score:2, Interesting)

      by jawtheshark ( 198669 ) *

      Adjust those number for inflation... Or better, retro-adjust current prices for 1980 prices.

      I do remember falling prices in the nineties, but now a PC is pretty close to an impulse buy. For me in 2000, a 2000€ PC was already an impulse buy (That said, I was single, a successful consultant with brand-new sports car, so my "impulses" were a bit off). These days an EEE PC is an impulse buy for anyone loving toys and having a bit spare money.

      This is not a repeat of the previous price-falls, this is the

      • Re: (Score:3, Informative)

        by jawtheshark ( 198669 ) *
        Just filling in the gaps: Source [bls.gov]
        • 5000$ in 1980 = 12889.87$ in 2008
        • 2500$ in 1990 = 4063.22$ in 2008
        • 1000$ in 1997 = 1323.52$ in 2008

        Now, let's take the Asus EEE PC (280$) in 1990. In 1990 you would have paid 172.28$ for that. That's a PC that would have beaten your top-of-the-line 2500$ 1990 PC to smithereens. (The i486 came out in 1989!))

  • Continually making the same thing for less money is not a very good business model.

    Pretty soon the customers will be asking for the same performance, free.

    Reminds me of the old quote, "We have being doing so much with so little for so long we are now qualified to do anything with nothing at all".

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Wednesday January 28, 2009 @03:13PM (#26642669)
    Comment removed based on user account deletion
    • Re: (Score:3, Interesting)

      by gbjbaanb ( 229885 )

      2) and that you don't need teraflops of CPU/GPU power just to draw greasepaper-style borders around your Microsoft Word windows

      You're dead right there. I always wondered why I could play games a few years ago that had some stunning graphics, yet ran very well on a 900Mhz PC with 256MB ram; yet Vista needs 1Gb and a graphics card that's better than that old spec just to draw a poxy bit of transparent titlebar.

      I'd blame the managed .NET stuff, but the WDDM is native! Makes the inefficiency even worse.

    • Re: (Score:3, Interesting)

      by bit01 ( 644603 )

      Maybe people will realize what an obscene waste of money and computing power and operating system like Windows Vista, which requires a gig of RAM to run, really is.

      Hear, hear. To a lesser extent the same is true of all modern GUI's. Most GUI programmers today seem to have no clue how to write efficient, usable software. I am still waiting for GUI software that responds fast enough. I'm an experienced computer programmer/user and I'm endlessly irritated by the slow response of computers that are supposed t

  • Cost is very important particularly in business, but for home users the price for a "good enough" PC has been in the same 600-1200 price range for a long time. What I think will drive sales particularly in the home, and mobile professional is size, have less wires, and use less energy. Folks are willing to pay more for that; rather than more powerful chips that they don't need. This should be good news for OEMs, because it is easier to show the "average" user a more aesthetic case, or more wireless peripher
    • PC hardware has left software requirements somewhat behind, unless you want to run the very latest games.
      My dual core PC from 2007 is still more than sufficient in terms of performance. The price to put a similar or better machine together has dropped from 800 Euros to 500 Euros, however (without monitor). That is assuming
      -the same case
      -a comparable power supply
      -same memory (2 GByte)
      -a slightly faster but less power-hungry CPU (AMD 45nm vs. 90nm, each in the energy-efficient version)
      -a faster GPU (ATI 4670

  • by Tenek ( 738297 ) on Wednesday January 28, 2009 @03:16PM (#26642721)

    sacrificing the bells and whistles that are offered by conventional software that hardly anyone uses anyway

    I think if you took out all the features that 'hardly anyone uses' you wouldn't have much of a product left. Bloatware and the 80/20 Myth [joelonsoftware.com]

    • So I'm starting to suspect that fretting about bloatware is more of a mental health problem than a software problem.

      Amen.

    • by GreatBunzinni ( 642500 ) on Wednesday January 28, 2009 @04:27PM (#26643887)

      The article you pointed out is pure nonsense. It claims that bloat isn't important due to the fact that memory cost dropped. Not only that, it tries to base that claim on this idiotic metric of dollar per megabyte and how the fact that software like microsoft's excel bloat from a 15MB install in the 5.0 days to a 146MB install in the 2000 days is somehow a good thing because in the 5.0 days it took "$36 worth of hard drive space" while "Excel 2000 takes up about $1.03 in hard drive space". No need to justify a 100% footprint. We are saving money by installing more crap to do the exact same thing.

      In fact, the idiot that wrote that article even had the audacity to state:

      In real terms, it's almost like Excel is actually getting smaller!

      Up is down, left is right, bloat is actually good for you.

      But people still complain. Although it appears that we should be grateful for all that bloat, we are somehow being ungrateful by believing that all that bloat is unnecessary. But fear not, the idiot that wrote the article has a nice accusation for all those bloat haters out there:

      I guess those guys just hate Windows.

      Yes, that's it. We don't hate orders of magnitude increase in bloat simply to be able to perform exactly what has been easily done with a fraction of resources. We don't hate to be forced to spend money on hardware to be left with a less than adequate solution when compared with the previous generation. We simply hate windows. Good call.

      The article is bad and you should feel bad for posting a link to it.

  • by Jason Levine ( 196982 ) on Wednesday January 28, 2009 @03:18PM (#26642741) Homepage

    Years back when everyone in the mainstream were trotting out how many Mhz/Ghz their processors ran and how their Latest And Greatest system was *soooo* much better, I insisted that the computer industry had a dirty little secret. The mid to low end computers would work just fine for 90% of users out there. Computer makers didn't want people knowing this and instead hoped that they would be convinced to upgrade every 2 or 3 years. Eventually, though, people learned that they were able to read their e-mail, browse the web, and work on documents without buying a system with a bleeding edge processor and maxed out specs. This seems like the continuation of the secret's collapse. People are realizing that not only don't they need to buy a system with a blazing fast processor just to send out e-mail, but they don't need to buy 10 different servers when one powerful (but possibly still not bleeding edge) server an run 10 virtual server instances.

  • 1) No one is following Moore's law. It's a description of what happens.

    2) You can, of course, come up with some equation that describes the cost of a set amount of processor power over time.

    3) This article and this summary make bad economic assumptions and use faulty logic. I suggest to all reading the comments that it's not worth reading.

    That is all.

  • by rminsk ( 831757 ) on Wednesday January 28, 2009 @03:22PM (#26642817)
    Moore's law does not say "that the amount of computing power available at a particular price doubles every 18 months." Moore's law says that the number of transistors that can be placed inexpensively on an integrated circuit increase exponentially, doubling approximately every two years.
    • Yes, but you don't expect anyone on a site like this to actually know your random "factoid".

      What we would also like to see is how many computing powers such as this equal a standard US LOC?

  • There is a possibility (however unlikely) that Microsoft has made a very clever and astute move going for a streamlined version of their current technology rather than a new iteration with whatever arbitrary new features weighing it down. It's clever because it's timed for release in a global recession and a switch in focus to developing markets - the 'next billion' users.

    But it's unlikely, and I would hesitate to say microsoft has actually preempted anything, I'd say their responding to what they've been
  • Hmm... (Score:4, Interesting)

    by kabocox ( 199019 ) on Wednesday January 28, 2009 @03:32PM (#26642971)

    I can buy a $350 mini laptop, $500 decently speced laptop, or a $500 desktop with what would have been unbelievable specs not long ago. I remember when I picked up computer shopper and was thrilled that there were any bare bones dekstops that sold at the $1K mark. Now you can get full featured systems for under .5K that do things that $2-3K machines couldn't do.

    Really, there is no such thing as a "Moore's Law." It's Moore's trend lines that have been holding. That it lasted 10 years, much less this long has been utterly amazing. I fully expect for us to run into problems keeping with "Moore's Law" before 2100. 5-10 years after the trend is broken it'll be something the future folks will either forget about it entirely or look back and kinda giggle at us like we were just silly about it all. 50-100 years later no one will care though every one will be making use of the by products of it. Do you notice where the stuff for roads comes from or what Roman engineer built the most or best roads? That's generally what they'll think of any computing device older than 20 years. If Moore's law holds until 2050, every computing device that we've currently made will be either trash or museum pieces by that time. Heck, you have people getting rid/upgrading of cell phones almost every 3-6 months already.

    We imagine replicators in Star Trek, but we don't need them with Walmart and 3-6 months for new products to come out. Consider Amazon+UPS next day shipping. Replicator tech would have to be cheaper and faster than that to compete. I think that it's more likely that we'll keep on improving our current tech. What happens when UPS can do 1 hour delivery to most places on the globe? Replicators might spring up, but only for the designers to use them to spend a week making 10K of a unit, to put it went on sale today, which would be sold out in two weeks and discounted by the week after. Face it; we are already living in a magical golden age. We just want it to be 1000x better in 50 years.

  • by xs650 ( 741277 ) on Wednesday January 28, 2009 @03:36PM (#26643027)

    Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version.

    Without Vista, MS wouldn't be able to claim that 7 was faster than their previous version of Windows.

  • More is More (Score:5, Interesting)

    by Archangel Michael ( 180766 ) on Wednesday January 28, 2009 @03:43PM (#26643157) Journal

    One of the things I learned many years ago, is that computer and computing speed isn't a function of how fast something runs. Rather it is a matter of whether or not you actually run something.

    If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.

    However, if you run Task B, which takes 1.5 hours to complete, and a new computer will run that same task in say 4 minutes (Real world example from my past, log processing), the difference isn't necessarily the 86 minute difference, but rather if and how often you actually run that task.

    It is endlessly amusing to see "real world benchmarks" that run in 3 minutes for most processors, separated by less than 2 x. Or frames per sec. Or ...... whatever.

    When shit takes too long to get done, you tend NOT to do it. If the difference is a few seconds, that is nice and all, and a few seconds may be of interest to "extreme" hobbyists.

    But Real World differences are not marginally decreasing from 300 to 279 seconds. Sorry, but those extra few seconds aren't going to prevent you from running that Task.

    The true measure is not how fast something gets done, but whether or not you actually do the task, because the time involve is prohibitive.

    • Re: (Score:3, Insightful)

      by ianare ( 1132971 )

      If computer speeds are twice as fast, and it currently takes you ten seconds to accomplish Task A, and a new computer will allow you to accomplish that same task in 5 seconds .... getting a new computer is not that big of a deal.

      Depends on the frequency of the task too. In Photoshop going from 10 to 5 secs for a simple task done often (say noise reduction) is a big deal. Or if an IDE takes 1 vs 2 seconds to load the popup showing class variables/methods. It doesn't sound like a big deal, but when you have to do it hundreds of times a day, believe me, those seconds add up ! I would (and have) gladly upgrade a PC for these kinds of real world improvements.

      • Re: (Score:3, Insightful)

        But you don't stop doing those tasks because it takes 10 Seconds. While you do have a valid point, those chunks of time don't stop you, they only annoy you.

        Trust me, you don't know what you DO NOT do because it takes to long to do it. I've seen tasks grow in time, from a minute to 1.5 hours as the logs grow due to increased activity. As the time mounted to process the logs, the less frequently I processed them, often running them over a weekends and evenings because I couldn't afford to process them during

  • The reality is that hardware has pulled so far ahead of software, it will be years before we exploit out current level of technology to its capacity.

    We have some apps that don't understand how to task between CPU's. (We have some OS's that barely grasp that). We have applications that were designed in a time of 16 bit machines and fairly low limits on memory that have been patched and slowly moved along when they really need a completely new architecture underneath now to function well. We

  • The first 486 I got my hands on came with a $5,000 price tag.

    My first Pentium came in, well spec'd, around $2,500.

    The PCs I was building for myself ran about $1,500 five years ago and the last one was down around $1,100 - all "mid-range" machines, capable of playing the latest games in fairly high quality and reasonably functional for at least 18 months to 2 years.

    Since a little after that Pentium, the systems I see more casual friends buying have dropped from few people buying $3,000 laptops to a fair numb

  • Flash taketh away.
  • by Gordon Moore that the amount of computing power available at a particular price doubles every 18 months

    -------------
    BZZZZ wrong answer
    Moore's original law states that: the number of transistors we are able to pack into a given size of silicon real estate inexpensively, doubles every 18 months. He changed this prediction to every 2 years in 1975, which bolstered the perceived accuracy of his prediction.

    Number of transistors for a particular price is a moving target which is entirely dependent on the suppl

  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday January 28, 2009 @04:20PM (#26643791) Journal
    While I can see the desire for cheaper rather than more powerful, I do wonder how much of the power/price tradeoff curve actually makes sense. Traditionally, the very high end of the curve makes very limited sense, since it is the nightmare world of low yields, early adopter taxes, and super critical enterprise stuff. In the middle, the power/price curve tends to be roughly linear; before gradually becoming less favorable at the bottom, because of fixed costs.

    As long as a processor, say, has to be tested, packaged, marked, shipped, etc.(which costs very similar amounts,whether the die in question is a cheap cutdown model or a high end monster) there is going to be a point below which cutting performance doesn't actually cut price by any useful amount. Something like the hard drive is the same way. Any drive has a sealed case, controller board, motor, voice coil unit, and at least one platter. Below whatever the capacity is of that basic drive, there are no real cost savings to be had(incidentally, that is one of the interesting things about flash storage. HDDs might be 10 cents a gig in larger capacities; but that doesn't mean that you can get a 4gig drive for 40 cents, I had a quick look, and you can't get anything new for under about $35. With flash, you might be paying 100 cents a gig; but you pretty much can get any multiple you want).

    Cost, overall, is gradually being whittled down; but, once all the low hanging super high margin products are picked off, there is going to be a point past which it simply isn't possible to exchange price for performance at any reasonable rate. Used and obsolete gear offers a partial solution(since it can be, and is, sold below the cost of production in many cases) but that only works if your needs are small enough to be fulfilled from the used market.
  • by Peepsalot ( 654517 ) on Wednesday January 28, 2009 @04:23PM (#26643837)
    In recent years not only has CPU performance been increased, but the efficiency in terms of power consumption per unit of work has greatly improved.

    Even if the majority of users begin realize they have no practical use for top end CPUs with gobs processing power, everyone still benefits from higher efficiency CPUs. It reduces electric bills, simplifies cooling systems, allows for smaller form factors, etc. I think in the future the power efficiency will become more important as people start to care less about having the ultimate killer machine in terms of processing power. People are already performing actions on their mobile devices(iPhone, Blackberry, etc) which were possible only on a desktop in past years. The strict power requirements of these devices with tiny batteries will continue to demand improvements in CPU technology.

    I'm waiting for the day when it is common to see completely passively cooled desktop computers, with solid state hard disks, no moving parts, sipping just a few watts of power without emitting a single sound.
  • by pz ( 113803 ) on Wednesday January 28, 2009 @04:31PM (#26643957) Journal

    We more-or-less got enough computing power for most things with the introduction of the PIII 1GHz CPU. You might not agree with this, but it's at least approximately true. A computer outfitted with that processor and reasonable RAM browses the web just fine, plays MP3s, reads email, shows videos from YouTube, etc. It doesn't do everything that you might want, but it does a lot.

    If we took the amazing technology that has been used to create the 3 GHz multi-core monsters with massive on-chip cache memory in a power budget of 45W or so in some cases, and applied it to a re-implementation of the lowly PIII, we'd win big. We'd get a PIII 1GHz burning a paltry few watts.

    And this is precisely why chips like the Intel Atom have been so successful. Reasonable computing power for almost no electricity. We don't necessarily need just MORE-FASTER-BIGGER-STRONGER, which is the path Intel and AMD have historically put the most effort into following, we also need more efficient.

    • Re: (Score:3, Insightful)

      by Overzeetop ( 214511 )

      Actually, what we need is a massively fast processor which can scale - quickly - to a "slow" processor like the PIII. Most of the time my systems are idle, and I'd be happy with them running at 400MHz if I could do it for a 90% savings in power. When I hit the gas, though, and want to load my ipod with music from my FLAC collection, doing on the fly transcoding, I want both (or all 4, or 8) cores running at 3+GHz, making my file transfer speeds the bottleneck. I don't care if I burn 150W-200W on the proces

  • by IGnatius T Foobar ( 4328 ) on Wednesday January 28, 2009 @04:39PM (#26644087) Homepage Journal
    This is dangerous territory for Microsoft to be in. Levelling off of computer power means that buyers are getting off the upgrade treadmill -- they're not buying new computers every couple of years. Preloads on new computers are where Microsoft makes the bulk of their Windows sales.

    To make matters worse, without constant upgrades, Microsoft and ISV's can't count on new API's becoming widespread anytime soon, so they have to write applications for the lowest common denominator. This prevents Microsoft from forcing its latest agenda onto everyone -- and even worse, it could potentially provide the WINE team enough time to reach feature parity with Windows XP. (Spare me the lecture, have you tried WINE lately? It's surprisingly good these days.)

    All in all, Microsoft is being forced to stand still in a place where they can't afford to. Commoditization sucks when you're a monopolist, doesn't it?
  • tail wagging dog (Score:3, Interesting)

    by roc97007 ( 608802 ) on Wednesday January 28, 2009 @04:48PM (#26644255) Journal

    > [windows 7 same as] Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version. That could be bad news for computer-makers, since users will be less inclined to upgrade only proving that Moore's law has not been repealed, but that more people are taking the dividend it provides in cash, rather than processor cycles.

    I think this somewhat misses the point. People are less likely to buy new hardware in an economic downturn. It doesn't really have anything to do with whether the next version of Windows drives hardsware sales, as previous versions have done.

    If Windows 7 really "runs faster with fewer resources" than Vista, (I'm hopeful, but this won't be established until it's actually released) then it could be that Microsoft is recognizing the fact that they will get more over-the-counter purchases if they make it more likely to run on legacy hardware. Else, people will just stick with what they have. It's the economy, not Microsoft, that's the main driver.

    I am actually hopeful that we've broken the mindless upgrade cycle. I'm sorry it took a recession to do it.

  • by w0mprat ( 1317953 ) on Wednesday January 28, 2009 @05:05PM (#26644559)
    I agree that the shift will is towards smaller cheaper more energy efficiency and ubiquitous computers. However it doesn't necessarily follow that computers won't get faster and software to make use of it. Moore's inacurately named Law is still holding true at the bleeding edge, driven by gaming, content creation and research. What we are getting is a growing gap between the lowest end and the highest end. The high end will become a smaller slice of revenue for sure.

    For chip manufacturers little will change, performance per watt and cost/die/wafer require the same thing: ever smaller transistors that use less power per iteration. It's the same thing. So in reality Moore's Observation is still iterating unchecked, it's just the end packaging that will be different.

    Instead of dozens of billion-transistor multicore behemoths from a wafer, they will get hundreds of tiny cut-down processors with a lower transitor count.

    Now, it's been shown the latter which is a more profitable approach.
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday January 28, 2009 @06:37PM (#26645943)

    It should be obvious, shouldn't it? Our work enviroment of choice has been the Desktop metophor for about 20 years now. Todays computers are powerfull enough to handle very luxurious desktop enviroments. I've basically replaced my very first PC - the first ever ATX bigtower casing, an InWin from 1996, that ways around about a metric ton - with a Mac Mini. 3D wise I even think it's a downgrade, allthough I only have a Geforce 4200 Ti as my latest 3D card in there.

    But, as others here have pointed our allready, it consumes about the tenth of the power, makes almost no noise at all - even now I can barely hear it - and it is like 40 times as small. Meanwhile FOSSnix based systems are only getting better without making computing skills obsolete and making it even more finacially attractive to go for cheap and small.

    The next performance race for most people will only take place after the standards for powerconsumption, size and noise have been raised and met. After that regular computers will be heading for more power again. I presume that next league will stall after a decade again, when 200$ computers the size of my external HDD have reached the power to render photorealistic motiongraphics in real time.

  • by John Sokol ( 109591 ) on Wednesday January 28, 2009 @07:15PM (#26646467) Homepage Journal

    > first version of Windows that makes computers run faster than the previous version.

    So now it will only be 10x slower then Linux instead of 100x for the same operations.

  • by M0b1u5 ( 569472 ) on Wednesday January 28, 2009 @07:53PM (#26647031) Homepage

    For goodness sake, Moore's law never specified anything to do with "computing power"!

    Moore observed that typically the number of transistors doubled ||on the lowest price process|| around every 2 years.

    At least the poster got something right: the cost of the process.

    But, it's not a law AT ALL; it's a self-fulfilling prophecy! Manufacturers know the target they have to hit (Moore's!) and they do everything they can to hit it. Anything less would result in company failure.

  • by Glasswire ( 302197 ) on Wednesday January 28, 2009 @10:06PM (#26648399) Homepage

    The observation behind Moore's Law doesn't say anything about performance. It's a projection about the rate at which transistor density on a wafer grows every 18 months. Historically most companies incl Intel had used this to make bigger dies for larger and more complex processors, but you can also use the improvements in transistor real estate to simply make smaller dies and hence, more of them, increasing yield and bringing down the price. So bringing the price down isn't different than Moore's Law, it's just another way to use ML.

  • Hmnn? (Score:3, Funny)

    by Vexorian ( 959249 ) on Wednesday January 28, 2009 @10:30PM (#26648565)

    Even Microsoft is jumping on the bandwagon: the next version of Windows is intended to do the same as the last version, Vista, but to run faster and use fewer resources. If so, it will be the first version of Windows that makes computers run faster than the previous version

    1. Take random article from news site
    2. Somehow manage to make it justify a new slashdot story that includes a link to ooold blog promoting windows 7.
    3. ?????
    4. Profit / Win laptop ?

    How is vista seven related to this at all? It didn't get faster for doing less... That article states clearly that it is just using a more responsive interface, I mean, come on!...

  • False choice (Score:3, Insightful)

    by symbolset ( 646467 ) on Wednesday January 28, 2009 @10:39PM (#26648627) Journal

    The presentation of the false choice fallacy is that you must choose option a or option b. As far as I can tell, businesses want not only option a and option b, they also want "the same performance in less watts". And a number of other things.

    By presenting the trend as a singular choice the author presents a false choice. What is actually happening is that the computing ecosystem is becoming more diverse. As we select from a richer menu, we are enabled to pursue our goals large and small with equipment that suits the application. It's a good thing.

  • by lpq ( 583377 ) on Wednesday January 28, 2009 @11:48PM (#26649123) Homepage Journal

    "If so, it will be the first version of Windows that makes computers run faster than the previous version."

    This is the 2nd bit of falseness -- WinXP was faster than WinME.

    Second -- WinXP is still quite a bit faster than Win7.

    The article states that Win7 improves in areas where Windows was "OS-bound" over Vista. However, it says there is NO IMPROVEMENT in Win7 for applications. It was applications that noticed a 10-15% performance hit in Vista vs. XP due to the overhead of the DRM'd drivers. As near as I can tell from everything that has been leaked out about Vista before, during and after its development was that MS added (not replaced), but added a whole new abstraction layer. They tried to make it transparent where they could, but this was the basic reason why nearly all drivers that actually touched hardware had to be rewritten -- the USER has to be completely isolated from the real hardware -- so DRM can detect hardware/software work-arounds or unauthorized modifications or "taps" into the unencrypted data stream. This goes down to the level of being able to tell if something is plugged into a jack due to impedance changes -- if impedance or electrical specs don't match exactly with what the circuit is supposed to produce -- the OS is supposed to assume tampering and mark the OS-state as "compromised". Think of it being similar to the Linux - Kernel's tainted bit. Once it's set, it is supposed to be irreversible unless you reboot -- because the integrity of the kernel has been compromised. The DRM in Vista-Win7 was spec'ed to be similar but with finer level sensors -- so if anything -- a code path takes too long to execute, or circuits don't return their expected values, it's to assume the box is unsecure, so content can be disabled or downgraded at the content-providers option.

    All this is still in Win7 -- the only difference it all the drivers that were broken during the switch to Vista won't be rebroken -- so you won't get anywhere near the OEM flack and Blog-reports about incompatibilities -- those will all be buried with Vista -- along with your memories -- so hopes MS. But MS has already made it clear that you won't be able to upgrade an XP machine to Win7 -- so they can control the experience from the start -- either by starting with corrupted Vista drivers, or Win7 drivers -- take your pick. Both are designed to ensure your compliance, but more importantly -- both cause performance degradation that everyone pays for by needing a bigger machine than they needed for XP.

    The whole planet will be paying an excess carbon tax -- forever -- all to add in content-producer's demanded wish list.

    This whole bit about the IT industry warming up to Win7 because it's not so bad compared to Vista just makes me want to puke. It's still corrupt and slow.

    The government should require MS to open-source WinXP -- so it can be supported apart from MS -- who's obviously going for a "content-control" OS (like the next Gen Apple's are slated for). This will be the beginning of the end for non-commercial, open-source OS's or media boxes. It will be all pay-to-play -- just like Washington.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...