Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware Technology

Fifty Years of Moore's Law 101

HughPickens.com writes: IEEE is running a special report on "50 Years of Moore's Law" that considers "the gift that keeps on giving" from different points of view. Chris Mack begins by arguing that nothing about Moore's Law was inevitable. "Instead, it's a testament to hard work, human ingenuity, and the incentives of a free market. Moore's prediction may have started out as a fairly simple observation of a young industry. But over time it became an expectation and self-fulfilling prophecy—an ongoing act of creation by engineers and companies that saw the benefits of Moore's Law and did their best to keep it going, or else risk falling behind the competition."

Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop, but the death of Moore's Law will spur innovation. "Someday in the foreseeable future, you will not be able to buy a better computer next year," writes Huang. "Under such a regime, you'll probably want to purchase things that are more nicely made to begin with. The idea of an "heirloom laptop" may sound preposterous today, but someday we may perceive our computers as cherished and useful looms to hand down to our children, much as some people today regard wristwatches or antique furniture."

Vaclav Smil writes about "Moore's Curse" and argues that there is a dark side to the revolution in electronics for it has had the unintended effect of raising expectations for technical progress. "We are assured that rapid progress will soon bring self-driving electric cars, hypersonic airplanes, individually tailored cancer cures, and instant three-dimensional printing of hearts and kidneys. We are even told it will pave the world's transition from fossil fuels to renewable energies," writes Smil. "But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods."

Finally, Cyrus Mody tackles the question: what kind of thing is Moore's Law? "Moore's Law is a human construct. As with legislation, though, most of us have little and only indirect say in its construction," writes Mody. "Everyone, both the producers and consumers of microelectronics, takes steps needed to maintain Moore's Law, yet everyone's experience is that they are subject to it."
This discussion has been archived. No new comments can be posted.

Fifty Years of Moore's Law

Comments Filter:
  • > Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop,

    I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.

    Yeah, it may "eventually" stop when transistors are built with just 3 atoms. Then will switch over to photonics or quantum, then some weird hyper-dimensional shit.

    • That "weird hyperdimensional shut" is the kind of innovation he is talking about. That stuff doesn't just fall from the sky. Lots of people have to innovate the he'll out of that stuff for years or decades. What website do you think you are reading, anyway?

      • >> weird hyper-dimensional shut

        if you are thinking space time or something like Warp speed, not sure if their is enough power ever to achieve that in our life time
        if you are thinking LxWxH + trinary chips ... that could happen. given I like to dream but the thought of trinary chip just seems like wishful thinking

    • by Waffle Iron ( 339739 ) on Tuesday April 14, 2015 @05:45PM (#49474573)

      I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.

      I don't think that it's necessarily "inevitable". Take aviation, for example. There was arguably exponential increases in the capability of aircraft for 55 years from 1903 to 1958, when the Boeing 707 was introduced. Ever since, further progress on economically viable aircraft has been pretty much limited to incremental increases in fuel economy and marketing strategies to keep costs down by keeping planes full.

      • by rioki ( 1328185 )

        I don't know, switching from aluminium and titanium to composite materials is, such as carbon fibers is a real big deal in aviation. But this is something that you don't see and thus don't recognize. Would you know that the A350 and 787 are almost entirely made of plastic?

        I agree that Moore's Law is slowing, but i doubt that we will see a slowdown in innovation. We have already seen a shift from more powerful to smaller and more energy efficient. The number of applications that need raw power are getting le

        • All the plastic helps with the incremental increments in fuel economy: approximately 2X better over the past 57 years. I also neglected to mention safety, which has improved a good deal more than fuel economy. That's all OK, but it's nothing like the dramatic changes that happened previous to the 707. After nearly six decades, today's planes still look very similar to a 707, are about the same size, and go the same speed.

          • by rioki ( 1328185 )

            Yes, but where is the difference to CPUs? Many little breakthroughs in technology, most of them you don't see.

    • > Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop,

      I think we've been hearing about the end of Moore's law for the last 15 years... inevitably, some process improvement comes along and it all keeps on going.

      Yeah, it may "eventually" stop when transistors are built with just 3 atoms. Then will switch over to photonics or quantum, then some weird hyper-dimensional shit.

      15 years ago they were talking about some weird 3 dimensional transistor shit.

    • >> Yeah, it may "eventually" stop when transistors are built with just 3 atoms
      Funny I was thinking along the same lines, I recall when they got to 9 or 10 atoms as being the nearest they could be, then 2 or 3 years someone came out with 8, I do like Moore's Law as a benchmark of what can be achieved. And just not in chips but in data storage and power consumption.

      I really wish I could find more benchmarks on progress. it's just fun to learn stuff like this.

      Oh by the way... I guessing ( using Moore's L

  • by TheNarrator ( 200498 ) on Tuesday April 14, 2015 @05:08PM (#49474341)

    That guy is going to be pissed when we don't get cold supercomputers with billions of times more power than the brain using reversible computing.

    • Re: (Score:2, Interesting)

      by Just Some Guy ( 3352 )

      That guy is going to be pissed when we don't get cold supercomputers with billions of times more power than the brain using reversible computing.

      Kurzweil may or may not be nuts, but the data [singularity.com] seems to be going his way so far.

      • Re: (Score:2, Insightful)

        by Anonymous Coward
        I think this XKCD [xkcd.com] says it all.
        • I love xkcd, but... no. That's true if you extrapolate a trend from 3 or 4 data points. If you extrapolate it from a few hundred, then it starts to look a little more predictive.
    • by Zeroko ( 880939 )
      Making use of reversible computing, we could build fully 3-D circuitry since there would be much less power to dissipate (although still some to correct hardware errors & perhaps to clean up crashed processes). This would in turn get around no longer being able to make smaller transistors, & thus could be one future direction. Fabrication might be more tricky, but more money could go into such projects if it is not going into smaller, smaller, smaller. Software would similarly require changes, but a
      • by smaddox ( 928261 )

        Fully 3D circuitry is limited more by the requirement to have a single-crystal and the economics of circuit fabrication, than by power density. Furthermore, neuromorphic computing (which is advancing rapidly) has the potential to solve power density and yield issues, but Si wafers are still cheap compared to mask steps.

  • "We are assured that rapid progress will soon bring self-driving electric cars,

    Uh.... [google.com]

    hypersonic airplanes,

    Well... [networkworld.com]

    individually tailored cancer cures,

    cough-cough [ajmc.com]

    and instant three-dimensional printing of hearts and kidneys.

    You see... [theguardian.com]

    We are even told it will pave the world's transition from fossil fuels to renewable energies,"

    Aww screw it. [bloomberg.com]

    Could there have been worse examples of "LOL those crazy promises!"?

    • by Anonymous Coward

      Seeing every one of those except maybe the fossil fuels to renewable energies doesn't exist, yeah, there could have been worse examples.

      Case and point, point me to where I can buy a fully autonomous car right now? How about I ask again 5 years from now? Also, where can I buy a ticket on even a supersonic aircraft, let alone a hypersonic one? The only existing supersonic aircraft was taken out of service years ago as it wasn't economically feasible. And what? There's a cure for cancer let alone a custom

      • by linearZ ( 710002 )

        Google is 5 years out from having their cars on the market, just like they were 5 years ago.

        Google is a lot closer than 5 years. The computing and sensing technology now exist to make this reasonable priced. The problem is an engineering problem - developing the algorithms to work properly as a driver.

        Well within 5 years (try 2 years), both Google and Uber will be running low speed taxi services in dense city areas using their respective vehicles. You may not be able to purchase the vehicle or drive the freeways, but Uber and Google will replace a lot of Uber drivers and cabbies. Google is ver

        • Well within 5 years (try 2 years), both Google and Uber will be running low speed taxi services in dense city areas using their respective vehicles.

          If they are lucky, within five years they will have the algorithms necessary to self-drive a car. From there, expect another 5-10 debugging the software and making it safe enough for the public.

          Go look up how long it takes to build flight-safe airplane software, and then realize that car software is much more complicated.

    • These things are still relatively rare, expensive, and nowhere near the level of completeness that most clickbait articles breathlessly written by a reporter with no technical knowledge would imply.

      These are all things that people (especially reporters selling headlines) want very badly, but not necessarily things that will ever be able to become practical enough to make it out of R&D and into common use.
    • Yes, but people thought that these things would happen at a pace similar to the pace of computer technology development. It didn't, it took a lot longer.

      Also a lot of these are still in development, not yet at the stage of a real product or with limited adoption.

  • The idea of an "heirloom laptop" may sound preposterous today, but someday we may perceive our computers as cherished and useful looms to hand down to our children, much as some people today regard wristwatches or antique furniture."

    It is preposterous... Even if it were impossible to make computers faster in any way in the future (extremely unlikely given the countless avenues there are to explore in terms of speed), even then the inovation in computers i not and would not be limited to speed, so no computer heirlooms wont ever happen, stupid person.

    • by James McGuigan ( 852772 ) on Tuesday April 14, 2015 @05:39PM (#49474523) Homepage

      “It’s your father’s Sinclair ZX Spectrum. This is the weapon of a computer hacker. Not as clumsy or as random as an iphone, but a more elegant weapon for a more civilized age. For years, the hackers were the guardians of peace and justice in the internet. Before the dark times, before the NSA.”

      • âoeItâ(TM)s your fatherâ(TM)s Sinclair ZX Spectrum. This is the weapon of a computer hacker. Not as clumsy or as random as an iphone, but a more elegant weapon for a more civilized age.

        Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new. Being that I was alive back then, I can also assure you that it was not a more civilized age either. Crime and pollution were much worse than now. Racial prejudices were starting to die off, and sexual orientation prejudices were very prevalent.

        For years, the hackers were the guardians of peace and justice in the internet. Before the dark times, before the NSA.â

        I'll give you that. Hackers were pretty damn benevolent. Most cracking was meant to be more for humor or to see if you could do it, than anything harmful. But the

        • by MacTO ( 1161105 )

          Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new.

          Elegant depends upon context, and I would argue that those computers were elegant in the context of their era. Difficult to use, sure. Yet compare that to the technology that preceded it. If you needed to type something out, typewriters sure were simple. Needed to make changes, then you needed to use a correction tape. Except that wasn't always appropriate, so you had that thing called drafts. What about doing calculations? There were machines of various sorts that could handle that, yet you had to

          • I had an Apple 2 from a couple year prior also. It was a much better computer by every point, other than maybe size/weight. Well, it did have less RAM once I added the 16K RAM pack to the Sinclair. I also had a TI 99/4A from the year before the Sinclair was released. I got the Sinclair for the novelty and because it was relatively cheap. So, no, it was not elegant even then.
          • by mjgday ( 173139 )

            Um, yeah. I had one of those, and elegant is not a word that was used to describe them, even when new.

            Elegant depends upon context, and I would argue that those computers were elegant in the context of their era. Difficult to use, sure. Yet compare that to the technology that preceded it. If you needed to type something out, typewriters sure were simple. Needed to make changes, then you needed to use a correction tape. Except that wasn't always appropriate, so you had that thing called drafts. {snip}Spreadsheets {snip} Accounting software {snip}

            We're talking about a ZX Spectrum, right?

            I know they had a Word-processor and probably a spreadsheet for the Speccy, but that's hardly an average use case.

            Manic Miner and Jet Set Willy were elegant tho, elegant and awesome!

    • The pace of technological advance has been accelerating for some time, and "Moore's law" was not the driving force by any means, because the phenomenon started long before the invention of the transistor or integrated circuit.

      Let's compare 0 AD and 1000AD. Sure, there are some advances and changes, but by and large not too different. Jumping from one time to the next, technology is going to be the least of your concerns as far as difference.
      Now let's go from 1000AD to 1500AD. Changes are a little more appa
      • The last major, world changing thing, was the internet - some 25 years ago. Since then we've just seen it get better and better - but no real breakthroughs

        Before that it was jet planes and anti-biotics - mid 50s

        Before that motor cars - 1900 or so

        Before that railroads - 1830 or so

        Now it may be that we are waiting for the next major breakthrough.
        • I disagree. The internet of 25 years ago, and the internet of today, are very different things. Even the internet of 10 years ago is noticeably different than today, partly because I can take it with me in the palm of my hand, in ways that weren't possible then (or were limited to the ridiculously wealthy) - and that's not solely a function of computing power increases. It's improvements in a lot of things, from battery storage capacity and size to spectrum use to the establishment of robust wireless data
        • The last major, world changing thing, was the internet - some 25 years ago. Since then we've just seen it get better and better - but no real breakthroughs

          Um, 25 years ago was... 1990. In that time we've gone from computers being a comparitive rarity (many people didn't even have a home PC) to nearly 80% of the population carrying round a computer in their pocket. No one had cellphones in 1990 to a first approximation. Now almost everyone does.

        • omg - what rock are you under?

          I'll list some for you:
          1) smart phones - world changing - 13 years ago with the blackberry when people started to use them
          2) human genome sequencing - world changing - 15 years ago - completed 2000, but finalized in 2003
          3) digital cameras - world changing - average people didn't start using them until 16 years ago, 1999.
          4) LCD monitor - world changing - 17 years ago
          5) rebirth of the electric cars - world changing - 7 years ago
          6) Linux - 24 years ago
          7) Amazon - 21 years go -- a

          • I'll give you the smart phone; as a luddite who refuses to use one, I tend to forget their significance. Digital cameras - also true. Genome sequencing - not yet THAT significant; whilst helpful for law enforcement, we've yet to see its wider application. LCD monitors - only significant as leading towards smartphones etc. LINUX, Amazon and electric cars - nah - not that significant.

            However the central experience of western life - of living in nuclear families in dispersed suburbs, travelling to work in
    • At some point it will cease to make sense to update your computer on a regular basis. I have a 10 year old one that is fine for internet browsing and word processing. I have a friend who still uses Windows 2000 on hers - though her household does have another one. As computers get to be point of being good enough for all but the latest, most processor intense, activities, then the concept of keeping an heirloom one - especially ones designed to be upgradeable - will probably make more and more sense.
      • At some point it will cease to make sense to update your computer on a regular basis. I have a 10 year old one that is fine for internet browsing and word processing

        Regular yes, heirloom no. The space between physical obsoleting to the point of uselessness has and will continue to increase, but it a whole generation through which zero innovation in computers happens? less a post-apocalyptic scenario, that's not going to happen.

        ...A nail clipper is extremely limited in it's purpose and possible number of designs, it has a very attainable optimal design after which no substantial improvement can be made. The current and most prevalent nail clipper design is extremely ele

  • Moore's Law is over (Score:5, Interesting)

    by Anonymous Coward on Tuesday April 14, 2015 @05:18PM (#49474401)

    Incidentally Moore's law died sometime last year technically, as Intel failed to ship its new node within "18-24 months" of its last one, meaning the density of transistors did not, for anyone, double within the time limits specified by Moore's Law. With the other foundries (TSMC/GloFlo/Samsung) still ramping up the same feature density size with finfet transistors that Intel had 3 years ago, and 10nm bringing even more difficulties than Intel's "14nm" it's a question how much longer feature size can continue to shrink at all, let alone somehow coming within the Moore's Law cadence of ever 18-24 months.

  • by eth1 ( 94901 ) on Tuesday April 14, 2015 @05:43PM (#49474561)

    Andrew "bunnie" Huang argues that Moore's Law is slowing and will someday stop

    Moore's Meta-Law:
    The number of people predicting the end of Moore's Law doubles every eighteen months!

  • by ajedgar ( 67399 ) on Tuesday April 14, 2015 @05:45PM (#49474571)

    I remember watching Star Trek (TOS) and thinking how fantastic it would be to have all that storage in that little cartridge the size of a matchbook; books, movies, medical records, the Encyclopedia Galactica, all on one little memory device. I never expected it happen in my lifetime.

    Then in 1985 once the initial glow of the original Macintosh had worn off a little, my brother and I brainstormed on what our _ultimate_ computer would be: 1024x768 TrueColor display, a whole _8_ megabytes of memory, and a 50 Mhz 68000 series CPU. Wheee!

    Now we have 128 GB microSD cards smaller than your fingernail. And that super-computer in your pocket that happens to make phone calls? It's more powerful than a 4 processor Cray YMP M90 circa 1992.

    We've come a long way!

    --aj;

    • For me it was reading The Adolescence of P1 in the late 1980s, with its mention of 'gigantic' 70 MB disc drives that gave me a laugh.
      https://en.wikipedia.org/wiki/The_Adolescence_of_P-1 [wikipedia.org]

    • by Agripa ( 139780 )

      Unfortunately high density Flash memory has a retention of months to years unless it is scrubbed. That makes it great for SSDs which are regularly used but useless for archival purposes or even as a replacement for magnetic and optical removable media in many applications.

      • Source? I'm interested in this.
        • by Agripa ( 139780 )

          The manufacturers do not like to advertise this so specifications are in short supply. I ran some of my own tests on various unused USB Flash drives I had laying around and none of them retained data more than a year whether powered or unpowered so I assume they do no background scrubbing. SSDs generally have better documentation and will specify something like 1 year of unpowered retention. Beware of "typical" specifications which have almost no meaning.

  • Koomey's law (Score:5, Interesting)

    by Sara Chan ( 138144 ) on Tuesday April 14, 2015 @05:49PM (#49474601)
    Moore's law is sort of a mangled version of Koomey's law [wikipedia.org]. Koomey's law states that the number of computations per joule of energy dissipated has been doubling every 1.6 years. It appears to have been operative since the late 1940s: longer than Moore's law. Moreover, Koomey's law has the appeal of being defined in terms of basic physics, rather than technological artefacts. Hence, I prefer Koomey's law, even though Moore's law is far more famous.

    There is another interesting aspect to Koomey's law: it hints at an answer to the question "for how long can this continue?" The hinted answer is "until 2050", because by 2050 computations will require so little energy that they will face a fundamental thermodynamic constraint—Landauer's principle [wikipedia.org]. The only way to avoid that constraint is with reversible computing [wikipedia.org].
    • Reversible computing requires infinite storage. Won't and can't happen.

      • by Zeroko ( 880939 )
        Reversible computing in no way requires infinite storage...you just compute something, copy the answer, & then un-compute it (by computing each value in reverse order & XORing it with its original copy, for example). You then only need storage for the maximum size of temporary data plus the final answer, just like now. You get a speed penalty for all that un-computation, of course, but not infinite storage. Plus, you can still expend energy occasionally to erase data (such as the data left over from
        • Reversible computing is nothing more than making everything a one-to-one function.
          Current computing is merely functional, but not one-to-one. Operations such as XOR are not one-to-one functions because XOR(0,1) = 1 and XOR(1,0) = 1; given the output 1 and the function XOR you cannot recover the inputs.

          Reversible computing makes all operations one-to-one, and thus reversible. This is achieved by storing some of the inputs for many-to-one functions. If you want to reverse more than one step (the whole poin

          • by Zeroko ( 880939 )

            B=A XOR B (leaving A unchanged) is a reversible operation & is what I meant. More generally, B=f(A) XOR B is reversible (in fact, self-inverse), where f can be any (even irreversible) function.

            Sure, you need to save the input to otherwise-irreversible steps, but the point is that you can erase a known value, & since there was some method to compute the intermediate values in the first place, they can be removed from memory in reverse order. (This is a known method—I did not come up with it.) T

            • With XOR you don't need any additional storage, but there are functions (whether they be at the transistor level or at the application level) where many variables result in a much smaller output. You need additional storage in these cases. You are also not always free to overwrite variables if you can recover them, because many functions may take a single variable as input. Recovering X may involve stupidly long chains, and you need storage to go back through the whole chain for as far as you want to rec

              • by Zeroko ( 880939 )

                Hmm. I suppose that can be true in an iterative setting (needing to store some data from every iteration), & that the only hope of avoiding that is rewriting the whole loop to be fully reversible so it does not consume space every iteration. (It cannot take more space than linear in the run time, at any rate.) I was imagining recursive functions with stack allocation for each, but I should know better since I use tail recursion all the time. So I guess I was only right about iteration- & tail-recurs

              • by Zeroko ( 880939 )
                Reversible Space Equals Deterministic Space [ucsd.edu] says that for a Turing machine running in time T(n) & space S(n), you can get the space & time both linear in T(n) (as I suggested) or space O(S(n) log T(n)) with time O(T(n)^(1+epsilon)) or space O(S(n)) with time exponential in T(n). So there is a tradeoff, but the space does not have to be (more than linearly) worse if you are willing to wait (way too long, of course, unless you are already worrying about the heat death of the universe), & not much
                • Yes, you can trade off time with storage, but it's still on the order of n for storage, which is infinite in the general sense and impractical in the real-world sense.
                  Think of all the non-reversible operations a single CPU core @ 3 GHz chews through in a year.

                  • by Zeroko ( 880939 )
                    The space requirement is not infinite for reversible computing unless it is also infinite for irreversible computing (& thus equally impractical), even if you want a polynomial slowdown. The paper proves this. That 3 GHz CPU either has finite external memory (& thus loops or stops after at most exponentially many steps (or, in the real world, suffers hardware failure)) or infinite external memory (in which case, you have already solved the infinity problem).
    • Well in the 35 years until 2050, there will be approximately 23 more Moore's law doublings, which means computing chips will be around 8.4 million times more powerful than now. So around 60 iPhones 41's in 2050 will have the same computing power as all of the 500 million iPhones currently on the planet.

      That should allow us to do a lot of cool stuff.

      As an aside, I consider Moore's law as more a product of the geometric progression of chip lithography. You increase feature resolution by a linear amount and yo

    • by tlhIngan ( 30335 )

      Moore's law is sort of a mangled version of Koomey's law. Koomey's law states that the number of computations per joule of energy dissipated has been doubling every 1.6 years. It appears to have been operative since the late 1940s: longer than Moore's law. Moreover, Koomey's law has the appeal of being defined in terms of basic physics, rather than technological artefacts. Hence, I prefer Koomey's law, even though Moore's law is far more famous.

      There is another interesting aspect to Koomey's law: it

  • ... under a loose interpretation. Mooers' law [wikipedia.org] is 56.
  • by pjrc ( 134994 ) <paul@pjrc.com> on Tuesday April 14, 2015 @06:01PM (#49474669) Homepage Journal

    Regarding Andrew âoebunnieâ Huang ridiculous article....

    As commercial success and product differentiation starts to depend less on quickly leveraging the latest hardware and moreso on algorithmic improvements, companies will not magically become more inclined to publish source code. When the path to improved performance involves massive man-hours optimizing code, small teams & startups will not somehow gain an advantage.

    Click baiting "open source" and an interactive graph might bring a lot of page views, but the entire premise is truly absurd.

  • I am feeling sick and sad that my generation could be the failure that couldn't keep up with Moore's law and is looking for excuses and marketing incompetence as innovation.

    Specially considering that we can't even fucking go to the moon anymore, and the motherfuckers who did it used fucking 64kb computers.

    2 out of 2. We are self-appointed lazy losers full of ourselves and deserve no respect from our ancestors.

    • Yeah, I wonder why this generation hasn't discovered new elements or new fundamental forces, or new Euclid's theorems. Stupid generation.

      Are you seriously this stupid?

    • by khallow ( 566160 )
      Just feel good that you'll never be as bad as your children.
  • Chris Mack begins by arguing that nothing about Moore's Law was inevitable. "Instead, it's a testament to hard work, human ingenuity, and the incentives of a free market.

    Humans working hard and having ingenuity, and being incentivized by the free market are all things that are sort of inevitable in themselves. I don't mean to diminish those positive features of humanity, but I think it's ok to take them for granted in the sense that I don't think it is likely for those things to stop being features of humanity barring some kind of catastrophe.

    Was Moore's Law going to be as true as it was with 100% probability? No, some stuff could have gone wrong. Some people might have

  • From the summary:

    But the doubling time for transistor density is no guide to technical progress generally. Modern life depends on many processes that improve rather slowly, not least the production of food and energy and the transportation of people and goods.

    A lot of progress depends on information technology, though. For example our understanding of biochemical processes. Or the capability of satellites that monitor what's going on with our planet. Or our understanding of quantum effects in semiconductor materials, in turn the basis for IC's, LED lighting, and a whole slew of other applications. Our use of smartphones & related communication technology. Or even something as "low-tech" as logistics.

    Make computation cheaper, a

  • Yes I know technically the number of transistors on a chip is still doubling every 18 months or so; and yes that means cheaper chips that use less power. Yes that is all fine and good. But kids today don't seem to remember back when having twice as many transistors pretty much meant having twice the computing power. That 486 could do twice as much at the same clock speed as the 386 -- and the 486 was eventually going to be sold at higher clock speeds. And you didn't need to recompile anything to take advant

    • by serviscope_minor ( 664417 ) on Tuesday April 14, 2015 @07:34PM (#49475161) Journal

      I don't think it's that your older. Home computing was very much in its infancy in the 80s, and only started growing up in the 90s. As with all things, it was a period of wild optimism, rapid change, rapid improvements and huge variety. Now it's settling down and becoming much more boring as all the low hanging fruit has gone and larger and more expensive operations are required to squeeze out the remaining performance.

      The exact same thing happened in both the automobile and aeroplane industries as well, but I was born long after they entered the boring phase.

      In the early 1900s, any yahoo with a bicycle garage, a couple of petrol engines a good supply of wood, some optimism and some giant brass ones could build and fly a primitive aircraft. And they did in huge quantities. There were all sorts of whacky things like rotary engines where the whole crank case rotates, wings that twisted, weird paterning and layouts of wings, on-wing gantries for in-flight servicing of broken down engines and so on and so forth.

      Now it's about bumping 0.1% off the fuel burn by optimising for short-haul versus long haul flights and so on.

      IOW, it's not "thing were better when we were kids", rather many industries have gone through these transitions and computing is no exception.

  • For the past thirty years, experts have told us that Moore's Law is likely to end within ten years.

    What do the experts today think? Predictions are in: Moore's Law will probably end in about ten years [pcworld.com].

    Good to see some things never change.

  • Back in the 1960's there was a pundit/gadfly by the name of Herb Grosch, who posited a similar law about cost/speed of the various models of IBM computers. One of my first jobs at Honeywell EDP Division was analyzing the law as it applied to the 360 line. Fitted perfectly. Then we hired away a guy from IBM Hq who told us it was their pricing strategy.
  • by manu0601 ( 2221348 ) on Tuesday April 14, 2015 @07:46PM (#49475227)
    One Moore's law will be a thing of the past, developers will have to take care of software performances, instead of requiring latest hardware to run badly optimized code.
  • Computers opened the flood gate to a limitless wonderland. Moore's law is simply a method of observing the radical progress in computer power. Frankly I think that five years from now we will marvel at how we got by with today's computers and electronics. I don't think we have even seen the beginning of what is surely going to occur. Maybe I'm a mindless idiot with a foggy grasp of reality but my view is that we will soon have computers capable of writing and testing programs on their own without h
    • by Anonymous Coward

      In 5 years? Why? Computers 5 years ago were really the same as todays computers. Same with computers 20 years ago.

      Computers have been the same for many years now. Just faster.

      And there is no such thing as AI or machine intelligence. There has been ZERO progress in that field.

  • thanks you for the info. http://www.educa.net/curso/que... [educa.net]
  • And in 18 months it will be 100 years old!
  • Electronics are progressing faster then us meat puppets can deal with. We're going to have issues as electronics have the capability to take over more and more of what us humans do.

    When you ask someone, what do you do? You generally get an answer of their job. it's part of our internal definition. what happens when you do nothing (and get paid nothing)?

    • Then you run into the predictions of the Technocracy movement [wikipedia.org], where the price system collapses and most people have no job and zero income. Without job there are no consumers, without consumers there are no jobs. It's inevitable as the amount of work an individual can do increases with technology.

  • ...we all give up.

    Even if we have to invest exponentially more resources into shrinking transistors, the industry is very likely to continue to invest. They will give up when the R&D costs are high enough that there is no longer any profit. But marketing has really pushed people to upgrade to new devices that they don't need, if marketing continues to do their job then we'll see Moore's Law working for quite some time to come.

  • I've heard the term for years and thought I understood it. However, this thread seems to contain a lot of debate on exactly what Moore's Law means... I don't believe it actually has anything to do with cpu power doubling or transistor density. Can somebody clarify a precise definition?

    Here is my interpretation...

    If I buy a CPU today for X dollars in 18 months a CPU will exist that contains roughly twice the number of transistors that will also cost X dollars to purchase.

     

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...