Forgot your password?
typodupeerror
Hardware Technology

Michio Kaku's Dark Prediction For the End of Moore's Law 347

Posted by timothy
from the techno-malthusian dept.
nightcats writes "An excerpt from Michio Kaku's new book appears at salon.com, in which he sees a dark economic future within the next 20 years as Moore's law is brought to an end when single-atom transistors give way to quantum states. Kaku predicts: 'Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.'" Exactly the way the collapse of the vacuum tube industry killed the economy, I hope.
This discussion has been archived. No new comments can be posted.

Michio Kaku's Dark Prediction For the End of Moore's Law

Comments Filter:
  • by andreicio (1209692) on Sunday March 20, 2011 @08:32AM (#35549336)

    Noone will take a disaster prophecy seriously if you can't even be bothered to pair it with some planetary alignment or ancient calendar.

  • The major difference being the tube/valve industry was done in by the transistor - i.e. we had a viable replacement that was better. The problem with the transistor is that we don't (yet) have a viable replacement.
    • by frnic (98517) on Sunday March 20, 2011 @08:45AM (#35549412)

      Before we had transisters we didn't have them yet either.

    • The major difference being the tube/valve industry was done in by the transistor - i.e. we had a viable replacement that was better. The problem with the transistor is that we don't (yet) have a viable replacement.

      There's a big difference between then and now. We have a lot of people/companies/countries trying to drive the progress and development of new technologies. Small startups can play a role, or even become the new leaders; ungodly international conglomerates can 'change or die'. There's a brave new word out there ... but there's always a brave new world out there. Now you young folks go get it and bring it back to those of us who are tired, cranky and complacent.

    • by maxwell demon (590494) on Sunday March 20, 2011 @08:52AM (#35549454) Journal

      So what? Already today the chips are just perfect for most applications. Add 20 more years of Moore's law, and we won't even need more powerful chips. You'll have the power of today's supercomputers on your cell phone. I doubt Moore's law would continue even if physically possible, because there will be no need for it.

      • by mwvdlee (775178)

        Until even the most complex task imaginable can be computed in less time than it takes you to click a button, there will be a need for more processing power.

        • by kdemetter (965669)

          Yes, and this 'need for more processing power' , is exactly what Moore's law exploits : Moore's law basically dictates that the demand for processing power doubles every year.

          As a result , it's most profitable to follow this demand.

          Speeding it up would be silly ( even if new technology would allow it ) , because that means you lose money :

          For example , if i suddenly were to create a processor which has 10.000x the processing power , i would go brankrupt :

          - Either it would be so expensive , that no one would

          • by Anne Thwacks (531696) on Sunday March 20, 2011 @10:38AM (#35550186)
            Moore's Law ensures that every year people will find that their computer is too slow

            No - Microsoft does that. Moore's law ensures that new computers can perform better at the same rate that MS adds bloat to their software, or marginally faster. By avoiding the use of Windows, I can continue to use my 4 year old PC or ten year old Sparc machines. YMMV

          • by mwvdlee (775178)

            If you could make your 10.000x CPU, chances are you'll be filthy stinking rich in no time.
            It may be expensive for individuals, but for companies it'd probably beat having a room full of racks including cables, cooling, etc.
            If it'd be cheap, people would want a faster one next year, in order to play their realtime 3D raytraced games on eight 2048x1536 screens instead of just four.

      • Re:On vacuum tubes. (Score:5, Interesting)

        by jbolden (176878) on Sunday March 20, 2011 @10:22AM (#35550072) Homepage

        Today's chips were perfect for most applications in the 1980s. Once WordPerfect could outrun a human in terms of spell check and could outrun even the fastest printers CPU upgrades didn't do much. Same with Lotus 1-2-3, once complex speadsheets with lots of macros could be processed faster than a human could read a spreadsheet....

        But all that excess power led to the GUI. And then technologies like OLE. Which drove up requirements by orders of magnitude. But OLE hasn't really hit another generation because everything is so unstable. Imagine the next generation of applications that have data embedded from dozens of devices and hundreds of websites. I do a Quicken report which

        a) contacts my banks internet connections and pulls in all the credit card transactions
        b) hits each of those vendors (100+) with the credit processing number and pulls up all the items for each transaction
        c) does an item lookup to figure out what sort of expenses they are and prorates out general costs, like sales tax. That's 1000s of web information requests for an annual report.

        That sort of data processing we don't yet have and certainly not on cellphones. Another area is AI where systems are underpowered.
        Imagine a news search engine that knows my entire browsing history. Like a Pandora across all my news choices for the last year. I search for a story and because the system knows my preferences on dozens of dimensions its able to feed me the stories that most fit my preferences. Analyzing every article every day to do simple word counts is about the limits of a massive datacenter of google. Analyzing every article every day to determine: how much scientific background is this assuming in biology, in chemistry, in mathematics; what sort of editorial biases does it have, how human interest heavy is the presentation, how respected is in the journal.... that's way beyond what we can do today.

      • by Caraig (186934) *

        And 640K is all we'll ever need. =)

        On a more serious note, really... I think it's some sort of corollary to Moore's law: Processing needs will always expand to fill the available processing capacity. In short, we're going to be using our pocket computers with quantum-state processors, and still be wondering why frickin' Outlook is running so slow.

        • by dakohli (1442929)

          Agreed.

          As our computers have become more capable, they actually demand more from users, and productivity may not increase as much as you thing.

          In 1990, we had typing pools in my organisation, which produced memos, letters and other paperwork. Now that there is a computer on every desk, we have lost these pools. Now, the originator must write his/her own staff-work. It is expected that grammar and spelling be correct. Supervisors will edit for style, and often send it right back down to the originator.

      • I agree that the doomtastic prophecies seem rather overblown. If nothing else, the fact that it is (comparatively) easy to make predictions about the endpoint of Moore's law, and the approximate performance thereof, makes being taken by surprise harder. And surprise, not limitations in themselves, is where complex systems really tend to bite it. We know how large atoms are, we have predictions of varying pessimism regarding how many you need to make a transistor that will actually work, and work often enoug
      • I don't dispute that their is some limit that we will approach a limit with respect to computing speed. What I don't see, is evidence for 'economic collapse' as a consequence. Surely there will always be a need for programmers? Maybe more so because efficient programming will yield greater speeds and you won't be able to rely on mediocre quality ones and lazily rely on hardware getting faster. Similarly speaking with 2011 hardware alone we are still are nowhere near reaching the full economic capabilities o

    • by martyros (588782) on Sunday March 20, 2011 @10:45AM (#35550246)

      I'm by no means a hopeless optimist, but I think the arguments he's making here don't really make much sense. He's focusing exclusively on one aspect -- the increase in speed / computation power -- and saying that when that stops developing, everything will stop and die massively.

      That doesn't make any sense. Cars got faster between 1910 and 1930. But after they reached "as fast as humans can actually control them safely", they stopped getting faster, by and large. Did that cause a collapse of the industry? Did everyone completely stop buying cars? Consider airplanes -- between 1900 when the first flight happened, to WW2 where they were a critical part of strategy, they got faster. But once they reached the limits of speed / air resistance economics, they stopped getting significantly faster -- at least as far as most consumers are concerned. Now the main difference in passenger experience between a plane made 30 years ago and one made 10 years ago is whether the in-flight entertainment is on one shared screen, or each person has their own screen. This lack of increase in airplane speed has somehow failed to destroy the airline economy.

      When transistors hit their limit, there will still be huge amounts of transforming to do. Even within technology, there are things to do: there's a whole avenue of domain-specific chips to pursue. With the exception of GPUs (and possibly cryptography), there has been until now no point in making chips to do one specific thing; by the time you made it, Intel's CPUs would be more powerful at doing whatever it was you were going to do anyway. When we really hit the limit of silicon, that will become a rich avenue to explore.

      Outside of technology, there's even more. Culturally, we don't even know what to do with all the computing we could have. If my sink or table or door or wall isn't as smart as it could be, it's not because there aren't enough transistors, it's because we don't know what to do with the transistors. I'd say that the biggest limitation right now to ubiquitous computing isn't so much number of transistors, as what to do with the transistors. Will there ever be a task that my microwave will perform that will require 4 cores of an i7 supplemented with GPUs? User interfaces, techniques, and all kinds of other things are still wide-open. I'd go so far as to say that computing power isn't nearly the biggest difference between the computers of today and the computers of five years ago.

      The main point is, there's still a lot of innovation that can be done that will not somehow fail if chips don't get faster.

      • by Culture20 (968837)

        That doesn't make any sense. Cars got faster between 1910 and 1930. But after they reached "as fast as humans can actually control them safely", they stopped getting faster, by and large. Did that cause a collapse of the industry? Did everyone completely stop buying cars? Consider airplanes -- between 1900 when the first flight happened, to WW2 where they were a critical part of strategy, they got faster. But once they reached the limits of speed / air resistance economics, they stopped getting significantly faster -- at least as far as most consumers are concerned. Now the main difference in passenger experience between a plane made 30 years ago and one made 10 years ago is whether the in-flight entertainment is on one shared screen, or each person has their own screen. This lack of increase in airplane speed has somehow failed to destroy the airline economy.

        The concord was a commercial endeavor pushing the speeds of consumer air travel, but it died, and no one has started another supersonic jet company.

      • by evilWurst (96042)

        Well, yes and no. The physics is unforgiving; as Kaku says, we're going to hit a transistor shrink wall. At that point, the easy advances are over.

        The transistor shrink wall isn't the same thing as peak computation power, only a predictor of it. We have room for advancement in how well we used the transistors we have; once we've got them as small as possibly, we can improve how densely we pack them, and how efficiently we utilize them for computation. Those problems are harder to solve, so we haven't been d

  • Dark predictions (Score:5, Insightful)

    by Wowsers (1151731) on Sunday March 20, 2011 @08:33AM (#35549348) Journal

    I predict a dark future for Michio Kaku's new book.... namely, the bargain bin.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      Yeah, it's embarrassing when someone who's brilliant within his area of expertise starts nosing into other fields (in this case economics and the electronics industry) just to say stupid things. By the way, he did this before, although the previous victim was biology. Why do physicists think they are masters of all sciences? [scienceblogs.com] Granted, that was in response to a question, but he really should have said ‘I have no clue’. Why oh why do experts always think they're experts in everything?

    • ... and subsequently the landfill.
  • by Hazel Bergeron (2015538) on Sunday March 20, 2011 @08:34AM (#35549354) Journal

    Software developers are going to have to consider increasing efficiency as they make their wares more complex! And we might have to actually implement concurrency research which is under two decades old!

    Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.

    • Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.

      Ah, my trusty old friend Acorn... what went wrong?

    • by rolfwind (528248)

      I think they have been increasing efficiency the past decade, because, just as a casual observer, I haven't seen the types of gains that were to be had from 1995 to 2003ish anymore.

      Not in clock speed increases (yeah, I know this isn't everything, but it certainly was part of the equation) in CPUs and HDDs just don't increase in capacity like they used to. Back in the early-mid 00s, I upgraded from 40GB to 320GB, nearly an order of a magnitude bigger. And now, in my price range, I'm looking from jumping fr

      • by Jiro (131519)

        I wouldn't call a factor of 6, when I was expecting a factor of 8, "a lot less increase than I was expecting"

      • It's more or less undeniable that we are going to run out of low-hanging fruit as time goes on(it is, after all, entirely reasonable to take advantage of the most accessible potential improvements to your technology, which necessarily leaves you with improved technology and a harder set of future improvements). Just imagine how disappointed somebody upgrading from a vacuum tube based system to an IC based one would be with the rate of progress at any point from then on...

        The part where I think Kaku goes
  • Does anyone pay any attention to Michio Kaku? He isn't quite as much a woo merchant as Deepak Chopra, perhaps one could compare him to the likes of Henry Stapp or Fritjof Capra.
  • Oh really? (Score:3, Insightful)

    by Anonymous Coward on Sunday March 20, 2011 @08:39AM (#35549384)

    Apparently people can't:
    make cluster computers
    make boards with multiple chip sockets
    make extension boards that go in PCI or potential future slots
    use distributed computing
    [insert many other ways to get around limited processing power]

    Man, we sure are screwed in 20 years time, computers will start rebelling against us because we can't make them smaller than the boundaries of the universe!

    On a more serious note, this is retarded. Period.
    20 years is a VERY long time.
    By then, we'd probably actually have the beginnings of working quantum computers that are useful.
    By then, we'd have almost certainly found out how to get around or deal with these problems, possibly even taking advantage of quantum effects to reduce circuit complexity and power needs.
    Who knows, but i know one thing for sure, the world won't end, life will go on and usual, and this book will still be shit.

  • by gearloos (816828) on Sunday March 20, 2011 @08:44AM (#35549400)
    Michio Kaku is not necessarily the best in his field, mediocre at best, but he has the biggest voice. I was talking to an older woman awhile back and she is a devoted fan of his. I asked her what she knew of him other than that he does "layman's" break down commentaries of Physics for the discovery channel and she actually thought badly of me for trying to undermine her opinion of "the top physicist in the world today". Well, that's definitely HER opinion and not mine. Just because he has a big mouth (media wise) does not make him remotely right on anything is the point I'm trying to make here. oh, I just got it- Now I understand Politics lol
    • I'd bet most of the top people in their field don't take the time to make their field publicly accessible. Steven Hawking comes to mind as a counterexample with a few books, but I can't think of a single mathematician counterexample. My point is Michio Kaku doesn't have to be a "top physicist", and I wouldn't even expect him to be. That he popularizes technical stuff is enough for me.

      He probably has a good point, too, that at least eventually Moore's law failing will have strong economic impacts, and it's
    • by rolfwind (528248)

      Michio just seems very self aggrandizing to me - especially with that one series of his where he dreams up of solutions to various things.

      I really liked Sagan, and I think Brian Cox is a worthy successor to him, he just has and transmits that passion of his for science and I really like that. Neils DeGrasse Tyson also comes close. Then there are the mythbusters--.--, while sometimes horrible in their scientific method, they're at least showing practical experiments and how to do things besides theory.

      And

    • by MoellerPlesset2 (1419023) on Sunday March 20, 2011 @12:39PM (#35551234)

      Michio Kaku is not necessarily the best in his field, mediocre at best, but he has the biggest voice.

      I agree. But this isn't really news; This is how it's _always_ worked. The public is not going to figure out the merits of your scientific achievements on their own, and then give you attention that's proportionate to that. It's the same as in any other area: You have to market yourself.

      Linus Pauling was arguably the most famous chemist of the last century. But he wasn't actually that important. The quantum-chemical contributions he made were in reality on-par with those of Mulliken, Hund and Slater. Many would say Slater should've shared in his first Nobel prize. But it was Pauling who wrote "The nature of the chemical bond", it was Pauling who popularized the subject, it was Pauling who was the bigger educator and public figure (which was not limited to chemistry). Richard Feynman was one of the most famous physicists. And while his contributions are also beyond question, they were arguably not a lot larger than those of, say, Murray Gell-Mann, who is nowhere near as famous. Because Gell-Mann was not a big educator. His popular-scientific books didn't sell anywhere near as well. Dirac was as important as Bohr when it came to quantum theory, but he wasn't anywhere near the popular and public figure Bohr was. And so he's also less known.

      What bothers me about Kaku isn't the fact that his fame is disproportionate to his scientific contributions, or even the fact that it leads people to think he's a greater scientist than he is. What annoys me about Kaku is his propensity to comment on stuff that he doesn't know much or anything about. For instance, his statements on evolution, which were harshly (but justly) criticized recently by PZ Meyers. Or his commenting on the Deepwater Horizon spill, the Fukushima diaster (which he, IMO recklessly, called the worst diaster second only to Chernobyl, even though it's far from clear that it'd be worse than Three Mile Island or Windscale at this point, and certainly several orders of magnitude less severe than Chernobyl). And now we have him commenting about Moore's Law, even though he's not a solid-state physicist.

      I suspect he's letting his ego cloud his better judgment. It's not uncommon - the aforementioned Pauling, for all his scientific merits, had a whole bunch of bad, crankish ideas in areas outside his field (nuclear physics, vitamin megadoses, anesthesiology). I don't believe at all Feynman was the humble guy he tried so hard to make himself out to be, but to his credit, he was quite respectful of other fields and did not have that propensity to make himself out to be an expert on things he didn't know much about. Of course, there's also the possibility that it's not about Kaku's ego and that he just genuinely doesn't actually give a damn about educating the public, and is more interested in just getting attention for himself. But I'm prepared to give him the benefit of the doubt on that.

  • Gradual transition (Score:5, Insightful)

    by Kjella (173770) on Sunday March 20, 2011 @08:44AM (#35549404) Homepage

    Sooner or later it will come to an end, but it will come slowly as the challenges rise, the costs skyrocket and the benfits are lower due to higher leakages and lifetime issues. And designs will continue to be improved, if you're no longer constantly redesigning it for a tick-tock every two years you can add more dedicated circuits to do special things. Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. Besides, there's plenty other computer-ish inventions to do like laying fiber networks everywhere, mobile devices, display technlogy - the world will still be in significant change 20 years from now. Just perhaps not on the fundamental CPU code / GPU shader level.

    • by Sir_Sri (199544)

      And there's a lot to be done with different architectures, and just plain organizing computers differently (basically changing up how the CPU, GPU and various memory systems connect to each other).

      Right now all of that experimental stuff pretty much stays experimental, or custom, because by the time you get it out the door the traditional CPU-GPU market has gone through a tick-tock cycle and no matter how good your idea was, it's still not as economical as a newer version of your traditional hardware.

      Once t

    • by jbolden (176878)

      Excellent agree. I said a similar thing: http://hardware.slashdot.org/comments.pl?sid=2045520&cid=35549980 [slashdot.org]

      The end of CPU/GPU density is nowhere near the end of the computers "getting better". Your points about all the display technologies and wiring are a good one. We still aren't using the CPU technology we have today in most devices.

    • by Dogtanian (588974)

      In many ways it already has stopped in that single-core performance hasn't improved much for a while

      Amen... I'd like to emphasise that point.

      Yes, it's absolutely correct that Moore's Law relates to the number of transistors- however, for years many people took it as being synonymous with increases in clock speed and performance because the two pretty much *did* correlate until recently. And while it's not broken yet according to the actual definition, the easy and "free" performance increases that most people took as being an inevitable consequence of Moore's Law *have* massively diminished in the past

  • His view (Score:3, Insightful)

    by Anonymous Coward on Sunday March 20, 2011 @08:46AM (#35549420)

    His view is based upon the chip and not on the device.

    What I'm seeing is folks (manager types ) using their iPhone as their business computer - eliminating the laptop and even their desktop. They're on the move and what they need is a portable communications device that also has some other apps.

    Spreadsheets? That's for the back office staff. The same goes for anything else that you still need a desktop/laptop for.

    So what's my point - desktops and laptops are becoming a commodity back office device (like the typewriter in the past) and the demand has stabilized and as far as business apps are concerned, there isn't any need for more power - bloatware aside.

    To head off the "64K of RAM is all anyone really needs" comments, that was then, this is now. Back then, we were at the birth of the PC revolution. Now, we're in the mature commodity stage. Will we need more power in the future? Yes. But at Moore's law increases? Nope.

    The future is efficiency, portability and communication.

    PC's are inefficient for most uses; therefore, there won't be any "death" or "economic" destruction - just some "old" companies hitting the skids (Dell) or vastly changing their business if they can (HP).

  • There has always been a suffering factor built into changes in technology spilling over and causing changes in society. Usually the suffering has been rather confined. The buggy whip workers were not such a large group of workers that the new automobile market destroyed. But now things are different and less predictable. A great example is in office staff eliminated by the cell phone. As cell phones took over the small company was able to get rid of millions of girl Friday types that had answere

  • by s-whs (959229) on Sunday March 20, 2011 @09:10AM (#35549556)
    He made similar economic predictions in the BBC Horizon episode "The dark secret of Hendrik Schoen" (2004).

    That was the day I lost all respect for Kaku. His economic predictions are moronic (there will always be change, abrupt changes in what creates wealth), and in that Horizon documentary his comments seemed ludicrously off track as well.
  • by mo (2873) on Sunday March 20, 2011 @09:16AM (#35549590)
    Even with transistors the same size, there are so many avenues to explore in processor design. Just off the top of my head, how about a memristor-based analog co-processor for tasks like facial detection or language/speech recognition. How about processors with asynchronous clocks, or clockless designs. Sure, they're harder to build, but once transistor sizes fixate, might as well spend the effort because designs will have a much longer lifecycle.
  • Kaku is a hack (Score:5, Insightful)

    by thasmudyan (460603) <udo.schroeter@gmail. c o m> on Sunday March 20, 2011 @09:26AM (#35549636) Homepage

    This guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little. Maybe I'm alone in thinking this, but his TV series "Physics of the Impossible" was one big self-aggrandizing marketing gig. I barely made it through two episodes that essentially consisted of the professor rehashing old science fiction concepts and passing them off as his own inventions. Every episode ended with a big "presentation" in front of dozens of fawning admirers. Before the credits rolled, they made sure to cram in as many people as possible saying how great and ground-breaking his ideas were. It was disgusting.

    Are there physical limits to Moore's law? Sure. We already knew that. Circuits can't keep getting smaller and smaller indefinitely, and we have already run into the limit on reasonable clock speeds several years ago. And despite this, the computer industry hasn't cataclysmically imploded.

    • by Hatta (162192)

      guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little.

      Isn't that pretty much the same thing?

    • Like many top people in academia, hes a professional schmoozer and salesman. Seriously, if you actually look at many principal investigators they slap their names on papers when the grad students and post docs do all the work and have most of the ideas.
  • While parts of technology might stop progressing as fast, other parts of technology will start getting optimized, to get over the halting of that other part. So if hardware stops getting faster, people will start optimizing software (which is currently extremely inefficient), until we get a better HW/SW tech at some point later in the future. There's a very nice comment on the Amazon page of the book by JPS, give it a read.

    • by tgd (2822)

      On this one?

      Other than techniques for self-promotion and publicity, what is he generally correct on? I mean, he's not Dr. Oz level of self-important whackjob, but that's not saying much.

  • by drolli (522659) on Sunday March 20, 2011 @09:49AM (#35549818) Journal

    From weird analogies and a certain amount of misunderstanding things the excerpt draws strange conclusions.

    a) Misunderstanding how the frequency spacing relates to required number of cycles: The correct assumption would be that if light has 10^14Hz and you restrict yourself to single-octave circuit (for the sake of simplicity: lets say 10% relative bw circuit), then you can if you "cram" ideall of modulate fast enough, 10^13bits*log2(S/N) bits per second. so probably 10^14bits/second - that is a lot.

    b) limits to Moores Law: Moores law is an economic law. There is no physical limit which i see which can be reached technologically until 2020 (in mass production). There is a technological limit to what can be produced, but going in the third dimension and new materials will give opportunity to continue on the same course for a while. If you look at what physicists are currently looking at, you realize that the end of silicon/metal/oxide technology will not be the end of Moores Law or classical computing

    c) "on the atomic level i cant know where the electron is". As it happens to be i work on quantum computation and i really hate to explain that: If you arrange a specific situation, then you cant know where the electron is on the atomic scale. If the statement would be as general as he makes it, it would be impossible to have different chemical configurations of the same stoichiometric mixtures. SIngle-molecule electronic/magnetic configurations. The quantum tunnel coupling in single molecule magnets between states can be designed, and i dont see a specific reason why it should be impossible to realize single molecule devices in which tunneling does not play a role

    d) he does not understand FETs AFAIU

    e) contrary to his opinion, very thin 2DEGs exist and i dont see a reason why upon (finding and) choosing the right layers, the confinement can be very steep in the third direction (not infinity, but also not requiring more than 50nm thickness)

    The funny thing is that he forgot what already is and probably will (there *may* be ways out, like superconductors or ballistic transport but don't bet on it) really be a problem for all classical/room temperature computers: heat. While the designing smaller elements may be possible when using the right physics/technology, reducing the capacitances of lines (associated with an energy loss in the line resistance per switching) will be difficult. Once we *really* stack in the third dimension it will need a lot of clever computer scientists (and maybe mathematicians) to reduce thee needed interconnects, since otherwise stacking the third dimension wont give us anything besides memory capacity.

    So to conclude: i believe until 2050 the definition of Moores law will be obsolete. but it will not break down because we are unable to make circuits smaller, but because it may be too expensive to make them smaller or powering and cooling the circuits may become impractical. We probably will have a replacement of moores law by an equivalent scaling law for power per switching.

    • by mbone (558574)

      I am curious about your take on quantum computers. My impression is that, if they are ever actually made into an operational product, they are likely to have a profound impact in certain areas (watch out, public key encryption!), but are unlikely to be much use in sending emails or watching videos.

    • It's a real shame that I don't have modpoints today. You've made a really insightful post.

      Although you've stuck to the physical issues rather than the economic issues you've captured the main point there with

      but it will not break down because we are unable to make circuits smaller, but because it may be too expensive to make them smaller or powering and cooling the circuits may become impractical.

      Most industries transition from a period of rapid growth to a longer period of slower growth. Once we come up ag

  • Pseudo-economist (Score:3, Insightful)

    by Boona (1795684) on Sunday March 20, 2011 @09:51AM (#35549826)
    Another pseudo-economist out to tell us that an increase in productivity and a lowering living costs will be a net loss for society. Michio Kaku can you please take an economics 101 class before writing a book about the economic impact of anything. The general population is already economically illiterate and this only fuels the problem. Thanks.
  • by Anonymous Coward on Sunday March 20, 2011 @10:08AM (#35549974)

    Kaku is an embarrassment. In the mid/late 90s he presented himself as a "nuclear physicist" to the major news outlets (he is no such thing-he's a field theorist) and jumped on the till-then fringe protest movement opposing the launch of the Cassini mission. The opposition was based on the idea that the nuclear batteries on the probe posed a danger in the event of a launch accident. Nevermind that there had previously been launch accidents with the same battery type (military sats) and the ceramic uranium cores were simply recovered and _reused_ because they're practically indestructible. (The batteries are just warm bricks. Low level uranium fission keeps them cooking and thermoelectrics generate the juice. There are no controls to go wrong, no parts to break, nada. That's why they're used. The ceramic itself is terrifically tough.)

    Anyway, Kaku saw the cameras and the bright lights and decided that he was a nuclear physicist and start spouting all sorts of total nonsense to frighten the unwashed masses. He has a long history of pretending to know things. Google "Kaku evolution blather" for another example. I watched him give a seminar once while I was in grad school and I just spent the hour squirming in embarrassment for him and his self-aggrandizement.

    Yes, I loath the man. I'm a physicist and he just perpetuates the image of people in my field as hubristic egoists. He needs to be shouted down and run out of the media. There are lots of really good popularizers out there (DeGrasse-Tyson, Greene, etc) who, yes, need to establish a public presence to make a career, but who are also actually interested in facts and science and education and know their own limits.

    • I mean I agree Kaku is a blight on science. (I mean his explanation of why E=MC^2 on the science channel basically amounted to "That's what Stone Cold Al Einstein said so.") Still Tyson? I mean we are talking about a guy who romanticizes so much how black holes suck everything down around them. Honestly, the way he puts it he makes it seem as though if a black hole were to go through the solar system getting sucked in would be the major concern. (You know, not mentioning that you'd have to get fairly close
  • Just read the article, haven't read the comments yet.

    Moore's law as far as CPUs and GPUs has already slowed down considerably this entire decade. As far as memory, so far as memory the chips aren't that thin yet. What this means is what everyone has been saying for a long time: more cores, more ram. More cores means applications need to be parallelizable. That's at least a one time overhaul of most of the world's code base.

    Lets assume hardware improvements in general slow down. This leads to a hardwar

  • Moore's law is not so bad compared to the bigger picture of running out of resources. An economy that assumes exponential growth will be thrown into turmoil, and millions^Wbillions will lose their jobs.
  • Here is a news flash. I have it on good authority that

    - eventually Moore's law will fail, and

    - the world will continue to roll through the void. Life will go on, and we will not burn our Mac Book Pro's for heat, nor turn our rack-mounted servers into crude dwellings.

  • by buddyglass (925859) on Sunday March 20, 2011 @10:47AM (#35550252)

    Worth noting this table [wikipedia.org]? Specifically the overall rows at the top for men and women. Income for men has been flat since 1970 when adjusted for inflation. All the income gains have come from women entering the workforce, going from partial to full employment, and/or the gradual elimination of sex discrimination which drives down wages. One could also argue the cost of living has actually risen faster than official inflation measures, especially when one includes the additional costs necessitated by both partners working full time. (Day care, outsourcing tasks like cleaning and yard work, etc.)

    MOS transistors were developed prior to 1970, but not by much, and they didn't really start catching on until the 1970s. Now I'm certainly not arguing causation here, but by the same token I'm not sure it's valid to suggest (via sarcasm) that the move from vacuum tubes to transistors ushered in a new golden era of prosperity.

  • I can't remember the book I read it it, but the author made the claim that if you look at the part of Moore's law about human computing power, i.e. doubling our ability to compute numbers every 18 months, and temporarily ignore other parts of Moore's law it holds true going back to pre-history times.

    In other words silicon computer chips took up where the abacus and solid state circuitry left off, etc, going all the way back to putting marks in the sand on the ground.

    If we get to the theoretical limits of si

  • There are parts of atoms that already lend themselves well to binary behavior. Of course, figuring out how to manipulate them would be a challenge. But of course, Moore's law pertains to number of transistors that can economically be put on an IC right? Well, if you had an infinitely big chip, you could put an infinite number of transistors on it, so there is no theoretical limit. It just is a question of economics. Additionally, I don't think we have done much playing in 3D as far as chips go. We have most
  • Moore's Law doesn't say chips get smaller, it says they get 2x the transistors every 18 months. It doesn't matter if chips hit a wall where transistors can't get smaller. As long as they continue to get cheaper, we will simply start growing by making bigger chips, or more chips, or 3D chips.

    Further, we can get speed other ways: no moving parts, better software design and optimization, simplification. iOS v4.3 on a single core 1GHz ARM feels faster than Mac OS v10.6 on dual core 2GHz Intel because of factors

  • by Ephemeriis (315124) on Sunday March 20, 2011 @12:29PM (#35551160)

    I am so sick of seeing Michio Kaku all over the place...

    It made sense back when he was talking about string theory. He's a physicist, after all. But these days he's just some generic scientist who's more than happy to show up on TV and talk about anything even vaguely scientific.

    Did you see him commenting on the whirlpool formed after the earthquakes in Japan? Because a physicist is obviously the most qualified person they could find to talk about ocean currents and plate tectonics and whatnot.

    What makes Michio Kaku any more qualified to talk about Moor's Law than I am? It isn't like he actually knows anything about microchip fabrication or economics or industrial processes... The guy is a physicist.

  • I disagree (Score:4, Interesting)

    by aaaaaaargh! (1150173) on Sunday March 20, 2011 @12:56PM (#35551332)

    Like others I believe Kaku is wrong. Here is my prediction:

    Within the next 20 years massively parallel processing will become more and more common, machines with a few dozens, hundreds or even thousands cores will be the rule, and programming languages / compilers will be able to automatically turn sequential programs into parallel ones whenever this is possible. Almost all practical computing problems and needs will turn out to be highly parallelizable. The impact of this change on economy will be zero. Computers will never stop to become faster and faster.

    In 50 years from now or earlier our massively parallel conventional machines will be substituted by quantum computers. These will first be available to governments and big companies and within a short period of time will be miniaturized and become available and affordable to end consumers.

A penny saved is a penny to squander. -- Ambrose Bierce

Working...