Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Hardware Technology

Michio Kaku's Dark Prediction For the End of Moore's Law 347

nightcats writes "An excerpt from Michio Kaku's new book appears at salon.com, in which he sees a dark economic future within the next 20 years as Moore's law is brought to an end when single-atom transistors give way to quantum states. Kaku predicts: 'Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.'" Exactly the way the collapse of the vacuum tube industry killed the economy, I hope.
This discussion has been archived. No new comments can be posted.

Michio Kaku's Dark Prediction For the End of Moore's Law

Comments Filter:
  • Dark predictions (Score:5, Insightful)

    by Wowsers ( 1151731 ) on Sunday March 20, 2011 @08:33AM (#35549348) Journal

    I predict a dark future for Michio Kaku's new book.... namely, the bargain bin.

  • Oh really? (Score:3, Insightful)

    by Anonymous Coward on Sunday March 20, 2011 @08:39AM (#35549384)

    Apparently people can't:
    make cluster computers
    make boards with multiple chip sockets
    make extension boards that go in PCI or potential future slots
    use distributed computing
    [insert many other ways to get around limited processing power]

    Man, we sure are screwed in 20 years time, computers will start rebelling against us because we can't make them smaller than the boundaries of the universe!

    On a more serious note, this is retarded. Period.
    20 years is a VERY long time.
    By then, we'd probably actually have the beginnings of working quantum computers that are useful.
    By then, we'd have almost certainly found out how to get around or deal with these problems, possibly even taking advantage of quantum effects to reduce circuit complexity and power needs.
    Who knows, but i know one thing for sure, the world won't end, life will go on and usual, and this book will still be shit.

  • by gearloos ( 816828 ) on Sunday March 20, 2011 @08:44AM (#35549400)
    Michio Kaku is not necessarily the best in his field, mediocre at best, but he has the biggest voice. I was talking to an older woman awhile back and she is a devoted fan of his. I asked her what she knew of him other than that he does "layman's" break down commentaries of Physics for the discovery channel and she actually thought badly of me for trying to undermine her opinion of "the top physicist in the world today". Well, that's definitely HER opinion and not mine. Just because he has a big mouth (media wise) does not make him remotely right on anything is the point I'm trying to make here. oh, I just got it- Now I understand Politics lol
  • Gradual transition (Score:5, Insightful)

    by Kjella ( 173770 ) on Sunday March 20, 2011 @08:44AM (#35549404) Homepage

    Sooner or later it will come to an end, but it will come slowly as the challenges rise, the costs skyrocket and the benfits are lower due to higher leakages and lifetime issues. And designs will continue to be improved, if you're no longer constantly redesigning it for a tick-tock every two years you can add more dedicated circuits to do special things. Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. Besides, there's plenty other computer-ish inventions to do like laying fiber networks everywhere, mobile devices, display technlogy - the world will still be in significant change 20 years from now. Just perhaps not on the fundamental CPU code / GPU shader level.

  • by sydneyfong ( 410107 ) on Sunday March 20, 2011 @08:45AM (#35549410) Homepage Journal

    Yeah, maybe we should stop the waste, and employ human operators to send telegraphs like they did in the good old days, scribes to write documents by hand....

  • by frnic ( 98517 ) on Sunday March 20, 2011 @08:45AM (#35549412)

    Before we had transisters we didn't have them yet either.

  • His view (Score:3, Insightful)

    by Anonymous Coward on Sunday March 20, 2011 @08:46AM (#35549420)

    His view is based upon the chip and not on the device.

    What I'm seeing is folks (manager types ) using their iPhone as their business computer - eliminating the laptop and even their desktop. They're on the move and what they need is a portable communications device that also has some other apps.

    Spreadsheets? That's for the back office staff. The same goes for anything else that you still need a desktop/laptop for.

    So what's my point - desktops and laptops are becoming a commodity back office device (like the typewriter in the past) and the demand has stabilized and as far as business apps are concerned, there isn't any need for more power - bloatware aside.

    To head off the "64K of RAM is all anyone really needs" comments, that was then, this is now. Back then, we were at the birth of the PC revolution. Now, we're in the mature commodity stage. Will we need more power in the future? Yes. But at Moore's law increases? Nope.

    The future is efficiency, portability and communication.

    PC's are inefficient for most uses; therefore, there won't be any "death" or "economic" destruction - just some "old" companies hitting the skids (Dell) or vastly changing their business if they can (HP).

  • by Anonymous Coward on Sunday March 20, 2011 @08:47AM (#35549424)

    Really, IT has far more wide ranging applications than a fridge and can create new ways of doing things, these may not always be better but a good proportion of it is. People who think that IT is a waste are usually the same people that think the space program is a waste or that education is a waste. Progress has to come from somewhere, it is not magiclly pooed from the buts of celebrities or political figures as they dance about appealing to the masses.

  • by maxwell demon ( 590494 ) on Sunday March 20, 2011 @08:52AM (#35549454) Journal

    So what? Already today the chips are just perfect for most applications. Add 20 more years of Moore's law, and we won't even need more powerful chips. You'll have the power of today's supercomputers on your cell phone. I doubt Moore's law would continue even if physically possible, because there will be no need for it.

  • Kaku is a hack (Score:5, Insightful)

    by thasmudyan ( 460603 ) <thasmudyan@o[ ]fu.com ['pen' in gap]> on Sunday March 20, 2011 @09:26AM (#35549636)

    This guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little. Maybe I'm alone in thinking this, but his TV series "Physics of the Impossible" was one big self-aggrandizing marketing gig. I barely made it through two episodes that essentially consisted of the professor rehashing old science fiction concepts and passing them off as his own inventions. Every episode ended with a big "presentation" in front of dozens of fawning admirers. Before the credits rolled, they made sure to cram in as many people as possible saying how great and ground-breaking his ideas were. It was disgusting.

    Are there physical limits to Moore's law? Sure. We already knew that. Circuits can't keep getting smaller and smaller indefinitely, and we have already run into the limit on reasonable clock speeds several years ago. And despite this, the computer industry hasn't cataclysmically imploded.

  • Pseudo-economist (Score:3, Insightful)

    by Boona ( 1795684 ) on Sunday March 20, 2011 @09:51AM (#35549826)
    Another pseudo-economist out to tell us that an increase in productivity and a lowering living costs will be a net loss for society. Michio Kaku can you please take an economics 101 class before writing a book about the economic impact of anything. The general population is already economically illiterate and this only fuels the problem. Thanks.
  • Re:Stupid comment (Score:2, Insightful)

    by Anonymous Coward on Sunday March 20, 2011 @10:07AM (#35549958)

    Uh, no. He's a gawdawful write. The entire excerpt was a dreary and largely useless lead-in to the final paragraph. Kaku writes not as if he believes in using two word where one will do but in using a hundred words where one will do.

    And what does the reader get when you slog your way through to the last paragraph? The shocking news that quantum effects will put an end to conventional integrated circuits.

    Jiminy Cricket! I wish I was smart enough to make that prediction! It's only been common knowledge in the tech community for a couple of decades. Maybe there's a Nobel Prize for belaboring the obvious that Kaku's going for.

    The implication of the article, which Kaku's smart enough not to get too explicit about, is that when that sad day arrives AMD and Intel - they'll still be the only two microprocessor manufacturers of any note - will produce their final chips none of which will work. Oh, the tragedy! Oh, the humanity! Oh, if only they'd listened to Michio Kaku while there was still time!

    Of course long before then Kaku will have cashed the checks from this piece of drek.

    All the phony Luddites who moan about the arrogance of technophiles will have had their conceits confirmed that technology is the crystalization of hubris. That's probably what they're tweeting each other right now on their Iphone 2s.

    Meanwhile, back in the real world Kaku's dark prognostications will be forgotten in less time then it takes AMD and Intel to produce the next generation of microprocessors.

  • by Anonymous Coward on Sunday March 20, 2011 @10:08AM (#35549974)

    Kaku is an embarrassment. In the mid/late 90s he presented himself as a "nuclear physicist" to the major news outlets (he is no such thing-he's a field theorist) and jumped on the till-then fringe protest movement opposing the launch of the Cassini mission. The opposition was based on the idea that the nuclear batteries on the probe posed a danger in the event of a launch accident. Nevermind that there had previously been launch accidents with the same battery type (military sats) and the ceramic uranium cores were simply recovered and _reused_ because they're practically indestructible. (The batteries are just warm bricks. Low level uranium fission keeps them cooking and thermoelectrics generate the juice. There are no controls to go wrong, no parts to break, nada. That's why they're used. The ceramic itself is terrifically tough.)

    Anyway, Kaku saw the cameras and the bright lights and decided that he was a nuclear physicist and start spouting all sorts of total nonsense to frighten the unwashed masses. He has a long history of pretending to know things. Google "Kaku evolution blather" for another example. I watched him give a seminar once while I was in grad school and I just spent the hour squirming in embarrassment for him and his self-aggrandizement.

    Yes, I loath the man. I'm a physicist and he just perpetuates the image of people in my field as hubristic egoists. He needs to be shouted down and run out of the media. There are lots of really good popularizers out there (DeGrasse-Tyson, Greene, etc) who, yes, need to establish a public presence to make a career, but who are also actually interested in facts and science and education and know their own limits.

  • by wierd_w ( 1375923 ) on Sunday March 20, 2011 @10:25AM (#35550094)

    amusingly, that only confirms Kaku's prediction.

    If your existing refrigerator is perfectly good, then what incentive do you have to buy the NEW refrigerator?
    If you don't buy NEW refrigerators, how does the refrigerator manufacturer stay in business?

    For a more geek friendly variant on this, look at microsoft. Their last 3 "New" versions have mostly been about Microsoft's bottom line, and been less about true innovation. (EG--look how hard they are trying to kill windows XP.)

    When you reach a point where your company can no longer just add bloat, call it new, and sell them like hotcakes because the existing product is arguably just as good, if not better, due to physical limitations of the device, then you end up with profitability grinding to a halt, and industry suffering mightily.

    What you would see instead, is a service-industry created, instead of a product-industry.... Oh wait, we already are!

  • by martyros ( 588782 ) on Sunday March 20, 2011 @10:45AM (#35550246)

    I'm by no means a hopeless optimist, but I think the arguments he's making here don't really make much sense. He's focusing exclusively on one aspect -- the increase in speed / computation power -- and saying that when that stops developing, everything will stop and die massively.

    That doesn't make any sense. Cars got faster between 1910 and 1930. But after they reached "as fast as humans can actually control them safely", they stopped getting faster, by and large. Did that cause a collapse of the industry? Did everyone completely stop buying cars? Consider airplanes -- between 1900 when the first flight happened, to WW2 where they were a critical part of strategy, they got faster. But once they reached the limits of speed / air resistance economics, they stopped getting significantly faster -- at least as far as most consumers are concerned. Now the main difference in passenger experience between a plane made 30 years ago and one made 10 years ago is whether the in-flight entertainment is on one shared screen, or each person has their own screen. This lack of increase in airplane speed has somehow failed to destroy the airline economy.

    When transistors hit their limit, there will still be huge amounts of transforming to do. Even within technology, there are things to do: there's a whole avenue of domain-specific chips to pursue. With the exception of GPUs (and possibly cryptography), there has been until now no point in making chips to do one specific thing; by the time you made it, Intel's CPUs would be more powerful at doing whatever it was you were going to do anyway. When we really hit the limit of silicon, that will become a rich avenue to explore.

    Outside of technology, there's even more. Culturally, we don't even know what to do with all the computing we could have. If my sink or table or door or wall isn't as smart as it could be, it's not because there aren't enough transistors, it's because we don't know what to do with the transistors. I'd say that the biggest limitation right now to ubiquitous computing isn't so much number of transistors, as what to do with the transistors. Will there ever be a task that my microwave will perform that will require 4 cores of an i7 supplemented with GPUs? User interfaces, techniques, and all kinds of other things are still wide-open. I'd go so far as to say that computing power isn't nearly the biggest difference between the computers of today and the computers of five years ago.

    The main point is, there's still a lot of innovation that can be done that will not somehow fail if chips don't get faster.

  • by Ephemeriis ( 315124 ) on Sunday March 20, 2011 @12:29PM (#35551160)

    I am so sick of seeing Michio Kaku all over the place...

    It made sense back when he was talking about string theory. He's a physicist, after all. But these days he's just some generic scientist who's more than happy to show up on TV and talk about anything even vaguely scientific.

    Did you see him commenting on the whirlpool formed after the earthquakes in Japan? Because a physicist is obviously the most qualified person they could find to talk about ocean currents and plate tectonics and whatnot.

    What makes Michio Kaku any more qualified to talk about Moor's Law than I am? It isn't like he actually knows anything about microchip fabrication or economics or industrial processes... The guy is a physicist.

  • by MoellerPlesset2 ( 1419023 ) on Sunday March 20, 2011 @12:39PM (#35551234)

    Michio Kaku is not necessarily the best in his field, mediocre at best, but he has the biggest voice.

    I agree. But this isn't really news; This is how it's _always_ worked. The public is not going to figure out the merits of your scientific achievements on their own, and then give you attention that's proportionate to that. It's the same as in any other area: You have to market yourself.

    Linus Pauling was arguably the most famous chemist of the last century. But he wasn't actually that important. The quantum-chemical contributions he made were in reality on-par with those of Mulliken, Hund and Slater. Many would say Slater should've shared in his first Nobel prize. But it was Pauling who wrote "The nature of the chemical bond", it was Pauling who popularized the subject, it was Pauling who was the bigger educator and public figure (which was not limited to chemistry). Richard Feynman was one of the most famous physicists. And while his contributions are also beyond question, they were arguably not a lot larger than those of, say, Murray Gell-Mann, who is nowhere near as famous. Because Gell-Mann was not a big educator. His popular-scientific books didn't sell anywhere near as well. Dirac was as important as Bohr when it came to quantum theory, but he wasn't anywhere near the popular and public figure Bohr was. And so he's also less known.

    What bothers me about Kaku isn't the fact that his fame is disproportionate to his scientific contributions, or even the fact that it leads people to think he's a greater scientist than he is. What annoys me about Kaku is his propensity to comment on stuff that he doesn't know much or anything about. For instance, his statements on evolution, which were harshly (but justly) criticized recently by PZ Meyers. Or his commenting on the Deepwater Horizon spill, the Fukushima diaster (which he, IMO recklessly, called the worst diaster second only to Chernobyl, even though it's far from clear that it'd be worse than Three Mile Island or Windscale at this point, and certainly several orders of magnitude less severe than Chernobyl). And now we have him commenting about Moore's Law, even though he's not a solid-state physicist.

    I suspect he's letting his ego cloud his better judgment. It's not uncommon - the aforementioned Pauling, for all his scientific merits, had a whole bunch of bad, crankish ideas in areas outside his field (nuclear physics, vitamin megadoses, anesthesiology). I don't believe at all Feynman was the humble guy he tried so hard to make himself out to be, but to his credit, he was quite respectful of other fields and did not have that propensity to make himself out to be an expert on things he didn't know much about. Of course, there's also the possibility that it's not about Kaku's ego and that he just genuinely doesn't actually give a damn about educating the public, and is more interested in just getting attention for himself. But I'm prepared to give him the benefit of the doubt on that.

Today is a good day for information-gathering. Read someone else's mail file.

Working...