Michio Kaku's Dark Prediction For the End of Moore's Law 347
nightcats writes "An excerpt from Michio Kaku's new book appears at salon.com, in which he sees a dark economic future within the next 20 years as Moore's law is brought to an end when single-atom transistors give way to quantum states. Kaku predicts: 'Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.'" Exactly the way the collapse of the vacuum tube industry killed the economy, I hope.
No planetary alignment? (Score:5, Funny)
Noone will take a disaster prophecy seriously if you can't even be bothered to pair it with some planetary alignment or ancient calendar.
Re: (Score:2)
Or Nostradamus
Re: (Score:3)
What do Herman's Hermits [peternoone.com] have to do with silicon technology disasters?
On vacuum tubes. (Score:2)
Re:On vacuum tubes. (Score:5, Insightful)
Before we had transisters we didn't have them yet either.
Re: (Score:3)
Re: (Score:2)
The major difference being the tube/valve industry was done in by the transistor - i.e. we had a viable replacement that was better. The problem with the transistor is that we don't (yet) have a viable replacement.
There's a big difference between then and now. We have a lot of people/companies/countries trying to drive the progress and development of new technologies. Small startups can play a role, or even become the new leaders; ungodly international conglomerates can 'change or die'. There's a brave new word out there ... but there's always a brave new world out there. Now you young folks go get it and bring it back to those of us who are tired, cranky and complacent.
Re:On vacuum tubes. (Score:5, Insightful)
So what? Already today the chips are just perfect for most applications. Add 20 more years of Moore's law, and we won't even need more powerful chips. You'll have the power of today's supercomputers on your cell phone. I doubt Moore's law would continue even if physically possible, because there will be no need for it.
Re: (Score:3)
Until even the most complex task imaginable can be computed in less time than it takes you to click a button, there will be a need for more processing power.
Re: (Score:3)
Yes, and this 'need for more processing power' , is exactly what Moore's law exploits : Moore's law basically dictates that the demand for processing power doubles every year.
As a result , it's most profitable to follow this demand.
Speeding it up would be silly ( even if new technology would allow it ) , because that means you lose money :
For example , if i suddenly were to create a processor which has 10.000x the processing power , i would go brankrupt :
- Either it would be so expensive , that no one would
Re:On vacuum tubes. (Score:4, Funny)
No - Microsoft does that. Moore's law ensures that new computers can perform better at the same rate that MS adds bloat to their software, or marginally faster. By avoiding the use of Windows, I can continue to use my 4 year old PC or ten year old Sparc machines. YMMV
Re: (Score:2)
If you could make your 10.000x CPU, chances are you'll be filthy stinking rich in no time.
It may be expensive for individuals, but for companies it'd probably beat having a room full of racks including cables, cooling, etc.
If it'd be cheap, people would want a faster one next year, in order to play their realtime 3D raytraced games on eight 2048x1536 screens instead of just four.
Re:On vacuum tubes. (Score:5, Interesting)
Today's chips were perfect for most applications in the 1980s. Once WordPerfect could outrun a human in terms of spell check and could outrun even the fastest printers CPU upgrades didn't do much. Same with Lotus 1-2-3, once complex speadsheets with lots of macros could be processed faster than a human could read a spreadsheet....
But all that excess power led to the GUI. And then technologies like OLE. Which drove up requirements by orders of magnitude. But OLE hasn't really hit another generation because everything is so unstable. Imagine the next generation of applications that have data embedded from dozens of devices and hundreds of websites. I do a Quicken report which
a) contacts my banks internet connections and pulls in all the credit card transactions
b) hits each of those vendors (100+) with the credit processing number and pulls up all the items for each transaction
c) does an item lookup to figure out what sort of expenses they are and prorates out general costs, like sales tax. That's 1000s of web information requests for an annual report.
That sort of data processing we don't yet have and certainly not on cellphones. Another area is AI where systems are underpowered.
Imagine a news search engine that knows my entire browsing history. Like a Pandora across all my news choices for the last year. I search for a story and because the system knows my preferences on dozens of dimensions its able to feed me the stories that most fit my preferences. Analyzing every article every day to do simple word counts is about the limits of a massive datacenter of google. Analyzing every article every day to determine: how much scientific background is this assuming in biology, in chemistry, in mathematics; what sort of editorial biases does it have, how human interest heavy is the presentation, how respected is in the journal.... that's way beyond what we can do today.
Re: (Score:2)
And 640K is all we'll ever need. =)
On a more serious note, really... I think it's some sort of corollary to Moore's law: Processing needs will always expand to fill the available processing capacity. In short, we're going to be using our pocket computers with quantum-state processors, and still be wondering why frickin' Outlook is running so slow.
Re: (Score:2)
Agreed.
As our computers have become more capable, they actually demand more from users, and productivity may not increase as much as you thing.
In 1990, we had typing pools in my organisation, which produced memos, letters and other paperwork. Now that there is a computer on every desk, we have lost these pools. Now, the originator must write his/her own staff-work. It is expected that grammar and spelling be correct. Supervisors will edit for style, and often send it right back down to the originator.
Re: (Score:2)
Re: (Score:3)
I don't dispute that their is some limit that we will approach a limit with respect to computing speed. What I don't see, is evidence for 'economic collapse' as a consequence. Surely there will always be a need for programmers? Maybe more so because efficient programming will yield greater speeds and you won't be able to rely on mediocre quality ones and lazily rely on hardware getting faster. Similarly speaking with 2011 hardware alone we are still are nowhere near reaching the full economic capabilities o
Re: (Score:3)
Right now its hard to get refrigerators that maintain proper temperature at different points. Having a system that can manipulate airflow based on what's inside it: i.e. is running a fluid dynamics program, is taking pictures and analyzing them of its internal contents, is offering an interface to your computer...
There is nothing like that on the market, and yes it would be a huge economic value. Keeping food at the right temperature allows people to store better foods which can lead to them buying more s
Re: (Score:2)
No you probably need a computer with the power of a 286 to do that, maybe less. But the question was how much smarter do refrigerators need to be, and that was a useful example because that was a place where we aren't taking advantage of 20 year old technology that could today be implemented cheaply enough.
Re:On vacuum tubes. (Score:5, Insightful)
amusingly, that only confirms Kaku's prediction.
If your existing refrigerator is perfectly good, then what incentive do you have to buy the NEW refrigerator?
If you don't buy NEW refrigerators, how does the refrigerator manufacturer stay in business?
For a more geek friendly variant on this, look at microsoft. Their last 3 "New" versions have mostly been about Microsoft's bottom line, and been less about true innovation. (EG--look how hard they are trying to kill windows XP.)
When you reach a point where your company can no longer just add bloat, call it new, and sell them like hotcakes because the existing product is arguably just as good, if not better, due to physical limitations of the device, then you end up with profitability grinding to a halt, and industry suffering mightily.
What you would see instead, is a service-industry created, instead of a product-industry.... Oh wait, we already are!
Re: (Score:2)
At worst case the PC industry would probably become something like the fashion industry. People have been making shoes for thousands of years, the shoe industry still finds a way of making people pay. And often even for crappy ill-fitting shoes.
At best case, stuff will become cheaper and people will have extra money to spend on other crap, and companies will try to convince them to spend it.
The likely case is just like how refrigerator manuf
Re: (Score:2)
Except for product that really shouldn't ever really wear out---
Excepting for REALLY bad designs, normal operating load on an integrated circuit should have it last many times longer than it's predicted obsolescence by moore's law. You can find perfectly functional Atari STs and IBM XTs even today, more than 20 years later.
compare to shoes, which wear out as a consequence of use (like brake pads on cars..) or refrigerators, which have moving parts (The compressor, and pals)--- the solid-state electronics i
Re: (Score:3)
My previous desktop computer failed after about seven years.
And yes, integrated circuits do wear out. [wikipedia.org] Indeed, the smaller the structures, the sooner the chip will fail.
And I have a fully functioning Commodore 64 (and Atari 2600, Atari 5600, NES, and a PSX) plugged into a 10 year old TV in my bedroom. I'm typing this on a 7 year old laptop, which works fine sans some issues from the aging battery. The DVD-ROM drive in main computer is almost 10 years old. My mom's old computer, which we replaced because of software bloat, was nearing 15 years old, her new on is made of 5 year old components and will probably last enough 5-10 years (ignoring moving parts). Hell, her mon
Re: (Score:3)
I usually buy a new appliance because something mechanical breaks and with commodity manufacturing it's cheaper than calling a repair man. If it has a faster chip great. If it's doesn't, who cares. In the case of a refrigerator I'm buying cold not smart...
Re: (Score:2, Flamebait)
Their last 3 "New" versions have mostly been about Microsoft's bottom line, and been less about true innovation. (EG--look how hard they are trying to kill windows XP.)
Amusing.
The geek body-slams XP for ten years.
But is first and loudest to be heard wailing at its EOL gravesite mourning.
What you would see instead, is a service-industry created, instead of a product-industry.... Oh wait, we already are!
Last I heard from the geek, service was the way to find profit in FOSS.
Re: (Score:2)
Re: (Score:3)
amusingly, that only confirms Kaku's prediction.
If your existing refrigerator is perfectly good, then what incentive do you have to buy the NEW refrigerator?
If you don't buy NEW refrigerators, how does the refrigerator manufacturer stay in business?
...
I don't know about you, but no one I know buys new refrigerators because a new model came out. They buy new fridges when they go bad and can't be fixed cheaply.
Look, the whole arguement is stupid.
So, lets say they hit the end point of cpu's, big whoop. We aren't magicly not going to need devices with cpu's anymore. Will still need them. Nothing last forever, will need to replace old stuff. New people are born every day, they are going to need stuff with cpu's in them. Oh, dang, earths population
Re: (Score:2)
Otherwise, we'll never be able to play Crises 16 on Windows 2030.
Re:On vacuum tubes. (Score:4, Insightful)
I'm by no means a hopeless optimist, but I think the arguments he's making here don't really make much sense. He's focusing exclusively on one aspect -- the increase in speed / computation power -- and saying that when that stops developing, everything will stop and die massively.
That doesn't make any sense. Cars got faster between 1910 and 1930. But after they reached "as fast as humans can actually control them safely", they stopped getting faster, by and large. Did that cause a collapse of the industry? Did everyone completely stop buying cars? Consider airplanes -- between 1900 when the first flight happened, to WW2 where they were a critical part of strategy, they got faster. But once they reached the limits of speed / air resistance economics, they stopped getting significantly faster -- at least as far as most consumers are concerned. Now the main difference in passenger experience between a plane made 30 years ago and one made 10 years ago is whether the in-flight entertainment is on one shared screen, or each person has their own screen. This lack of increase in airplane speed has somehow failed to destroy the airline economy.
When transistors hit their limit, there will still be huge amounts of transforming to do. Even within technology, there are things to do: there's a whole avenue of domain-specific chips to pursue. With the exception of GPUs (and possibly cryptography), there has been until now no point in making chips to do one specific thing; by the time you made it, Intel's CPUs would be more powerful at doing whatever it was you were going to do anyway. When we really hit the limit of silicon, that will become a rich avenue to explore.
Outside of technology, there's even more. Culturally, we don't even know what to do with all the computing we could have. If my sink or table or door or wall isn't as smart as it could be, it's not because there aren't enough transistors, it's because we don't know what to do with the transistors. I'd say that the biggest limitation right now to ubiquitous computing isn't so much number of transistors, as what to do with the transistors. Will there ever be a task that my microwave will perform that will require 4 cores of an i7 supplemented with GPUs? User interfaces, techniques, and all kinds of other things are still wide-open. I'd go so far as to say that computing power isn't nearly the biggest difference between the computers of today and the computers of five years ago.
The main point is, there's still a lot of innovation that can be done that will not somehow fail if chips don't get faster.
Re: (Score:2)
That doesn't make any sense. Cars got faster between 1910 and 1930. But after they reached "as fast as humans can actually control them safely", they stopped getting faster, by and large. Did that cause a collapse of the industry? Did everyone completely stop buying cars? Consider airplanes -- between 1900 when the first flight happened, to WW2 where they were a critical part of strategy, they got faster. But once they reached the limits of speed / air resistance economics, they stopped getting significantly faster -- at least as far as most consumers are concerned. Now the main difference in passenger experience between a plane made 30 years ago and one made 10 years ago is whether the in-flight entertainment is on one shared screen, or each person has their own screen. This lack of increase in airplane speed has somehow failed to destroy the airline economy.
The concord was a commercial endeavor pushing the speeds of consumer air travel, but it died, and no one has started another supersonic jet company.
Re: (Score:3)
Well, yes and no. The physics is unforgiving; as Kaku says, we're going to hit a transistor shrink wall. At that point, the easy advances are over.
The transistor shrink wall isn't the same thing as peak computation power, only a predictor of it. We have room for advancement in how well we used the transistors we have; once we've got them as small as possibly, we can improve how densely we pack them, and how efficiently we utilize them for computation. Those problems are harder to solve, so we haven't been d
Dark predictions (Score:5, Insightful)
I predict a dark future for Michio Kaku's new book.... namely, the bargain bin.
Re: (Score:2, Informative)
Yeah, it's embarrassing when someone who's brilliant within his area of expertise starts nosing into other fields (in this case economics and the electronics industry) just to say stupid things. By the way, he did this before, although the previous victim was biology. Why do physicists think they are masters of all sciences? [scienceblogs.com] Granted, that was in response to a question, but he really should have said ‘I have no clue’. Why oh why do experts always think they're experts in everything?
Re: (Score:2)
and this is a bad thing? (Score:5, Funny)
Software developers are going to have to consider increasing efficiency as they make their wares more complex! And we might have to actually implement concurrency research which is under two decades old!
Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.
Re: (Score:2)
Who knows, we might even end up with the responsiveness of my RISC OS 2 Acorn A3000 in 1990.
Ah, my trusty old friend Acorn... what went wrong?
Re: (Score:2)
RISCOS and the Archimedes line were about a decade ahead of the competition when they first came out, but sadly Not Invented Here syndrome killed them in the end.
Indeed. Application bundles and something similar to the OS X dock spring to mind - not suggesting anything, just saying - but it is rather amusing to me that in a way we have Aunty Beeb to thank for it all.
Re: (Score:2)
I think they have been increasing efficiency the past decade, because, just as a casual observer, I haven't seen the types of gains that were to be had from 1995 to 2003ish anymore.
Not in clock speed increases (yeah, I know this isn't everything, but it certainly was part of the equation) in CPUs and HDDs just don't increase in capacity like they used to. Back in the early-mid 00s, I upgraded from 40GB to 320GB, nearly an order of a magnitude bigger. And now, in my price range, I'm looking from jumping fr
Re: (Score:2)
I wouldn't call a factor of 6, when I was expecting a factor of 8, "a lot less increase than I was expecting"
Re: (Score:3)
The part where I think Kaku goes
Re: (Score:2)
"There's no point letting it go unused most of the time because you have super fast immediate software that took ages to develop."
Sure there is. When the silicon is idle, it draws less power. Lower power draw increases battery life for portable devices, lower operating temperatures and lower overall utility bills. Part of the problem with current industrial society is that it gobbles energy like a child in a candy store. Is your contribution to this problem a solution, or a compounding factor?
The notion th
Seriously (Score:2)
Oh really? (Score:3, Insightful)
Apparently people can't:
make cluster computers
make boards with multiple chip sockets
make extension boards that go in PCI or potential future slots
use distributed computing
[insert many other ways to get around limited processing power]
Man, we sure are screwed in 20 years time, computers will start rebelling against us because we can't make them smaller than the boundaries of the universe!
On a more serious note, this is retarded. Period.
20 years is a VERY long time.
By then, we'd probably actually have the beginnings of working quantum computers that are useful.
By then, we'd have almost certainly found out how to get around or deal with these problems, possibly even taking advantage of quantum effects to reduce circuit complexity and power needs.
Who knows, but i know one thing for sure, the world won't end, life will go on and usual, and this book will still be shit.
This is a perfect example of the world today (Score:5, Insightful)
Re: (Score:3)
He probably has a good point, too, that at least eventually Moore's law failing will have strong economic impacts, and it's
Re:This is a perfect example of the world today (Score:4, Interesting)
Hawking isn't even a top physicist. I mean, he's a serious, good physicist, and an inspiring guy, just not one of the 5-10 best physicists alive today. Kaku on the other hand is just a popularizer. Which is fine. Except that the guy seems to be a hack and huge self promoter.
Re: (Score:2)
Michio just seems very self aggrandizing to me - especially with that one series of his where he dreams up of solutions to various things.
I really liked Sagan, and I think Brian Cox is a worthy successor to him, he just has and transmits that passion of his for science and I really like that. Neils DeGrasse Tyson also comes close. Then there are the mythbusters--.--, while sometimes horrible in their scientific method, they're at least showing practical experiments and how to do things besides theory.
And
Re:This is a perfect example of the world today (Score:5, Insightful)
I agree. But this isn't really news; This is how it's _always_ worked. The public is not going to figure out the merits of your scientific achievements on their own, and then give you attention that's proportionate to that. It's the same as in any other area: You have to market yourself.
Linus Pauling was arguably the most famous chemist of the last century. But he wasn't actually that important. The quantum-chemical contributions he made were in reality on-par with those of Mulliken, Hund and Slater. Many would say Slater should've shared in his first Nobel prize. But it was Pauling who wrote "The nature of the chemical bond", it was Pauling who popularized the subject, it was Pauling who was the bigger educator and public figure (which was not limited to chemistry). Richard Feynman was one of the most famous physicists. And while his contributions are also beyond question, they were arguably not a lot larger than those of, say, Murray Gell-Mann, who is nowhere near as famous. Because Gell-Mann was not a big educator. His popular-scientific books didn't sell anywhere near as well. Dirac was as important as Bohr when it came to quantum theory, but he wasn't anywhere near the popular and public figure Bohr was. And so he's also less known.
What bothers me about Kaku isn't the fact that his fame is disproportionate to his scientific contributions, or even the fact that it leads people to think he's a greater scientist than he is. What annoys me about Kaku is his propensity to comment on stuff that he doesn't know much or anything about. For instance, his statements on evolution, which were harshly (but justly) criticized recently by PZ Meyers. Or his commenting on the Deepwater Horizon spill, the Fukushima diaster (which he, IMO recklessly, called the worst diaster second only to Chernobyl, even though it's far from clear that it'd be worse than Three Mile Island or Windscale at this point, and certainly several orders of magnitude less severe than Chernobyl). And now we have him commenting about Moore's Law, even though he's not a solid-state physicist.
I suspect he's letting his ego cloud his better judgment. It's not uncommon - the aforementioned Pauling, for all his scientific merits, had a whole bunch of bad, crankish ideas in areas outside his field (nuclear physics, vitamin megadoses, anesthesiology). I don't believe at all Feynman was the humble guy he tried so hard to make himself out to be, but to his credit, he was quite respectful of other fields and did not have that propensity to make himself out to be an expert on things he didn't know much about. Of course, there's also the possibility that it's not about Kaku's ego and that he just genuinely doesn't actually give a damn about educating the public, and is more interested in just getting attention for himself. But I'm prepared to give him the benefit of the doubt on that.
Re: (Score:2)
Kaku educates the public about where to find his books. That's about it.
Listen to Kaku's radio show sometime if it is still on the air. It's two hours long, about 40% of the show is an advertisement for his book/tv appearance/book signing. There is usually an interview with someone interesting, who gets cut off mid-sentence so Kaku can talk about his book and cut to the radio station's commercials.
Re: (Score:2)
Gradual transition (Score:5, Insightful)
Sooner or later it will come to an end, but it will come slowly as the challenges rise, the costs skyrocket and the benfits are lower due to higher leakages and lifetime issues. And designs will continue to be improved, if you're no longer constantly redesigning it for a tick-tock every two years you can add more dedicated circuits to do special things. Just for example look at the dedicated high def video decoding/encoding/transcoding solutions that have popped up. In many ways it already has stopped in that single-core performance hasn't improved much for a while, it's been all multicore and multithreading of software. Besides, there's plenty other computer-ish inventions to do like laying fiber networks everywhere, mobile devices, display technlogy - the world will still be in significant change 20 years from now. Just perhaps not on the fundamental CPU code / GPU shader level.
Re: (Score:2)
And there's a lot to be done with different architectures, and just plain organizing computers differently (basically changing up how the CPU, GPU and various memory systems connect to each other).
Right now all of that experimental stuff pretty much stays experimental, or custom, because by the time you get it out the door the traditional CPU-GPU market has gone through a tick-tock cycle and no matter how good your idea was, it's still not as economical as a newer version of your traditional hardware.
Once t
Re: (Score:2)
Excellent agree. I said a similar thing: http://hardware.slashdot.org/comments.pl?sid=2045520&cid=35549980 [slashdot.org]
The end of CPU/GPU density is nowhere near the end of the computers "getting better". Your points about all the display technologies and wiring are a good one. We still aren't using the CPU technology we have today in most devices.
Re: (Score:2)
In many ways it already has stopped in that single-core performance hasn't improved much for a while
Amen... I'd like to emphasise that point.
Yes, it's absolutely correct that Moore's Law relates to the number of transistors- however, for years many people took it as being synonymous with increases in clock speed and performance because the two pretty much *did* correlate until recently. And while it's not broken yet according to the actual definition, the easy and "free" performance increases that most people took as being an inevitable consequence of Moore's Law *have* massively diminished in the past
His view (Score:3, Insightful)
His view is based upon the chip and not on the device.
What I'm seeing is folks (manager types ) using their iPhone as their business computer - eliminating the laptop and even their desktop. They're on the move and what they need is a portable communications device that also has some other apps.
Spreadsheets? That's for the back office staff. The same goes for anything else that you still need a desktop/laptop for.
So what's my point - desktops and laptops are becoming a commodity back office device (like the typewriter in the past) and the demand has stabilized and as far as business apps are concerned, there isn't any need for more power - bloatware aside.
To head off the "64K of RAM is all anyone really needs" comments, that was then, this is now. Back then, we were at the birth of the PC revolution. Now, we're in the mature commodity stage. Will we need more power in the future? Yes. But at Moore's law increases? Nope.
The future is efficiency, portability and communication.
PC's are inefficient for most uses; therefore, there won't be any "death" or "economic" destruction - just some "old" companies hitting the skids (Dell) or vastly changing their business if they can (HP).
Amplified Future Shock (Score:2)
There has always been a suffering factor built into changes in technology spilling over and causing changes in society. Usually the suffering has been rather confined. The buggy whip workers were not such a large group of workers that the new automobile market destroyed. But now things are different and less predictable. A great example is in office staff eliminated by the cell phone. As cell phones took over the small company was able to get rid of millions of girl Friday types that had answere
Re: (Score:2)
Not really new from him. (Score:4, Interesting)
That was the day I lost all respect for Kaku. His economic predictions are moronic (there will always be change, abrupt changes in what creates wealth), and in that Horizon documentary his comments seemed ludicrously off track as well.
memristor-based analog computers (Score:3)
Kaku is a hack (Score:5, Insightful)
This guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little. Maybe I'm alone in thinking this, but his TV series "Physics of the Impossible" was one big self-aggrandizing marketing gig. I barely made it through two episodes that essentially consisted of the professor rehashing old science fiction concepts and passing them off as his own inventions. Every episode ended with a big "presentation" in front of dozens of fawning admirers. Before the credits rolled, they made sure to cram in as many people as possible saying how great and ground-breaking his ideas were. It was disgusting.
Are there physical limits to Moore's law? Sure. We already knew that. Circuits can't keep getting smaller and smaller indefinitely, and we have already run into the limit on reasonable clock speeds several years ago. And despite this, the computer industry hasn't cataclysmically imploded.
Re: (Score:2)
guy is trying to establish himself as some kind of authority on futurism, but I just perceive him as an attention whore who actually contributes very little.
Isn't that pretty much the same thing?
Re: (Score:3)
Kaku is wrong on this one (Score:2)
While parts of technology might stop progressing as fast, other parts of technology will start getting optimized, to get over the halting of that other part. So if hardware stops getting faster, people will start optimizing software (which is currently extremely inefficient), until we get a better HW/SW tech at some point later in the future. There's a very nice comment on the Amazon page of the book by JPS, give it a read.
Re: (Score:2)
On this one?
Other than techniques for self-promotion and publicity, what is he generally correct on? I mean, he's not Dr. Oz level of self-important whackjob, but that's not saying much.
I am a solid state quantum physicist (Score:5, Interesting)
From weird analogies and a certain amount of misunderstanding things the excerpt draws strange conclusions.
a) Misunderstanding how the frequency spacing relates to required number of cycles: The correct assumption would be that if light has 10^14Hz and you restrict yourself to single-octave circuit (for the sake of simplicity: lets say 10% relative bw circuit), then you can if you "cram" ideall of modulate fast enough, 10^13bits*log2(S/N) bits per second. so probably 10^14bits/second - that is a lot.
b) limits to Moores Law: Moores law is an economic law. There is no physical limit which i see which can be reached technologically until 2020 (in mass production). There is a technological limit to what can be produced, but going in the third dimension and new materials will give opportunity to continue on the same course for a while. If you look at what physicists are currently looking at, you realize that the end of silicon/metal/oxide technology will not be the end of Moores Law or classical computing
c) "on the atomic level i cant know where the electron is". As it happens to be i work on quantum computation and i really hate to explain that: If you arrange a specific situation, then you cant know where the electron is on the atomic scale. If the statement would be as general as he makes it, it would be impossible to have different chemical configurations of the same stoichiometric mixtures. SIngle-molecule electronic/magnetic configurations. The quantum tunnel coupling in single molecule magnets between states can be designed, and i dont see a specific reason why it should be impossible to realize single molecule devices in which tunneling does not play a role
d) he does not understand FETs AFAIU
e) contrary to his opinion, very thin 2DEGs exist and i dont see a reason why upon (finding and) choosing the right layers, the confinement can be very steep in the third direction (not infinity, but also not requiring more than 50nm thickness)
The funny thing is that he forgot what already is and probably will (there *may* be ways out, like superconductors or ballistic transport but don't bet on it) really be a problem for all classical/room temperature computers: heat. While the designing smaller elements may be possible when using the right physics/technology, reducing the capacitances of lines (associated with an energy loss in the line resistance per switching) will be difficult. Once we *really* stack in the third dimension it will need a lot of clever computer scientists (and maybe mathematicians) to reduce thee needed interconnects, since otherwise stacking the third dimension wont give us anything besides memory capacity.
So to conclude: i believe until 2050 the definition of Moores law will be obsolete. but it will not break down because we are unable to make circuits smaller, but because it may be too expensive to make them smaller or powering and cooling the circuits may become impractical. We probably will have a replacement of moores law by an equivalent scaling law for power per switching.
Re: (Score:2)
I am curious about your take on quantum computers. My impression is that, if they are ever actually made into an operational product, they are likely to have a profound impact in certain areas (watch out, public key encryption!), but are unlikely to be much use in sending emails or watching videos.
Re: (Score:2)
It's a real shame that I don't have modpoints today. You've made a really insightful post.
Although you've stuck to the physical issues rather than the economic issues you've captured the main point there with
Most industries transition from a period of rapid growth to a longer period of slower growth. Once we come up ag
Re: (Score:2)
I actually thought about a similar title. but i find in necessary to point out the own profession - unless the author of the article. That is because an electrical engineer or an chemist may have other conclusion - and i would find them very interesting. On the other hand i am really pissed that everybody who has hear the word "Uncertainity relation" believes that he can state things like "everything small must be quantum and tunnel" whereever he or she need to invoke the argument that classical physics doe
Pseudo-economist (Score:3, Insightful)
Re: (Score:2)
yeah well, productivity has been increasing for years but people who actually work for a living are seeing their _real_ wages fall. After all one way you get better productivity is to pay people less for the same work.
Wow - you especially need to take a microeconomics class. In said class you would learn:
a) As productivity increases, profitability increases. Increased profitability leads to greater incentive to provide supply of product and thus, higher demand for labor.
b) Holding supply of labor constant, increased demand for labor will raise real wages.
c) Holding demand for labor constant, increased supply of labor will decrease real wages.
Now look at the increase in real wages (constant 1982 dollars) over the
Kaku is a blight on science (Score:3, Insightful)
Kaku is an embarrassment. In the mid/late 90s he presented himself as a "nuclear physicist" to the major news outlets (he is no such thing-he's a field theorist) and jumped on the till-then fringe protest movement opposing the launch of the Cassini mission. The opposition was based on the idea that the nuclear batteries on the probe posed a danger in the event of a launch accident. Nevermind that there had previously been launch accidents with the same battery type (military sats) and the ceramic uranium cores were simply recovered and _reused_ because they're practically indestructible. (The batteries are just warm bricks. Low level uranium fission keeps them cooking and thermoelectrics generate the juice. There are no controls to go wrong, no parts to break, nada. That's why they're used. The ceramic itself is terrifically tough.)
Anyway, Kaku saw the cameras and the bright lights and decided that he was a nuclear physicist and start spouting all sorts of total nonsense to frighten the unwashed masses. He has a long history of pretending to know things. Google "Kaku evolution blather" for another example. I watched him give a seminar once while I was in grad school and I just spent the hour squirming in embarrassment for him and his self-aggrandizement.
Yes, I loath the man. I'm a physicist and he just perpetuates the image of people in my field as hubristic egoists. He needs to be shouted down and run out of the media. There are lots of really good popularizers out there (DeGrasse-Tyson, Greene, etc) who, yes, need to establish a public presence to make a career, but who are also actually interested in facts and science and education and know their own limits.
Wait Tyson is good at popularizing science? (Score:2)
So I just read the article (Score:2)
Just read the article, haven't read the comments yet.
Moore's law as far as CPUs and GPUs has already slowed down considerably this entire decade. As far as memory, so far as memory the chips aren't that thin yet. What this means is what everyone has been saying for a long time: more cores, more ram. More cores means applications need to be parallelizable. That's at least a one time overhaul of most of the world's code base.
Lets assume hardware improvements in general slow down. This leads to a hardwar
Limits of growth in general (Score:2)
Give me a break (Score:2)
Here is a news flash. I have it on good authority that
- eventually Moore's law will fail, and
- the world will continue to roll through the void. Life will go on, and we will not burn our Mac Book Pro's for heat, nor turn our rack-mounted servers into crude dwellings.
hmm (Score:3)
Worth noting this table [wikipedia.org]? Specifically the overall rows at the top for men and women. Income for men has been flat since 1970 when adjusted for inflation. All the income gains have come from women entering the workforce, going from partial to full employment, and/or the gradual elimination of sex discrimination which drives down wages. One could also argue the cost of living has actually risen faster than official inflation measures, especially when one includes the additional costs necessitated by both partners working full time. (Day care, outsourcing tasks like cleaning and yard work, etc.)
MOS transistors were developed prior to 1970, but not by much, and they didn't really start catching on until the 1970s. Now I'm certainly not arguing causation here, but by the same token I'm not sure it's valid to suggest (via sarcasm) that the move from vacuum tubes to transistors ushered in a new golden era of prosperity.
Dumb.. (Score:2)
I can't remember the book I read it it, but the author made the claim that if you look at the part of Moore's law about human computing power, i.e. doubling our ability to compute numbers every 18 months, and temporarily ignore other parts of Moore's law it holds true going back to pre-history times.
In other words silicon computer chips took up where the abacus and solid state circuitry left off, etc, going all the way back to putting marks in the sand on the ground.
If we get to the theoretical limits of si
Subatomic particles (Score:2)
Grow in other ways than smaller transistors (Score:2)
Moore's Law doesn't say chips get smaller, it says they get 2x the transistors every 18 months. It doesn't matter if chips hit a wall where transistors can't get smaller. As long as they continue to get cheaper, we will simply start growing by making bigger chips, or more chips, or 3D chips.
Further, we can get speed other ways: no moving parts, better software design and optimization, simplification. iOS v4.3 on a single core 1GHz ARM feels faster than Mac OS v10.6 on dual core 2GHz Intel because of factors
professional expert (Score:5, Insightful)
I am so sick of seeing Michio Kaku all over the place...
It made sense back when he was talking about string theory. He's a physicist, after all. But these days he's just some generic scientist who's more than happy to show up on TV and talk about anything even vaguely scientific.
Did you see him commenting on the whirlpool formed after the earthquakes in Japan? Because a physicist is obviously the most qualified person they could find to talk about ocean currents and plate tectonics and whatnot.
What makes Michio Kaku any more qualified to talk about Moor's Law than I am? It isn't like he actually knows anything about microchip fabrication or economics or industrial processes... The guy is a physicist.
I disagree (Score:4, Interesting)
Like others I believe Kaku is wrong. Here is my prediction:
Within the next 20 years massively parallel processing will become more and more common, machines with a few dozens, hundreds or even thousands cores will be the rule, and programming languages / compilers will be able to automatically turn sequential programs into parallel ones whenever this is possible. Almost all practical computing problems and needs will turn out to be highly parallelizable. The impact of this change on economy will be zero. Computers will never stop to become faster and faster.
In 50 years from now or earlier our massively parallel conventional machines will be substituted by quantum computers. These will first be available to governments and big companies and within a short period of time will be miniaturized and become available and affordable to end consumers.
Re:Maybe IT will stop sucking up 10% of economy (Score:5, Insightful)
Yeah, maybe we should stop the waste, and employ human operators to send telegraphs like they did in the good old days, scribes to write documents by hand....
Re: (Score:2)
Humans as the most important resource is the old economy ... due to overpopulation and automation the most important resources for a society are slowly becoming natural resources. For the competition between first world countries it's already true, the country with the highest median wealth has a trade surplus based on oil. The comparative advantage of an educated labour force is diminishing very fast. So no, all those people freed up in IT will not flow towards creating more wealth in other ways. It will j
Re: (Score:3, Insightful)
Really, IT has far more wide ranging applications than a fridge and can create new ways of doing things, these may not always be better but a good proportion of it is. People who think that IT is a waste are usually the same people that think the space program is a waste or that education is a waste. Progress has to come from somewhere, it is not magiclly pooed from the buts of celebrities or political figures as they dance about appealing to the masses.
Um, refrigerators use a lot of energy. (Score:2)
I take it that you are too young to pay the electricity bill... Basement? Cooler down there?
Re: (Score:2)
Re: (Score:3)
Imagine if someone else came up with a "new refrigerator" and the efforts on maintaining the "new refrigerator" came to suck up 10% of the economy.
How big of an LCD will this fridge have? Will it have USB 3, Thunderbolt or Gigabit Ethernet? How about WiFi, a full Bluetooth implementation or this new fangled NFC stuff? Will my better half be able to hook up a scale that not only weighs me before I open the fridge but after to see exactly what I took out? Will a pre-recorded movie play that tells me I shouldn't be eating whatever I just took out, reminding me of my diet or just asking "are you going to bring me one, too?" What about commercials? "I see
Re: (Score:3)
Wait, does that mean I've been wasting the 20-30% of my budget that I spend on food? I sure am going to miss it. Oh well, at least my pastime of throwing dollar coins at drains only costs me about 2% of my income and is therefore not wasteful.
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
Even if Moore's law come to an end, we can still improve the performance of the systems via parallelism.
And by returning to writing efficient software.
Re: (Score:2)
I can confidently predict that will not be what happens.
Re: (Score:2)
Only up to a point, because only increasing by parallelism means increasing it by adding higher power requirements. If you think 300 W is a lot of power now, then it will be a heck of lot worse after we compensated Moore's law for a couple of years by adding more parallelism.
Re: (Score:2)
Still, we have some limit
Re: (Score:2)
Sure but the power used by computers today has gone down as we went from say 8" drives to, 5 1/4" to 3 1/2" to 2 1/2" inch drives. For laptops this is a problem. But lets say the average computer used as much electricity as the average room air conditioner how much of a problem would that really be? Especially if (given the cost) the system where sharing to dumb terminals all through the house so there was only one of them.
Re: (Score:2)
Last time I replaced my both my desktop and my laptop is when both of them were knackered from punishing overuse (9 years and 5 years respectively). Same goes for my last phone, and my last TV. If the only replacement computers available were essentially the same as my old ones, I'd still have replaced them, the same as I would a broken oven with a new (but not substantially improved) oven. And incidentally, the biggest difference between my new CPUs and the old CPUs were in the number of cores- something w
Re:Cores (Score:2)
Actually, you brought up a problem the *excerpt* doesn't even get into - the whole cores & threading tussle. (But this is from a whole book, so we can't speak to the whole contents!). It might mean that a 64 core computer might only use some 4 cores because every dev can't always work in the complexity of parceling out tasks to an undefined number of cores and have it optimize every time. In that sense we might lose ground against Moore's law early.
Maybe it would take a hardware plateau for the big soft
Re: (Score:2, Insightful)
Uh, no. He's a gawdawful write. The entire excerpt was a dreary and largely useless lead-in to the final paragraph. Kaku writes not as if he believes in using two word where one will do but in using a hundred words where one will do.
And what does the reader get when you slog your way through to the last paragraph? The shocking news that quantum effects will put an end to conventional integrated circuits.
Jiminy Cricket! I wish I was smart enough to make that prediction! It's only been common knowledge in the
Re: (Score:2)
The real subject of quantum physics isn't matter at a scale of less than size N, it's matter at any scale where there's a probabilistic state rather than a discrete one. Several million atoms in a Bose-Einstein conjugate form can be a single quantum event, and a single electron can be a classical one.What's down at the bottom of the scale isn't unlimited smaller, but either limited or unlimited stranger and stranger. Maybe we will be able to store or manipulate data in amounts that extend Moore's law there
Re: (Score:2)
The end is never really the end. Just the beginning of the next thing. What makes anyone think that the quantum level is the end of smallness?
*I* don't know, but I do know that that's the bleeding obvious type of question I thought up when I was something like 11 years old (and I'm not *that* damn smart by any means). I strongly suspect that physicists have therefore already considered that issue, and even if it were true there's probably quantum issues deriving from the likes of the uncertainty principle [wikipedia.org] that would come into play anyway.