HP Answers The Question: Moore's Law Is Ending. Now What? (hpe.com) 95
Long-time Slashdot reader Paul Fernhout writes:
R. Stanley Williams, of Hewlett Packard Labs, wrote a report exploring the end of Moore's Law, saying it "could be the best thing that has happened in computing since the beginning of Moore's law. Confronting the end of an epoch should enable a new era of creativity by encouraging computer scientists to invent biologically inspired devices, circuits, and architectures implemented using recently emerging technologies." This idea is also looked at in a broader shorter article by Curt Hopkins also with HP Labs.
Williams argues that "The effort to scale silicon CMOS overwhelmingly dominated the intellectual and financial capital investments of industry, government, and academia, starving investigations across broad segments of computer science and locking in one dominant model for computers, the von Neumann architecture." And Hopkins points to three alternatives already being developed at Hewlett Packard Enterprise -- neuromorphic computing, photonic computing, and Memory-Driven Computing. "All three technologies have been successfully tested in prototype devices, but MDC is at center stage."
Williams argues that "The effort to scale silicon CMOS overwhelmingly dominated the intellectual and financial capital investments of industry, government, and academia, starving investigations across broad segments of computer science and locking in one dominant model for computers, the von Neumann architecture." And Hopkins points to three alternatives already being developed at Hewlett Packard Enterprise -- neuromorphic computing, photonic computing, and Memory-Driven Computing. "All three technologies have been successfully tested in prototype devices, but MDC is at center stage."
Re: (Score:2)
This might be possible. A Pentium-II on current die tech would be so small that pretty much transistor switching speed is your only concern. However you'd need the 82459AD-version L2 cache controller/Tag RAM chip to address 3.5-4GB RAM.
Re:Frequency stalled (Score:4, Insightful)
I think even P2 could address 64 GB of RAM, but with a maximum of 4 GB per process. It is something a lot of people don't realize because Microsoft disabled that on consumer OS versions.
Re: (Score:2)
I always wondered about that (until 64-bit machines became common and made it moot.) But were there any 32-bit processors that brought out more than 32-bits worth of address lines to physical RAM?
Re: (Score:2)
I always wondered about that (until 64-bit machines became common and made it moot.) But were there any 32-bit processors that brought out more than 32-bits worth of address lines to physical RAM?
Huh? Many 32 bit processors and especially those which supported some form of physical address extension supported more then 4 gigabytes of address space. That includes all Intel processors starting with the Pentium Pro which supported 64 gigabytes of physical address space. After that, practicality and market segmentation are usually what limited the maximum amount of memory. Microsoft held back greater than 4 gigabyte systems in the consumer space for years.
Re: Frequency stalled (Score:2)
Re: (Score:2)
Many CPU's still don't need active cooling. The fans in my laptop (XPS 13, i5-6200U) only come on under heavy load, and that only because of the cramped interior of the machine. The "M" cpus in things like the new Macbooks use even less power, and the Atoms less still; I had an old Atom netbook whose cooling fan died, but the machine still ran fine (including under load) without it. (I think its CPU had a TDP of 2.5W.)
Re: (Score:3)
Because higher frequencies mean higher temperatures.
That is nonsense. Modern CPUs need more power, hence they produce more heat.
Remember when CPUs didn't need active cooling?
Yes, my 6502, got about 110C hot. But the transistors still worked at that temperature. Modern CPUs have so small transistors that they can not work at that temperature anymore as the electrons would simply jump several transistors far through the substrate.
Re: Frequency stalled (Score:2)
Re: Moore's Law isn't a law at all (Score:2)
Law has a scientific and a colloquial usage. âoeMurphyâ(TM)s Lawâ is equally not a law, and yet weâ(TM)re not swayed by that argument to change its name. Mooreâ(TM)s Law was named by someone and itâ(TM)s in the common vernacular now. Nobody will have any idea what youâ(TM)re talking about if you refer to it as âoeMooreâ(TM)s Observationâ or âoeMooreâ(TM)s Conjectureâ. Itâ(TM)s not that youâ(TM)re being overly pedantic, youâ(TM)
Even Moore expected it to end years ago... (Score:1)
It was simply an observation of the then-current progression of increases in transistor count in a commercially available part over a given period of time. While it has been used as a rule of thumb regarding the complexity of parts you should expect from your competitors in the same period of time, it has mostly at this point become a form of religion amongst Silicon Valley types, leading to it being a self-fulfilling prophecy, rather than something that was a necessity for the evolution that happened. Opti
Re: (Score:2)
Actually, law doesn't have a scientific meaning. It once did, but that was back when they thought of the universe as being run by the laws of God.
Related scientific terms are things like theory, hypothesis, conjecture, guesstimate, etc. None of them imply perfection. A theory is a hypothesis that has passed several tests. A conjecture is an incompletely formulated hypothesis. And a guesstimate is a conjecture that doesn't have any calculations behind it, but may fit the known data.
When you look at how
Re: (Score:2)
OT: your browser needs new batteries so your comments down look like this - "âoeMooreâ(TM)s Observationâ or âoeMooreâ(TM)s Conjectureâ. Itâ(TM)s not that youâ(TM)re"
Re: (Score:2)
Law has a scientific and a colloquial usage. Murphy&'s Law is equally not a law...
With usual usage, it's not even a law. Typically, a law is something that can be expressed as mathematical form, e.g. the law of gravitation, the laws of thermodynamics. Murphy's Law should actually be Murphy's Theory.
Re: (Score:3)
Hypothesis is a bit strong - it's more like an observation. An observation that have pushed the silicon industry forward by having that continuous improvement as a goal. But nobody thought it was some natural law.
We still have many ways to continue Moore's law. Going 3D with either monolitic (several logic layers per chip) or die stacking (or of course die stacking of multiple monolitic 3D dies). Increasing the size of chips is another way. Most chips aren't near the reticle limit and one can actually make
Re: (Score:2)
We still have many ways to continue Moore's law.
We have many theoretical ways of going forward, problem is, there hasn't turned up an actual way. Computing speeds have stalled the past several years. Too few data-points to be certain that Moore's law is dead, but it does not look good.
Re:Moore's Law isn't a law at all (Score:5, Interesting)
It's also hitting a wall for 2 reasons:
- As shrinks get closer & closer to atomic scales, they become more difficult, and therefore, more expensive. As a result, despite other trends like larger diameter wafers, process shrinks no longer result in cost savings, which is the only reason (other than capacity) that one would wanna do those in the first place
- Unlike past years, where applications would grow in complexity to quickly overwhelm CPUs at the time, multiprocessing has completely changed the game. Although programming using multithreading & multiprocessing techniques have been around for a while, there ain't too many applications that can overwhelm multiple cores. That is a good part of the reason that Intel & AMD have slowed down in their CPU sales: not too many people have to replace laptops that they've had for years. When that gravy train is drying up, there ain't much of a case to spend billions in process shrinks.
Re: (Score:2)
It's also hitting a wall for 2 reasons:
The purpose of the article wasn't to argue about the reasons it's hitting a wall, but that the fact that it is hitting a wall is a good thing for the industry. It's an optimistic view of the industry, but not terribly convincing. If the industry can't deliver results quickly, investment will dry up and Moore's law will turn from exponential, to linear, to flat.
Moore's "Economic" Law Re:Moore's Law isn't a la (Score:1)
" If the industry can't deliver results quickly, investment will dry up and Moore's law will turn from exponential, to linear, to flat."
Turn Moore's Law around a bit and bare with me:
I've begun to wonder if Moore's Law is better thought of primarily economically rather than scientifically... addressing issues with product pipeline management, upgrade demand/appetite etc. Given that Moore was managing a technology that grew in jumps and spurts as discoveries were made and problems overcome; he mig
Re: (Score:1)
-
More on topic; I did a quick search to see if anyone shared my view from an economic perspective and thought this was interesting: https://techpinions.com/moores-law-begins-and-ends-with-economics/46575 [techpinions.com]
Not exactly what I was stating but I found it interesting that he also made the observation that "It was an estimation that became a self-fulfilling prophecy".
Re: (Score:2)
Re: (Score:2)
True. But yield also gets complicated when factoring in other parameters that may be affected as a result of a shrink. Yeah, you'll get the square of as many die/wafer, but that doesn't necessarily imply that you won't have more defects per wafer.
Re: (Score:2)
But even that power reduction has limits. Like 0.7V is the voltage that a diode must have before current flows, so internally, there is going to be clamps on Vdd: one can't just keep reducing it w/ process shrinks the way one did when going from 5V to 3.3V to 1.8V. So while there may have been in-circuit level shifters in the past, one can't keep shrinking and hoping to go from 12 hr battery life to 24.
CPUs generally will be made on the cutting edge processes, and so will higher density semiconductor me
Author Looking to Extend "Moore's Law" (Score:5, Insightful)
Yes, I know "Moore's Law" isn't a law but an observation.
When I RTFA, it seems the author is looking at different technologies to continue growth of computing capability for a given unit of space. I also get the impression that Mr. Williams is looking to fund projects that he has an eye on by saying that Si based chips will soon no longer be economically improved and VC/Investment Money should be looking at alternative technologies rather than continued shrinking of Si chip features.
Unfortunately, I don't see a fundamental shift in what Mr. Williams is looking for the resulting devices to do. I would think that if he was really planning on dealing with the end of Moore's Law, he would be looking at new paradigms in how to perform the required tasks, not new ways of doing the same things we do now.
Regardless, the physical end of our ability to grow the number of devices on a slab of Si has been forecasted for more than forty years now - Don't forget that as the devices have gotten smaller in size, the overall wafer and chip size has grown as have yields which mean a continuing drop in cost per Si capacitor/transistor along with an increase in capability per chip. I would be hesitant to invest in technologies that depend on the end of Si chips' trend of becoming increasingly cheaper with increased capabilities over the next few years.
Re:Author Looking beyond "Moore's Law" (Score:3, Interesting)
The issue he offers up for consideration is that further spending of even more $B to move Moore one step to 5nm or beyond would be better spent on looking to other
Re: (Score:1)
Slogging through The Machine. Better than nothing but I tire of not having actual gear to use, same for 3D-Xpoint for that matter.
Re: Moore's Law Is Ending (Score:2)
Moore's law just leads to bloatware. (Score:1)
Stopping Moore's law means that millennial programmers addiction to bloatware like react.js and electron will be curbed and we will be forced to write lean mean programs again. Remember when a gigabyte of ram was a high end workstation. Now it is a cheap netbook.
Ssshhhh, don't frighten the children! (Score:2)
Don't tell them they might have to bin their handholding, inefficient bloated frameworks, or have to trade in their scripting or VM languages for something that compiles to machine code and where they might - horrors! - actually have to have a clue about how memory (de)allocation, threading, multi process, DB normalisation, sockets actually works. Or know how to pick the best sorting algorithm for the data size and complexity they're working with and not just hope the 21 year old hipster who wrote the dUdeF
Been down this road before (Score:5, Funny)
Sounds like HP is about to make an itanic breakthrough
Or alternatively... (Score:2)
We start to megazord the chips, kinda like those HBM memory modules, but for several things like cores,GPUs or even lower level components like ALUs and pipelines etc..
Of course, we're talking about a cooling and interconnecting hell here, but probably will be more economically feasible than trying to make sub atomic transistors.
Maybe it's a good thing for computer science (Score:2)
It's not good for anyone else. Fast, simple, cheap improvements that means my computer today is absurdly much faster than the C64 I had 30 years ago. And my dad can tell how much faster they are than vacuum tubes 60 years ago. A friend of mine has two classic cars, ~30 and ~50 years ago. Maybe they're not quite as reliable or safe as modern cars, but they go fast enough to mingle well with other cars. I think it'd be pretty sad if in 2047 base performance is pretty much the same and we just do it "smarter"
Re: (Score:2)
I wouldn't expect too much improvement on the algorithm side of things, we've already been optimizing those as much as possible for years because current hardware can't keep up.
As for more cores, eventually power and heat requirements will become too large. Even if you can generate free power, the heat will have to go somewhere which will become a practical limit if we want to keep following Moore's law in just a few iterations. At the point where your computer is providing the heat for your entire house
Re: (Score:2)
I wouldn't expect too much improvement on the algorithm side of things, we've already been optimizing those as much as possible for years because current hardware can't keep up.
I think that they're talking about doing more low-level programming instead of things like Java.
Re: (Score:2)
I don't know how quickly stuff will change, but we have to fronts: first new main memory technologies, like magnetic based RAMs. And secondly, different operation systems to exploit that. Better/Different operating systems is probably the slow part.
Magnetic based RAM (Memristor, see: https://en.wikipedia.org/wiki/... [wikipedia.org] magnetic based main memory that holds its memory when switched off) will blur the distinction between RAM and "hard drive" or any external memory.
Bottom line that means with a new operation sys
Re: (Score:2)
Considering the alternatives that he is proposing, and the increasing costs of fabs as features shrink, it might be a good thing.
OTOH, either neuromorphic chips or memory centered chips will drastically redefine the art of programming, possibly rendering current skills obsolete. So it's not likely to be good for the current programmers. (Even multiprocessing needs changes that aren't yet widespread.) The third option, photonic computing, might well not redefine programming, but that's a "might" depending
Not good at all. (Score:2)
There's nothing good about this at all for a consumer. Maybe for a lazy has been corporation like HP who can wallow in their simplistic and outdated designs that barely need to change. This is a sign that a market has reached stagnation and has nowhere else to go. As a hardware and computer nerd, this is a dark age.
Re: (Score:3)
https://msdn.microsoft.com/en-... [microsoft.com]
Registry settings that can be modified to improve operating system performance
This section provides a description of recommended values for several registry entries that impact operating system performance. These registry entries can be applied manually or can be applied via the operating system optimization PowerShell script included in Windows PowerShell Scripts.
Increase available worker threads
At system startup, Windows creates several server threads that operate as part
More efficient programming (Score:1)
We've all observed that as CPU and RAM have increased in speed/capacity, software has bloated horribly. Maybe now it's time to rethink crap like .net* and layers of virtualization, and go back to efficient code writing.
* (I've seen many major software projects more than quadruple in size, become buggier, and run much slower when they switched to .net)
Re: More efficient programming (Score:2)
And end to laziness (Score:1)
Moore's Law has allowed too many to justify lazy design decisions and programming in computer architecture. With the end of Moore's Law, progress will come by offloading the main CPUs as much as possible, echoing construction of mainframes way back when. By coprocessing I/O, audio, and even most of the GUI, and OS kernel, the main CPUs can be reserved for actual user processes. Much of the housekeeping of a modern OS does not demand the degree of processing power of the main CPUs, but often hardware has bee
Re: And end to laziness (Score:2)
Moore's Fork (Score:5, Interesting)
Re: (Score:2)
I'm very interested in H.265 video capabilities in machines. My old Nexus 7 2013 model doesn't handle HEVC well at all, but newer ARM chips have H.265 decoding built-in. Same for Intel and AMD chips -- I have an old dual-core laptop that struggles to play HEVC (nearly 100% CPU usage, but it pulls it off assuming no background processes trip it up), yet newer laptops can play HEVC without much CPU usage at all with the new instruction set for it.
I've been eyeing ARM - based boards like the raspberry, orang
Re: (Score:2)
We already did hit it. We knew it was coming for 20 years and haven't come up with an alternative to what we have now. Progress is NOT inevitable.
Re: (Score:2)
In fact, it is an old trend that is coming back.
Old systems had plenty of specialized chips for sound, graphics, IO, etc... As CPUs became more powerful, these specialized features ended up as software run by the CPU. Now, these accelerator chips are coming back, the difference being that they tend to be on the same die as the main CPU (SoC).
the birth of a legend (Score:1)
Your grandfather's Hewlett Packard made calculators that were the envy of engineers everywhere. The pilgrims of NASA jet-packed to the blast-proof Taj Mahal by the Boeing load.
Your father's HP made printer ink that was the envy of Rupert Murdoch. Bean counters sprouted sturdy beanstalks, and spouted unto the clouds in ecstasy. (This was before the one true cloud to rule them all.)
Today's HP makes drivel that's the envy of one last, eccentric greybeard who lives in a ratty shack near the beach, with old n
Re: (Score:2)
Um... the recent server offerings are pretty impressive. You should check it out.
What it really means (Score:2)
Re: (Score:2)
The answer is "not much faster". In fact, the computer you own now isn't significantly faster than the one you had 5 years ago.
No, the answer is "much faster" using dedicated hardware designs for AI applications. The first generation of Google's TPU was already 15 to 30 times faster than a generic GPU, and used much less power. This was their first attempt, and Google's not even an experienced high performance chip designer.
There is still a lot to gain, not only by limiting precision, and by implementing special functions, but also by recognizing that occasional mistakes are not a big problem, which allows a more efficient design.
Re: (Score:3)
Good thing (Score:2)
Ever question why there are constantly so many versions, updates, patches and problems? Because the hardware keeps getting updated, which gives us new potential features and introduce new problems.
Old phones keep slowing down, and losing battery life because of this.
With consistent hardware, we can:
a) Take more time and develop software WITHOUT bugs. Yes, I know this sounds ridiculous, because no one can take years to develop software when their competitors do it weeks. No longer a problem once the hardw
Chicken little will have wrote better code. (Score:3)
For starters, I've been reading "the sky is about to fall" articles for at least 20 years about how: "In 2-3 more years Moore's Law is going to slam into a barrier imposed by the laws of physics.". The entire world of computing will come crashing down and burn, the beast will rise from the pit, the keymaster and gatekeeper will find each other, the dead will dig themselves up from their graves, dogs and cats living together, mass hysteria. The doom and gloom crowd have been wrong every time so far. Every time, some clever person at IBM or Intel has figured a way to cross the streams and save the world. So why should I trust the chicken littles that we're living in the end times now?
And even if Moore's Law slows down or pauses, there's plenty of room in the hardware we have today for continued improvement on the software side. Developers will just have to rely less on "pay no attention to the man behind the curtain" languages and frameworks and go back to optimizing their code for performance... like they used to before crap like Java, .Net, and Rails encouraged everyone to be lazy and rely on ever-improving CPUs to make their apps not suck. Why should the hardware guys do all the work, after all? Hell, they can start by writing their code to be properly multi-threaded. My desktop, for example, has a core i7 with 4 physical cores and 8 virtual ones via hyper-threading. I couldn't begin to count the number of times I've watched some program or another run a single core up to 100%, stop there, and ignore the 7 other threads it could be running simultaneously to improve its performance nearly 8-fold, no new or faster hardware needed.
Re: (Score:2)
Re: (Score:2)
I was trying to type "will have to write better code.". I'm not sure how exactly "will have wrote" came out, beyond that whole post being a pre-coffee incident.
What now? How about a complete change in computing (Score:1)
As the transistor's get closer to the minimum allowed and with them unable to push past 4.3Ghz (unless extreme cooling is used) the CPU has reached stagnation and it's just clever marketing with little features built into the CPU that differentiates each process(along with the stupid naming conventions now used which mean nothing to many people means we should be looking elsewhere.
We need to dump silicon, or find a hybrid way of using current nm production process with a new material to increase computing p
Any headline that ends in a question mark... (Score:1)
Meg Whitman's Corrollary to Moore's Law: (Score:2)
Now that Moore's Law is ending, (Score:1)