Chip Power Breakthrough Reported by Startup 174
Carl Bialik from WSJ writes "The Wall Street Journal reports that a tiny Silicon Valley firm, Multigig, is proposing a novel way to synchronize the operations of computer chips, addressing power-consumption problems facing the semiconductor industry. From the article: 'John Wood, a British engineer who founded Multigig in 2000, devised an approach that involves sending electrical signals around square loop structures, said Haris Basit, Multigig's chief operating officer. The regular rotation works like the tick of a conventional clock, while most of the electrical power is recycled, he said. The technology can achieve 75% power savings over conventional clocking approaches, the company says.'"
Article text for the hard-of-linking (Score:3, Informative)
By DON CLARK
May 8, 2006; Page B6
A tiny Silicon Valley company is proposing a novel way to synchronize the operations of computer chips, addressing power-consumption problems that are a major issue facing the semiconductor industry.
Multigig Inc., a closely held start-up company in Scotts Valley, Calif., says its technology is a major advance over the clock circuitry used on many kinds of chips.
Semiconductor clocks work like the drum major in a marching band, sending out electrical pulses to keep tiny components on chips performing operations at the right time. In microprocessor chips used in computers, the frequency of those pulses -- also called clock speed -- helps determine how much computing work gets done per second.
One problem is that the energy from timing pulses flows in a one-way pattern through a chip until it is discharged, wasting most of the power. Clocks account for 50% or more of the power consumption on some chips, estimates Kenneth Pedrotti, an associate professor of electrical engineering at the University of California at Santa Cruz.
Partly for that reason, companies such as Intel Corp. have all but stopped increasing the clock speeds of microprocessors, a popular way to increase computing performance through most of the 1990s.
John Wood, a British engineer who founded Multigig in 2000, devised an approach that involves sending electrical signals around square loop structures, said Haris Basit, Multigig's chief operating officer. The regular rotation works like the tick of a conventional clock, while most of the electrical power is recycled, he said. The technology can achieve 75% power savings over conventional clocking approaches, the company says.
A typical chip would use an array of timing loops, in a grid akin to a piece of graph paper, Mr. Basit said. The loops automatically synchronize their timing pulses. That feature helps address a problem called "skew" -- the slightly different arrival times of timing pulses throughout a typical chip -- that tends to limit clock precision.
Multigig says its self-synchronizing loops can run efficiently at unusually high frequencies.
Mr. Pedrotti said past attempts to address the skew problem have tended to increase power consumption. He and his students, some of whom receive research funding from Multigig, have performed simulations that so far back up the company's claims, though the team is just about to start tests using actual chips, he said.
Multigig is in talks to license its technology to chip makers, as well as design some of its own products to use the clock technology. Besides microprocessors and other digital chips, the approach could help synchronize frequencies of communication chips, Mr. Basit said.
"This is a dramatic way of clocking circuits," said Steve Ohr, an analyst at Gartner Inc. He cautioned it could take years to get existing manufacturers to modify existing products to take advantage of the new technology. "Intel is not going to redesign the Pentium tomorrow because of it," he said.
I'll call bullshit (Score:2)
In the opposite corner we have the asynchronous processing folks who tell us that removing clocking will improve power consumption.
These are at odds with eachother and someone has gotta be wrong. I smell a VC scam.
Re:I'll call bullshit (Score:2)
Re:I'll call bullshit (Score:4, Interesting)
Larger drivers = larger power.
Therefore if you've got a method to make your clocks arrive more accuratly then you've more timing margin between FFs and therfore can use smaller drivers.
Clock trees are also the major consumer of power in most designs, so anything that can reduce them is good.
Async removes the clock altogether so you save power there.
So yes both of them can be right.
This is a breakthrough! (Score:4, Funny)
Re:This is a breakthrough! (Score:2)
not far off (Score:3, Informative)
Not so fast (Score:1)
D'oh! Looks like I won't be getting 12 hour battery life on my laptop anytime soon!
Re:Not so fast (Score:2, Funny)
Simple Math (Score:4, Informative)
Lower power means lower heat, too. (Score:3, Interesting)
Now, whether it is linear or not, any heat reduction is a Good Thing (tm).
Hopefully we can choose between faster chips at the heat levels we have now, or the same speed chips at a 37.5% reduction in heat (and points in between).
Re:Lower power means lower heat, too. (Score:2)
Only for the CPU (Score:3, Interesting)
Re:Only for the CPU (Score:2)
Advertising lingo - "up to" (Score:4, Insightful)
Remember, in advertising-speak, "up to" means "less than". Values between 0% and 75% fulfill the conditions of being "up to a 75% savings".
Re:Advertising lingo - "up to" (Score:2)
Re:Advertising lingo - "up to" (Score:2)
You are a tard. "up to" means "at this point under definable conditions". It's like the EPA ratings on your car, EG: 23 city, 31 highway.
If you drive on a level highway, with a fully-tuned car, with a recent oil change, properly inflated tires, at 75 degrees farenheit, at *exactly* 55 degrees, you'll see your 31 MPH.
But reality is that your tires are a tad low, you haven
Re:Advertising lingo - "up to" (Score:2)
Oh really? Check a dictionary, or even use simple logic. "Up to N" == "not greater than N" == "less than or equals to N". Assuming values can't go negative, "less than or equals to N" == "between 0 and N".
The only reason EPA ratings on cars are reasonably accurate is bcos it's legally required - they are forced to demonstrate that the mileage figure they give can be achieved under "perfect" conditions. Without that, there would be
Re:Advertising lingo - "up to" (Score:2)
Thank you.
Re:Advertising lingo - "up to" (Score:2)
You are a tard. "up to" means "at this point under definable conditions". It's like the EPA ratings on your car, EG: 23 city, 31 highway. If you drive on a level highway, with a fully-tuned car, with a recent oil change, properly inflated tires, at 75 degrees farenheit, at *exactly* 55 degrees, you'll see your 31 MPH.
Poor example...you forgot to add in "with your air conditioner off" as a condition. That said, I think the post you reply to makes a good point about advertising in general.
Just as car co
Re:Advertising lingo - "up to" (Score:2)
They are required, by EPA regulation, to report what the car got on a dynamometer under certain cycles meant to simulate actually driving. The EPA regulates that this is the way. The EPA is therefore responsible for the validity of the results.
Some c
Re:Advertising lingo - "up to" (Score:2)
No, the EPA is responsible for the accuracy of the results under the specified conditions. The validity of the results is a function of whether the test conditions reflect the typical use of your average vehicle. Since (in most cases) they fail to set test conditio
Re:Advertising lingo - "up to" (Score:2)
Re:Advertising lingo - "up to" (Score:2)
Radical Breakthrough? (Score:1)
Why not? If this works it sounds like Moore's law would continue, and would give whatever company that deployed it first a performance advantage.
Is this really so radical we'll have to wait years to get it on our desks?
Re:Radical Breakthrough? (Score:5, Insightful)
Because first they're going to get a bunch of their theoreticians to work the math on the problem to make sure it's viable. Then they're going to get a bunch of their VLSI modellers to run virtual simulations on the clock modification to refine exactly how great the potential efficiency gain would be. If that turns out OK then they'd produce some simple mock-ups of the new clock architecture to make sure that it functions correctly in hardware. Then they'd go about the expensive and time-consuming process of redesigning the current chip architectures to include the new style clock. Then they'd produce an initial fabrication of the chip to run through extensive hardware testing (and on the inevitable failure they'd hop two steps back and try again.) Once they were happy with the design they'd scale up to full production and roll it out.
Everybody in the microprocessor design world remembers this [wikipedia.org] all too well.
Re:Radical Breakthrough? (Score:1)
Re:Radical Breakthrough? (Score:2)
Nah, that's when they bring in the bunny-suited lawyers to prove that they were the ones that invented the technology all along.
:-)
Re:Radical Breakthrough? (Score:2)
Pentium 4 has 64 flaws, Core Duo has 34 and counting...
At this point releasing a CPU with only one obscure FDIV bug would probably be a day to celebrate.
Re:Radical Breakthrough? (Score:3, Informative)
The Pentium also had the more egregious F00F bug, the nonexistent opcod
Re:Radical Breakthrough? (Score:2, Funny)
Re:Radical Breakthrough? (Score:2)
are any of the core solo/duo ones that bad?
Re:Radical Breakthrough? (Score:2)
It isn't every day you can read in a news article how to use windows calculator to make your computer output incorrect math... I remember being tickled to death about it being what the article said it would be.
Re:Radical Breakthrough? (Score:2)
Tools will need a rehack. (Score:3, Informative)
Why not?
For starters the automated design tools will need a rehack.
Current synchronous chips use a "clock tree" to try to get all the flops and latches to clock at once. Then the design tools assume that the outputs flip at the same time and try to route the signals so they all get through the logic to set up the flops in time for the next clock.
This scheme will produce waves of clocking that propagate across/around the chip. So
Getting closer (Score:3, Funny)
Seriously though, I'll look forward to seeing this new chip in production, since more energy efficient chips means less waste heat, and thus quieter computers with fewer fans. I'll trust it when I see it, I'm not so swayed by a company that is still just a "startup" probably looking to get a boost to its stock price by anouncing a breakthrough.
Re:Getting closer (Score:1)
Re:Getting closer (Score:2)
You might save a lot of power (Score:2, Informative)
No overclocking (Score:3, Interesting)
Re:No overclocking (Score:2)
You're assuming that A. there can be only one pulse in flight at a time (which is probably not the case) and B. that the breadth of the pulse is constant. I would expect that in such a design, calculation might occur on the rise and the value would be propagated to the next stage of the CPU on the fall, which would mean that the pulse width and number of concurrent pulses in the loop could be adjusted to allow for significant va
Re:No overclocking (Score:2)
Re:No overclocking (Score:2)
No, but it could improve the stability of the circuit if you have problems with the computation not consistently being done by the time it needs to be propagated on the back side of the pulse. If it gets to a certain width, of course, you'd end up reducing the pulse multiplier or else you'd have problems with the data not being propagated before the next clock arrives. The point is that there's some range in the middle where it wou
Chip technology is awesome (Score:1, Interesting)
First of all, I can barely grasp how chips work in the first place, lots of yes-no-maybe so gates that the electrons have to pass through.
So, would it be possible to make a 3-D chip? Where, instead of one line or branches that the electron follows but a crazy ass network for it to flow through?
Re:Chip technology is awesome (Score:1)
Re: (Score:3, Insightful)
Re:Chip technology is awesome (Score:2)
"But the team, writing in Physical Review Letters, believes the effect may be useful in driving coolants through overheating computer microchips."
Fractals? (Score:2)
However, one possible solution to the problem you pose would be to design the chips with lots of little holes and pumping fluid through it. A design could be based on a fractal 3D
Re:Fractals? (Score:2)
Nevermind fluids... I seriously wonder hy we don't already use chips formed as Sierpinsky carpets, with plain ol' copper or aluminum cooling running through the chip!
You wouldn't even need a true fractal on today's chips... A mere second or third order carpet would vastly improve CPU cooling at almost no expense (well, i
Re:Fractals? (Score:2)
The problem really has to do with creating too much heat in the first place (e.g. chewing through the storage capacity of your laptop battery), and what the heck you do with the heat once it's off the chip and on the heat sink. Apple's PowerBook / MacBook case is
Re:Chip technology is awesome (Score:2)
Best solution: invent room-temperature superconductor, make the chip out of that, profit.
Second-best solution: Handle it the same way office buildings do, by "installing air-conditioning ducts". i.e. little hollow tubes full of moving air (or some sort of coolant) that run through the cube at intervals carrying the excess heat away.
Third-best solution: Run the chip slowly enough that only a little bit of heat is generated: littl
Re:Chip technology is awesome (Score:2)
There have been some interesting research projects carried out using Sierpinski cubes as the chip fabrication layout, and using the channels in t
Re:Chip technology is awesome (Score:2)
There's a circuit in the chip, which is not just "one line or branches"... it really already is a "crazy ass network" it flows through. You might be able to change the layout slightly and make the circuit itself more efficient by giving yourself the freedom of working in 3 dimensions... however I bet that would be harder to design, manufacture, and cool.
Re:Chip technology is awesome (Score:2)
A question which generates this many informative/interesting replies is, by definition, "interesting". It's merely not "informative".
-Chris
Re:Chip technology is awesome (Score:5, Informative)
P.S. In this context, the correct spelling of nerd is E-N-G-I-N-E-E-R ;^)
In most respects, chips today are ALREADY 3d in that there are multiple layers of planar (flat layers) metal wiring (anywhere from 4 to 8) connected by vias (vertical interconnect) over a single layer transistors. The routing of signals on each layer is on purpose designed to be a crazy-ass network (to avoid electromagnetic signal coupling noise between adjacent wires).
However, in current technology, there's still only 1 layer of transistors, and the main limitation of adding more is that there's no good way to get rid of the heat of transistors. Even today, there isn't a good way to get rid of the heat of the transistors in the 1 layer of current chips, let alone a big pancake stack (or lasagna) of transistors. People are already starting to stack memory chips that don't get too hot together, and I'm sure they'll eventually start doing different kind of stacks too as they get better at figuring out the heat problem...
Heat is your enemy (Score:2)
Current CPUs keep the transistors very very close to the heatsink and still struggle to keep them cool. If you had a cube shaped chip then it would be near impossible (with traditional processes).
There are some interesting projects to get miniture coolant pipes running through the chip, but that's a way off.
Re:Chip technology is awesome (Score:1)
Sounds good but what about size? (Score:4, Insightful)
Other techniques like multiple independant clock areas that can be shut down when not in use seem far more beneficial, IMHO.
Re:Sounds good but what about size? (Score:1)
Re:Sounds good but what about size? (Score:2)
Multiple clock areas already exist (infact I'd say exist in all modern SOCs (certainly every one I've ever worked on)).
Certainly the last 3 chips I worked on were more power limited than area limited, and with modern processes is becomming ever more so - so another tool in the chest to trade area against power would be welcomed
less power consumption for your berserker! (Score:1, Funny)
Re:less power consumption for your berserker! (Score:2)
Sci-Fi Weapons to Join US Arsenal
U.S. Considers Anti-Satellite Laser
World domination here we come...
Whoops (Score:2)
Sci-Fi Weapons to Join US Arsenal [slashdot.org]
U.S. Considers Anti-Satellite Laser [slashdot.org]
vaporware...? (Score:5, Insightful)
Re:vaporware...? (Score:2)
Sometimes a new mind working on a problem can yeild solutions much faster than 1000 people thinking "the old way"
Re:vaporware...? (Score:4, Insightful)
I'm in no way qualified to comment on the actual technology here, but I will submit that this situation isn't as unlikely as it might seem. For many problems, the potential solution-space is so large (and the cost of trying out various approaches is so significant) that even a large R&D lab with a big budget and years of effort can end up missing what in retrospect is a very clever and useful solution. It's easy to get bogged down trying "just one more tweak" of your first (or second or third) approach that you never look around and notice the other approach hiding in plain sight. Even worse, a given organization can easily build up a culture that says "this is the way we do things, because this is the way we know things work", which can discourage even bright new employees from looking at alternative methods. (i.e. Why "start from scratch" with approach B when your company has invested millions in developing approach A?)
A new startup, on the other hand, doesn't have all that baggage that might limit their point of view. Or even more likely, some bright person may have had The Big Idea, and decided to found a startup to exploit it and get rich, rather than donating his idea to some pre-existing corporation.
That said, there is plenty of room for bullshit vaporware in the world too
Re:vaporware...? (Score:2)
"More" does not always mean more (Score:2)
This is not always true. As one increases the number of humans, one increases the inherent inefficiency [wikipedia.org].
A more interesting question is whether a thousand people who love their work can do more work than 10 people who ~hate~ one another. (I'll let you know how it turns out. CV available upon request.)
Induction feedback ... (Score:2, Informative)
Technical paper? (Score:1)
Re:Technical paper? (Score:3, Informative)
Re:Technical paper? (Score:4, Informative)
http://multigig.com/pub.html [multigig.com] has some whitepapers. I read the ISSCC 2006 slide set, which let me know the general technique.
Basically, they produce a clock ring to produce a "differential" clock pair that after one lap swaps neg and pos and so it's frequency is tuned by it's own capacitance and inductance. They call it a "moebius" loop since it's not really a differential pair, but the clock wave makes two round trips before getting back to the start.. Neighboring loops can be tuned together (although if that's by just routing the wave throughout the chip I'm not sure). They didn't seem to mention synchronizing the period to outside sources, and I'm not sure how they'll be able to do that.
The clocking is not the interesting part to me, but rather their logic strategy. The trick is that logic itself has no connection to power or ground. The clock nets provides the "power and ground" and all logic must be done as differential (a and abar as inputs, q and qbar as outputs). This is where they get the power savings from--the swings are reduced and there's no path to power or ground to drain away charge. Without really discussing it, charge seems to just shift around on internal nodes between the differential logic states. They then use pure NMOS fets for logic, which removes all PMOS. The logic will never read the power rail, though--it will always be a Vt drop. I just looked this over quickly, but it seems the full-swing clocks and lack of PMOS make this work out fine.
For quick adoption, they'll need to work out clever techniques to connect this logic to standard clocked logic. Otherwise, it looks only a little bit easier to use than asynchronous logic. The issues they face seem very similar to asynchronous logic issues--tool support, interface to standard clocked logic, debug, test, etc.
It's not vapor.
Sounds like adiabatic logic (Score:2)
The ring thing sounds like it's just a new clock generation scheme to go with the existing adiabatic logic techniques (which do have rather
Re:Sounds like adiabatic logic (Score:2)
I call BS (Score:5, Insightful)
The article seems to say that the 'tick' of the clock is carrying energy throughout the chip and when the 'tick' hits the edge, the energy is lost. Electronics in your typical digital circuit does not work that way. Energy does not flow through the chip with the signals (ok, it does theoretically, but that amount is negliable with the dynamic losses in the gates mentioned below).
You get power dissipation in each gate or buffer that changes state because of some signal, irregardless of the direction in which the information is flowing. You can not recycle this power. This comes directly from the basic principle behind CMOS technology (used by almost all digital chips today) - you are charging and discharging a capacitor.
Typical example, that running signals in a circuit does not save power: take a ring oscillator (a number of negators wired in a loop). This circuit will oscillate (send changing signals through its loop) and consume an considerable amount of power.
Re:I call BS (Score:4, Informative)
http://www.eetimes.com/news/latest/showArticle.jh
Looks interesting. I wonder what they mean with 'taps', and if they calculated their power savings right (would each register need its own tap, or if not, is the buffer needed to boost the power from the loop included in the clock system power?)
Geometry is significant (Score:2)
On its first analog-to-digital converter, MultiGig will implement one physical ring with four phases. Taps can be implemented at any point around the ring to gain access to any of the four phases.
I interpret this as they will have 4 "clock" wires, each carrying a square wave with a 1/4 On, 3/4 Off cycle, with each of the wires out of phase (1/4th shifted) with each other. Since the wire arrangement has previously been described as a square, this creates an in
Re:Geometry is significant (Score:2)
Aha. That makes it clear. That probably will work to save a lot of power. Neat. I hope many chipmanufacturers (AMD, Xilinx, etc) will be able to successfully use it.
Re:I call BS (Score:5, Informative)
You're half right. You're right that what's going on is a charging and discharging of a cap, but you're wrong that the charge can't be recycled. A conventional clock works by connecting the gates of a bunch of devices (i.e. capacitance) to Vdd, then after a little time connecting it to ground instead. Wait a little bit, then repeat. What effectively happens is that you dump some amount of charge from Vdd to ground each switch, and it's gone (i.e. it's heat now). A water analogy would be a tub of water above you (Vdd), a bucket in your hand (the capacitance), and the ground (gnd). You pour some water from the tub into your bucket (charge the cap), then dump it on the ground.
It doesn't have to be this way. There are actually ways to charge a capacitor, and then pull the charge back out again (without dumping it to ground)! I'm going to assume you're familiar with LRC circuts, and how they can resonant when an impulse is applied. What's going on during the oscilattions? Charge is moving into the capacitor, and then being pulled back out to the inductor. The same charge goes back and forth, ideally forever (of course, in practice, the resistance isn't 0 so you put out some heat and the oscillations dies down). I'm not sure what exactly the water analogy would be - maybe a wave sloshing back and forth in a trough.
I recently attended a seminar where the presenter talked about clocking based on LRC oscillations and he had actually fabbed chips that worked. The basic idea was to put an inductor on the die, and set up oscillations between the inductor and the clock load capacitance, which results in a ticking clock. Of course, you get a sinusoidal clock instead of a nice almost-square-wave, so your circuits have to be designed a little bit differently, but the point is, it works and is doable.
Now, the technology described in this article, as best as I can tell, uses another idea - transmission lines. In a normal design, your clock grid basically looks like a bunch of capacitors with resistors in between (i.e. distributed RC). It takes time for a signal to propagate - signals propagate much slower than the speed of light, becuase you actually have to charge up the capacitance along the line through the resistance of the line itself. Imagine a long trough that's empty. You start pouring water in, and although water reaches the far side pretty quickly, you don't actually observe it until the water level at the far end is half way up. Signals propagate differently when wires are set up as transmission lines - they propagate at much closer to the speed of light, because you're actually sending a wave down the line (imagine creating a ripple on a trough of water, instead of actually filling and emptying the trough).
Now, I don't understand how they combined charge recycling and transmission lines, I don't understand transmission lines all that well, but your arguments aren't good reasons to disregard the claims made by the company.
If you're interested, here [cmu.edu] is a little bit of info about the talk I went to.
Typical example, that running signals in a circuit does not save power: take a ring oscillator (a number of negators wired in a loop). This circuit will oscillate (send changing signals through its loop) and consume an considerable amount of power.
If you created an oscillator between an inductor and a capacitor, on the other hand, once you started it going, it would continue for a long time with minimal energy injected in the future.
Re:I call BS (Score:2, Insightful)
Re:I call BS - Wrong, Wrong, WRONG! (Score:3, Informative)
I didn't say electron flow speed changes. I said signal propagtion speed
Re:I call BS - Wrong, Wrong, WRONG! (Score:2)
Re:I call BS (Score:2)
Worse than that, this isn't entirely a quirk of the technology; it's partly a basic limitation of physics/information theory. There's a certain amount of energy tha
Re:I call BS (Score:2)
Worse than that, this isn't entirely a quirk of the technology; it's partly a basic limitation of physics/information theory. There's a certain amount of energy that must be expended to delete a bit of data, and that's a hard limit.
We're still orders of magnitude away from caring about that, though.
Re:I call BS (Score:2)
Goddamnit, please don't resort to making up words in the midst of an otherwise readable post. It impacts your credibility more than you (apparently) realize...
Re:I call BS (Score:2)
Voltage flowing in loops.. (Score:3, Insightful)
Well now... (Score:2)
Re:Well now... (Score:2)
Only 75%? Personally, I would be 100% less likely to use a car that really can't go anywhere. Unless I were homeless or looking for a nice seedy place in which to Fornicate Under Carnal Knowledge...
Clockless CPUs? (Score:2)
Re:Clockless CPUs? (Score:3, Informative)
Compared to all the other logic in a cpu from the decoders to the schedulers to the ALUs, load-store, and then all the support pipeline registers, control logic, etc not to mention the cache...
The problem with "doing away with the clock" is being able to co-ordinate things in some usable amoun
EETimes: Xmas gift leads to rotary wave epiphany (Score:2, Interesting)
http://www.eetimes.com/news/latest/showArticle.jht ml;jsessionid=SG3NCFVRB3QWEQSNDBESKHA?articleID=18 7200783 [eetimes.com]
the EE Times piece (in the printed edition not up on the web) has a sidebar,
with neat background on the inventor:
________
Christmas present leads to ratoary wave epiphany
The Rotary Traveling Wave technology was the brainchild of MultiGig Inc.
founder and chief technology officer John Wood, a self-taught inventor
and son of an inventor w
low power computing (Score:2)
This (highly technical) paper describes what I'm talking about:
http://www.zyvex.com/nanotech/electroTextOnly.html [zyvex.com]
This article mentions a "helical logic" which sounds a bit like what this invention is...
Re:nah (Score:2)
" Multigig, have performed simulations that so far back up the company's claims, though the team is just about to start tests using actual chips, he said. "
Given lots of unknown factors that can arise when you're using real electrons on real silicon, I like the idea, but I'll happily wait for the prototype before thinking this would be a net good thing.
Re:nah (Score:1)
Re:nah (Score:4, Interesting)
Re:nah (Score:2)
Re:nah (Score:2)
Re:nah (Score:2)
eta(trans) * eta(gen) * eta(charge) * eta(discharge) * eta(motor) * eta(trans) =
0.85 * 0.
Re:nah (Score:2)
There's a new lithium ion variant with a nanotube-array electrode that might be a good solution for that. Charges 85% of capacity in minutes, which implies enormous power densities and minimal precentage losses to heat.
(There's also a new lead-acid cell design using graphite rather than lead for the structural support of the plates that makes a similar sort of improvement in lead-a