AMD Eyes Major Socket Change (pcgamer.com) 92
An anonymous reader quotes a report from PC Gamer: According to a tweet from Executable Fix, a well-known leaker, AMD will finally move away from PGA to LGA with the shift to AM5, the new socket set to replace AM4. They say the new socket design will be LGA-1718 -- the number representing the number of pins required for the package. They also note that a coming generation of AMD chip will support DDR5 and PCIe 4.0 with a 600-series chipset.
When we talk about PGA, we're most often discussing processors with pins sticking out the underside of a chip that slot into a motherboard with a compatible socket. An LGA design will instead see a flat array of connection points on the processor, which will align with pins within the motherboard's socket. Either way you look at it, you're getting some very bendable, if not breakable, pins. But in my opinion it's much easier to bend those pins on the CPU. While a shift to LGA may seem somewhat trivial, the change will mark a major shakeup in AMD's desktop lineup.
When we talk about PGA, we're most often discussing processors with pins sticking out the underside of a chip that slot into a motherboard with a compatible socket. An LGA design will instead see a flat array of connection points on the processor, which will align with pins within the motherboard's socket. Either way you look at it, you're getting some very bendable, if not breakable, pins. But in my opinion it's much easier to bend those pins on the CPU. While a shift to LGA may seem somewhat trivial, the change will mark a major shakeup in AMD's desktop lineup.
Eyes Major Socket Change (Score:2, Offtopic)
I see what subbie did there.
Re: (Score:1)
Electricity + Eye Sockets? .... Ruunnnn!....
Re: Eyes Major Socket Change (Score:1, Funny)
* ElectroBOOM has entered the chat
Good Luck getting a Motherboard RMA'd (Score:2)
Re: (Score:2)
That would be a claim against the carrier. Also have stuff like that shipped by USPS. They generally handle things like that better even if it's slower.
Re: (Score:3)
Re: (Score:2)
Did you use a credit card or PayPal? I had a faulty clothes iron, which broke again after warranty (first one was covered). The credit card company paid the original amount, and I threw the thing.
Try giving them a call. Hopefully you could get your money back...
Re: (Score:2)
Good luck getting a processor RMA'd. Personally I prefer to replace a $200 motherboard rather than a $500 processor.
Re: (Score:2)
ASUS has had really crappy thermal design for a while now. I have completely stopped buying from them.
Comment removed (Score:5, Interesting)
Re: (Score:2)
Advantage is in shipping it's easier to protect an LGA than a PGA from damage. Although I never really had any problems with PGA because when boxed the pins were enclosed and faced the boxed cooler. You'd have to basically hit the processor hard and that could cause other damage than just pins. Something LGA wouldn't protect from either.
Re:Potential best option. (Score:5, Insightful)
That worked well in the sub GHz regime, and when the pins were further apart. These days the socket / interface starts becoming a real engineering challenge in and of itself.
Backwards compatibility. (Score:2)
Well considering they've been using LGA [wikipedia.org] on their Threadripper and Eypc lines it's not that big of a step. It's just the AM4 [wikipedia.org] that is. Thing is it looks like AM5 will still retain the chip-set design. Which remembering when Ryzen first came out it was held back by chip set availability.
Re: (Score:1)
Backwards compatibility would be great if CPUs were cheaper over time, reliably. More often then not, they just get discontinued when demand drops. That leaves anyone wanting to upgrade with the daunting prospect of finding a good condition used part.
Re: (Score:1)
From what I've read elsewhere (mostly 3Dcenter.de, a German hardware magazine), AMD will change both the socket and memory interface with Zen4. The article says "a coming generation" for DDR5, but the LGA socket and DDR5 come together.
Which makes sense because it means one cut in compatibility rather than two. Some people will still moan about the lack of an upgrade path, but this one there was announced quite early.
Reminds me of ... (Score:4, Informative)
In preparation for soldered-on chips? (Score:1)
I fear this might be done to allow easy soldering on by criminally greedy OEMs.
It's not actually an advantage to have the pins on the MB, is it...
Re: (Score:1)
yea like PGA is impossible to solder, oh wait they have been doing it for like 50 years dink
Re:In preparation for soldered-on chips? (Score:4, Informative)
I don't know how LGA would be "in preparation for".BGA (soldered on). If you wanted to BGA, you'd use BGA, not LGA.
> It's not actually an advantage to have the pins on the MB, is it.
There are two main advantages. First, LGA allows for more pins in a given area. More pins for more PCI lanes, etc. Second, pins get bent or damaged, so you want the pins to be on the cheaper part. If you have a $600 CPU on a $150 motherboard, you want LGA. Because you'd rather replace a $150 part than a $600 part.
Also, arguably pins on the board are less likely to get damaged in the first place. When you swap a CPU, or re-install it for any reason, the CPU is likely laying on the bench while the motherboard is in the relative safety of the case.
But mainly, LGA (pins on the board) lets you have more pins.
That's one reason the AMD PGA CPUs are physically larger than the Intel LGA parts - they have to be bigger to fit all the pins, since LGA pins require more space per pin.
Re:In preparation for soldered-on chips? (Score:4, Insightful)
Bent pins on the CPU are easier to see and bend back. Bent pins on the motherboard are much more difficult to notice and also more difficult to bend back.
Re: (Score:2)
A modern high density chips makes bending a bent pin back and incredibly fickle and very likely to fail process. You're very likely to break the pin in the process.
Re: (Score:2)
Still, bending back a pin of AM4 CPU looks easier than bending back a pin on a LGA motherboard (which I had to do at least once).
Also, if a pin is bent on a CPU, I can notice it easier (as the CPU will not go into the socket), while a bent pin on a motherboard is not so apparent and only noticeable by its effects (no boot etc).
At least to me, the pins on the motherboard appear much flimsier and easier to bend by accident than pins on a AM4 CPU.
Re: (Score:2)
Looks easier. But really isn't so much is my point. They snap very easily and its quite rare to be able to bend a CPU pin back without it breaking.
As for identification, there I would challenge. The high density of the LGA makes it immediately obvious when a pin is out of place (pattern broken). The reality here is moving to LGA is necessary precisely to increase pin density of the next generation of chips, so even if the pins were technically possible to put on the CPU side, they'd be equally impossible to
Re: (Score:2)
It used to be easy with just the tip of a mechanical pencil. The pins are too small now for that to work very well.
I'll tell you what's really hard. I accidentally pulled out a PCIe x16 slot along with the graphics card because the case was too small to unlatch the card from the slot correctly. Putting the pins in the right position to slide the plastic slot back on took hours and hours.
Re: (Score:2)
It used to be easy with just the tip of a mechanical pencil.
I always used some PCB that had about the same thickness as the gap between pins on the CPU.
Re: (Score:2)
That was a desperation attempt, I DO NOT RECOMMEND!
Re: (Score:1)
> Bent pins on the motherboard are much more difficult to notice and also more difficult to bend back.
But doable! It took me all afternoon, but I got the 3 bent pins of my LGA 775 straightened up and the thing booted again.
Re: (Score:2)
I have bent back a pin on LGA socket as well, but noticing which pin was bent and bending it back was more difficult than with a PGA CPU.
Re: (Score:2)
Sure, but:
1: in the worse case scenario do you want to solder directly on a CPU or on a motherboard?
and
2: In the even worse case scenario, do you want to replace your CPU or your motherboard?
Motherboards are simple and cheap.
Re: (Score:2)
Soldering on the CPU seems easier than on the motherboard, since the pins on the motherboard are really close together and overlap to an extent, while I may be able to get a really small soldering iron for the cpu.
Replacing the motherboard is usually cheaper (unless it's an old server, then CPUs are cheaper than the motherboard), but replacing the CPU is easier.
Re: (Score:2)
That's a reasonable argument. There are reasonable arguments both ways. Thus, one of the major manufacturers chose LGA, the other PGA. Until the number of pins go to be too high, with Ryzen.
Re: (Score:2)
LGA allows for more pins in a given area.
How does that work? I spent some time thinking about this and can't come up with a reason why you'd be able to reduce pin spacing in LGA compared to PGA.
Re: (Score:2)
Re: In preparation for soldered-on chips? (Score:2)
Look at how the PGA is secured.
Basically a grid of openings where the pins drop in. So each pin is surrounded by a wall of material. And there's limits on how thin those walls can be. But with a LGA, there is no material surrounding each pin. So the pins can get closer to each other.
Re: (Score:2)
in an LGA, the material surrounding each pin is at the bottom of each pin: you have to secure the pins.
Re: (Score:3)
There is a very important 3rd advantage: LGA connections have better electrical properties.
Re: (Score:1)
If you have a $600 CPU on a $150 motherboard, you want LGA. Because you'd rather replace a $150 part than a $600 part
Sigh. Beyond a beige box desktop, both components cost a lot more, and replacing a motherboard is far from free. You have downtime, technician time, etc.
Re: (Score:2)
Intel moved to LGA, what, 15 years ago? I don't remember seeing any LGA chips getting soldered down by anyone.
Also if OEMs want soldered down chips AMD will happily sell them BGA chips to use [quoracdn.net].
Re: (Score:2)
Why? If OEMs wanted to lock the user in they could do it either with a locked down BIOS or by buying one of the many BGA chips that AMD already sells.
It's not actually an advantage to have the pins on the MB, is it...
It is. There's no advantage to having pins on the processor, in fact many disadvantages.
I want to upgrade... (Score:2)
But honestly, why? My amd 3core 3.6 ghz 8gb ram GeForce GTX 750 Ti 2gb works for everything I do, including the games I do want to play. No killer app has come out that I want to spend $700 on a new computer system. Maybe if I did something that really was CPU intensive but most of what my system does is run old games and the web browser (which seems to use more resources then almost any of the games, WTF).
The newest game I have is SC2 with all the expansions. Maybe Wurm Unlimited, but I think SC2 takes mor
Re: (Score:3, Insightful)
what? have you not seen the techtubers? 142FPS at 1080p UNPLAYABLE due to 4ns micro stutter! how can you possibally play anything that doesnt look like a PS1????
Re: (Score:2)
> techtubers
This sounds like an opportunity cost.
Re: (Score:2)
You know, I still play many things on an FX8350. No issues with quite a few games.
Re: (Score:2)
Starcraft 2 came out in 2010. Your hardware was decent gear when it was released in (I estimate) ~2014/2015. If Starcraft 2 is the newest game you're playing, it's no wonder that you don't feel compelled to upgrade.
Re: (Score:2)
I have a system with a 7-year-old CPU (4 yo GPU) and I would have no reason at all to upgrade were it not for games. I like to play AAAs sometimes and the system is not good enough for some.
It's a bit sad to just upgrade for the games but I won't be going top of the line and the new PC should last me through the lifespan of the new generation of consoles. So it sho
Re: I want to upgrade... (Score:1)
I got a 770 for the garage computer it playing modern games fine on medium to high 1080p
Is it 4k ultra 144hz no but ots fine
Re: (Score:2)
I got a 770 for the garage computer it playing modern games fine on medium to high 1080p
Is it 4k ultra 144hz no but ots fine
But that's the point. You can make out the gray hairs in your opponent when looking through a high power scope from way-to-close for anything vaguely realistic. All in 144Hz glory as they sway in the AI-generated wind and are ray-trace-reflected in the water droplets on their far-too-prominent cheek bones.
Don't get me wrong, I enjoy the higher realism to a degree but it's really gotten to the point of overkill. I tend to enjoy NES/SNES games more...plus there's no upsell/loot boxes/expansion packs.
Re: (Score:2)
I was getting along with an AM3-socket system (I don't game) and then I moved to Firefox as a browser which ate up three-quarters of the available 8GB of DDR3 RAM and begged for more. I maxed out the RAM to 16GB and Firefox proceeded to use three quarters of that and started paging to hard disk (for some reason it wouldn't use the SATA SSD as virtual RAM).
I built myself a new AM4 Ryzen system this spring with 32GB of DDR4 RAM and now Firefox won't take use than 5GB or so of RAM for some reason. Go figure.
Re: (Score:2)
My FX-8350 system has a Gigabyte G1 Gaming motherboard and I've pumped it up to 32GB of ~1900 MHz RAM. (Corsair Vengeance) It's glorious for an ancient potato. I disabled paging.
Re: (Score:2)
I'm just hoping my system continues to work with no component failures but it would be a "real" shame to "have" to immediately order new parts to build a system. That would be just "horrible".
Actually, right now it would be pretty miserable, with the shortages on so much and the panic buying raising prices on everything else. It may be a fantastic time when this is over and the market overcorrects and has surplus of everything.
Re: (Score:1)
Owner of a similarly capable PC here, AMD quad core Phenom 2, 12GB RAM and a Radeon HD7850 / 2GB. In the last 1-2 years, games compatibility on Linux has suffered from game developers switching to Vulkan. My HD7850 only supports Vulkan 1.0, which seems insufficient in some cases. In particular War Thunder.
Considering Egosoft games, "Standard" X3 will run decently on your system (some mods will tax it more). I also consider it quite good.
Re:I want to see a challenge (Score:5, Insightful)
Re: (Score:2)
Re: (Score:2)
I think you probably can get it very quick if you use most of your 40 pins for PCI-E lanes.
But the package would need to have some internal VRMs etc to distribute power across the chip, given those modern chips use the thousands of connections for mostly separate VCCs and GNDs.
Re: (Score:2)
Re: (Score:2)
They tried that with RAMBUS memory a while back. The problem with RAM is it's very latency sensitive and multiple parallel serial buses tend to have higher latency (but better throughput). They might be able to pull it off though. I don't think it will make the number of pins decrease though: a PCIe x16 socket has more pins than an AGP 4 socket and a lot more than classic PCI (slightly fewer than PCI-X).
What serial links do is allow you to have lots of pins running really fast. In a parallel bus those pins
Re: (Score:2)
They are retrying by CXL [samsung.com], a new standard that is said to based on PCIe. As PCIe x16 interface contains 164 pins, it is still less than the 288 pins in the current RAM sticks.
For the latency issue, I imagine one day the latency-critical part of memory need will be embedded into the CPU package, like how they packaged northbridge into CPU last decade, or in other words, putting what Intel experimented in 5775C to mainstream common practice.
Re: (Score:2)
You've got that a little backwards - an x1 bus isn't a fragment of an x16, an x16 is 16 separate x1 channels all running to the same mechanical socket. Kind of a significant difference from a logic perspective. Among other things, rewiring them back into separate mechanical sockets isn't actually very cool.
And high performance video cards are one of the *very* few common devices that have any use for an x16 socket. Even high-speed NVME drives only use x4.
For almost everything else a single PCIe x1 socket i
Re: (Score:2)
There are plans of making main memory run in serial following the trend of other interfaces like PCI -> PCIe. When that happens, there may be a pin count shrink of CPU sockets in the future.
That won't happen. The move to serial was made possible by technology. That same technology is applied to existing parallel interfaces on memory modules to make them as fast as they currently are.
Much like how PCIe x16's speed benefit isn't from 16 data packets sent one after the other in serial, but rather 16 parallel PCIe channels.
Re: (Score:2)
Well, not many people make big DIPs anymore, certainly not 40 pin DIPs. With that said, it's actually not that hard to wire up a small QFN or small LGA chip by hand. It's a bit tedious but I've managed to solder LGA16 chips (0.5mm pitch) with a cheap reflow oven and rework them by hand with a cheap rework station. I've also deadbugged a few different QFNs in my time. I generally like mounting them on a DIP carrier so I can then stick them in a breadboard and wire up the peripheral circuitry easily.
Anyway th
Re: (Score:2)
Yeah, that's what I was thinking. My impression is that most of the pins on a CPU are dedicated to communicating with RAM, and you *really* don't want to slow that down.
On the other hand if you're integrating RAM, etc into the chip, so that you're basically talking about a system-on-chip package, the sky's the limit. The Raspberry Pi Pico board is basically a tiny chip (plus some supporting electronics) mounted on a 40-pin DIP breakout board, I can't think of any reason you couldn't integrate a full-power
Cheaper (Score:3)
Moving the pins to the socket transfers the cost from the CPU to the motherboard.
Re: (Score:2)
Which makes sense as the CPU is usually more expensive than the motherboard. If you are going to damage one it might as well be the cheaper part.
Re: (Score:2)
In my budget gamer builds the CPU and MB are usually very close to the same price.
Isn't that still a common type of configuration these days? I certainly see a lot of them offered.
Re: (Score:2)
I guess it depends what you considering a gaming rig. For eSports and the like you are right. For high performance gaming the CPUs are pretty expensive these days.
Re: (Score:2)
Do the pins themselves actually cost much though? Especially compared to the matching ZIF socket?
I'm not sure how the total cost of the two connectors compares, but I suspect either one pales in comparison to the surrounding hardware, and that the majority of the cost is with on motherboard side either way, whether it's a PGA ZIF socket, or an LGA array of micro-spring (I think?) pins The biggest cost difference is likely which part has to be replaced if a pin breaks - and the CPU is usually far more exp
I damaged an LGA mobo last week (Score:5, Interesting)
I was really wishing I could have just bent a CPU pin and straightened it with some tweezers, then put it in the socket, and not gone through all that grief.
After saying all that I still don't mind LGA as the CPUs are easy to clean of any thermal paste screw ups but keep that thermal paste well away from an exposed mobo! I have screwed up an Dell R710 by having specs of Arctic Silver paste fall into the socket and screw things up royally.Granted - that could happen with PGA also.
Anyays - I need to not drink too much coffee and be jittery before doing CPU swaps.
Re: (Score:2)
I haven't dealt with them personally, is it really that much harder to bend an LGA pin straight again? I mean, they're usually denser so you'd need finer tools and a steadier hand, but beyond that?
Re: (Score:2)
It's fucking impossible. I bought a second-hand mobo with some bent pins in the LGA socket, and never could get them realigned. They naturally point every which way so you can't easily tell where any particular pin should go or when they're properly lined up.
PGA on the other hand is pretty simple, as all the pins line up and point in the same direction -- just look down the rows until you find a pin out of line and straighten it. An empty fine-point mechanical pencil works well to grab the pin. The only dan
Re: (Score:2)
Ah, that makes sense. Most of the close-up pictures I found showed the pins in neatly aligned rows like freshly-raked grass, usually poking through some sort of grid structure, presumably a support base for the chip.
Seems like it shouldn't be *that* hard though - just look at the bottom of the chip to find the position of the contact corresponding to the bent pin, presumably the tips are still on a nice grid layout, wherever the base may be located. Perhaps with the aid of overlaying a mirrored photograph
Re: (Score:2)
Come to think of it - are you sure your problem had anything to do with misaligned pins? Motherboards without CPUs are one of those products I would shy away from buying second-hand, since unlike CPUs which are often replaced with newer versions, there's very little reason to replace a motherboard unless it's defective.
The fact that someone would even sell you a mobo with bent pins suggests they're not the most reputable of dealers.
Re: (Score:2)
Anyways - I need to not drink too much coffee and be jittery before doing CPU swaps.
Indeed. It is a bit like brain-surgery, although unlike a surgeon you can work as slowly as you like. But mistakes get costly very fast.
DDR5 (Score:2)
Will we see any benefit to DDR5? MY understanding is that their is no benefit to RAM over 3200mhz. And 3200mhz RAM is already really common.
Or is the idea that current gen cpus are build for slow DDR4 ram, and when we upgrade the spec they will be able to take advantage of way faster speeds?
Re: (Score:2)
It'll definitely improve the APUs
Re: (Score:3)
I like that they used ECC per 32 bit channel. (So, 2 channels of 40 bits.) I've been worried that higher chip densities and larger amount of memory are going to need more ECC for rare events. With each DDR5 DIMM having 2 "banks" with each it's own ECC, it's a step in the right direction.
Plus, if I read specs for DDR5 correctly, they now include voltage regula
Re: (Score:2)
I had missed that DDR5 had ECC as a standard feature. Finally! That alone is worth holding off on my next system upgrade.
Of course, if they're integrating ECC as a standard feature I imagine it's at least partially because the technology is more prone to bit-errors, but so long as the corrected reliability is better it's a win. Especially if they're using an extra parity bit to reliably at least detect 2-bit errors (assuming your 40-bit number is authoritative? You only need 39 to fix 1-bit errors).
And
Re: (Score:2)
The 40-bit channel is only for the connection from the DDR5 to the CPU—and that part is optional, resulting in both ECC and "non-ECC" configurations where the "non-ECC" version has on-die ECC for data in storage but no ECC for data in transit. According to the draft standard linked on the Wikipedia page the on-die ECC uses 8 check bits to provide reliable single-error correction for 128-bit data blocks. However, not all double-bit errors are detectable by the ECC logic, and some double-bit errors may
Re: (Score:2)
Interesting, and wonderful news, thanks. So we should only actually have to pay for 6.25% extra "parity RAM" instead of 25%. I suspect the incremental cost for the extra data-lines and chipset logic to detect transmission errors should be comparatively tiny in comparison - hopefully small enough that even most midrange motherboards include it.
It's a real shame that perfect Hamming codes fall just barely short of efficiently covering the power-of-two data sizes that have become popular in computers - That 8
Re: (Score:2)
Zen 3+ or Zen 4? (Score:2)
What I want to know is has AMD settled on AM5 for the Zen 3+ coming later this year, or will they hold off until Zen 4?
No Better, Just a Shift of Responsibility (Score:2)
Bring on the Elastomers (Score:2)
Putting the pins on the chip (AMD) makes the chip the vulnerable-to-damage part.
Putting the pins on the motherboard (Intel) makes the motherboard the vulnerable-to-damage part.
However in the lab, for high pin count test chips, we use elastomeric connectors. A rubber sheet with millions of vertical conductors embedded in it. The elastomeric sheet is squished between the pads on the board and the pads on the chip. This can connect thousands of pins without risk of damaging either the board or the chip.
These s
PGA pins are not that "bendable" (Score:3)
These are hardened brass, usually. You need to try to bend some to see that. I understand that not many people are willing to do that to a new CPU, but I tried it on some older ones. These pins are actually quite sturdy and you need to drop the CPU or drop something on the pins at the right angle to bend them. Or you need to misalign on insert and use a lot of force and I mean a lot.
My impression is PGA is actually a lot more robust mechanically, but LGA has better electrical properties. This is specifically interesting for PCI-E 4.0 where speeds get really high and any deviation from "raw straight wire" causes loss of speed. An LGA contact is basically a "spear" being pushed straight upwards and that as good as you can get. Also, LGA allows tighter packaging because the contacts are simpler mechanically. The downside is that you cannot change the CPU very often before the socket becomes unreliable. I think for Intel the stated number was that you must not open and close the socket more than 5 times before replacing the mainboard. Lets hope AMD can push that to 10 or so and then I think this should be entirely fine.
As AMD keeps using a socket as long as it makes sense (quite unlike Intel that obviously makes it intentionally hard to keep using a CPU or a mainboard when the other changes), I think this move is quite reasonably timed and I expect AMD will keep using this new socket for quite a while.
shitty move if no upgrades become available (Score:2)