Intel Launches New Chipset 127
mikemuch writes "The new P35 and G33 chipsets, codenamed 'Bear Lake' are now available. They have a new memory controller that supports DDR3 RAM at up to 1333MHz, a new southbridge, and will support the upcoming 45nm Penryn CPUs. They don't yet have an actually new and different GPU — their GMA 3100 is pretty much the same as the GMA 3000 of the G965 chipset."
For a little more technical info you can also check out the Hot Hardware writeup.
What's Different (Score:4, Insightful)
What does Penryn need that's new and different in the way of support? Is it just a bump in FSB speed?
Re:What's Different (Score:4, Informative)
I know. I'm not all the excited, either.
Re: (Score:2)
Re: (Score:3, Interesting)
Rumour has it (I haven't kept up, so maybe rumour had it) that SSE4 would include scatter-gather instructions. These allow you to specify multiple memory addresses to be loaded into the same vector. This makes auto-vectorisation much easier for compilers, since your memory layout no longer has to be designed with vectorisation in mind.
If this is true, then it might need co-operation from the memory controller to work effectively. Since Intel's memory controllers are on the north bridge chip, it would
Re-state the question. (Score:5, Interesting)
A new chipset for DDR3 is logical in this situation : the chipset has to handle a different and electrically incompatible memory.
But why does a new CPU needs a newer Chipset ?!?!?
Meanwhile, in AMD's land, there's a standard between the chipset and the CPU called Hypertransport.
As long as both the CPU and the chipset follow the same protocol or compatible variation of (like AM2 being HT/2.0 and AM2+ and AM3 being HT/3.0) you can pretty much pair any thing you want.
The only restriction for a mother board is to have compatible socket (the CPU has on-board memory controller and directly speaks to the RAM sticks. There are different sockets type for different memory combination : 794 is for single channel DDR, 939 is for dual channel DDR, AM2 is for DDR2, Opteron F is for DDR2 and much higher number of Hypertransport lanes), and even that is getting stabilised (future AM2+ and AM3 CPUs can plug in today's AM2 board).
Why can't Intel guarantee the same kind of stability ?!?!?
Oh, yes, I know : they make chipsets and earn money by selling more motherboard.
Even back at the Pentium II/III era they have gone through the same cycle, releasing several incompatible chipsets and slot/socket formats in order to pump up motherboard sales, even if the same slot-1 PII motherboard could last until the last PIII only using adapted slotckets.
Meanwhile AMD is getting recommended on various website (like Ars Technica) as preferred solution for entry-/middle- level machines, because of cheaper board and more stable (and upgradable) hardware.
Stability of AM2/AM2+/AM3 is one of biggest AMD's advantage over LGA775 and should be put forward.
Re: (Score:2)
Stability of AM2/AM2+/AM3 is one of biggest AMD's advantage over LGA775 and should be put forward.
What do you mean by "sho
Yeah I too wait on even better. (Score:3, Insightful)
I agree on this point. Athough, as I said, there's aa good commitment coming from AMD of stabilising the AM2/AM2+/AM3 family, we could hope even better.
Now that the on-CPU-die memory controller has definitely decoupled the CPU/Memory (the fast evolving part) from the northbridge/motherboard (much more constant - except maybe for the graphical c
Re:Re-state the question. (Score:4, Interesting)
Note that that's not just "AMD land," that's IBM land, VIA land, Transmeta land, HP land, SUN land, and every-other-chip-manufacturer-except-Intel land.
Re:Re-state the question. (Score:5, Informative)
Why would Intel invest in chipsets and motherboards when the profit margins are slim (as compared to much higher profit margins for a cpu)? For one, the investment in chipsets and motherboards has saved the company from major disasters on several occasions by early detection of obscure bugs. Knowledge of internal problems can allow the company to delay or cancel a product (such as Timna [pcworld.com]), which is much less harmful to a stock price than shipping a broken product.
By the way, divisions within a company that constitute a material [wikipedia.org] portion of earnings are required to report their revenue. If you want to know whether or not Intel makes money from chipsets, you can look it up in public records.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
The real lesson you should've learned is to always buy CPU+MB+RAM in bundled form. Where the retailer has already put the 3 components together and guarantees that they work. (MWave charges all of $9 for the service.) With a motherboard bundle, you eliminate all of the guesswork and you're sure to get a working setup. Some places call this an "assemble & test" o
You must be new here (Score:2)
Re: (Score:2, Insightful)
Why can't Intel guarantee the same kind of stability ?!?!?
You've got to be fucking shitting me. What are you high on? Because I'd like some of that. I can't see a single statement in your post that isn't absurd and that doesn't turn the truth on its head.
There are plenty of reasons to favor AMD over Intel, but sockets are not one of them.
Have you checked the longevity of LGA775, the only desktop and entry-level server socket that matters? And have you compared that to the longevity of AMD's sockets? Have you read the fucking article? Have you looked at Intel's CP
Re: (Score:2)
Re: (Score:1)
Are you SERIOUSLY trying to say that 3 separate AM- systems are more stable than one socket?
Re: (Score:2)
I looked for a definitive answer from nVidia or eVGA, but it's not clear whether nForce 680i boards will support Penryn/Wolfdale or not. FWIW, a moderator at the eVGA forums thinks they will [evga.com], but nobody knows for sure.
So unless Intel says otherwise, chipsets from other vendors may work with Penryn, regardless of Intel's chipset refresh for DDR3. I mean, most enthusiast boards do well over 1333 MHz FSB, and also have fine-grained voltage adjustments. Unless t
Re: (Score:1)
My old BT2 board still performs as good as current boards on the market.
That preferred solution probably doesn't represent an average between GPU driven and CPU driven markets. Hmm.
SSE 4 (Score:4, Informative)
As far as I know, gcc only supports up to SSE3 intrinsics. Look in pmmintrin.h
Re: (Score:3, Interesting)
Maybe it does not count since it was an AMD invention rather than an Intel invention?
Re: (Score:2)
Re: (Score:2)
If you really want to get off of the bad hardware, you could maybe go with POWER or Alpha or go invent something else completely new. MS tried with Alpha for a while, but noone bought it. So it looks like it's really our fault, not theirs.
Re: (Score:2)
Amen. Microsoft supported x86, MIPS, PowerPC, and Alpha with the first release of NT 4.0. Nobody bought PowerPC and MIPS, and very very few people bought Alpha. So by the time Win2k came around, Windows was x86 only. I really hoped Alpha or PowerPC would succeed and get us off the multilayered hack that is x86, but the masses did not agree with me.
The good thing about this debacle for MSFT is that toda
Re: (Score:2)
Nothing lazy about it.
Intel offered up the Itanium. A 64bit platform that ran existing 32bit code in slow-motion mode. Folks making purchase decisions looked at
Re:What's Different (Score:4, Informative)
Re: (Score:1)
Re: (Score:2)
Do you have a link for that? (Preferably from Intel, or a motherboard vendor, or a review site that talked to Intel) Because I can't find anywhere that old motherboard incompatibility is stated definitively.
Sleep States (Score:3, Informative)
Penryn does C6 [google.com]. I don't know which, if any, requirements are satisfied in current boards.
The subsystems of the board (buses, controllers, GPU, etc.) need to function by themselves while the processor is off. I'd imagine there are also certain hardware requirements to bring the CPU out of C6 that the new boards provide.
The average enthusiast probably doesn't need outstanding battery life, it's just a nice extra. But for business/professional uses, this is a very welcome development.
Re: (Score:1)
OT: External Intel(r) gfx? (Score:1, Interesting)
Re: (Score:3, Interesting)
Re: (Score:1, Interesting)
Intel is therefore the best for Linux.
Re: (Score:2)
Yep -- a GMA 950 will outperform even a GeForce 8000 GTX, when the GeForce is using the nv driver!
Re: (Score:2)
Of course, it's 2D-only, so those fancy Nvidia cards are basically worthless for 3D video if you want all open-source drivers.
Kudos to Intel for releasing open-source video drivers; I just wish they'd make stand-alone PCIe boards with their chips.
Re: (Score:2)
Do Intel's X3000 open-source drivers have support for the T&L units and vertex shaders built-in to the X3000? The Windows drivers certainly don't, [intel.com] even after 9 months on the market!
Almost 3 months ago, beta drivers were promised, but they have yet to surface. The X3000 is still using the processor to perform vertex shading / T&L, just like the GMA 900 / 950, and that's why it still gets beat by the old Nvidia GeForce 61
Re:OT: External Intel(r) gfx? (Score:4, Informative)
Well, not really, no. But huge numbers of run-of-the-mill business PCs, plus the Apple "consumer" line (mini, imac and macbooks), use the standard Intel graphics hardware. It does OK for most people's purposes, and the install base is huge, and for those reasons, a bump in capabilities for the onboard graphics chip would be noteworthy.
Re:OT: External Intel(r) gfx? (Score:5, Informative)
I wouldn't go redundant like this but both of the other replies are from ACs and many people will never read them/know they exist.
So far, Intel is the only company with supported OSS drivers. AMD has "promised" to deliver them for ATI cards, but who knows how long that will take? And nVidia has made no such promise.
In addition, if we could get them without shared memory, the performance would likely improve and it wouldn't drag down system performance. So that would be a great thing.
When we get OSS drivers for ATI, it might become possible to use one under Linux (or any other OS but MacOS for which Apple participates in driver development) in a reliable fashion. But ATI's drivers are poop anyway. Regardless, those who want a 100% OSS system can not buy a current nVidia card, as they are unsupported; an older nVidia card still in production is likely to come from one of the least-reputable vendors, so a card supported by the 'nv' driver that's worth using will be hard to come by. Intel is currently the only credible choice for accelerated video with OSS drivers.
Re: (Score:2)
Which is probably why they have claimed they will do it. Their drivers stink. If they can get people to code up quality drivers for little to no expense, suddenly they are much more competative with NVIDIA plus they've bought mindshare in the OSS community.
Re: (Score:2)
Yeah, it sounds like a win-win situation for ATI. All the proprietary, encumbered code in the world hasn't enabled them to create drivers that are worth one tenth of one shit. I've kept trying ATI off and on over the
Re: (Score:2)
The only people getting shafted by open-source software are the proprietary software vendors that compete directly against OSS solutions. Too bad, so sad. Time to move on to something else; I don't see Cadence, MentorGraphics, or AutoDesk complaining much about OSS software cutting into their business. Although AutoDesk bette
Re: (Score:2)
Re:OT: External Intel(r) gfx? (Score:4, Informative)
Re: (Score:2)
The problem isn't just one of overall bandwidth use, but also one of contention. Further, when used for 3D the memory consumption will be greater because not only graphics memory but also texture memory is in system RAM.
And of course, you don't actually get
Re: (Score:2)
The problem isn't just one of overall bandwidth use, but also one of contention.
Sure, but that can be dealt with. I'm not exactly sure how current chipsets handle it, but they certainly have some cache for display buffer, together with some logic for prioritization (if the display buffer is full, requests from the display controller to the memory controller have low priority, if it gets more empty priority will increase).
Further, when used for 3D the memory consumption will be greater because not only graphics memory but also texture memory is in system RAM.
Yes but as said, that doesn't count as "drags down system performance". It will "only" drag down 3d performance. It will eat some ram, true, but as long as you have
Re: (Score:2)
This is irrelevant because the maximum load to the system only occurs when the system is doing 3D graphics.
What we need is two versions of essentially the same graphics card, one IGP and one standalone. But good luck finding that.
Or more to the point, we need to run another benchmark while a 3d benchma
Re: (Score:2, Informative)
With current memory speeds and dual channel bandwidth, system memory can handle the additional traffic load of the graphic subsystem without suffering that much. And for what concerns 3D graphic performance of those budget cards, that's mainly gpu bound, not memory bo
Re: (Score:2)
512MB is adequate for Windows XP (although you will notice a change to 1GB) and for OSX (ditto) but not enough for Vista. It's more than enough for any Linux you care to use.
Note that the only OS with which you are actually doomed with 512MB is Vista...
Now, with that said; OSX is slow no matter how much RAM you have,
Re: (Score:1)
Re: (Score:1)
Note that the only OS with which you are actually doomed with 512MB is Vista...
Mmh ya, you're right, the "quite doomed" thing was... quite stretched. =)
After all my main machine at home is also an XP2500+ with just 768mb and it's okay with Windows XP. My sister still use a Duron 600 with 512mb and she's fine surfing and writing.
My point was about the shared memory stuff - its drawbacks would kick only with a limited amount of system memory, and you'd better off simply adding ram than thinking about discrete memory for the integrated graphic card.
And anyway I think nobody would
Re: (Score:2)
My Windows system was honestly a bit sluggish for my tastes with 1GB though (not so much within an app as it was SWITCHING between apps. If I was playing WoW and tabbed out to check something on the web, the system ground to a c
Beer Lake? (Score:3, Funny)
*hic* Best name evar!
..oh, wait.
Intel can quit development now (Score:2, Funny)
Re: (Score:3, Funny)
Re: (Score:1)
Re: (Score:1)
Or with actual performance testing (Score:4, Informative)
Re: (Score:1)
When Do We Get Onchip DSPs? (Score:4, Interesting)
All because DSP is more parallelizable than true general purpose processing, as parallelization is the best solution to increasing CPU power, just as the data to be processed is inherently more parallel, and more simply streams of "signals", as multimedia convergence redefines computing.
So when will Intel reverse its epoch of NSP, and deliver new uPs with embedded DSP in HW?
Re: (Score:3, Interesting)
Probably about the same time that web application developers realize that their problems (particularly AJAX) can be solved more efficiently with a DSP architecture and start designing tiers of servers in a pipelined DSP configuration. Considering the amount of computer science exhibited by this industry, I'd peg it at sometime around a quarter to never.
Re: (Score:3, Insightful)
And there's not much sense in Web apps being processed by an FIR or full-spectrum mixer.
All I really get from your comment is that you don't know what DSPs do, or what Intel does - or maybe how Web apps work.
Re: (Score:3, Interesting)
Intel's designs are driven by what drives the sales of their processors. For right now, that's gobs of desktop and PC Server machines. The alternative architectures are in no danger of knocking Intel out of that position. They will carve themselves a niche for now, which is why Intel has been more worried about AMD than they've been worried about IBM. Which means that Intel will sit up and take notice of the DSP-oriented chips if an
Re:When Do We Get Onchip DSPs? (Score:4, Insightful)
DSP is fast math at the expense of fast logic. Web apps have at least as much logic as math, intractably intertwined. DSP of Web apps is inappropriate. DSPs on a chip with fast logic would be good for Web apps and everything else. Intel sells lots of CPUs to process Web apps. And IBM/Toshiba/Sony is planning to sell lots of Cells to do so.
I know what you're talking about. And I know that you don't.
Re: (Score:2)
Re: (Score:2)
ArsTechnica has good coverage [arstechnica.com], and you can find more at C|Net [com.com]. Incidentally, the AC was right -- simply googling for "Intel 80 core processor" yields plenty of results. (Googling for "Intel 80 core processor Slashdot" will find the Slashdot article to which I provided the link.) Instead of ripping on the guy for being passive-aggress
Re: (Score:2)
When someone makes a counterintuitive claim, they should cite it. When asked politely to cite it, they are obligated to cite it. When they're obnoxious about it, they deserve scorn.
FWIW, "coverage on Slashdot" neither makes a claim common knowledge, nor really offers any substantiation to a claim, unless it that coverage cites reli
Re: (Score:2)
Congratulations! You've won the "Stupidest Passive-Aggressive Slashdotter of the Hour" award. You win a crusty old 1990s joke to add to your collection.
Re: (Score:2)
Then, like the classic Anonymous Coward asshole, you turned totally obnoxious. So I can tell without any further help from you that you are indeed that stupid.
Re: (Score:2)
was polite.
Then you turned obnoxious asshole, which was completely consistent with that worthless kind of AC posting.
If you'd just cited your original anonymous claim, or just given the citation in the face of the good reason I asked for it, we could have had an interesting
Re: (Score:1)
You're wrong. That wasn't polite. Now that you have a non-AC that disagrees with you on that point, will you please just drop it?
Re: (Score:2)
I remind you that responding to ACs who make incited claims demands only the minimum of politeness.
Re: (Score:2)
Cunt.
Re: (Score:1)
AC - 1
Brave poster - 0
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
NSP is the way to go only when there's CPU
Re: (Score:2)
As I mentioned, increasingly large segments of all PC processing are suited for DSP. So others, like Sony, are putting DSPs on the CPU.
I started programming/designing for DSP over a decade and a half ago, about half my programming/designing career, so I can tell you - as I did. Proper design, info architecture, eliminates "crazy deadlock bugs". Or you can design it wrong, on a regular CPU without DSP
DDR3 RAM at up to 1333MHz? ::Yawn:: (Score:5, Funny)
Re: (Score:2, Funny)
DDR2 and DDR3 on the same mobo (Score:1)
What's with these codenames? (Score:1)
Re: (Score:1)
Code names are mostly for internal use anyway. It's a way of referring to a project before marketing gets a hold of it and names it something officially.
I can't believe I need to explain the purpose of code names to someone on Slashdot.
Re: (Score:2)
Also, not all designs get released. It confuses all references to "the next next next Intel Chip" when "the next next Intel Chip" gets canceled (see 4.0ghz P4). Likewise, "The next Windows" isn't very descriptive, as MS has separate desktop and server lines. Windows ME would've thrown everything off, as there wasn't originally supposed to
ECC? (Score:2)
No (Score:2)
Minor nitpick: 1333 is CPU bus, not RAM. (Score:3, Interesting)
But, as many have already discovered, the previous P965 chipset can be made to support DDR-2 faster than its specced 800 MHz, and processors above its specced 1067 MHz, so 1333 MHz RAM will PROBABLY work just fine with minor BIOS tweaking, but its still unofficial.
I'm waiting for X38, with its dual X16 PCI-E 2.0 slots, among other improvements.
Improvement but not much. (Score:2)
First off the PCI-e 2.0 support is apparently in the X38 'enthusiast' chipset so that's one scratch.
Also the P35 seems fairly good with DDR3 but it's a hell of a price premium and certainly not an insane speed bump.
On top of this, the P35 supports the 45nm Penryn CPU, guess what? the 965 chipset also supports Penryn if the boards are designed with this in mind, some may not work, some may only need a bios update - but you will see Penryn working on 965 boards
Re: (Score:3, Interesting)
On another topic, I love the screenshots of the upcoming motherboards. Computer components are getting so colorful. I remember back when you got a green motherboard with black and whit
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)