Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Intel Hardware

Preview of Intel's Dual-Core Extreme Edition 289

Posted by Hemos
from the kicking-the-tires dept.
ThinSkin writes "Intel let ExtremeTech.com sneak behind the curtain of its anticipated Dual-Core Pentium Extreme Edition processor for a full performance preview with benchmarks. Bundled with essentially two Prescott cores on one die, the Extreme Edition 840 processor clocks at 3.2GHz and contains a beefed-up power management system to keep the CPUs running cool during use. Expect Intel's dual-core line to hit the streets sometime this quarter. No word on pricing yet." Update: 04/04 17:26 GMT by T : Timmus points out FiringSquad's preview, too, writing "The benchmark results are mixed, with a few applications taking advantage of the new CPU, and some that don't." And Kez writes in reference to this article to say: "Our article on HEXUS.net, covering the P4 EE in detail, states the price as £650 (that's what we're looking at in the UK anyway, not sure about the U.S.)."
This discussion has been archived. No new comments can be posted.

Preview of Intel's Dual-Core Extreme Edition

Comments Filter:
  • by ackthpt (218170) * on Monday April 04, 2005 @11:18AM (#12134563) Homepage Journal
    I love superlatives like 'Extreme' in a product name. It's so funny to look at, years later. "Hey, remember this old clunker? It was ' EXTREME !'"
    "Yeah, by today's standards it's EXTREMELY slow!"
    "Only dual core, ha ha ha ha hah!"

    I guess they can't very well call it 840i, as they've already used that for a chipset, but maybe Intel should stick to names ending with -ium and -on instead of something which timelessly proclaims some chunk of doped silicon as superior.

    Next up from Intel, the Ultra-Spifftronic-Wowee-Zappo Triple Core, with extra schmaltz!

  • Cool?!? (Score:5, Insightful)

    by Cruithne (658153) on Monday April 04, 2005 @11:19AM (#12134576)
    Running cool during use? It seems to me they'll need the power management to keep it from melting itself, judging from the heat output of just one of those beasts...
  • Extreme edition (Score:4, Insightful)

    by thundercatslair (809424) on Monday April 04, 2005 @11:22AM (#12134613)
    Why do intel marketers think that if they name it "extreme edition" it will sell more?
  • by Blitzenn (554788) on Monday April 04, 2005 @11:26AM (#12134640) Homepage Journal
    I think Intel's decision to leave out extensions developed by AMD are going to kill to processor fairly quickly. Granted they bought the rights to them from AMD, but their must be some royalty type deal here, because Intel is only including a handful of them. That will make their processor increasingly incompatable with the already accepted AMD architecture. Why is Intel so grudging to admit they are behind? They are going to kill themselves with that attitude. A couple more processor iterations and failures like this, and I expect Intel to make moves to get out of the desktop processor market altogether.
  • by LiENUS (207736) <slashdot@nOSPam.vetmanage.com> on Monday April 04, 2005 @11:28AM (#12134663) Homepage
    It looks like gamers won't be all that interested in this offering. Even once games support mutli-threading, this wont end up boosting their framerate much. Instead this will raise the lower framerate and give them smoother gameplay. While this is a great improvement unfortunately most gamers seem only interested in their max fps and not the minimum. However for workstations this will be great, lower cost than dual procescors means graphics design companies and advertising agencies can get their job done quicker and more efficiently.
  • Uh, right.... (Score:4, Insightful)

    by imroy (755) <imroykun@gmail.com> on Monday April 04, 2005 @11:31AM (#12134695) Homepage Journal
    ...contains a beefed-up power management system to keep the CPUs running cool during use

    So in other words... unless you have extreme cooling this thing will never run at full speed for long. Because when it does, it will quickly heat up and this power management will throttle the clock speed and core voltage. Apps may start up a little faster, but long-term consumers of CPU cycles (e.g media encoding, some games, etc) won't see much improvement. But I'm sure lots of clueless consumers will go for this new eXtreme CPU. Can't wait to see what bullshit analogy Intel will come up with for the TV ads...

  • by Stevyn (691306) on Monday April 04, 2005 @11:33AM (#12134717)
    Then they'll call it "ExtremeX!"

    I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.
  • by utlemming (654269) on Monday April 04, 2005 @11:52AM (#12134915) Homepage
    Worse yet, how many of those people that can truly benefit from the power that the Extreme Edition can offer, don't because of stigmas on the name? I recently was told a story about a guy that had a job offer but refused becuase he didn't fit the culture of the company: apparently every workstation had the latest, greatest gadget from the fancy fadish mice, to modded computer cases with the flashing neon lights. While those things looked cool, he didn't feel that he would fit in with a company that spent money on the cool stuff as opposed to spending money on development. I have to say that I feel the same way. When I am in the market for computing power, I am not interested in the fadish stuff -- I am interested in the raw numbers and if the computer can do what I need it to do. With names like "Extreme" your marketing to the gamers and not nessarily to the programing professional. The marketing departments should at least market a simular chip with simular abilities as a "Developer Edition." But I guess people that would be interested in them are the guys buying the Xeons and the Opertons.
  • Re:Ketchup (Score:3, Insightful)

    by hawkbug (94280) <psx AT fimble DOT com> on Monday April 04, 2005 @11:54AM (#12134931) Homepage
    Sigh... I'll say it again, Intel is NOT shipping true dual core chips. They slapped 2 dies onto one package. And if you understand manufacturing, it's much more expensive to do this, and Intel would not do this in large volume without charging a massive amount of cash for each chip. And by the way, when you say shipping, can you show where you can currently purchase one of these chips? I didn't think so. It's called a paper launch, and Intel, Nvidia, ATI, and AMD are all notorious for using them. Intel might make a few of these chips and provide them to Dell for the high end gaming segment, which Dell might sell 100 of these machines for PR. When Intel can put two cores on a single die, and can actually ship them and people like us can buy them from places like Newegg.com, then you can claim Intel as shipping dual core chips. I'm not a processor fan boy either - I'm also telling you AMD is not shipping chips either, and when they do, it won't count until we can actually purchase and use them. What I am saying is that AMD will be first out of the door to ship true dual core chips. But you know what? It doesn't matter who is first - it matters who makes the best chip for the least amount of cash. Then we'll see who succeeds and who doesn't. Paper launches don't count.
  • Re:How about (Score:2, Insightful)

    by Anonymous Coward on Monday April 04, 2005 @12:22PM (#12135211)
    Please, stop posting. You know absolutely fuck all about what you're talking about, and you're adding even more mis-information to this article.

    I would also like to say that I don't know anything about AMD's offering of dual-core, so I can't comment on why their way is better. I'm sure it is, because AMD's way is always better, but I don't actually have proof of that.

    That comment pretty much says it all about your experience in this field.
  • by MortisUmbra (569191) on Monday April 04, 2005 @12:43PM (#12135460)
    Is the way they benchmark it.

    Listen, for office productivity and "how fast can I open spreadsheets", nobody SHOULD need more than one CPU.

    The rendering tests were a little disappointing (I seem to recall a bigger gap in the AMD benchmarks), but really the point of dual CPU's is, as anyone who has used one knows, responsiveness.

    Yeah rendering times dropping to 60% of normal is nice, but let me tell you, where a normal single CPU system would sit there gurgling and choking on its own vomit because some dirty little application decides it MUST use up all the CPU time, dual CPU systems just go "eh, whatever, hes being a jerk, I can help you over here."

    It is SO nice to use a dual CPU system in daily routine useage (which for me is QUITE varied) just for the increase in responsiveness alone.
  • by Blitzenn (554788) on Monday April 04, 2005 @01:52PM (#12136216) Homepage Journal
    Agreed, but one additional point I need to make that you brought up. Intel has already lost their spot with Microsoft. Windows XP was written for the AMD processor, not the Intel cores. Microsoft actually had to go back and 'patch' their software to make it work properly with the current line of Intel processors. The new version of Windows, yet to be officially named, is also written around the AMD instruction set.

    Thanks for naming the instruction sets that I mentioned. I could not remember the name. I believe that Intel spent enough money in the AMD store that AMD granted them the rights to name the instruction sets anyway they liked, as long as they did not change the functionality. That has to rub the Intel people the wrong way.
  • by Blitzenn (554788) on Monday April 04, 2005 @02:07PM (#12136364) Homepage Journal
    Here is the exact filed text of the ten year licensing agreement you are refering to. Intel AMD cross licensing agreement [findlaw.com] Nowhere is there any legal languague in it that covers future developments. That would be a really stupid business move on anyone's part.

    The real story here is what caused Intel to agree to a license agreement to begin with. They actually were caught with their pants down on this one. They had reverse engineered everything and attempted to move forward with their reverse engineered plans and AMD blew the whistle. Intel admitted their deed, to save on litigation that was obviously not in Intel's favor (based on their forced agreements with AMD in the past).

    No AMD definately did this in a seperate agreement and were very happy to make it public. They just wanted the publicity out of Intel bringing up the rear for a change.
  • Re:Yawn... (Score:4, Insightful)

    by volsung (378) <stan@mtrr.org> on Monday April 04, 2005 @02:10PM (#12136398)
    BTW, this is why the Opterons have on-chip memory controllers. Then your aggregate memory bandwidth scales with the number of CPUs (assuming your OS is suitably NUMA-aware) and you can sidestep this problem. (More or less. A memory hog processes could start stealing bandwidth from the other CPUs if its working set doesn't all fit in one CPUs memory bank.)
  • Re:How about (Score:3, Insightful)

    by fitten (521191) on Monday April 04, 2005 @02:56PM (#12136877)
    Dual core is two cores on a single die. Intel's solution may not be optimal by any measure, but to call it not "true" dual core is simply AMD apologists trying to live down not being the first out the gate with one. Yes, AMD's solution is much better (in my opinion and others) and it will be the one that I buy, but you cannot dismiss Intel's chips simply because they hurt your pride.
  • Re:Ketchup (Score:3, Insightful)

    by spitefulcrow (713858) <sam@dividezero.net> on Monday April 04, 2005 @02:59PM (#12136918) Journal
    If you had read TFA, you'd know that Intel is NOT shipping the 840 EE yet. What the sites that posted TFAs received were generic boxes containing sneak previews of a chip that will most likely ship in a few months.
  • Re:Cool?!? (Score:2, Insightful)

    by oc255 (218044) <milkfilk @ y a h oo.com> on Monday April 04, 2005 @03:27PM (#12137244) Homepage
    But the Apple G5 hair drier reportedly pulls much less. Seriously, how are we going to put these in 1U units, stacked tight?! When will air not cut it? I bought a dual xeon box that seriously will blow a piece of paper out of your hands if you stand at the back of it in the rack.

    Apple has liquid-cooled options (dual 2.5ghz), Dell does not (yes, custom PCs exist). When will we start seeing major PC names going liquid?

"Why should we subsidize intellectual curiosity?" -Ronald Reagan

Working...