Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Intel Hardware

Preview of Intel's Dual-Core Extreme Edition 289

Posted by Hemos
from the kicking-the-tires dept.
ThinSkin writes "Intel let ExtremeTech.com sneak behind the curtain of its anticipated Dual-Core Pentium Extreme Edition processor for a full performance preview with benchmarks. Bundled with essentially two Prescott cores on one die, the Extreme Edition 840 processor clocks at 3.2GHz and contains a beefed-up power management system to keep the CPUs running cool during use. Expect Intel's dual-core line to hit the streets sometime this quarter. No word on pricing yet." Update: 04/04 17:26 GMT by T : Timmus points out FiringSquad's preview, too, writing "The benchmark results are mixed, with a few applications taking advantage of the new CPU, and some that don't." And Kez writes in reference to this article to say: "Our article on HEXUS.net, covering the P4 EE in detail, states the price as £650 (that's what we're looking at in the UK anyway, not sure about the U.S.)."
This discussion has been archived. No new comments can be posted.

Preview of Intel's Dual-Core Extreme Edition

Comments Filter:
  • by kwoo (641864) <kjwcode&gmail,com> on Monday April 04, 2005 @11:18AM (#12134557) Homepage Journal

    On SlashMark? Namely, how many seconds does it take to compile the Linux kernel? :P

  • by ackthpt (218170) * on Monday April 04, 2005 @11:18AM (#12134563) Homepage Journal
    I love superlatives like 'Extreme' in a product name. It's so funny to look at, years later. "Hey, remember this old clunker? It was ' EXTREME !'"
    "Yeah, by today's standards it's EXTREMELY slow!"
    "Only dual core, ha ha ha ha hah!"

    I guess they can't very well call it 840i, as they've already used that for a chipset, but maybe Intel should stick to names ending with -ium and -on instead of something which timelessly proclaims some chunk of doped silicon as superior.

    Next up from Intel, the Ultra-Spifftronic-Wowee-Zappo Triple Core, with extra schmaltz!

    • by mikael (484) on Monday April 04, 2005 @11:23AM (#12134623)
      Somehow, Extremium and Extremon don't seem to have the same rhyme. Next up from Intel, the Ultra-Spifftronic-Wowee-Zappo Triple Core, with extra schmaltz! The local ice-cream van used to sell those during the Summer holidays - you had to eat them immediately, otherwise they would melt before you got inside.
    • by Stevyn (691306) on Monday April 04, 2005 @11:33AM (#12134717)
      Then they'll call it "ExtremeX!"

      I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.
      • Then they'll call it "ExtremeX!" I feel bad for the engineers who come up with these designs which are then crapped on by their marketting department.

        Which probably has a lot to do with the success of the Dilbert strip.

        This morning, on the way in to work, the BBC World Service had another feature on managment (flavor-of-the-day) trends. I suppose marketting does the same thing, but nobody has actually put their finger on it, yet.

      • by utlemming (654269) on Monday April 04, 2005 @11:52AM (#12134915) Homepage
        Worse yet, how many of those people that can truly benefit from the power that the Extreme Edition can offer, don't because of stigmas on the name? I recently was told a story about a guy that had a job offer but refused becuase he didn't fit the culture of the company: apparently every workstation had the latest, greatest gadget from the fancy fadish mice, to modded computer cases with the flashing neon lights. While those things looked cool, he didn't feel that he would fit in with a company that spent money on the cool stuff as opposed to spending money on development. I have to say that I feel the same way. When I am in the market for computing power, I am not interested in the fadish stuff -- I am interested in the raw numbers and if the computer can do what I need it to do. With names like "Extreme" your marketing to the gamers and not nessarily to the programing professional. The marketing departments should at least market a simular chip with simular abilities as a "Developer Edition." But I guess people that would be interested in them are the guys buying the Xeons and the Opertons.
    • well for the most part they use "EE" in place of Extreme Edition.. maybe later on they can give it a better definition like Extra Expensive
    • by SilentChris (452960) on Monday April 04, 2005 @12:17PM (#12135145) Homepage
      Kind of reminds me of the old MST3K skit.

      Crow: Hey Mike!
      Tom: Hi Mike!
      Mike: What's going on?
      Tom: EXTREME! That's what's going on Mike!
      Crow: Yeah, Mike! You should try it!
      Mike: Extreme what exactly?
      Crow: Well, take me for example. I'm into extreme Yoga... SURRRRRRGE!!!!
      Tom: And I'm into philatily. I own you Venezulea 1947! Extreeeme postage! Woo!
      Crow: Now have you thought about what you'd like to be extreme about, Mike?
      Tom: No fear, Miguel.
      Mike: I'm not really extre...oh, you know what? I really like rice.
      Crow: Ahh, well, EXTREME... RICE!!
      Tom: Rice! Thermo nuclear protection! Wooo!
      Crow: Yeah! See, Mike. Isn't rice better when it's extreme?
      Mike: Sure is, uh, we'll be right back.
      Crow: WAAAOOO!!
      Tom: Haaaaaa!!
  • Sweet! (Score:4, Funny)

    by kmartshopper (836454) on Monday April 04, 2005 @11:18AM (#12134567)
    ... something else we can use to make breakfast with!
  • Holy Cow... (Score:4, Funny)

    by Robotron23 (832528) on Monday April 04, 2005 @11:19AM (#12134573) Homepage
    We recently returned from a road trip to discover a very large box waiting for us.

    If the processors that big how the heck will I fit it on my motherboard?!
    • by pla (258480) on Monday April 04, 2005 @11:31AM (#12134688) Journal
      If the processors that big how the heck will I fit it on my motherboard?!

      Well, the processor itself only takes a few square inches - The rest of the box held the liquid nitrogen cooling system needed to keep the thing slightly cooler than the surface of the sun.
    • by ackthpt (218170) *
      If the processors that big how the heck will I fit it on my motherboard?!

      That was the heatsink. The processor and motherboard were in a small brown box being crushed beneath it (as dictated by Galactic Shipping Directive 4.07a(7ii)

    • by pg110404 (836120)
      If the processors that big how the heck will I fit it on my motherboard?!

      Simple. You don't fit the processor on the motherboard, you fit the motherboard on the processor.

      Just don't forget to reinforce the desk.
  • Cool?!? (Score:5, Insightful)

    by Cruithne (658153) on Monday April 04, 2005 @11:19AM (#12134576)
    Running cool during use? It seems to me they'll need the power management to keep it from melting itself, judging from the heat output of just one of those beasts...
    • Re:Cool?!? (Score:2, Interesting)

      by pg110404 (836120)
      It seems to me they'll need the power management to keep it from melting itself

      Don't forget the 50 Gigawatt power supply!

      The processor alone consumes (last I heard) about 100 watts and if it's essentially two processors in one, will require a really really good power supply. That means to use this proc, you'll instantly need 100 extra watts out of your power supply.

      If they have to have power management to keep it from meltdown, just how much more computing CAN you get out of it anyway? To me the seco
      • 125W .

        holy crap...

        We're starting to get into the range of *NEW HOUSE WIRING REQUIRED* for computers.

        And most people don't need much more than what's in my pocketPC to do their email/wrd proc. hehe.

        • Re:Cool?!? (Score:4, Informative)

          by timeOday (582209) on Monday April 04, 2005 @01:23PM (#12135909)
          Well, don't get too excited, your average hair drier pulls 1800 W.
    • 130W design power consumption [pcmag.com], impressive number, right?

      On the other hand, Free scale e600 dual core [freescale.com] has a power budget of 15W.

      If I am the designer of next hybrid car, I go after the second one.

  • How about (Score:4, Informative)

    by Adult film producer (866485) <van@i2pmail.org> on Monday April 04, 2005 @11:20AM (#12134583)
    we just call it what it is, a two-die module. This is not true dual core but two cores slapped into one chip package... Sure you'll only be using one socket but thats about the only different. Architectually, you will need to look at AMD's offerings for true dual-core.
    • we just call it what it is, a two-die module. This is not true dual core but two cores slapped into one chip package... Sure you'll only be using one socket but thats about the only different. Architectually, you will need to look at AMD's offerings for true dual-core.

      Shush! You're taking the glimmer off the chrome, just as Intel, in a slap-dash manner, try to recapture some sort of legitimacy after getting spanked by AMD, right after totally dissing 64 bits.

      You hear a tinny voice say, "32 bits should

      • Intel got spanked by AMD's on-chip memory controller. The cpus support 64bit, but they handed intel its ass in 32bit mode thanks to the lower latency.

        It was quite a strategy show boating X64 while the memory controller silently kicked ass.
    • by MankyD (567984)
      This is not true dual core but two cores slapped into one chip package...

      Care to elaborate on the difference?
      • In the AMD world the cpus talk across the HT at like "really fast" and then they talk to the northbridge.

        In the Intel world they all share the northbridge.

        Now think about "cache coherancy"...

        Tom
        • Re:How about (Score:5, Informative)

          by hawkbug (94280) <psx AT fimble DOT com> on Monday April 04, 2005 @11:44AM (#12134838) Homepage
          I don't think so - AMD boards don't have a northbridge... the memory controller is on the CPU itself.

          http://www.anandtech.com/memory/showdoc.aspx?i=200 6 [anandtech.com]

          See the last paragraph
        • Re:How about (Score:3, Interesting)

          by Dink Paisy (823325)
          So what you are saying is that AMD CPUs have more overhead due to cache coherency traffic on the point-to-point CPU links, whereas Intel CPUs don't generate cache cache coherency traffic except on invalid misses, since they can snoop the shared memory bus? And perhaps you could clear up for me what the northbridge for a newer AMD CPU does. I thought the main function of the northbridge was the memory controller, which is included on die on newer AMD CPUs.
          • Re:How about (Score:5, Interesting)

            by tomstdenis (446163) <tomstdenisNO@SPAMgmail.com> on Monday April 04, 2005 @11:54AM (#12134926) Homepage
            No, you got it backwards. The AMD cpus [as I understand it] have DEDICATED pipes to the other cpus. They're 8/16 bits wide and run at [forget but think it goes upto 1.6Ghz].

            So cpu 2 and cpu 3 could talk and not get in the way of cpu 1 and the memory bus. Yes, there is "northbridge" for memory but there still is a memory bus. The Intel cpus have no dedicated bus and ALL talk over the same bus.

            Not having either combo of boxes I can't tell you which is faster but usually AMD is much faster than Intel just on the pure "not being a Ghz pusher".

            Tom
            • No, there is not a northbridge for the K8 core. Everything related to memory is on the chip itself.
              • I missed the "no" in there...

                The point is ...

                TheRe is but OnE ShaReD Memory buS which is a bottle NeCk if used foR Cache coherancy aNd memoRY access Holy fuck, comprehend much?

                Tom
                • Ok, so you miss the "No" and expect me to take your post for what you meant? Sorry, but I'm used to speaking in english.
      • Re:How about (Score:5, Informative)

        by freidog (706941) on Monday April 04, 2005 @01:20PM (#12135884)
        This is not true dual core but two cores slapped into one chip package...

        Care to elaborate on the difference?

        Typically what they mean is that Intel's design is not functionally different than having two distinct processors as you would in a typical SMP setup.
        If you look at the diagrams on the second page of the article, you'll see there's no direct communications between the two cores on die. If the two cores want to check cache coherency or system resouces access it's arbitrated over the sytem bus.
        AMD uses a 'System Request Interface' that all cores on a die will connect to. There's actually local communcations between the two cores. You don't have to hop onto the system bus (or HTT link in this case) to request something that's sitting right next to you. This really only works well since Opteron is a NUMA architecture to begin with, you don't have to go snooping around to see who else is using the data because unless the local SRI has 'checked it out' you have exclusive access, and you don't need to verify that.
    • by mobiux (118006)
      But then you would need to admit that AMD's technology is technically superior.

      And I doubt if intel marketing would appreciate that very much.

      He would then find himself cut off and unable to make these "preview" articles.
    • by Jeff DeMaagd (2015) on Monday April 04, 2005 @11:42AM (#12134819) Homepage Journal
      A die is a term for a discrete piece of silicon. My understanding is that both cores are on the same piece of silicon, even if they don't share anything other than power and FSB connections. I would say that it is a single die module.
      • Die vs. core (Score:3, Interesting)

        by Prof. Pi (199260)
        A die is a term for a discrete piece of silicon.

        You are correct. A "two die module" would have two separate pieces of silicon, interconnected through one of several techniques.

        But this is /., where you're supposed to cheer for AMD and mock everything Intel ever does. Just remember this, and you can get lots of 'Informative' mod points, even if you don't understand even the most basic terms of chip manufacturing. At least that's what I can figure by looking at what gets modded up around here.

    • Re:How about (Score:2, Interesting)

      by jskelly (151002)
      Isn't there also a dual-core PowerPC/G5 in the works? I think it hasn't been announced officially, but it seems to have been accidentally confirmed by IBM [theregister.co.uk] and by Apple [theregister.co.uk] as well.
    • Re:How about (Score:3, Insightful)

      by fitten (521191)
      Dual core is two cores on a single die. Intel's solution may not be optimal by any measure, but to call it not "true" dual core is simply AMD apologists trying to live down not being the first out the gate with one. Yes, AMD's solution is much better (in my opinion and others) and it will be the one that I buy, but you cannot dismiss Intel's chips simply because they hurt your pride.
  • by Anonymous Coward on Monday April 04, 2005 @11:20AM (#12134584)
    If one of the cores generates a floating-point error, the other core can be used to correct the problem by adding both errors together to derive a slightly larger error.
  • Ketchup (Score:2, Interesting)

    Intel is just playing catch-up now to AMD. With AMD's 64-bit architecture being chosen by the market over Intel's shoddy architecture, Intel is ahead only in name-recognition. As the article says, AMD has been working on their dual-core offering for a year longer than Intel. AMD is a year ahead in development. Their offering is likely to be much more robust than Intel's with that extra year.

    But, who knows? Intel seems to be shipping first. And we all know, Real Artists Ship.
    • by Blitzenn (554788)
      I think Intel's decision to leave out extensions developed by AMD are going to kill to processor fairly quickly. Granted they bought the rights to them from AMD, but their must be some royalty type deal here, because Intel is only including a handful of them. That will make their processor increasingly incompatable with the already accepted AMD architecture. Why is Intel so grudging to admit they are behind? They are going to kill themselves with that attitude. A couple more processor iterations and fai
      • Actually there is no royalty deal, AMD and INTEL have cross licensing agreements stemming for a lawsuit several years back.
        • Actually there is no royalty deal, AMD and INTEL have cross licensing agreements stemming for a lawsuit several years back.

          Are you so sure that that cross licensing arrangement covers technologies that are subsequently developed? Forever and ever? Or does it only cover existing technologies at the time of the agreement?
        • Here is the exact filed text of the ten year licensing agreement you are refering to. Intel AMD cross licensing agreement [findlaw.com] Nowhere is there any legal languague in it that covers future developments. That would be a really stupid business move on anyone's part.

          The real story here is what caused Intel to agree to a license agreement to begin with. They actually were caught with their pants down on this one. They had reverse engineered everything and attempted to move forward with their reverse enginee
  • Dear Intel, (Score:4, Funny)

    by Triumph The Insult C (586706) on Monday April 04, 2005 @11:21AM (#12134605) Homepage Journal
    I think it's great that you are developing new products.

    However, because of your poor form of not making documentation or firmware freely available, I will instead be sending my personal dollars, and (significantly larger) work budget, to AMD.
  • Extreme edition (Score:4, Insightful)

    by thundercatslair (809424) on Monday April 04, 2005 @11:22AM (#12134613)
    Why do intel marketers think that if they name it "extreme edition" it will sell more?
  • by Rude Turnip (49495) <valuation@gm[ ].com ['ail' in gap]> on Monday April 04, 2005 @11:24AM (#12134629)
    If I wanted to build a Windows system for gaming, would I have to buy Windows XP Pro for multiprocessor support...or is this dual core configuration invisible to the OS, meaning I could get away with XP Home for $100 less.
  • People are actually asking how much it's going to cost?

    The Answer is simple

    An arm, a leg and your left testicle* - it's Intel afterall

    --------
    *or ovary if you're a woman
  • by LiENUS (207736) <slashdot.vetmanage@com> on Monday April 04, 2005 @11:28AM (#12134663) Homepage
    It looks like gamers won't be all that interested in this offering. Even once games support mutli-threading, this wont end up boosting their framerate much. Instead this will raise the lower framerate and give them smoother gameplay. While this is a great improvement unfortunately most gamers seem only interested in their max fps and not the minimum. However for workstations this will be great, lower cost than dual procescors means graphics design companies and advertising agencies can get their job done quicker and more efficiently.
    • by Anonymous Coward
      You're right on the money there. If there's one very disturbing trend in the retail computer market it is the reliance on gaming for sales. I browse computer shops all the time and the other day my wife was with my just to kill some time before catching the subway.
      We walked into a computer shot and a sales guy jumped on us in no time. We let him show us around and do his schtick for a bit and then I asked him why they didn't have any machines slower than 1Ghz for my wife who just browses the web but mai
      • eBay is a good place to buy EPIA boards.

        The reason they don't make 1 GHz CPUs is because they would never sell enough of them for proportionally-lower pricing to make sense. Chip manufacturing is full of sweet spots. This is why Mini-ITX boards with the slower Centaur processors are actually significantly more expensive than commodity Intel/AMD boards. They amount to a low-volume niche product with no economies of scale to speak of, so you won't save any money just because you're buying a slower CPU. Y
  • Uh, right.... (Score:4, Insightful)

    by imroy (755) <imroykun@gmail.com> on Monday April 04, 2005 @11:31AM (#12134695) Homepage Journal
    ...contains a beefed-up power management system to keep the CPUs running cool during use

    So in other words... unless you have extreme cooling this thing will never run at full speed for long. Because when it does, it will quickly heat up and this power management will throttle the clock speed and core voltage. Apps may start up a little faster, but long-term consumers of CPU cycles (e.g media encoding, some games, etc) won't see much improvement. But I'm sure lots of clueless consumers will go for this new eXtreme CPU. Can't wait to see what bullshit analogy Intel will come up with for the TV ads...

  • Buuuuut (Score:2, Interesting)

    by skomes (868255)
    Why do we have dual cores? Everybody's admitted they are going to be prohibitively expensive, so is it just for show? Let's see some AFFORDABLE dual cores before we start heralding them as the future of processors.
  • Dual core ? (Score:4, Funny)

    by Pop69 (700500) <billy AT benarty DOT co DOT uk> on Monday April 04, 2005 @11:42AM (#12134820) Homepage
    Does that mean I'll be able to fry two eggs at once ?
  • by IdJit (78604) on Monday April 04, 2005 @11:50AM (#12134896)
    Does it have a hemi?
  • by alta (1263) on Monday April 04, 2005 @11:53AM (#12134924) Homepage Journal
    Now the spyware on all my users's machines will have a processor all to themselves. That means the users will have the second processor to run Word, excel, et al...

    That means they'll leave me alone and quit bitching about slow machines for a while! Woohoo! Oh, and will help that winword.exe that keeps crashing and staying backgrounded. Woot!

    (Yes, I know the spyware will take over both proc's. Let me dream)
  • Long term solution? (Score:3, Interesting)

    by Jugalator (259273) on Monday April 04, 2005 @12:07PM (#12135041) Journal
    Excuse me if this sounds unusually stupid at Slashdot, but they will in other words release 3.2 GHz dual core models initially? Won't they then have developed a new technology just to hit problematic clock frequency spoken of at ~4 GHz almost immediately? I was always thinking of something like two 1.6 GHz cores possibly with some tricks to achieve similar speeds as a current 3.2 GHz P4... Am I missing something here or is this just an unusually short term solution?
    • Short term flop (Score:2, Interesting)

      by Blitzenn (554788)
      It's more like a short term flop in my eyes. With this Dual core bearly beating a slightly fast clocked single core procesor in only a small handful of tests coupled with it's extremely high cost, it's dead before it even hits the streets. People are not going to spend 2 or 3 times the amount of cash for that kind of performance. It's just not going to happen.

      I agree that the expectation is double the core, double the power. This test processor is dismal in that regard. I guess we will all have to
  • Apple n Oranges (Score:2, Interesting)

    by zioncity (862007)
    I wonder how it will compare to a dual core G5 chip from Apple.... whenever they get it out, which with all this dual core news from Intel, I would think it would be soon.

    WWDC perhaps?
  • When it comes down to actually purchasing anything "dual", the price always leads me to buy two seperate systems with money left over to bank.

    In the future no doubt more applications will include multi-threading, but I'm not holding my breath.
    What do you think the stability will be like with yet more bloated code?

    When setting up servers, you will most certainly find that "dual" is something to stay away from. Running multiple machines is far more economical, easier to replace components, and moving your
  • by hirschma (187820) on Monday April 04, 2005 @12:37PM (#12135397)
    The review is useless without comparing their test box to an Opteron dually. Since the details regarding how AMD is going to implement dual core is well known, they could take an existing AMD dually, and hobble it with a slower hypertransport setting which would give a pretty accurate simulation.

    This lack of comparison indirectly tells me that AMD's dual core solution is going to wipe the floor with Intel's, even more so than the current AMD performance advantage over Intel on single core procs.

    I wonder how big a gun Intel put to their head. I also wonder how much AMD is pissed off at being "scooped", when they've been working at this for a much longer time.

    jh
  • by MortisUmbra (569191) on Monday April 04, 2005 @12:43PM (#12135460)
    Is the way they benchmark it.

    Listen, for office productivity and "how fast can I open spreadsheets", nobody SHOULD need more than one CPU.

    The rendering tests were a little disappointing (I seem to recall a bigger gap in the AMD benchmarks), but really the point of dual CPU's is, as anyone who has used one knows, responsiveness.

    Yeah rendering times dropping to 60% of normal is nice, but let me tell you, where a normal single CPU system would sit there gurgling and choking on its own vomit because some dirty little application decides it MUST use up all the CPU time, dual CPU systems just go "eh, whatever, hes being a jerk, I can help you over here."

    It is SO nice to use a dual CPU system in daily routine useage (which for me is QUITE varied) just for the increase in responsiveness alone.
  • Multi-processor systems, which includes multi-core, and AFAIK, hyper-threading, need an OS which will reasonably distribute tasks across computing resources. Under DOS...er, Windows, this seems to be the app's responsibilty, since the article refered to "threaded applications, such as 3D Studio Max, Photoshop, and Premiere.". I know that Perl (ActiveState) doesn't have thread support under Windows. I haven't tried it under Linux.
  • Yawn... (Score:3, Interesting)

    by gillbates (106458) on Monday April 04, 2005 @12:56PM (#12135616) Homepage Journal

    even the Extreme Edition dual core CPU only has an 800MHz effective FSB, not 1066MHz

    It doesn't make much sense to put two processors on the same bus, and then lower the bus speed. And, as the benchmarks showed, single-threaded applications ran slower on the dual-core processor than on the regular P4

    I understand "dual core" has a certain market appeal - much like faster clock speeds. Never mind the fact that bus bandwidth and hard drive speed have a greater overall effect on system performance.

    Those who want dual cores would be better off buying a computer that was designed to support multiple cpu's - for example, a UNIX workstation. It doesn't matter how many cores you put on a chip if your memory bus can't feed them:

    1. An P4 can theoretically execute 2 instructions every clock cycle.
    2. Make that 4 instructions/clock for a dual core.
    3. Each instruction averages 4 bytes of data access. Since we'll consider the instructions to be cached, we'll ignore the memory access for them, for now. So we're up to 16 bytes of throughput per clock cycle.
    4. At 3200 MHz, times 16 bytes/clock, we're up to 51,200 MB/s theoretical throughput.
    5. Yet, the 800 MHz FSB (which transfers 8 bytes/cycle) can only do 6400 MB/s throughput.
    Granted, 6.4 GB/s is very fast - But even a single core P4 can saturate the memory bus. What point is there in adding another core (aside from marketing hoopla), when the bus can't run fast enough to support it!

    It seems to me that Intel added the power management features to the chip because they knew that the second core was going to be idle most of the time.

    • Re:Yawn... (Score:4, Insightful)

      by volsung (378) <stan@mtrr.org> on Monday April 04, 2005 @02:10PM (#12136398)
      BTW, this is why the Opterons have on-chip memory controllers. Then your aggregate memory bandwidth scales with the number of CPUs (assuming your OS is suitably NUMA-aware) and you can sidestep this problem. (More or less. A memory hog processes could start stealing bandwidth from the other CPUs if its working set doesn't all fit in one CPUs memory bank.)
  • Are still wating for the 12 sided die...Oh, wait...

Statistics are no substitute for judgement. -- Henry Clay

Working...