Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel Hardware

Basics of Modern Intel CPUs 236

Doggie Fizzle writes "For those who think you can drop a Xeon into your Celeron system for an upgrade... 'Although there are currently only two main players in the CPU market, AMD and Intel, the number of choices is still enough to make the typical consumer's head spin. Each manufacturer has a few different models to promote, and many of these models can be found in a few different form factors (namely, the "sockets" to which they connect) that exclude interchangeability. This two-part series of Tech Tips will look at a few details of each of the currently-supported CPU (Central Processing Unit) sockets and how they are all similar and different from one to another' "
This discussion has been archived. No new comments can be posted.

Basics of Modern Intel CPUs

Comments Filter:
  • Finally. (Score:5, Funny)

    by Xzzy ( 111297 ) <sether@@@tru7h...org> on Thursday June 02, 2005 @03:08PM (#12707078) Homepage
    Thank god someone finally explained what the acronym "CPU" meant, I've been wondering about that for years, quietly bobbing my head like I know what's going on anytime someone mentions it.

    And I owe it all to Slashdot.
    • Me too. Now that that's taken care I continue researching FPU and GPU. I think that there might be a connection between them, but so far its still up in the air- like my work in figuring out APU...
      • Re:Finally. (Score:4, Funny)

        by peragrin ( 659227 ) on Thursday June 02, 2005 @03:15PM (#12707144)
        Just wait till you get to FPGA's.

        Just before you do tape it I want to watch your head spin up and explode. :-)
    • Despite the dripping sarcasm it actually is a decent article and starts off with a clear explanation of socket designations.

      Granted, I may be the only person on slashdot primarily for the science articles, and therefore the only person who didn't know that stuff, but at least I've heard of AMD and that linus thingy...
  • Confusion (Score:3, Interesting)

    by ackthpt ( 218170 ) * on Thursday June 02, 2005 @03:10PM (#12707091) Homepage Journal
    Also AMD is casting doubt [nforcershq.com] on Intel's claim of dual core. Explains how Intel beat them to market, just do a cheap shortcut.

    As if sockets aren't enough, there's now two video card standards AGP and SLI (card: PCI-E) which caught me by surprise. I had to change my order before shipping as I didn't realise I could not use an AGP card with the new SLI/PCI-E configuration. Better? I don't need to spend $$$, my existing video card works fine, I just wanted to upgrade the mobo and CPU.

    • There already was PCI and AGP slots before, why did this suddenly catch you by surprise?

      Besides, SLI is nothing new: the Voodoo cards did it way back when on bog-standard PCI slots, and is only of use to hardcore games, and it's been proven time and time again that they'll pay anything to be the top geek with the most frames per second for a few months.
    • Just shows how confusing it is... the two standard for video card slots/video bus are AGP and PCI-E. SLI is the Nvidia proprietary method to make two SLI capable video cards on PCI-E bus to act as one video device. ATI has now introduced their own version, which they are calling Crossfire. I think that these proprietary techs are limited to PCI-E cards, but I don't know if that's just because they are only made for those cards, or if there's something inherent in the tech to cause it. Still, it's not of
      • That's just it... how many Mobos have 2 AGP slots? I personally haven't seen one, but maybe there's one out there.
        • I thought that only one AGP slot per board was part of the spec.
          (to avoid overloading the video bus like happened when VLB was the fast video bus)

          But, maybe I'm remembering speculation and not specification.
      • A Motherboard can not have two AGP Slots, unless it had a horribly modified bridging set. Reason? AGP is an accelerated graphics *port* not a *bus* altho in fairness PCI-E is port based, but it was designed from the ground up to have multi-device capability whereas AGP was designed to be a SINGLE video adapter interface to the system.
      • Re:Confusion (Score:2, Insightful)

        by Eugene ( 6671 )
        I've never seen a MB chipset that can handle more then 1 AGP bus. but PCI Express is a different beast. the current implementation is that you can have either 1xPCI-E 16lane or 2x PCI-E 8lane configuration, the ovreall *availible* bandwidth is the same. since current gen videocard does not utilize that much bandwidth. future chipset/GPU design might give you more PCI-E lanes to playwith. what I'm really concerned now is the lack of the standard, and forcing People to stick with either ATI chipset/GPU fo
    • Re:Confusion (Score:3, Interesting)

      by stienman ( 51024 )
      Explains how Intel beat them to market, just do a cheap shortcut.

      The article explains:
      Although Intel was the first to launch its dual-core processor solution for desktop PCs, Richard commented that a real dual-core processor should be one that integrates two cores onto the same die.

      Ah yes, that old trick.
      1) Notice other company has beat you to market
      2) Panic!
      3) Define technology to exclude competitor's product
      4) Indicate that you actually beat them to market with a "technically correct (accordin
      • Err...well before Intel came out with their dual core, I was under the impression that dual core processors did mean on the same die. Maybe half the people paying attention thought it meant something it didn't, or maybe AMD and I were just suffering under the exact same interpretation of the phrase.
        • Dual-core simply means you have two tightly-coupled cores. Generally this implies they are in the same package at least, and there is low-bandwidth interconnect, some means for IPC, probably a shared chipset, and some amount of shared memory hierarchy, perhaps a shared L2 or L3 cache.

          The reason most people think it is synonymous with dual-cores-on-the-same-die is simply because it has historically not been simple or cheap enough to integrate multiple die in the same package. Intel has the capability to d
      • Re:Confusion (Score:3, Insightful)

        by drgonzo59 ( 747139 )
        Well I care if they are on the same die or not. Being on the same die means faster communication between them it also means less heat, which means a smaller heatsink, less noise and less electricity consumed. Whether it should be called "dual core" or "2x" or whatever marketing lingo is irrelevant for me.
        • Well I care if they are on the same die or not. Being on the same die means faster communication between them
          False.
          it also means less heat
          False.
          which means a smaller heatsink, less noise and less electricity consumed.
          False.

          Not only are these assmptions not inherently coupled with the # die integrated in the package, but... separating the cores out to multiple die actually can reduce latencies and enable better connectivity due to stacking, and furthermore has a drastic impact on yield, test time, an
          • How to explain consistent dual core advantage of AMD in bencharks. If connectivity is better and latencies are better the results should be better too. Stacking might not be as advantageous, the two cores cannot be put too close to each other because of the heat dissipation issue, if you put them far appart might as well put them on the same die and that will allow for better cooling too. From a theortical point of view having two stacked cores might mean shorter latency, but from an engineering and a cost
            • Separate die do not preclude the benefits of a single plane. They could be placed side-by-side if heat dissipation is critical and adequate heat dissipation vehicle/substrate cannot be packaged adjacent to the stacked die. But... I am definitely not an expert on packaging technology.

              But your question is definitely a valid one. There are countless architectural reasons why AMD would perform better. They could have better queueing mechanisms, faster IPC (inter-process communication), faster interconnect,
              • Yeah clearly AMD is playing a marketing game here. 2 dies in one package sure looks and sounds like a dual-core to me.

                Some people are die-hard fans of one or the other (AMD or Intel), I am a fan of what works better and is cheaper at the same time. It was Intel for me a while a go, now its AMD, but it might change, who knows...

      • The real question is:

        Is the dual core Intel product multi die or single die?

        I'd read the article, but the server hosting the horrible article is down.
      • Re:Confusion (Score:3, Insightful)

        by Vellmont ( 569020 )

        Some may think of it as a "cheap trick", but the reality is that
        1) The result is the same

        Absolutely false. CPUs aren't like cables holding up a bridge where you can twist two around each other and get double the strength. You can't just take two CPU cores on seperate dies and put them in the same package and expect the same peformance as a single die CPU. First of all there needs to be communication between the two cores. That slows down performance by quite a bit. If you look at the performance dif
      • Ah yes, sounds like the "good ol' days" of the Pentium2. Back when the CPU and L2 were "in the same package" but not on the same die. What a wonderful solution that was, it worked so ... wait, it barely limped along. The L2 was run at 1/2 the speed of the processor, and the Celeron 300a beat the pants off it.

        Intel likes to take shortcuts, hoping nobody notices, or that nobody cares. Unfortunately, nobody does, and the ones who can actually benefit from the proper technology are the ones who suffer.

        It
    • That caught you by surprise? You mean you actually figured out you had something called a "video card", that it was actually inside the box you have sitting on your desk, AND, you have opened successfully up your case, AND you identified which card was the video card somehow, AND you are considering putting a new one in its place, AND you have gone through the trouble to determine what is lacking about your card and which card is the best upgrade for it? All that trouble, AND you didn't think to read the
  • Only two ? (Score:4, Insightful)

    by alexhs ( 877055 ) on Thursday June 02, 2005 @03:11PM (#12707109) Homepage Journal
    Although there are currently only two main players in the CPU market, AMD and Intel [...]

    Huh ?
    What about IBM and all those embedded CPUS ?
    Did you mean PC Desktop CPU market ?
    • IBM (well, their subsidiary, VIA) is still in the PC Desktop CPU market. Sites like that ignore them though because H4rdc0r3 G4m3rz don't want a 20 watt motherboard in a 17mm x 17mm form if it's only fast enough to run all their desktop (non-game) applications, watch HD videos, be a SOHO router/firewall/fileserver, be an educational beowulf cluster node, etc. They want power because the salesperson told them they need power.
    • Re:Only two ? (Score:4, Interesting)

      by spaceyhackerlady ( 462530 ) on Thursday June 02, 2005 @04:25PM (#12707803)

      On a day-to-day basis I run in to four kinds of CPUs: x86 (typing this on one), UltraSPARC (most of the boxes at work, plus an Ultra 5 I bought on EBay to play with), ARM (my Palm - one of the new ones), Power PC (stuff at work) and several 68k derivatives (various boxes at work from little to seriously studly).

      This doesn't include the niche processors, Analog Devices and TI DSPs, various PICs, and so on.

      ...laura who actually owns a DragonBall development board

      • "On a day-to-day basis I run in to four kinds of CPUs:
        1. x86 (typing this on one),
        2. UltraSPARC (most of the boxes at work, plus an Ultra 5 I bought on EBay to play with),
        3. ARM (my Palm - one of the new ones),
        4. Power PC (stuff at work)
        5. and several 68k derivatives (various boxes at work from little to seriously studly)
        ."


        NOBODY EXPECTS THE SPANISH INQISITION!!!!!

      • I bet you also run into a Z80 CPU somewhere too. Those things seem to be everywhere.
      • Apart from x86 PCs, I use my iPod (ARM), PocketPC (ARM) and N-Gage (ARM). ARM; The biggest CPU supplier most people have never heard of.

        At the office I have three old computers for decoration (two actually work); Sinclair ZX Spectrum, Apple IIc and Atari 400. Remembering what CPUs they use is left as an exercise for someone who's not about to go to lunch...

    • Your comment might have been helpfull if you were actually pointing out a misunderstanding that people were having, or the article being just plain wrong about something. Everyone reading the article knows that we're talking about PCs.
      • article knows that we're talking about PCs.

        No. The article summary mentions Xeon, and when I think Xeon I think servers. In the server market there's more than two CPU manufacturers: Intel, AMD, IBM, Sun, Fujitsu, NEC, HP.
      • Everyone reading the article knows that we're talking about PCs.

        Then there are Mac bigots, like me, who like to point out that Macs are Personal Computers too!

        But maybe those are just the old ones who remember when the term "PC" actually encompassed a wide range of choices "IBM compatible PC" was just a marketing phrase.

        But, like you said, we all know what we're talking about when the term PC is used.
  • Surely... (Score:2, Funny)

    by simonew ( 887293 )
    ...the kind of people that visit this site know the difference their 478s and their 939s?
  • by Animats ( 122034 ) on Thursday June 02, 2005 @03:19PM (#12707174) Homepage
    This isn't about architecture. It's just a one-page note about CPU chip sockets. Big deal.

    Who picked the article title?

  • To avoid the Slashdot pounding
  • Authorship (Score:3, Informative)

    by Stankatz ( 846709 ) on Thursday June 02, 2005 @03:25PM (#12707243)
    "Doggie Fizzle writes[...]" No, Jason Kohrs wrote it. "Doggie Fizzle" copied and pasted it. I think the /. editors need to change their format a bit so as not to mislead readers about who writes these "summaries".

    (And thanks in advance for moderating me "Troll" or "Offtopic" for pointing this out.)
  • by SeaFox ( 739806 ) on Thursday June 02, 2005 @03:41PM (#12707373)
    "Although there are currently only two main players in the CPU market, AMD and Intel, the number of choices is still enough to make the typical consumer's head spin."

    Maybe this is why there's a near monolopy in operating systems, it's a good thing. Giving customers an actual choice seems to be enough to make their heads spin.

  • A story where my .sig has some relevance.
  • Interesting analogy (Score:4, Informative)

    by Zapraki ( 737378 ) on Thursday June 02, 2005 @03:51PM (#12707458)
    Ok, this is somewhat OT, but I think it's the best "layman" description of processor improvement that I've ever read. This is from Clock Speed: Tell Me When it Hertz by H. Gilbert, Dec. 22, 2004. Available at http://pclt.cis.yale.edu/pclt/PCHW/clockidea.htm/ [yale.edu]
    There are five ways to increase the processing power of a CPU or the teaching power of a High School.

    Raise the clock speed - In the analogy, this corresponds to reducing the time available for each class period. If the teacher can talk faster, and if the students behave and listen more closely, this can work up to a point. Each student gets done with the school day earlier.

    Build a Pipeline - A more complicated solution shortens the class period, but then breaks each subject into a sequence of steps. If it takes 45 minutes to cover Algebra, and that time cannot be reduced, then the subject could be covered in three consecutive 15 minute periods. A simpler subject might be covered in just one period. After all, there is no reason other than the convenience of scheduling why every every class for every subject lasts the same period of time. Students get done quicker, but only if some of the subjects are light weight.

    Parallelism - Add more classrooms and more students. No one student learns anything faster, but at the end of the day the school has taught more people in the same amount of time. Of course, this only works if you have more students in the school district to teach.

    Class Size - double the number of students in each classroom. High Schools don't like to do this. Computers, however, can easily switch from 32 to 64 bit operations. This will not effect most programs, but the particular applications that need processing power (games, multimedia) can be distributed in a 64 bit form to get more work done per operation.

    Build a Second School - Sometime in '05 or '06 both Intel and AMD will begin to ship "multi-core" processor chips. This creates a system with two separate CPUs. An individual program won't run any faster, and if these chips have a slower clock may even run more slowly. However, two programs will be able to run at once, and programs that require the most performance (games, multimedia) can be written to use both CPUs at once.

    • This is a terrible analogy since it likens increasing clock speed to reducing the length of lessons which essentially makes each clock cycle less useful.

      While clock speed isn't directly proportional to performance, in general you expect *some* improvement in performance from an increase in clock speed, all things being equal. Given that class periods include wasted time at the start and end, you'd expect a *drop* in performance from the "analogous" reduced time per class period.
    • This analogy becomes even more appropriate if you include the explaination of "increasing the bus speed".
  • by be-fan ( 61476 )
    An article that puts the word "socket"* in quotes is definitely an article that does not belong on Slashdot :)

    *) No, it is not wrong for me to use quotation marks around "socket" when I bitched about the author using quotation marks around "socket". The author was quoting to define the word "socket". I was quoting because I was reffering to "socket" the word, not socket the thing.
  • Oh! So the socket T MCU things are just big Basic Stamps? With all those I/O pins I bet I could control two Battlebots at once!

  • While I can appreciate that this article is aimed at the person who doesn't know anything, they miss a ton of CPU socket types in there. Many of which are either very common right now, or becoming very common.

    The article is way too terse, and doesn't really describe much. This same article could have (and probably should have been) 6 or 7 pages long. The fact that it only talks about intel processors is silly as well.
  • Did anyone else see the colors on the page?! Good thing Firefox lets us turn off CSS (View > Page Style > No Style), because it hurt my eyes to even look at the page. Especially their link text color.

    Too bad I didn't think to do it until I had already read the article...

  • In the article they state: "The Intel naming system used for the Pentium 4 processors in this class uses letters to represent the frontside bus speeds present. An "A" means 400 MHz, "B" means 533 MHz, and "C" means 800 MHz. So, a Pentium 4 2.4C would offer greater performance than a 2.4B or a 2.4A, despite them all having the same 2.4 GHz clock speed."

    But this is wrong.

    Original Pentium 4 CPUs used the Willamette core and ranged in speed from 1.3Ghz to 2Ghz in 100Mhz increments. They were built on a 0.18 m
  • Ummm did they forget about ARM, MIPS, PPC, SPARC?

"The one charm of marriage is that it makes a life of deception a neccessity." - Oscar Wilde

Working...