Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AMD Hardware

Leaked Pictures of Socket F 267

Robbedoeske writes "Dutch language site Tweakers.net has the first pictures of AMD's Socket F, aka Socket 1207. This socket introduces support for DDR 2 memory and some say it will offer the ability for a integrated PCI Express controller on the cpu. Socket F is meant to be used in systems with more than one Opteron cpu."
This discussion has been archived. No new comments can be posted.

Leaked Pictures of Socket F

Comments Filter:
  • Translation (Score:5, Funny)

    by Anonymous Coward on Tuesday November 08, 2005 @09:11AM (#13978209)
    For those that can't read Dutch the Socket F looks like any normal chip.
  • It looks similiar to Intel's new design with the pins, hopefully it isn't as easy to damage.
  • by Killjoy_NL ( 719667 ) <slashdot@@@remco...palli...nl> on Tuesday November 08, 2005 @09:12AM (#13978220)
    I'm dutch so I could read the forumpost that started it all.

    He actually said he counted all the pins, just to be sure to give enough information.

    Funny stuff (being dutch rocks)
    • by FST777 ( 913657 )
      Allright, for the fun of it:

      At our forum, Gathering of Tweakers, the first pictures of AMD's Socket F have emerged. In may we wrote in that AMD has set a new CPU-socket on their roadmap. The new socket would have 1207 connection point and would be meant for multi-Opteron servers. To prevent that a CPU with support for DDR-memory is placed on a DDR2-socket and vice-versa, a new socket was needed. The extra pins which are available with this step are rumored to be used for an integrated PCI Epress-control

  • EP? (Score:5, Funny)

    by Anonymous Coward on Tuesday November 08, 2005 @09:13AM (#13978230)
    Eerste post???
  • PGA (Score:4, Interesting)

    by theantipop ( 803016 ) on Tuesday November 08, 2005 @09:14AM (#13978238)
    If true, it is interesting to see AMD moving to pin grid array-style cpu connection. Intel has used this for a little while now with thier socket 775 Pentium 4 chips. While there was concern over broken pins resulting in unusable motherboards, it now seems to be a relatively robust mechanism. I wonder what advantages AMD saw that lead them to this design. I also wonder if their Socket M, 940 pin solution for next years Athlons will use the same socket design.
    • Re:PGA (Score:4, Informative)

      by sarahemm ( 707486 ) <{sarahemm} {at} {sarahemm.net}> on Tuesday November 08, 2005 @09:31AM (#13978349) Homepage
      I think you mean Land Grid Array (LGA [wikipedia.org]). Pin Grid Array (PGA [wikipedia.org]) is what they've been using since the 486 (386?) days...
  • by Anonymous Coward on Tuesday November 08, 2005 @09:15AM (#13978241)
    The more I learn about Apple and Intel the more worried I get.

    IBM is cranking out killer PPC chips.
    AMD is cranking out killer x86 chips.

    And Intel looks like they are ready to compete in some sort of Special Olympics for Computer Chips.

    How the hell can AMD be making such better chips and companies like Dell still selling Intel powered crap?

    • by pivo ( 11957 ) on Tuesday November 08, 2005 @09:26AM (#13978319)
      How the hell can AMD be making such better chips and companies like Dell still selling Intel powered crap?

      That's easy: Marketing
      • by Anonymous Coward
        Marketing money. Intel writes big checks to Dell as "co-operative marketing" funds. Also, Intel maintains its own sales force, including technical support for developers, on the corporate level. Guess how they decide which HW platform to recommend - who's laptops they show up with.

        Finally, Dell isn't much of an engineering company - they need to keep they're offerings simple - both for their supply chain and support. Helps to keep it easy and cheap to acquire and sell.
    • While AMD and IBM make technically superior chips, they simply don't have the mass manufacturing capability to compete with Chipzilla; a side effect of the huge capacity is the ability hae the quantity of procs available to offer deep discounts to high-volume customers (e.g. Dell and Apple) and still make money.

      On a side note, the stuff due to be out of Intel by the time Apple switches the PowerMacs doesn't look too shabby at all - of course, we'll have to see what IBM/AMD are offering to compete.
    • by NixLuver ( 693391 ) <stwhite&kcheretic,com> on Tuesday November 08, 2005 @10:20AM (#13978741) Homepage Journal
      How the hell can AMD be making such better chips and companies like Dell still selling Intel powered crap?

      Initially I think you have to consider exactly what Apple is trying to achieve. IBM won't play ball with Apple's laptop designs, and the powerbooks (as much as I love 'em) are being left behind, pretty badly, by X86 stuff. Intel mobile chips, as nearly as I can tell, offer the very best performance per watt of mobile solution at the high end (The G4 still kicks the crap out of 'em at comparable speeds, but since the fastest mobile G4 Mac you can get is 1.67 Ghz, it's a moot point).

      And one thing that the geek community loses sight of is that when we talk about AMD 'kicking the crap out of Intel', it's on a pass/fail basis; overall, they have traded the 'speed lead' several times since the initial offering of the Athlon, and rarely has one lead the other, in dollars per MIP, by more than 3-5%; since most websites that do comparative benchmarks trim the chaff so you can see the difference, the average page scanning consumer or geek gets a warped impression. If we have a scale that's 1000 units long, and Intel's chip does 990, and the AMD chip does 995, and we only show the last 10, it looks like the AMD is twice as fast, when it's really only .5% faster. In these days, a hardware site will pronounce a significant win over a 3% overall difference in performance!

      Application also matters. For instance, I do a lot of recording with pro hardware and software. The fact is that most of the software is optimized for the Intel chips much more so than the AMD, so in side-by-sides, I see about 20% better performance for the same hardware and software on my P4 over my Athlon. In some cases as much as 200-300%; I assume that those are REALLY optimized for the P4. But if I run up games on the two machines, the Athlon is 5-10% faster across the board (With the same video card).

    • Since when did technical superiority have anything to do with market dominance????
    • How is this +4 interesting? By any logical standard this is offtopic. Anyway, there are simple answers to all of your questions. 1. IBM is cranking out killer PPC chips. True, but they are not cranking them out fast enough, and they are not making good on their roadmap (PPCs were supposed to hit 3+ GHz in 2004). Also, it doesn't matter, as only Apple uses PPCs in mainstream computers, and Intel/AMD/Microsoft aren't just going to switch one day. 2. AMD is cranking out killer x86 chips. Also true, but
    • by Kjella ( 173770 ) on Tuesday November 08, 2005 @01:40PM (#13980660) Homepage
      How the hell can AMD be making such better chips and companies like Dell still selling Intel powered crap?

      Quite easy when you realize that that majority of consumers don't actually use the full capacity of their CPU very often. If you look at games the GPU is far more important than the CPU, which leaves heavy CPU use to media encoders, compilers and scientic processing. That's not really a big share of the market.

      Civ4 mins: 1.2 GHz or equivalent
      SW Battlefront II mins: 1.5 GHz or equivalent
      Call of Duty II mins: Pentium IV 1.4GHz or AMD Athlon 1.4 GHz or equivalent
      Age of Empires III: 1.4 GHz equivalent or higher processor or equivalent
      F.E.A.R. mins: Pentium(R) 4 - 1.7 GHz or equivalent
      Sims 2: 800 MHz processor or equivalent
      Quake 4: 2.0 Ghz or equivalent

      Those are some of the latest games released. PIV 2.0GHz was shipping in june 2002, so they are over three years behind the state of the art. And games are normally the most intense apps a user has. Basicly, an Intel machine does pretty much everything a computer user wants to do, so does an AMD. The rest is simply mindshare and momentum. Intel can drop their prices at any time if the market is slipping. They are simply balancing out taking out extra profit versus the threat AMD poses. If they don't watch out, they'll take a spanking in the professional market though, where admins are much more aware of what they're buying...
  • next step? (Score:4, Interesting)

    by Anonymous Coward on Tuesday November 08, 2005 @09:16AM (#13978245)
    Personally I never imagined integrating a PCI Express controller in a CPU. If this trend of intregation continues, what would be the next logical step?
    • Graphics controller on chip. Intel has shown such a chip at IDF this past fall.
      • No. Icky. Bad. You don't take the fastest changing part of a system and put it in a component that changes the slowest. Also, strapping a 110W GPU to a 60W CPU is not a smart thing.

                  -Charlie

        P.S. Bad, bad, bad. No cookie.
        • I have a feeling the integrated graphics will be, well, integrated graphics. IE, a very very low end graphics card that works just fine as long as you don't care about 3D performance.
          • Re:Ewwww (Score:2, Insightful)

            by ichigo 2.0 ( 900288 )
            Isn't integrating graphics on-chip a waste of transistors then? Unless Intel has given up on gamers and are aiming their processors to low-end users and workstations...
    • Back to the '80s (Score:4, Insightful)

      by cronot ( 530669 ) on Tuesday November 08, 2005 @09:33AM (#13978364)
      The first thing that came into my mind after reading the parent and its replies, is that this is coming closer to what microcomputers used to be back in the 80s, with the MSX, ZX-Spectrum, etc. Well, maybe the keyboard will remain detachable, as will any User Interactive peripheral, but everything else used to be much closer to the CPU back then.
    • by Barny ( 103770 )
      Integrated cpu/northbridge/southbridge on a 12,156 pin socket, some say it will support 4 channels of ddr4 and be able to heat a large home :)
    • by hal2814 ( 725639 ) on Tuesday November 08, 2005 @09:47AM (#13978445)
      The next step would be that my co-workers would actually be correct when they refer to the box that houses the motherboard, video card, memeory, etc. as a "CPU."
      • Re:next step? (Score:3, Insightful)

        by cyxxon ( 773198 )
        Hm but what will then happen to my coworkers who always refer to it as "the harddrive" (and in reality store all their stuff on a mapped network drive)?
    • Re:next step? (Score:4, Interesting)

      by pla ( 258480 ) on Tuesday November 08, 2005 @09:54AM (#13978499) Journal
      Personally I never imagined integrating a PCI Express controller in a CPU. If this trend of intregation continues, what would be the next logical step?

      Single-chip computers - A CPU, and a totally passive backplane that does nothing but provide real-estate for connectors. And most likely, you wouldn't strictly need any extra cards, with a decent (but not high end, thus the need for a bus at all) GPU included right on-die.

      Realistically, I expect two-chip computers as far more likely. Something along the lines of having your CPU plug directly into your video card, which has the standard video card parts on one side, and standard motherboard connectors on the other. And the whole thing could mount via a SECC-style connector to a power bus, right inside something just a tad bigger than current ATX power supplies.

      Drives? Uhhh... I'll have to think about that one. ;-)
      • With computer hardware of this day and age wouldn't that create really unmanageable heat density? Don't get me wrong, I love the idea, cause that would mean you could get something about the W x H dimensions of an LCD monitor just not as big on the large face (if I'm envisioning what you meant correctly) --just thinking of how to dissipate the heat from a Radeon 9800 and an Opteron if they're literally on the same board
        • cause that would mean you could get something about the W x H dimensions of an LCD monitor

          Actually, I had something more like a double-high and slightly deeper DVD drive in mind (including an actual DVD drive as part of it, of course. But then, I prefer cubes to pizza-boxes, personally... The actual shape wouldn't much matter. "Much smaller", at any rate, with the cooling probably taking up more room than the active components themselves.

          I suppose that would make water cooling a lot more attractive -
      • Sounds like you're talking about the P.A. Semi [theinquirer.net] chip recently announced. =]

        Well, sans GPU. But given the PCI express interface, a custom one off wouldnt be that hard to tack onto the board. Given the perposterous ram bandwidth on the PA Semi chips, a solution like nVidia's TurboCache would work great: just have one unified ram for processor and GPU.

        Thats one other thing I dont expect to see integrated any time soon: RAM. As for storage in general, hopefully flash will continue growing in capacity at a va
      • Single-chip computers - A CPU, and a totally passive backplane that does nothing but provide real-estate for connectors. And most likely, you wouldn't strictly need any extra cards, with a decent (but not high end, thus the need for a bus at all) GPU included right on-die.

        What about heat output? The next logical step isn't to shove them closer together - it's to use the real-estate you already have. Either the iMac G5 model, or back to the commodore model (your keyboard is your computer). If you detached th
    • They've added the memory controller and now they are adding the pci-e controller. If they keep adding things to the chip soon it will be so big that they'll just put the expansion slots directly onto the cpu. It WILL be the motherboard.
    • by Anonymous Coward
      The next logical step is to integrate the user into the chip. Chips will become huge, and the housing market will merge with the computer industry.
    • by eth1 ( 94901 )
      ...600W power supply? :P
    • Personally I never imagined integrating a PCI Express controller in a CPU. If this trend of intregation continues, what would be the next logical step?

      The display, keyboard and mouse.
    • 2 GB of main memory on the chip, or better yet a GPU. Then a wireless chip and finally the audio subsystem. If we can put all of these things on the chip we could have a griddle and make pancakes.
      • I agree Memory. More specifically, large quantities of DRAM. This is more a result of the decreasing size and trying to figure out what to do with all the additional transistors available. The GPU is an interestign idea, but I think DRAM is easier solely because the heat densities on GPU and CPU are so high that the lower heat density of DRAM would make it easier to add to a CPU. In all likelyhood the DRAM will actually act more like a gigantic exclusive L3 cache, of course with the option of having no
    • Didn't we learn anything after Cyrix?
    • I'm not sure if puting the PCIe bus on a chip is a good idea for a general purpose CPU, but then, I've been wrong about stuff like this. I didn't think that puting a memory bus on a general purpose CPU was a good idea either, but apparently, it worked out pretty well.

      I think the Alpha EV8 was supposed to have a built-in network interface, I don't know what sort of network interface it was supposed to be though, if it was Gigabit ethernet, 10Gig ethernet or just a generic processor network. Projects are un
    • Personally I never imagined integrating a PCI Express controller in a CPU. If this trend of intregation continues, what would be the next logical step?

      More "pins" - it's the new MHz.
      • If they keep putting more stuff on the CPU, at some point there should be a decrease in pin count, as there isn't much offboard to talk to any more! I like the other guy's idea of putting the DRAM on there, that should slash the pin count. Personally I've been stuck at 1GB DRAM for a few years now and don't need more. Let's make it 2GB onboard DRAM and call it good.
    • Given that more than two cores has quickly diminishing returns unless you have rediculously CPU-bound multi-threaded code, I would be surprised if we didn't see some very large caches or even a portion of main memory on die. OSes are getting NUMA support for Opterons, on-die memory would be another case of that.
  • I'm an AMD fanboy, oh yeah. Their chips are soooo good. Problem is that in Europe there wasn't enough press releases about AMD vs. Intel dual core duel [amd.com]. In fact there is nothing about that. Looks like all PR quietly took a large sum of money from Intel, and this duel is totally ignored by media [google.com]. I feel bitter about that.

    yes, I heard that AMD has launched a big ad campaign in US, but sadly this is not the case for Europe.

    and all the universities in western europe are forced to buy upgrades from Dell. I r
    • I heard that AMD has launched a big ad campaign in US, but sadly this is not the case for Europe

      Funny, with Adblock, Flashblock, Tivo, Netflix and Bitorrent, I can't recall the last commerical or ad I've seen.
      • Excellent. The government eat wendys subliminal advertising scheme is working perfectly. World buy pepsi domination is at hand.
      • Funny, with Adblock, Flashblock, Tivo, Netflix and Bitorrent, I can't recall the last commerical or ad I've seen.

        Try getting out of the house. Advertising is everywhere.

        __
        Laugh Daily adult funny video [laughdaily.com]
        • Only listen to podcasts while on the road, and cover your eyes.
        • Try getting out of the house
          WTF, there is nothing outside of the basement, why would I want to leave that?

          Advertising is everywhere.
          Define everywhere. Yeah magazines have advertisements but I don't have to buy them. Newspapers have advertisements I don't buy them either. On the radio, it is easy enough to change channels during a commercial but NPR and the BBC don't have a bunch anyway. Billboards, I can't ignore them with pretty ladies but I can't tell you what it was advertising. Are you in a being hel
    • No point really. The ones who would wish to know how good is AMD already know this. Some poeple have very close relatives who have taken care of this. Rest will just believe what salesmen tell them no matter what. So they usually end up with something like P4 with 256MB of RAM on cheapest ECS board with Via chipset. And Nvidia Gf5200 (hey, the board is total no name, but it's Nvidia after all, and Nvidia is the best). And cheapest LCD monitor NOT connected via DVI (it's not that salesmenn don't tell about D
    • "and all the universities in western europe are forced to buy upgrades from Dell."

      Nonsense! We have Viglens with P4s in them...
  • NP (Score:5, Funny)

    by kevin_conaway ( 585204 ) on Tuesday November 08, 2005 @09:19AM (#13978267) Homepage
    Ahh nerd porn. While the rest of the world is looking at leaked photos of Janet Jackson or Paris Hilton, we're looking at photos of AMDs new processor.
    • Re:NP (Score:5, Funny)

      by evilviper ( 135110 ) on Tuesday November 08, 2005 @09:26AM (#13978318) Journal
      While the rest of the world is looking at leaked photos of Janet Jackson or Paris Hilton, we're looking at photos of AMDs new processor.

      No, no... This isn't even pictures of a new AMD processor... it's pictures of the SOCKET where the processor will go.

      It's more like pr0n pictures of a bra or a bikini, without anyone wearing them... :-(
      • by Vo0k ( 760020 )
        Nope, that's closeup of the sexxy part of the motherboard, where the CPU inserts all its pins.
        Since the introduction of the zero-force sockets, plugging a CPU in is not that arousing anymore...
        • Nope & nope - that's a picture of all the pins on the socket where the CPU will *rest*.

          A tiny bed of nails for dual core Opterons. Helps them focus! Ssh!
      • by Surt ( 22457 )
        Are you kidding me? The socket is clearly the female side of this connection. Any nerd caught looking at the pins on his CPU may as well be checking out his friends ... you know.
    • by MouseR ( 3264 )
      Wich goes to say that in this world, there's a socket for everyone.
    • by Hoi Polloi ( 522990 ) on Tuesday November 08, 2005 @09:52AM (#13978483) Journal
      You just made me realize that reading the article description got me as excited as looking at nekked pics of Paris Hilton. The big difference is that AMD CPUs are much more interesting than her and are more talented. They are both about as flat and prickley though.
    • While the rest of the world is looking at leaked photos of Janet Jackson or Paris Hilton, we're looking at photos of AMDs new processor.

      I know I'd rather see a cpu than Paris Hilton. You can at least try to carry on a convo with a cpu.. and odds are it will have something more intelligent to say.
  • Translation (Score:5, Informative)

    by Anonymous Coward on Tuesday November 08, 2005 @09:26AM (#13978322)
    The first photographs of AMD's Socket F have shown up on our Gathering of Tweakers forum. We wrote about AMD having put its new processor socket on its roadmap last May. The new socket is said to have 1207 connection points and is intended for multi-Opteron servers. To prevent the insertion of a DDR-supporting processor into a DDR2-socket and vice versa, a new socket design was necessary. The extra pins that came available are said to be used for an integrated PCI Express controller. What's remarkable is that there's a clear separation in the middle of the socket. This could indicate that each core of a dual-core Opteron has its own set of contacts and thus is treated as two separate processors.

    The photographs furthermore show that Socket F, as Intel's Socket 775, will feature pins that make contact witht he processor. This is a so-called LGA socket: the CPU will no longer feature pins that have to be pushed into the socket. Socket F is also called Socket 1207, but carefull counting reveals that the socket only features 1206 pins. This socket supports DDR II 533-, 667- and 800MHz memory and this allows AMD to compete with Intel's FB-DIMM plans. The latter is scheduled to introduce its dual-core Dempsey platform in April, featuring the Greencreek chipset with support for FB-DIMM memory.
  • Yet another socket (Score:5, Interesting)

    by Cerberus7 ( 66071 ) on Tuesday November 08, 2005 @09:29AM (#13978336)
    Yay. I'm still on the fence if all of these different sockets are a good thing or not. I've gone from Socket 7 to Super Socket 7 to Socket A over the course of the last several years. Now it seems that there are way too many different sockets to choose from, and who knows which will show the same kind of longevity that my past choices have. What's a guy to do?
    • elmuerte is waiting for one socket to rule them all.

      damn, that must be pure evil.
    • Realize that a new motherboard is only about 10% of overall system cost, and that by the time you can afford and need a processor upgrade you can probably afford a motherboard upgrade to go with it.

      As a backup, the best strategy will tend to be to buy early in a socket's lifetime. Buy a new socket with a low end chip, and figure to upgrade to the highest chip that socket will support when said chip becomes cheap.

      • Aye, your strategy is my own -- works great, too. I did this with Socket A years back and again with socket 478.

        As a side note, I've found that I get better longevity out of a build (or "assembly", for those who are picky about terminology) if I wait for a board that has a combination of well-developed solid features on it, too.... sound, on-board VGA, SATA, SPDIF support, etc.

        What it probably comes down to is letting enough time go by such that all the major moboard manufacturers have the opportunity t

    • by Jendi ( 917869 )
      There really aren't that many to choose from -- your choices are basically defined by "server vs desktop" motherboard (eg socket 940 dual opteron/registered memory or socket 939 desktop) and "AMD vs Intel". After that, sure you'll want to stretch your current investment as far as possible, but at some point you have to bite the bullet and replace your motherboard and memory, how else are you going to keep getting loads more lovely memory bandwidth for your system?

      IMHO, I'm going to try and wait until AMD M2
    • It's no good complaining about the socket, because even if that stayed the same, the rest of the board you buy today or tomorrow wouldn't be compatible with DDR2 RAM or perhaps DDR3 RAM tomorrow.
  • by Anonymous Coward
    On our forum Gathering or Tweakers the first photograph of AMD's Socket f has emerged. In May we wrote all that AMD new processorsocket on its roadmap had put. The new voetstuk 1207 connection points would count and is intended for multi-Opteron-servers. To occur that a processor with support for Ddr-geheugen are pricked on Ddr2-voetje and vice-versa, therefore new socket were necessary. The extra pins which become available, could would according to reports be used for incorporated PCI Express-controller o
  • My god, it's full of holes.
  • Look at all them pins!

    I though having those pins on the socket was a stupid idea, but it's interesting to note that even if you did damage the pins on the motherboard, chances are it will be cheaper to replace it than the processor itself. Although only replacing the processor would be much more convenient.

  • Great, I haven't even built my new AMD 64 system and now I have dangled in front of me the latest and greatest to come. It will not only require a new CPU, but a new motherboard and new RAM (DDR2 vs DDR).

    Must...not...chase...bleeding...edge!
  • by lpangelrob ( 714473 ) on Tuesday November 08, 2005 @10:00AM (#13978574)
    I don't know what came first, Double Data Rate or Dance Dance Revolution, but I curse the second group that used the DDR acronym.

    Every single time I see DDR and compatibility, I think, wait, why do you need anything else with DDR?

  • by Hymer ( 856453 )
    ...it looks just so sexy... It so big, and there are so many holes...
    My GOD... YES... YES...

    Sorry... It just came over me...

    --

    Real CPU's have the cooler mountet with two 10mm nuts...
  • by kriston ( 7886 ) on Tuesday November 08, 2005 @11:15AM (#13979194) Homepage Journal
    I can't tell from the photographs--is this socket going to be a pinless processor like Intel's Socket-775 or are we stuck with over 1000 fragile whisker-like pins? I started appreciating my new Socket-775 system after I installed my Socket-754 with all the fragile pins on it. At first I thought it was silly but after straightening out more than a couple whisker-thin pins on my Athlon 64 CPUs I'm hoping Socket-F follows the precedent of using pin pads.
    • after straightening out more than a couple whisker-thin pins on my Athlon 64 CPUs

      If you are straightening pins, you're not being nearly careful enough. Socket 754 pins are considerably beefier than, for example, socket 468, and few people complained about Pentium 4 Northwood being easy to damage.

      LGA is more about better electrical connectivity than preventing bent pins. Remember that most CPUs go into OEM systems, which are aseembled by people who are much better at inserting CPUs that you are.
  • by UnknowingFool ( 672806 ) on Tuesday November 08, 2005 @11:57AM (#13979679)
    A new AMD Socket? No wonder the tech room was covered in drool this morning.

    God, I hope that was drool.

Measure with a micrometer. Mark with chalk. Cut with an axe.

Working...