Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Intel X58 To Be First Non-NVIDIA Chipset To Get SLI 103

Vigile writes "In a somewhat surprising move from a company that is used to holding its proprietary technologies close to its chest, NVIDIA has announced that it is opening up a 'certified SLI motherboard' program for boards using the upcoming Intel X58 chipset. The X58 is Intel's core logic offering for Nehalem/Bloomfield processors and many people wondered how NVIDIA would support SLI on a platform for which they had admitted to not developing a chipset. At first, NVIDIA was pushing the use of their dedicated nForce 200 chip, but have instead decided to open up the SLI technology to X58 motherboards that meet certain NVIDIA requirements. This leaves a lot of questions about NVIDIA's previous SLI statements, how the pricing of the certification affects partners, and if NVIDIA's chipset business is truly at its end now."
This discussion has been archived. No new comments can be posted.

Intel X58 To Be First Non-NVIDIA Chipset To Get SLI

Comments Filter:
  • by bestinshow ( 985111 ) on Thursday August 28, 2008 @09:11AM (#24778085)

    Maybe they got a CSI interconnect license from Intel in return for the SLI technology.

    Or the days of proprietary GPU ganging technology are coming to an end. Intel already does Crossfire in their chipsets, and AMD's GPUs are best right now so that's two hits against NVIDIA for their GPUs for the people that buy Intel-based computers.

    • by confused one ( 671304 ) on Thursday August 28, 2008 @09:33AM (#24778351)

      Maybe they got a CSI interconnect license from Intel in return for the SLI technology.

      that's exactly what was reported during IDF, Intel wouldn't license CSI interconnect unless it was part of a cross-license for nVidia SLI.

      • Re: (Score:3, Informative)

        by jensend ( 71114 )
        If nV really were getting QPI in exchange for this, then this would be a big win for consumers all around. However, the news I'm seeing says otherwise, for instance this bit from Tech Report [techreport.com]:

        [nV spokesman Tom] Petersen also told us Intel wasn't a party to Nvidia's decision to allow SLI on the X58, so there's no apparent quid pro quo here.

        Nvidia does not plan to abandon its chipset business entirely and will continue to make core-logic products for other Intel platforms, like the current Core 2 one.

        So "we'll

        • Or perhaps they'll continue making low-end high volume integrated graphics motherboard chipsets and are just leaving the low-volume neckbeard motherboard segment to Intel.

        • Aw crap. I hadn't seen that article yet. I wish I could find the link to the article which was, obviously, incorrect.
    • by suso ( 153703 ) *

      Didn't 3dfx have SLI capability 10 years ago?

      • Re:3dfx (Score:4, Informative)

        by bigstrat2003 ( 1058574 ) * on Thursday August 28, 2008 @10:04AM (#24778789)
        Same name, different tech. nVidia SLI [wikipedia.org] vs 3dfx SLI [wikipedia.org].
        • Re: (Score:3, Interesting)

          by suso ( 153703 ) *

          Different technologies that accomplish the same thing:

          • Scalable Link Interface (SLI) is a brand name for a multi-GPU solution developed by Nvidia for linking two or more video cards together to produce a single output.
          • Scan-Line Interleave (SLI) from 3dfx is a method for linking two (or more) video cards or chips together to produce a single output.

          Sure, it "changed dramatically", but don't all technologies change over time? 3dfx just didn't make it and especially since most of its IP was acquired by Nvidi

    • Re: (Score:3, Insightful)

      by citsacras ( 745706 )
      All OEM X58 motherboard manufacturers will have to submit their boards for certification by NVIDIA. NVIDIA will also be charging an undisclosed certification fee. If the board passes certification, NVIDIA issues a BIOS key enabling SLI. The NVIDIA SLI drivers check for the presence of this key. NVIDIA will continue to design chipsets for Penryn based platforms, but it will not be making any QPI enabled chipsets for Nehalem. Thus, with Nehalem, the only way to get SLI support with an LGA-1366 MB would be to
    • 'bout time!

      Up until now, nVidia held SLi to only their nForce chipsets to try to force users to buy them. But in general, the Intel ones were better - so unless someone was really, really fixated on SLi, they just went for the Intel ones and got a single Graphics card. But if they were really out for top performance, they'd just go for Intel with 2 ATi chips - it only hurt nVidia in the end. Their fixation on trying to force people onto their nForce chips just slightly injured their graphics card sales.

      `Jar

  • Nvidia (Score:2, Interesting)

    by Anonymous Coward

    Sounds to me like Nvidia has something up their sleeves. Or maybe not. Being a participant in the Intel Chipset might just mean they are trying to keep their foot in the door. they probably have a better grasp on whats going to happen in the future. Being compatible with intel chipsets (i think) is a big step in keeping their dominance in the Graphics adapter world. Without Intel acceptance, they might have problems with market share in the future when Intel releases their GPU. Way to go Nvidia, for keeping

    • Re:Nvidia (Score:4, Insightful)

      by pak9rabid ( 1011935 ) on Thursday August 28, 2008 @10:54AM (#24779507)

      Sounds to me like Nvidia has something up their sleeves.

      Such as licensing their SLI technology to Intel so that they can get an x86 license in return?

      • this is an interesting idea as intel actually needs viable competitors to avoid getting hit by the same laws that almost killed Microsoft. If AMD's CPU arm continues as they have done in the past year I cannot see them lasting very long so intel needs to add some competitors.
      • Intel has nothing nvidia wants.

        nVidia after a few acquisitons can make 386SX processors that they do sell. I have no idea how much the i686s have changed.

        As for 64-bit x86, well, the only people good for that are AMD. Intel's EM64T is just a botched AMD64 because itanium was a spectactular failure. It doesn't look very likely that AMD will deal with nVidia though. There's probably a bit of bad blood now; nvidia's CEO wanted to be CEO of the merged company, and that's why the merge never happened but AMD bou

        • Re: (Score:3, Funny)

          by Trogre ( 513942 )

          All the better. The GTX series cards are looking to be completely craptastic. Outlook not bright.

          Truer words have never been spoken. Tell me, what do you think of Exchange Server?

    • Seems to me that this is a last-resort by Nvidia to stay in business.

      1) Almost every hardcore gamer will be building an Intel Bloomfield/X58 system in the next 12 months (along with companies like Falcon Northwest that cater to hardcore gamers).

      2) Many of the largest mobo makers refused to incorporate the nForce 200 chip in their designs, it's hot and adds costs and licensing restrictions.

      3) AMD/ATI has just released new high-end graphics cards that are actually competitive, and that work in multi-GPU confi

  • Stop it already! (Score:5, Insightful)

    by Anonymous Coward on Thursday August 28, 2008 @09:25AM (#24778255)

    This leaves a lot of questions about NVIDIA's previous SLI statements, how the pricing of the certification affects partners, and if NVIDIA's chipset business is truly at its end now.

    Will you stop posting that stupid bloody rumour? There is no evidence, it has been flat out denied by nVidia, and it would be a stupid move. It is a made up outright lie, propagated by idiots like The Inquirer and fools who never read the multiple retractions. Frankly, I wish nVidia would start suing anyone and everyone who insists on reposting that stupid crap.

    I don't even like nVidia, but this sort of stuff just pisses me off.

    • Welcome to Slashdot - Where rumors are continually reported as true. Enjoy your stay.
    • Kind of like how Apple denied that they'd use x86 until they actually switched?
    • Without a QPI license they can't make dual 16x slot motherboards (the DMI interface doesn't have that kind of bandwidth) so they are relegated to making low end motherboard chipsets only ... that is not a place they want to be.

      AMD can't really save them there, since AMD is pretty much dead in the high end in the moment ... and they are in bed with ATI, which makes it hard for NVIDIA to compete on the integrated front.

      If they don't get a QPI license they will phase out their chipset business, the "denial" wa

  • The going rumor is that Intel and Nvidia struck a deal where Intel would get a license to make SLI chipsets, and Nvidia gets a license to make chipsets for teh upcoming i7 (Nehalem) platform.
  • by Catalina588 ( 1151475 ) on Thursday August 28, 2008 @09:29AM (#24778301)
    I have a Foxconn-built 590i nVidia reference board that supports SLI and has a raft of other features like smart Ethernet, RAID, yada yada, yada. I also use a Skulltrail with SLI chips in it.

    The reason most gamers buy nVidia reference boards is to get SLI. With nVidia now certifying other vendors, starting with Intel's X58, the nVidia reference SLI motherboard market is RIP 2008.

    • by Reapman ( 740286 )

      Actually I have an nVidia chipset based board and I don't SLi... I like what nVidia has done in that business, opened it up and made it easier to work on stuff... sure some of the software they have for overclocking and such is crappy, but better then any Intel based chipset I've ever used.

      I won an Intel based motherboard (a couple years back I admit) and if it showed me anything, it's that Intel knows nothing about the gaming business. I trust nVidia to make better chipsets because I've personally had gre

      • by rgviza ( 1303161 )

        They're getting out because their chipsets suck and they are focusing on their core business, which is GPUs. Your Mobo had a MCP55 southbridge with a broken interrupt timer. I have the same chipset (on a DFI board) It won't rear it's ugly head unless you use a dual core CPU and game or use linux.

        Without a BIOS patch you'll kernel panic on linux unless you use a switch that tells the kernel to use it's software timer (coded just for this issue) A while back Microsoft put in a patch that detects this issue an

      • by daviee ( 137644 )

        I have an nVidia 680i board and now a 780i board. I got it for fancy features at the time, but don't overclock.

        I'm not sure how Intel chipsets are right now, but to me it seems like nVidia boards (mostly all reference boards) are cutting things too close. E.g. 4 SIMMS in slot and the voltage drops? What kind of design is that?

        • by Reapman ( 740286 )

          I've had horrific motherboards from AMD (wouldn't even fit the STOCK fan on the motherboard, was a Slot style older board), Intel (CUV4X-D I think, pretty sure it was Intel, I only once got the kernel to load with both CPU's enabled), Via, etc etc. I don't think these chipsets are crap because nVidia doesn't know what they're doing, they're crap because people want stuff as cheap as possible and so they cut corners.

          nVidia made stuff like tweaking motherboard settings easier and I like what they've done for

    • nvidia made the best chip sets for socket A that were desirable because they overclocked well, were faster and more stable than VIA etc. They also had very good integrated sound with soundstorm, whereas integrated sound had been not even good enough for casual music listening till this point. All before sli2.
    • Too right. I've had a selection of nvidia boards since nforce 2, but all the fun of xfi+4GB ram+conroe+nvidia boards became too much of a headache, and I jumped ship to intel. Nvidia boards have better features - a lot more pci-e lanes, for a start - but having a board that doesn't constantly fall over at random or cause the sound-card to be completely non-functional under certain conditions is worth it. I went through 4 nvidia boards from different vendors (nforce 4, a couple of 650's and a 680i) before I

  • What is a Bloomfield and why should I care?
    • by DnemoniX ( 31461 )

      The Bloomfield is Intel's new Quad Core Gaming CPU, with a bulk price to manufacturers at around $999 per unit. In other words unless you are a hard core gamer with money to burn, you probably shouldn't care about it at all.

      • Or maybe if I wanted to run Vista. ;-) Thanks.
      • The $999 chip is only the highest-end model. In 2008 the 2.66Ghz Bloomfield (or Core i7 by its new marketdroid name) will be $284, which is still not cheap, but also affordable considering it is a brand-new next generation part. In 2009 the Lynnfield mainstream version of Nehalem will appear.

  • competition is good .. keeps prices low and performance only goes up.
    but teaming amd/ati (nothing to do here .. they are one now) and intel/nvidia I don't think is good. How long before software heavily optimized for intel/nvidia or ati/amd? Or worse: only works on intel/nvidia || ati/amd??

    • Re: (Score:3, Informative)

      Intel chipset also supports ATI Crossfire. so there's no "teaming" intended apparently. Some software is already heavily optimized for AMD vs Intel cpu and nVidia vs ATI gpu.
  • Opening up a larger market for their bread-n-butter card sales can't hurt. Probably a bigger win for nVidia than trying to continue to cut in on chipset sales. Intel's X38 & X48 chipsets have been major successes, and have probably boosted sales of ATI (er, AMD) boards. Both nVidia and Intel have a vested interest in reducing the market share of AMD...so it's not completely off the wall. Makes you wonder what sort of tradsies are involved. Probably not an x86 license.
  • ati crossfire run on any board without this lock down crap. Also only haveing the X58 for the intel cpus sounds like a bad idea $200 - $350 MB for the new intel cpu vs $100 - $300 amd boards with a lot more choice.

  • by phoenix.bam! ( 642635 ) on Thursday August 28, 2008 @09:59AM (#24778707)

    But doesn't SLI mean NVIDIA sells two high end graphics cards? Why wouldn't they do this? It makes perfect sense in every way possible.

    • And previously, it meant they also sold you an Nvidia Chipset motherboard. And, not many people buy Nvidia chipsets just for the chipset anymore. That's why.

    • Re: (Score:2, Funny)

      by RiotNrrd ( 35077 )

      But doesn't SLI mean NVIDIA sells two high end graphics cards? Why wouldn't they do this? It makes perfect sense in every way possible.

      I was thinking this exactly. When I upgraded my system last winter I could not find a SLI board that I really wanted, so I was stuck with having to run only one video card.

      If Apple were to implement SLI in their Mac Pro's I might dump my PC completely and just get a big, shiny, dual-GPU-having, silver box of HELL YEAH!

  • SLI (Score:1, Flamebait)

    Isn't SLI a con?
    • Re: (Score:3, Funny)

      Isn't SLI a con?

      No.

  • nForce boards (Score:4, Informative)

    by linuxpng ( 314861 ) on Thursday August 28, 2008 @10:16AM (#24778981)

    have been problematic for me. I've recently purchased my first intel system board (since I don't overclock) and can say that I've had much better stability. There is no downside to having intel stability applied to SLI. That being the case, maybe more games will not have such half assed support (or none) for multiple GPUs.

    • so true. my first SLI board had to be replaced 4 times before 6 months and then finally died before 6 months time.

      It was never stable and always hot. There were severe problems with the memory controller onboard.

      I had to apply extreme air cooling to maintain operating temperatures.

  • by rAiNsT0rm ( 877553 ) on Thursday August 28, 2008 @11:07AM (#24779697) Homepage

    OK, I am a longtime gamer (Atari 2600 onward) and have been building PCs for over 15 years. History has repeated itself time and time again, yet everyone still falls for the same crap. Game's cost a lot to produce, no game maker is going to make a game targeted at some minute fraction of their audience. When 90-95% of the PCs in homes aren't even SLI capable what deludes people into buying such a niche product and then expecting to be catered to?

    Tech demo "games" are what people always point to each time SLI tries to enter the market (way back to Voodoo days) and today with a title like Crysis. Everybody spends and spends and builds mammoth PCs to get the highest FPS in it but no one actually *plays* it as a game, it is just a benchmark and eye candy demo. Then they sit back and whine when all of the "blockbuster" games don't utilize a fraction of their uber systems. WoW, Warhammer: AR, GRID, Assassins Creed, Spore, etc. all run fine on systems over 4 years old. Because that is the middle-of-the-road developers are going to target for the most profit. Sure they may throw in an "ultra high mode" for the few bleeding edgers but it is always an afterthought and either buggy or incomplete.

    None of this is new. Stop throwing $500 into SLI video cards and $300 mainboards, IT ISN'T WORTH IT. Also, another rule of thumb that has always been proven right over time: If a card (video included) requires 2 slots or more for either cooling or "daughter cards" then it is an immature technology and will be streamlined into a single slot solution soon for much cheaper due to the reduced manufacturing costs.

    • Machines that is. If I only support the very latest then my sales will be a fraction of what they are.

    • by zachtib ( 828265 )

      You're right on several of these points. I built my machine a year ago for about $600: Core 2 Duo 2.1GHz, 2GB DDR2, and an Intel motherboard. I threw in a Geforce 6600GT that I already had and that machine has been able to handle just about everything I wanted to play on it. I recently ordered an 8800GT to replace my aging 6600GT, but that still puts the total cost at under $800.

      One thing I would argue on is Assassin's Creed. My younger brother bought it for PC, and I found it to be fairly hardware inten

      • For goodness sakes, its a laptop and fairly low-mid specs and the fact that it is playable at all is confirmation enough. Assassin's Creed was the most demanding of the games I listed even and on par with a number of other top titles like Bioshock, Hellgate, etc.

        I'm not coming down on you, just responding to your one "argument" which I think even you would agree is fairly thin.

      • by Khyber ( 864651 )

        128 megs of video ram? were you running XP or Vista? Vista allows for memory sharing, and Assassin's Creed ran just fine on a dual core 1.8 ghz C2D with 1 gig of system memory dedicated to video and one gig dedicated to the OS and programs. GPU is a faulty as hell 8600M GS, for which I have to send the laptop in to have replaced. By default, however, my 8600M has a dedicated 512 megs of RAM before the Vista added memory.

    • When 90-95% of the PCs in homes aren't even SLI capable what deludes people into buying such a niche product and then expecting to be catered to?

      None of this is new. Stop throwing $500 into SLI video cards and $300 mainboards, IT ISN'T WORTH IT.

      That's been my experience as well. Video games are developed to be played. They're looking to sell lots and lots of them. They want the widest audience possible. Sure, a game may run better or look nicer on a bleeding edge system...but it won't require that kind of hardware. Developers are aiming for decent, not amazing hardware.

      Sure, if money is absolutely no object then go right ahead and pour it into SLI and GPUs and whatnot. But generally speaking you can get more for your money if you put it into

      • The problem with SLI is you spend 4x the money to get a 20-30% boost, maybe 50% if the game is particularly sloppy in its graphics pipeline, and often 0% if the game doesn't support SLI at all - hell, some of them even crash under SLI.

        Meanwhile you can buy a cheap board, and a single card one step up, that will deliver steady performance across all games with far less compatibility/tweaking issues to worry about.

        I can't help but roll my eyes and belittle people when they blow $200-300 on a board, then buy a

    • I gotta disagree on Assassin's Creed, it is a faiy intensive game, I have a Pentium D clocked at 3.4 6hz and a 7600GT... and while that is not a "awesome system" it is within 4 years old and struggles with Assassin's Creed.

      Second, single slot variations of most cards are available with a smaller heat sink. Also I liquid cool so almost anything i put in will only take one slot. Of course there are the dual GPU cards as well which offer better performance that the single card counterparts in SLI/crossfire.
    • Re: (Score:3, Informative)

      Crysis. Everybody spends and spends and builds mammoth PCs to get the highest FPS in it but no one actually *plays* it as a game, it is just a benchmark and eye candy demo. Then they sit back and whine when all of the "blockbuster" games don't utilize a fraction of their uber systems. WoW, Warhammer: AR, GRID, Assassins Creed, Spore, etc. all run fine on systems over 4 years old.

      WotLK is bogging down below 30 fps on systems with a brand new 4870 using the new shadow options, and even without that on 30" screens, which seem to be the target for SLI. So WoW benefits from SLI. Warhammer slows to the teens at 1920x1200, which is becoming a common resolution (seen on 22-28" monitors) in large RvR encounters on a 4870, but not on a 4870x2 or a couple gt280s. So Warhammer benefits from SLI. Assassin's Creed is slow at points on an 8800 GTX at 1680x1050, so with higher resolution it sh

      • 1920x1200 is becoming a common resolution? Hardly, check out the Half Life 2 survey, a cool 2.29% of users are running it.

        http://www.steampowered.com/status/survey.html [steampowered.com]

        1024 x 768 and 1280 x 960 are still what most users are running by far. At those resolutions, a decent midrange card is more then any average user needs. Their games will run great.

        Basicly SLI is great for those who are willing to pay through the nose to run the game at crazy resolutions, AA and AF maxxed to the tits. Because god knows that j

        • 1920x1200 is becoming a common resolution? Hardly, check out the Half Life 2 survey, a cool 2.29% of users are running it.

          http://www.steampowered.com/status/survey.html [steampowered.com]

          1024 x 768 and 1280 x 960 are still what most users are running by far. At those resolutions, a decent midrange card is more then any average user needs. Their games will run great.

          Basicly SLI is great for those who are willing to pay through the nose to run the game at crazy resolutions, AA and AF maxxed to the tits. Because god knows that just improves the game experience so much that its worth shelling out more $$$.

          Common among enthusiasts, who are the target market of high end video card, yes, most definitely. I saw a recent poll, I think on the hardocp forums, that showed 1920 and 1680 basically tied for the most popular resolutions for that enthusiast community. NV and ATI aren't selling multi-card systems for 1024x768, but the higher resolution market is quite large, growing very quickly, and is the market that has money to spend and people who can truly benefit from multi-card systems (assuming they don't care

    • by Molochi ( 555357 )

      I have to say I agree with you. I only recently upgraded to a new gaming system and went with a single ATi 4850 for most of these reasons. I happily game on a HDTV at 1920x1080 or 1360x768, so I'm not running the ultrahigh resolution tech demos that seem to be needed to show the "benefits" of a system that uses a KW PSU to power a pair (or trip or quad) of leet GPUs.

      I have mixed feelings about the multislot cards though. Even though the 4850 uses a single slot design, only uses a single pci-e aux plug, and

      • I run my 4850 at 1920x1080 and I am amazed at its performance. Haven't tried Crysis yet, but CoD4 and every other game I have tried runs at minimum 60fps with graphics turned all the way up. The 4850 is a hot card but only because they throttle the fan down by default. I think it was a really dumb thing to do. There's an easy fix but you have to edit an XML file (which I have no problem doing, but most people would). I cranked the fan up to 100% from the default of 7% and saw the temperature drop from
    • by Fweeky ( 41046 )

      Everybody spends and spends and builds mammoth PCs to get the highest FPS ... 90-95% of the PCs in homes aren't even SLI capable

      I suggest you look up the meaning of "everybody", as you seem to be confusing it with something else.

      IT ISN'T WORTH IT

      Not for most people, no, but for game developers, people with huge monitors, and people with enough money to make the extra cost irrelevent, the tradeoff might be different.

      Of course, Extreme Edition/Ultra/etc are probably silly even for them; pay 70% more to get an extra 10% performance? At least SLI can nearly double performance.

      Crysis .. no one actually *plays* it as a game, it is just a benchmark and eye candy demo

      Um, I'm pretty sure I played it as a game; it's a fun FPS, at least for the f

  • Another fine article (AFA?) is here [xbitlabs.com].

    From AFA:

    At the Nvision event in San Jose, California, Nvidia outlined another plan: it will certify certain Intel X58-based mainboards for SLI compliance and will provide âoeapproval keys that will be integrated into the system BIOS for boards that pass certificationâ. The company said that it will charge mainboard makers for SLI compliance, but right now the terms are unknown.

    This smells like yet another "we'll put arbitrary software restrictions in our stuff because we're greedy. Wonder why they are the only ones with no free drivers whatsoever?

    I call bullshit. I stopped using their chipsets long ago, but now I'll actually switch to AMD for video. No more NVidia - hello, software freedom.

    Now if I could only find a P45-based board that can run FreeBIOS....

    • Anyone have any examples where the NVIDIA drivers detect your chipset, and if it's not an NVIDIA chipset your card underperforms? Sounds like that's what they'd be doing in this instance as well. IANAL, but that doesn't sound legal. They can't force you to use their chipset to run their graphics card. Admitting that their software will look for approved keys and hinder performance if they are not found definately sounds anti-competative. Isn't licensing the technology enough?
      • by Godji ( 957148 )
        They do something of the sort. A friend bought the nForce 680i SLI when it came out. It had a BIOS option to increase PCI-E bandwidth with certain high-end nVidia video cards by some 15%. The option would only work with a 8800-series card. Interestingly, the option disappeared in later BIOS revisions.

        Yet another example: the only difference between a commodity GeForce and a really expensive Quadro is a switch in the driver. The driver enables special features for Quadros, including the ability to run SLI
  • Is it just me, or is Intel "releasing" a new chipset every 6 months that does fuckall better than the last one ? It's like they've gotten an NVidia DNA transplant.

    Just like the Geforce 9 and GT2 have been craptacular rehashes of existing tech, the X38/X48 and now X58 are errily similar and in many cases worse performers than the P35 they're supposed to replace.

    X38 was supposed to have "unofficial" SLI support. X48 too. Now X58 has "official" SLI. Big whoop! Given the inflated price of these boards, I s

  • So...
    Everyone forgot already that there was a time SLI worked fine on Intel boards?
    Really?
  • Is this anything more than an Nvidia driver change, or was SLI lurking in the X58 all along and Intel was just waiting for permission to turn it on? Just what is required from a chipset to support SLI anyway? The implication is that it's more than the connecting cable between the cards. Could SLI have been running on non-Nvidia boards long before now? So many questions.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...