Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware Entertainment Games Technology

Four GPU Motherboard 220

didde writes "The people over at Tom's Hardware are running a story on Gigabytes experiments with quadruple GPU's on one motherboard. Perhaps we'll need something cooler than liquid metal to keep this beast from running hot?" From the article: "About half a year ago, we learned that Gigabyte was working on a graphics card that integrates two GeForce 6600GT graphics chips. While we were impressed with the out-of-the-box approach from Gigabyte, there was of course the question, whether two of those cards could be combined for a total of four graphics chips."
This discussion has been archived. No new comments can be posted.

Four GPU Motherboard

Comments Filter:
  • Quad Cards? (Score:3, Interesting)

    by OverlordQ ( 264228 ) on Friday May 27, 2005 @11:41AM (#12655656) Journal
    Maybe if you won the lottery and/or work is (for some odd reason) paying for it. 4 GFX cards that'll run SLI, or whatever SLI for 4 gfx cards is, will probably take up 75% of the total cost of a machine.
    • Re:Quad Cards? (Score:2, Insightful)

      by Morticae ( 801527 )
      Not necessarily.
      It's the same concept as a Beowulf supercomputer.

      With the possiblity of parallelism, we can use cheaper cards in tandem and get the same power as a high end graphics card (or one that doesn't exist) for far less money.

      It also helps things like failure--if one node fails you can simply replace it without the entire system (your $1000 graphics card) going down.

      Redundant systems and parallel computing are the wave of the future wooooooo!
      • Re:Quad Cards? (Score:2, Insightful)

        by gcauthon ( 714964 ) *
        A cluster of X components is never going to be as reliable as a single component. If you buy more of something then your odds of seeing a defect go up, not down. You are correct in that if one card fails then you only need to replace the one card. However, your odds of a card failing are now four times as likely. Supercomputers are not for the thrifty and neither are multi-gpu systems.
    • Re:Quad Cards? (Score:4, Informative)

      by taskforce ( 866056 ) on Friday May 27, 2005 @11:45AM (#12655702) Homepage
      This is 2 cards with 2 GPUs on them each, not 4 cards. Last year Gigabyte launched their dual GPU cards, but they couldn't run in SLi. At the time one of the main comments from reviewers and fans who were shocked by the power was "Whoa, wouldn't it be cool to run 2 of those in SLi and have 4 GPUs!"
    • Yes, but like dual SLI Voodoo2 cards, how freakishly long could you go with such a solution before being forced to say "ok, my freakishly powerful PC is finally too out of date to play the latest games"?

      I wouldn't be suprised if a quad machine lasted a cool decade with the current rate of technological advances in PCs.
  • Limitations (Score:5, Informative)

    by taskforce ( 866056 ) on Friday May 27, 2005 @11:42AM (#12655676) Homepage
    One of the major limitations of the GB Dual GPU cards is that they only worked on their propreitary motherboards, which is useless for people who use other brands of motherboards; this was supposedly because it was using the SLi in some strange way. (2 SLi links accross the GPUs as opposed to 2 boards)

    I would hope that they would be able to get these to run on all SLi boards, I've always thought one of the main strengths of building your own PC was the compatibility between differnet brands of components.

  • My God! (Score:5, Funny)

    by JudgeFurious ( 455868 ) on Friday May 27, 2005 @11:43AM (#12655682)
    ...It's full of GPU's!
    • "All these motherboards are yours except Macintosh. Attempt no modern video cards there."
    • You saw it here first:
      ATI and NVIDIA will get into a race to see how many GPUs they can fit in one computer/on one card. This will be the new benchmark - out the door pipelines, see ya memory, the new way to go is the "ATI 14 GPU XTREME LAVA HEAT CARD".
  • Voodoo 5 6000 anyone? Is this thing going to require its own external power brick just to the board?
    • You just brought back many not so fond memories. The first really high-end card I ever bought was a Voodoo 5 5500. That damn thing never worked right and the drivers never really left beta, as 3dfx was absorbed by Nvidia about 6 months after its release, IIRC. I still have the card, the box and the manuals though, haha. It is still the biggest and heaviest card I own.
    • My GeForce FX 5700 card already has an external power connector.
    • Voodoo 5 6000 anyone? Is this thing going to require its own external power brick just to the board?

      And the thrust from the cooling fans will be enough to power an executive jet.
  • welcome our Longhorn-running overlords
  • by Swamii ( 594522 ) on Friday May 27, 2005 @11:45AM (#12655711) Homepage
    Gigabyte has stated they will throw in a free Nuclear Power Plant to help pay for power consumption when you buy one of their 4-card chipsets.
  • by LegendOfLink ( 574790 ) on Friday May 27, 2005 @11:46AM (#12655717) Homepage
    Remember Carmack promising us real-time rendering for full CG movies? Can you imagine a game with the visuals of the Shrek series?

    Personally, as an old-skool gamer, I'm hoping that if it ever comes to that, gameplay won't completely be forgotten, as the ratio of gamplay to graphics seems to diminish every day.
    • by Anonymous Coward

      Can you imagine a game with the visuals of the Shrek series?

      How many graphics cards do I need not to see that though?

    • As graphics get closer to "good enough" reality, games will *have* to focus on gameplay over eye candy.

      • As graphics get closer to "good enough" reality, games will *have* to focus on gameplay over eye candy.

        Not if you have enough hyped-up (testosterone-charged) pre-teen boys wanting the latest and greatest visuals. Marketers have only three adjectives to describe their product(s):

        Latest, bestest, greatest.

        The marketers have this all sewn up, and it don't take too many brains to figure it out.

        Get 5%, make noise, look cool and the rest will follow. Do you actually think that the Beatles phenomenon
    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday May 27, 2005 @12:00PM (#12655895)
      Comment removed based on user account deletion
      • There have ALWAYS been a huge amount of "games" with horrible gameplay. The only difference now is that the crap looks nice.

        I always thought the main gripe was that nowadays the crap looks nice and costs millions and millions of dollars to produce. What you end up with, therefore, is a gaming industry that's become a big-money factory system run by huge media conglomerates who A.) overwork their employees and B.) are highly risk averse, meaning they are far more likely to produce mediocre games based o


      • It always makes me laugh to hear "old-school" gamers complain about companies putting graphics ahead of gameplay.


        No, those are newbies. Old timers complain about having graphics in games at all.

        hawk who understands that nethack is the only game that matters
      • dodge this stuff and shoot this stuff

        So, the measure of gameplay is now a matter of how many different elements are in one game?

        Lots of old games are good. Lots of new games are good. But a game doesn't have to have 80-million different things to do in it in order to be good. Some might say that focusing on one thing and doing it well more often makes for a superior gaming experience than games that try to cram every thing the developers ever thought of into the context of the game.
  • Why? (Score:5, Insightful)

    by nmg196 ( 184961 ) * on Friday May 27, 2005 @11:47AM (#12655731)
    Can anyone think of a reason why you need more than one of these cards? Currently my machine runs the most complex game I can think of (HalfLife 2) at 1280x960 at more frames per second than my monitor even scans at.

    Why would you need it to be 4 times faster than that?

    OK, I can see that a handful of people might want to play at 1600x1200 if they have a decent monitor, but usually, running at resolutions higher than that is fairly pointless unless you have a 21" or bigger monitor. The average monitor can't do resolutions that large without blurring the pixels together from what I've seen.
    • Re:Why? (Score:3, Funny)

      Blurring the pixels results in free real time antialiasing!
    • Re:Why? (Score:4, Insightful)

      by MoralHazard ( 447833 ) on Friday May 27, 2005 @11:55AM (#12655824)
      Half-Life 2 is the most complex game you can this of, right now. Shit, I remember people saying the exact same thing about the ATI Rage 128 and the original GeForce, right about the time the first Half-Life game came out.

      3D game animation is one of the few areas in which ordinary PC consumers run programs that routinely push the limits of their machines. Your machine might be enough to run HL2 perfectly well, but just give it a year or two. Game designers WILL push the envelope of technology, and your machine will eventually struggle to play the newest games.

      Remember, Gigabyte isn't shipping this Quad-GPU motherboard, yet. This might not hit shelves until next year. At which point it still might be overkill, but it'll be ready for the next-gen games.
      • "Remember, Gigabyte isn't shipping this Quad-GPU motherboard, yet. This might not hit shelves until next year."
        But by next year nVidia will have the next generation of video chip out. Gigabyte is using the 6600GT. Isn't the 6800 Ultra out already? Would four 6600GTs give you more power then two 6800 Ultra's?

        • Would four 6600GTs give you more power then two 6800 Ultra's?

          Now how the heck am I supposed to know that?
          • Since you are posting your opinion I thought you might have some facts to back it up.
            The specs for a 6800 ultra are.
            512 MB
            Memory Bandwidth 33.6 GB/sec.
            Fill Rate 6.4 billion texels/sec.
            Vertices per Second 600 Million
            Memory Data Rate 1050 MHz
            Pixels per Clock (peak) 16
            Textures per Pixel* 16
            RAMDACs 400 MHz
            And for the 6600 GT
            Memory Bandwidth 16.0 GB/sec.
            Fill Rate (texels/sec.) 4.0 billion
            Vertices per Second 375 million
            Memory Data Rate 1000 MHz
            Pixels per Cl
    • Never used one of those new 24" Widescreen Dell LCD's have you? 12ms response time, 1920x1200 resolution, and all the pixels really exist...
      • Re:Why? (Score:3, Funny)

        by pla ( 258480 )
        Never used one of those new 24" Widescreen Dell LCD's have you? 12ms response time, 1920x1200 resolution, and all the pixels really exist...

        Yeah, but after buying one on of those, you can't afford a quad-GPU system.

        Or games.

        Or food.
        • Spending ~$1,000 isn't that much for a LCD of that size and quality which will last for more than 3 years. That's especially the case when you consider the insane amounts of money other "normal" people have no problem wasting on options for their status symbol cars. So, if you're the type of person who stares at a screen for >8 hours per day, it's a good investment.

          The 24" Dell's are among the best available today (I've done my research but am still waiting for my 3.5yr-old 19" CRT to die first); most

          • The problem with low response time LCD's is that they generally compromise on image quality. While the 25 ms 213T does ghost when playing games it is also one of the few LCD monitors that has image fidelity that is good enough for my eyes.

            In the quest for the marketability of low response time, LCD manufacurers have been moving to lcd panel designs that just don't deliver image quality.

            • I now own one of those Dell 1920x1200 24" monitors.

              Eizo is traditionally the Rolls-Royce of monitors for image quality.

              The image quality of My Dell 2405FPW is just as good if not better than my 19" Eizo L675 monitor that cost nearly $4000 a coupla years ago.
    • Re:Why? (Score:3, Insightful)

      by Agave ( 2539 )
      There are monitors smaller than 21"?? :)

      I will never understand why someone will spend $500+ on a videocard and then skimp on the monitor.
    • Re:Why? (Score:4, Insightful)

      by UnknowingFool ( 672806 ) on Friday May 27, 2005 @11:58AM (#12655869)
      I can only think of multiple displays with multiple monitors. Instead of a card handling all the monitors, each monitor (or monitors) is handled by a separate card. i.e. One for right, One for center, one for left, one for top or behind. For gamers, they could use it to create panoramic views. It could also be used for large multi-monitor displays demo displays, but for the average person, I don't see a big need.
      • In SLI configurations, only the first graphics card can be used to output a signal.

        All traffic that goes out from the other boards goes across the SLI bus.
    • Re:Why? (Score:3, Interesting)

      by eht ( 8912 )
      HD video output 1920x1080, do real time rendering of movies, why send a extremely large pre rendered movie when you can send just a couple scene files (though like compiled code vs source code they could end up being larger than the compiled version).

      Play game with real HD graphics.

      Don't limit the idea to just computer monitors.

      TV stations could use it to make real time HD talking heads, your anchor woman is sick, but signed a release to use her features in case she is sick to render her, or have her be
      • >> Play game with real HD graphics.

        wooo. HD = 1920*1080

        I already play UT2004, Doom3, Halflife2 etc at 1920*1200 silky smooth with just a single BFG 6800 ultra card.
      • Just as an aside, real-time rendering of movies from the scene files would indeed be bigger than the resultant movie, probably by quite a margin - mainly because of textures, unless they're all procedural...
    • Re:Why? (Score:5, Funny)

      by aliquis ( 678370 ) on Friday May 27, 2005 @12:01PM (#12655896)
      Yeah, and also, why do you need more than 640kB of ram? really? ;)

      (on topic: there will be released even more advanced games)
    • What about when games let you use more than one monitor? Either a side-by-side configuration, or a left-center-right config would be awesome, giving you peripheral vision. I think the 3-monitor setup would be a great way to use a big/expensive center monitor and a couple of smaller/cheaper ones on the sides.
    • " Can anyone think of a reason why you need more than one of these cards? Currently my machine runs the most complex game I can think of (HalfLife 2) at 1280x960 at more frames per second than my monitor even scans at."

      Running 4 copies of it? Joking, but not entirely. One of these days Valve might wisen up and make their games more competition friendly like QuakeWorld, which has a nice split screen mode that makes spectating matches a lot easier.

      That, and of course if videocards were only made to run the
    • Doom 3 is more graphically complex/intense than Half-Life 2 at times. And this is all we have today - games don't even have realistic soft shadows yet. This is an 'investment to the future'. Also, some games actually play differently the higher your frames. Also, things like AntiAliasing and Ansiotropic filtering. Compare This [tomshardware.com]. The difference between SLI and non-SLI is huge. If you're a graphics card whore (aka 1600x1200x6AAx16AF), this is the way to go. Now, imagine buying one of these things now. Next y
    • Jusy FYI, I have a Radeon X800 XT an IBM P275 21 inch monitor that runs HalfLife2 at 2048x1536@75hz. Silky smooth, unless I crank up the AntiAliasing, and yes, even at that high res, AA DOES improve image quality. Nonetheless, the X800 does a decent job, usually 40-80FPS, few dips below that.
    • Can anyone think of a reason why you need more than one of these cards?

      I bet you can't understand why you need a monkey with four asses [spscriptorium.com] either! Some people just don't get it!

    • Can you imagine how cool it would be to be able to do full ray tracing in a large complex game world in REAL TIME???

      As it is ray tracing a single frame of a medium polycount room takes a fair bit of time...

    • The average monitor can't do resolutions that large without blurring the pixels together from what I've seen.

      Keep in mind that people doing SLI won't have "average" monitors.

      I don't have SLI but I love playing at 1600x1200 on my 21" CRT. I wish I could play 2048x1536, but I don't know if games support that.
    • It's a good question.

      Well, the source engine in HL2 was designed to run acceptably on middle-of-the-road hardware. The excellent performance of your card may be, in part, due to efficiencies in the underlying engine. In my game group, the most GPU-taxing game has been Doom3, hands down. Also, speed isn't everything. There's better-quality lighting, shading, texture/bump mappings, AA, and all the rest. It may be overkill today, but I'd really like to see how much better things could look -- regardless of s

  • ..then the PC will again be a more cool thing to play games on than PSP and Xbox2! I suspect PSP and XBox2 will be more cool for a while now, until all computers have Quadruple or more GPU's..
    • The GPUs that both consoles use are made by both nVidia and ATI, and you can bet your ass that before or shortly after the consoles have been released, you'll be able to get them in the form of (bloody expensive) PCIe cards. Considering that ATI are now also working on multi-GPU technology, the results from both should be interesting.
  • by brotherscrim ( 617899 ) on Friday May 27, 2005 @11:50AM (#12655767) Journal
    The new Gillette MACH 6© 6 GPU motherboard, with comfort strip.
  • ...but I can think of a lot more important things to do with 4 8x PCIe lanes than dual-SLI. Like pumping several dual-input monitors, or perhaps up to 8 single-input displays.

    -theGreater.
    • No, it's not just you.

      I'm waiting for somebody to suggest that you could take a single PC, provide it with four graphics cards, plug a USB hub and four sets of keyboard and mice into it, and use it to serve four users.

      It's "revenge of the mini"...!
  • I can't afford even one new video card!!!

    I am still running my good ole Voodoo3 3500 w/tv-IN/OUT. For a Linux desktop though, it still kicks major butt!
  • Is this four GPU's driving a single display? What is this SLI stuff?
    • by kebes ( 861706 ) on Friday May 27, 2005 @12:29PM (#12656160) Journal
      The 4 GPUs are on two dual-core cards. You could use this in an SLI setup to run a single monitor with ridiculous amounts of graphics power, or two monitors with still amazing graphics rendering, or more monitors if you wanted to, I suppose.

      SLI is Scalable Link Interface. [wikipedia.org] It's a way to have two video cards running a single display. If, for instance, you have a video game with really high graphics requirements, but you don't want your frames-per-second (fps) to drop, then you could use the two graphics cards to render alternating frames. That way, you have high frame rate combined with the best graphics. In theory you can double the graphics complexity of whatever you are trying to render. In practice, of course, it can be hard to get it running, and for many games/applications won't make any difference whatsoever. It's still a very much "power gamer" setup, only for people who (1) have the money, (2) like tinkering, (3) enjoy being "bleeding edge" just for the heck of it, (4) really like their games to look slick... at any cost!

      Despite the fact that SLI is currently seen to be sorta frivolous by many, it's quite possible that SLI (or multi-GPU cards) will become common in the future, and will in fact be required to play modern games.
      • Very informative - thanks!
      • Sadly, no one yet offers a solution that provides memory-sharing between GPUs, and you end using a lot of very expensive, fast memory to duplicate the same texture and geometry data on each card.

        It seems like a collosal waste to me; given that the memory represents a large proportion of the cost of a high-end gfx card, one would think they'd borrow some knowledge from SMP designs to make better use of it.
        • While dual-GPU is relatively new (not including the Voodoo SLI), the only way sharing the same data between GPUs could be achieved would be by giving them access to the same set of RAM. Unfortunately, while it saves money, I would imagine that given the amount of data that has to be read from the RAM, the performance loss would be a problem.
      • Last time I checked, each card had two GPUs, with RAM for each. I don't think they're dual-core - nVidia would have had to have made them specially.
      • Despite the fact that SLI is currently seen to be sorta frivolous by many, it's quite possible that SLI (or multi-GPU cards) will become common in the future, and will in fact be required to play modern games.

        Graphics cards already are massively parallel. The level of parallelisation will only increase, but I think there are more efficient ways of increasing performance than duplicating everything - for instance, it's just extremely wasteful to have individual memory per card. It's necessary for running d
  • by pg110404 ( 836120 ) on Friday May 27, 2005 @12:02PM (#12655914)
    .....and a hell of a lot of porn. How sweet is that?
  • Gallium costs around $US500/kg.
    It's hard to say how much they would need in
    this product, but it wouldn't suprise me if the
    gallium alone adds $30-50 to the cost.
    • At least its not quite as bad.
      The 500$/kg is for semiconductor grade gallium (99.9999% pure).
      As the used material should be a gallium alloy (to make is liquid at 20C), purity should not be an issue, so industrial grade gallium for 150$/kg can be used.

  • by vectorian798 ( 792613 ) on Friday May 27, 2005 @12:20PM (#12656077)
    I know a lot of you are gonna be saying that there is no mobo with two x16 PCI-E slots so let me point out one right now:

    Tyan Thunder K8WE [tyan.com] - definitely the top of the line for dual-opteron mobo's right now IMHO.

    Anyways, the reason this is a stupid idea is of course that as soon as someone 'upgrades' to this and squeezes out a refresh rate higher than our monitors can produce or our eyes can detect, we will have our next-gen cards and games.

    Next-gen cards of course will have hardware features (read: steeped in the architecture) that no matter what you do, this generation of cards won't be able to support. For example, think of the GeForce 4MX versus the GeForce 3 Ti 200. As you may know, the 4MX does not have any shaders and the Ti 200 does. Even if I bundled up 4 4MX's, I would not be able to render reflective water in Far Cry or Half Life 2 (assuming the game in question allowed it with out inferior GPU first of all) simply because there is no dedicated hardware for volumetric per-pixel effects.

    So then, instead of getting more GPU's (or spending money on a more expensive mobo just to be able to SLI) people should just wait until we actually need that extra juice - and now certainly is not the time. I recall that in one of the Unreal 3 Engine demos from a long while back, someone commented that the 6800's would run U3 like crap even on low settings (I think they said 25 FPS).
  • Quad PCI GPUs offer the best $:MFLOPS we can get, especially for PCs. Who's got SW to harness their linear algebra engines to run a pool of LAME MP3 encoder engines, without bugging the CPU one bit?
  • This is seriously good news for solid modelers and animators. These are two fields where you can never have enough horsepower. It also may prove useful in rendering farms (nvidia is working on hardware acceleration for render farms).

    On the downside, you can only use one monitor in SLI mode, and most pros would rather saw off their own genitals than go to a single monitor setup. The workaround would be to grab an older PCI card for the secondary display device. Kinda sux.

    BBH
  • Nothing in the article even implied that both pairs of GPUs would subsequently be merged a second time so that all for GPUs we processing the one image. Or did I miss something?

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...