Four GPU Motherboard 220
didde writes "The people over at Tom's Hardware are running a story on Gigabytes experiments with quadruple GPU's on one motherboard. Perhaps we'll need something cooler than liquid metal to keep this beast from running hot?" From the article: "About half a year ago, we learned that Gigabyte was working on a graphics card that integrates two GeForce 6600GT graphics chips. While we were impressed with the out-of-the-box approach from Gigabyte, there was of course the question, whether two of those cards could be combined for a total of four graphics chips."
Quad Cards? (Score:3, Interesting)
Re:Quad Cards? (Score:2, Insightful)
It's the same concept as a Beowulf supercomputer.
With the possiblity of parallelism, we can use cheaper cards in tandem and get the same power as a high end graphics card (or one that doesn't exist) for far less money.
It also helps things like failure--if one node fails you can simply replace it without the entire system (your $1000 graphics card) going down.
Redundant systems and parallel computing are the wave of the future wooooooo!
Re:Quad Cards? (Score:2, Insightful)
Re:Quad Cards? (Score:4, Informative)
Re:Quad Cards? (Score:5, Informative)
Re:Quad Cards? (Score:2)
I wouldn't be suprised if a quad machine lasted a cool decade with the current rate of technological advances in PCs.
Limitations (Score:5, Informative)
I would hope that they would be able to get these to run on all SLi boards, I've always thought one of the main strengths of building your own PC was the compatibility between differnet brands of components.
Re: (Score:2)
My God! (Score:5, Funny)
Re:My God! (Score:3, Funny)
Re:My God! (Score:2)
ATI and NVIDIA will get into a race to see how many GPUs they can fit in one computer/on one card. This will be the new benchmark - out the door pipelines, see ya memory, the new way to go is the "ATI 14 GPU XTREME LAVA HEAT CARD".
Re:My God! (Score:2)
I'm waiting for it to go to 11!
---
telnet://sinep.gotdns.com [gotdns.com] -- Play TW2002 and LORD!
Re:My God! (Score:4, Informative)
Multiple pipelines at a time allows you to increase the rendering speed almost linearly, as long as you accept the trivial restriction that you must get the same image as an output no matter what order you render the pixels in. It's the opposite of the CPU business. In CPUs, they started adding multiple chips first, and in the GPU side of things, they added multiple cores (or their equivalent) first. This is partly because it's easier to only decode one instruction at a time and send the decoded signal to every pipeline than it is to decode multiple instructions and send them to the correct cores. This isn't to say that it's impossible, but a couple million more transistors are enough to make you think twice about whether you really want two cores.
Re:My God! (Score:4, Informative)
Voodoo 5 (Score:2)
Re:Voodoo 5 (Score:2)
They already do. (Score:2)
Re:Voodoo 5 (Score:2)
And the thrust from the cooling fans will be enough to power an executive jet.
Re:Voodoo 5 (Score:2)
Current GeForces are designed as parallelizable, kinda like Opterons or Xeon.
I for one (Score:2)
Re:I for one... (Score:2)
In other news... (Score:5, Funny)
Re:In other news... (Score:2)
So...how much longer until... (Score:5, Interesting)
Personally, as an old-skool gamer, I'm hoping that if it ever comes to that, gameplay won't completely be forgotten, as the ratio of gamplay to graphics seems to diminish every day.
Re:So...how much longer until... (Score:2, Funny)
Can you imagine a game with the visuals of the Shrek series?
How many graphics cards do I need not to see that though?
Re:So...how much longer until... (Score:3, Insightful)
Re:So...how much longer until... (Score:2)
As graphics get closer to "good enough" reality, games will *have* to focus on gameplay over eye candy.
Not if you have enough hyped-up (testosterone-charged) pre-teen boys wanting the latest and greatest visuals. Marketers have only three adjectives to describe their product(s):
Latest, bestest, greatest.
The marketers have this all sewn up, and it don't take too many brains to figure it out.
Get 5%, make noise, look cool and the rest will follow. Do you actually think that the Beatles phenomenon
Comment removed (Score:5, Insightful)
Re:So...how much longer until... (Score:3, Insightful)
I always thought the main gripe was that nowadays the crap looks nice and costs millions and millions of dollars to produce. What you end up with, therefore, is a gaming industry that's become a big-money factory system run by huge media conglomerates who A.) overwork their employees and B.) are highly risk averse, meaning they are far more likely to produce mediocre games based o
Re:So...how much longer until... (Score:2)
It always makes me laugh to hear "old-school" gamers complain about companies putting graphics ahead of gameplay.
No, those are newbies. Old timers complain about having graphics in games at all.
hawk who understands that nethack is the only game that matters
Re:So...how much longer until... (Score:2)
Re:So...how much longer until... (Score:2)
besides, nethack is the natural evolution of rogue.
(however, I'm still skeptical of the color and ascii animaition of spells . .
hawk
Re:So...how much longer until... (Score:2)
So, the measure of gameplay is now a matter of how many different elements are in one game?
Lots of old games are good. Lots of new games are good. But a game doesn't have to have 80-million different things to do in it in order to be good. Some might say that focusing on one thing and doing it well more often makes for a superior gaming experience than games that try to cram every thing the developers ever thought of into the context of the game.
Namco (Score:2)
But anyway, you shouldn't complain so much. Just like the great stories have already been told, the game ideas have already been done. They can be improved upon or redone, however, just like movies can be remade or stories can be retold with new novel twists on the characters, situations, or events. Games will continue to improve or maybe just change. But that's okay.
Why? (Score:5, Insightful)
Why would you need it to be 4 times faster than that?
OK, I can see that a handful of people might want to play at 1600x1200 if they have a decent monitor, but usually, running at resolutions higher than that is fairly pointless unless you have a 21" or bigger monitor. The average monitor can't do resolutions that large without blurring the pixels together from what I've seen.
Re:Why? (Score:3, Funny)
Re:Why? (Score:2)
Re:Why? (Score:4, Insightful)
3D game animation is one of the few areas in which ordinary PC consumers run programs that routinely push the limits of their machines. Your machine might be enough to run HL2 perfectly well, but just give it a year or two. Game designers WILL push the envelope of technology, and your machine will eventually struggle to play the newest games.
Remember, Gigabyte isn't shipping this Quad-GPU motherboard, yet. This might not hit shelves until next year. At which point it still might be overkill, but it'll be ready for the next-gen games.
That is what I do not understand. (Score:2)
But by next year nVidia will have the next generation of video chip out. Gigabyte is using the 6600GT. Isn't the 6800 Ultra out already? Would four 6600GTs give you more power then two 6800 Ultra's?
Re:That is what I do not understand. (Score:2)
Now how the heck am I supposed to know that?
Re:That is what I do not understand. (Score:2)
The specs for a 6800 ultra are.
512 MB
Memory Bandwidth 33.6 GB/sec.
Fill Rate 6.4 billion texels/sec.
Vertices per Second 600 Million
Memory Data Rate 1050 MHz
Pixels per Clock (peak) 16
Textures per Pixel* 16
RAMDACs 400 MHz
And for the 6600 GT
Memory Bandwidth 16.0 GB/sec.
Fill Rate (texels/sec.) 4.0 billion
Vertices per Second 375 million
Memory Data Rate 1000 MHz
Pixels per Cl
Re:Why? (Score:2)
Re:Why? (Score:3, Funny)
Yeah, but after buying one on of those, you can't afford a quad-GPU system.
Or games.
Or food.
Re:Why? (Score:2)
The 24" Dell's are among the best available today (I've done my research but am still waiting for my 3.5yr-old 19" CRT to die first); most
Re:Why? (Score:2)
In the quest for the marketability of low response time, LCD manufacurers have been moving to lcd panel designs that just don't deliver image quality.
Re:Why? (Score:2)
Eizo is traditionally the Rolls-Royce of monitors for image quality.
The image quality of My Dell 2405FPW is just as good if not better than my 19" Eizo L675 monitor that cost nearly $4000 a coupla years ago.
Re:Why? (Score:3, Insightful)
I will never understand why someone will spend $500+ on a videocard and then skimp on the monitor.
Re:Why? (Score:4, Insightful)
Re:Why? (Score:2)
All traffic that goes out from the other boards goes across the SLI bus.
Re:Why? (Score:3, Interesting)
Play game with real HD graphics.
Don't limit the idea to just computer monitors.
TV stations could use it to make real time HD talking heads, your anchor woman is sick, but signed a release to use her features in case she is sick to render her, or have her be
Re:Why? (Score:2)
wooo. HD = 1920*1080
I already play UT2004, Doom3, Halflife2 etc at 1920*1200 silky smooth with just a single BFG 6800 ultra card.
Re:Why? (Score:2)
Re:Why? (Score:5, Funny)
(on topic: there will be released even more advanced games)
What about... (Score:2)
Re:Why? (Score:2)
Running 4 copies of it? Joking, but not entirely. One of these days Valve might wisen up and make their games more competition friendly like QuakeWorld, which has a nice split screen mode that makes spectating matches a lot easier.
That, and of course if videocards were only made to run the
Re:Why? (Score:2)
Re:Why? (Score:2)
Re:Why? (Score:2)
I bet you can't understand why you need a monkey with four asses [spscriptorium.com] either! Some people just don't get it!
Real time ray tracing. (Score:2)
As it is ray tracing a single frame of a medium polycount room takes a fair bit of time...
Re:Why? (Score:2)
Keep in mind that people doing SLI won't have "average" monitors.
I don't have SLI but I love playing at 1600x1200 on my 21" CRT. I wish I could play 2048x1536, but I don't know if games support that.
Re:Why? (Score:2)
Well, the source engine in HL2 was designed to run acceptably on middle-of-the-road hardware. The excellent performance of your card may be, in part, due to efficiencies in the underlying engine. In my game group, the most GPU-taxing game has been Doom3, hands down. Also, speed isn't everything. There's better-quality lighting, shading, texture/bump mappings, AA, and all the rest. It may be overkill today, but I'd really like to see how much better things could look -- regardless of s
Re:Why? (Score:2)
Some of us do stupid things like that to get guys.
---
telnet://sinep.gotdns.com [gotdns.com] -- Come play TW2002 and LORD!
Re:Why? (Score:2)
Re:Why? (Score:2)
When this becomes standard... (Score:2)
Re:When this becomes standard... (Score:2)
What's next? (Score:5, Funny)
Re:What's next? (Score:2)
Maybe it's just me... (Score:2)
...but I can think of a lot more important things to do with 4 8x PCIe lanes than dual-SLI. Like pumping several dual-input monitors, or perhaps up to 8 single-input displays.
-theGreater.Re:Maybe it's just me... (Score:2)
I'm waiting for somebody to suggest that you could take a single PC, provide it with four graphics cards, plug a USB hub and four sets of keyboard and mice into it, and use it to serve four users.
It's "revenge of the mini"...!
I can't afford 1 new GPU... (Score:2)
I am still running my good ole Voodoo3 3500 w/tv-IN/OUT. For a Linux desktop though, it still kicks major butt!
Could someone please explain how this works? (Score:3, Interesting)
Re:Could someone please explain how this works? (Score:5, Informative)
SLI is Scalable Link Interface. [wikipedia.org] It's a way to have two video cards running a single display. If, for instance, you have a video game with really high graphics requirements, but you don't want your frames-per-second (fps) to drop, then you could use the two graphics cards to render alternating frames. That way, you have high frame rate combined with the best graphics. In theory you can double the graphics complexity of whatever you are trying to render. In practice, of course, it can be hard to get it running, and for many games/applications won't make any difference whatsoever. It's still a very much "power gamer" setup, only for people who (1) have the money, (2) like tinkering, (3) enjoy being "bleeding edge" just for the heck of it, (4) really like their games to look slick... at any cost!
Despite the fact that SLI is currently seen to be sorta frivolous by many, it's quite possible that SLI (or multi-GPU cards) will become common in the future, and will in fact be required to play modern games.
Mod Parent Up Please! (Score:2)
Re:Could someone please explain how this works? (Score:2)
It seems like a collosal waste to me; given that the memory represents a large proportion of the cost of a high-end gfx card, one would think they'd borrow some knowledge from SMP designs to make better use of it.
Re:Could someone please explain how this works? (Score:2)
Re:Could someone please explain how this works? (Score:2)
Re:Could someone please explain how this works? (Score:2)
Graphics cards already are massively parallel. The level of parallelisation will only increase, but I think there are more efficient ways of increasing performance than duplicating everything - for instance, it's just extremely wasteful to have individual memory per card. It's necessary for running d
4 GPUs, 4 monitors......... (Score:5, Funny)
Pricey! (Score:2)
It's hard to say how much they would need in
this product, but it wouldn't suprise me if the
gallium alone adds $30-50 to the cost.
Not true. (Score:2)
The 500$/kg is for semiconductor grade gallium (99.9999% pure).
As the used material should be a gallium alloy (to make is liquid at 20C), purity should not be an issue, so industrial grade gallium for 150$/kg can be used.
Bandwidth and the Potential of this Card (Score:4, Insightful)
Tyan Thunder K8WE [tyan.com] - definitely the top of the line for dual-opteron mobo's right now IMHO.
Anyways, the reason this is a stupid idea is of course that as soon as someone 'upgrades' to this and squeezes out a refresh rate higher than our monitors can produce or our eyes can detect, we will have our next-gen cards and games.
Next-gen cards of course will have hardware features (read: steeped in the architecture) that no matter what you do, this generation of cards won't be able to support. For example, think of the GeForce 4MX versus the GeForce 3 Ti 200. As you may know, the 4MX does not have any shaders and the Ti 200 does. Even if I bundled up 4 4MX's, I would not be able to render reflective water in Far Cry or Half Life 2 (assuming the game in question allowed it with out inferior GPU first of all) simply because there is no dedicated hardware for volumetric per-pixel effects.
So then, instead of getting more GPU's (or spending money on a more expensive mobo just to be able to SLI) people should just wait until we actually need that extra juice - and now certainly is not the time. I recall that in one of the Unreal 3 Engine demos from a long while back, someone commented that the 6800's would run U3 like crap even on low settings (I think they said 25 FPS).
LAME GPUs (Score:2)
Re:LAME GPUs (Score:2)
Good News for CAD and Animators... But... (Score:2)
On the downside, you can only use one monitor in SLI mode, and most pros would rather saw off their own genitals than go to a single monitor setup. The workaround would be to grab an older PCI card for the secondary display device. Kinda sux.
BBH
But do they all merge into one? (Score:2)
Re:So... (Score:2)
Re:So... (Score:2)
Re:So... (Score:2)
Re:So... (Score:2)
And let me guess, you used to bulls eye Wamp Rats with it back home and they're not much larger than two meters right?
Re:So... (Score:4, Informative)
Well here's the list from NewEgg.
SLI Equipped Motherboards [newegg.com].
Why would 4X the GPUs require 65X the cooling? (Score:2)
Re:4 GPU (Score:2)
Covered in hot grits, no less! (Score:5, Insightful)
This is /. , we've gotta give people like you something to whine about.obligatory speeling errur.
Natalie Portman? (Score:2)
Surely there's a picture of her naked in here somewhere. Possibly in... SOVIET RUSSIA.
Re:Kinda pissed at Gigabyte, actually... (Score:2)
Re:Pwnz0r (Score:2)
Re:One (perhaps four) Word(s) (Plus some others) (Score:2)
Re:And yet... (Score:2)
dude, you use a dial-up modem? You're sooo not the target market for these graphics cards. They're trying to sell these to people who can afford broadband. If you still remember when computers used real modems, they've already written you off as too poor to be a customer...
Re:Movies (Score:2)