Dual Video Cards Return 264
Kez writes "I'm sure many Slashdot readers fondly remember the era of 3dfx. SLI'd Voodoo 2's were a force to reckoned with. Sadly, that era ended a long time ago (although somebody has managed to get Doom III to play on a pair of Voodoo 2's.) However, Nvidia have revived SLI with their GeForce 6600 and 6800 cards. SLI works differently this time around, but the basic concept of using two cards to get the rendering work done is the same. Hexus.net has taken a look at how the new SLI works, how to set it up (and how not to,) along with benchmarks using both of the rendering modes available in the new SLI." And reader Oh'Boy writes "VIA on its latest press tour stopped by and visited in the UK and TrustedReviews have some new information on VIA's latest chipsets for AMD Athlon 64, the K8T890 and the K8T890 Pro which supports DualGFX. But what has emerged is that DualGFX after all doesn't support SLI, at least not for the time being, since it seems like nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI. VIA did on the other hand have two ATI cards up and running, although not in SLI mode."
New trend ? (Score:5, Insightful)
And we're not even speaking of how much power (wattage) these 'dual solutions' consume...
Re:New trend ? (Score:5, Funny)
Eventually, barring any further technological advances, perhaps we'll even result to modular clustering. Once again, the enthusiast's computer will be larger than a refrigerator!
Note to self (Score:2)
Re:New trend ? (Score:2)
Re:New trend ? (Score:2)
What I am more interested in is will we see smp boards supporting duel-core AMD-64 cpus. It could be interesting since from what I read the AMD64 is NUMA when using more than one cpu but I would guess that the duel core would be more of an UMA.
Re:New trend ? (Score:3, Insightful)
Re:New trend ? (Score:2, Informative)
From wikipedia [wikipedia.org]
My memory differs from the wikipedia, I seem to remember there being a Voodoo 4 4000 and I believe the 3dfx site listed it as the Voodoo 6 6000 and not a Voodoo 5 6000. Although most of my information was looking around on the 3dfx site back in the day so it may be they listed cards that weren't actually released (like the 5000 which I remember seeing there too). you can f
Re:New trend ? (Score:2)
I'm no expert but I was under the impression that the bottleneck in current graphics cards is the amount of memory and the speed of the bus that the texture data has to travel down. Aparrently the actual 3d geometry is very easy to process, its the rendering and its associated problems that slow things down.
Re:New trend ? (Score:2)
Re:New trend ? (Score:2)
We've seen dual-core GPUs already. What do you think a multi-pipeline GPU is?
Re:New trend ? (Score:2)
Isn't that what Zell Miller [about.com] runs on his PC?
Re:New trend ? (Score:2)
Barring BIOS support, you'll be able to drop the dual-core CPUs into existing boards (assuming the board itself supports it - which from the sounds of things, some will just require a BIOS update)
Buy the second a year later (Score:5, Insightful)
Re:Buy the second a year later (Score:3, Funny)
Re:Buy the second a year later (Score:2)
A friend of mine did the same with CPUs back in the day. He bought a dual Pentium 2 board with one processor - together it ran about $400. Then a year or so later he bought two matched processors for less than $80 (combined) and gave the old one to his brother. We're still using the dual P2 system as a file server.
Re:Buy the second a year later (Score:2)
So 1 year later you can add more brute force, but chances are that you will be behind the technology curve.
They didn't work before. Nothing has changed since. They wont work now.
Dual cards are just a way for the card sellers to make more money. We see them now ONLY because its easy to do on the PCI Express bus.
What I would like for
Re:New trend ? (Score:2, Informative)
The Folding@home ( http://folding.stanford.edu/ [stanford.edu]) is about to enter the GPU based Folding:
http://forum.folding-community.org/viewtopic.php?p =75287#75287 [folding-community.org]
Interesting times ahead...
Re:New trend ? (Score:5, Insightful)
I don't think so. Quoting from Intel's web site: "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." Many people assume Moores Law states that speed of processors will double every 18 months and that the fact that it is becoming difficult to increase clock speed now means that Moores Law is finished. However, increasing speed is a consequence of higher clock speeds and higher transistor counts. Dual cores means you can increase the number of transistors per IC more and actually use them to do real work rather than simply adding a huge cache (as was done with the latest Itanic). End result, more speed, higher transistor count, and Moores Law still fine. In fact, dual cores could mean that the transistor count increases at greater than Moores Law in the near term. Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....
Re:New trend ? (Score:2)
Re:New trend ? (Score:3, Interesting)
As long as the software developers do (with their crazy per-processor schemes it doesn't matter. Microsoft got that right in one go (I still don't like them, but they seem to do more right lately). Others will probably follow suite, at least for the PC/small server market.
And the rest is academic. Call it what you like, as long as it speeds up my PC and gives me better response time? Since the proc
Re:New trend ? (Score:2)
That's the rub right there. Moore's law under its broad interpretation - "computers get exponentially faster" - was great because new processors could run the same old programs with exponentially increasing speed. (Moore's law under its narrow interpretation - transistor count - is quite useless, since nobody cares about transistor counts per se).
N parallel processors are never as good as a single
Re:New trend ? (Score:2)
Power consumption (Score:5, Informative)
SLI power consumption can be significant! [anandtech.com]
Re:Power consumption (Score:2)
That's a lot, but still a damn sight better than double.
Re:Power consumption (Score:3, Informative)
Re:New trend ? (Score:2)
Re:New trend ? (Score:2)
Re:New trend ? (Score:2, Interesting)
A long time ago I had an Obsidian X-24 graphics card, which was basically an SLI Voodoo2 on one card that drew its power from a single PCI slot. It used so much power that that my computer would just power off without warning quite frequently.
A 350 watt power supply fixed the problem (I had a 250 watt), and that was a LOT of power back then. Now I have a 400 watt Antec power supply which was the recommended solution f
Err you aren't trying hard enough. (Score:5, Insightful)
As a lover of flight sims I'll be first in line to buy a mother board that can support 10 video cards. Along with an array of cheap monitors I will finally have a wrap around view of the sim world. This can apply easily to any game.
First person shooters could finally have peripheral vision (one center and two on the sides) along with a inventory and map screen. Brings the grand total to five.
Driving games could finally have a true perspective instead of the stupid 3rd person or 1/3 screen in car view. So at least three monitors.
RTS resource monitors, sat view, and ground maps. Well that could become quite the array depending on how much you wanted covered. Say anywhere from 3-12 monitors.
Same for Massive Multiplayer Online Games. I could see a use without trying hard that would require at least six monitors.
You could double, tripple or even quadruple up on the number of required cards for any one monitor that would require higher end graphics. There are always those twisted monkeys that come up with graphics that won't run on any one GPU these days. For example those lovely to the horizon maps that show up in various games that add about 100meters of high detail every year. I see another scenario where people boost their systems performance by picking up cheaper versions of cards they own to keep their graphics improving without breaking the bank. (We can all remember when GF 2 cards cost $400 each, that'll buy you 50 of them these days.
Who could afford all this you ask? Well just about anyone these days. I've got a stack of 17inch CRT monitors in the garage I picked up for $5 a piece that are just begging to be used. With the advent of sub $100 video cards and CRT monitors, and the fact that not every output would have to be super hi rez. Perpheral views, 2d maps, and inventory lists would be just fine on something to the equivalent to a GeForce 4 MX ($32 new). You could seriously enhance your gaming machine for the price of one top of the line latest and the greatest video card from ATI/Nvidia.
So you keep your two monitor display, for me I'm going to check to see if the wiring in my computer room can handle the extra 10 monitors I plan on adding.
Dual mania! (Score:2)
Looks better on SLI Voodoo2's than on my Rad 9800. (Score:5, Funny)
Who to Trust (Score:5, Insightful)
Re:Who to Trust (Score:2)
See, it works out in the end.
Re:Who to Trust (Score:2)
-Both logic and RTFA will hurt me
Re:Who to Trust (Score:3)
If you look at the history of video cards, you will see that whenever they succeed in reaching the limit in one particular technology, they will continue to move on something else. First it was screen resolution, then pixel depth, followed closely by 2D pixblitting, then 3D acceleration, multi-texturing, then programmable vertex and finally fragment programs,
On framerates... (Score:2)
Re:On framerates... (Score:2)
Re:On framerates... (Score:2)
Re:On framerates... (Score:4, Insightful)
However, there's a much more important factor at work here that confounds the film-vs-video-card comparison: video game frames are not the same as film frames. The biggest problem in this regard is motion blur. Here's a little exercise. Try it out in real life if you have the equipment, or just think along through it:
Let's say you were to use a video camera and capture 30 frames in 1 second. The subject is your own hand, waving up and down quickly.
Now let's say you rendered a 1 second video using the 3D engine du jour, also 30 frames, of a hand waving up and down quickly.
If you were to look at the 30 film frames, they would not be crisp. Each one of them would likely exhibit motion blur. However, when played at a rate of 30fps, to the human eye, that motion blur looks smooth.
If you were to look at the 30 rendered frames, there is no motion blur. Each frame is rendered crisply. The problem with this is, when played at 30fps, instead of smoothly moving from one frame to the next, the hand appears to jump between frames. There is no intermediate data to allow a smooth flow from frame to frame.
There are two ways around this: first, you could simulate motion blur in the engine. Second, you can pump the FPS up high enough that there is intermediate data for your eye to take in, and do the motion blur on it's own. The former of these options seems much more likely.
Re:On framerates... (Score:2, Insightful)
This way, when we've the desire to gawk at some doodad in the game world for three minutes at a time, we can enjoy it in full detail, but when you're being bumrushed by five beasties, your first reaction isn't to bask in the per-pixel lit glory. That is when the engine can crank down the detail and turn up the FPS (and potentially the amount of carbohydrates being pumped into your bloodstream.)
Possible reason... (Score:2, Interesting)
One more night of examining other motherboards and I decided to buy a mb based on nForce2Ultra chipset. After install
Intel & SLI (Score:5, Informative)
http://www.nvidia.com/object/IO_17070.html [nvidia.com]
I'm looking forward to a P4 NForce board.
Re:Intel & SLI (Score:2)
The only thing different here is that nVidia might introduce a cheaper way to get such a board.
Re:Intel & SLI (Score:2)
AlienWare (Score:5, Informative)
You can already buy from the alienware luxury collection some gaming systems featuring SLI
http://www.alienware.com/ALX_pages/choose_alx.aspSLI != SLI (Score:4, Informative)
It does make me wonder if the technology is capable of truly scaling
However, given the cost, and looking at what the 6800 can handle by itself, and comparing -those- to the evolution of games it appears to me that it will be no more costly to simply upgrade to a 6900/7000/whatever when it is required, as I can easily get by for the next year or two on a 6800 Ultra especially if including the fact that I would need a new computer to run it on since I don't have PCI-E (though I do have PCI-X, but not for gaming needs). And will be saving on electricity and mean time to failure (though that doesn't seem to be an issue much with video cards).
Not saying I don't see the attraction, but I don't get anywhere NEAR interested in 3D gaming enough to be spending that kind of dough.
Re:SLI != SLI (Score:5, Funny)
I propose a new acronym, TSIRT which will be the standard of rendering performance, similar to the "LOC" (Library Of Congress) reference when comparing download speeds.
Re:SLI != SLI (Score:5, Funny)
Re:SLI != SLI (Score:3, Informative)
Actually, nvidia's solution does either [anandtech.com], based on their own testing of which performs better for a given game. The drivers include profiles of the 100 most popular 3D titles which state which technique to use.
Re:SLI != SLI (Score:2)
Everyone keeps getting caught up in the idea that you would have to use this with just one monitor. I see it's potential much like a raid controller. Sure you can have two drives run together twice as fast but you can also use it to control the drives individually and increase the number you have. I can run either 2-3 RAIDs with 4 IDE devices on my computer or control 10 IDE/SATA devices wit
Re:SLI != SLI (Score:2)
AnantTech has a nice article [anandtech.com] about the Nvidia SLI.
It includes an explanation on how it works, power consumption and benchmarks of several games.
Re:SLI != SLI (Score:4, Informative)
Horizontal sweep is measured in kHz.
That, and the fact that CRT monitors lend themselves to horizontal divisions (top/bottom, not left/right) since they sweep top to bottom during refresh.
Double The Money (Score:5, Funny)
quad-card cash-vacuum (Score:2)
Interestingly, that might even work. According to the tests I saw (Anandtech or TechReport, can't remember), the PCIe videocards are only using about 4x of the available 16x anyway, so even with dual cards, they're only using half of the available PCIe lanes, so if they can figure out how to do it, quad cards _could_ work, in theory.
Not that you'd find enough suckers with enough money to make it worthwile, I bet.
I just wish my recently-purchas
Re:quad-card cash-vacuum (Score:2)
Re:quad-card cash-vacuum (Score:2)
I don't know if the x16 graphics slot is handled differently than other PCIe slots. If so, you'd have to manage that somehow, as the upcoming SLI mobos have done. Really, I'd rather see chipsets with more than 20 PCIe lanes. I'd like a mobo with an x16 for graphics, say 2 x4 slots, and 2 1x slot
Re:Double The Money (Score:2, Insightful)
You know what? Comments like yours are worthless. Thanks for your opinion that you think gaming isn't worth spending money on. The fact of the matter is, I am a gaming hobbyist. I like games, and I really like games running well on my rig. Setups like this push the dollar envelope, true, but how is it any worse than spending $1000 on a new golf driver?
Come to think on it, SLI is better than a driver because the improvements are evident and more dramatic compared to
Re:Double The Money (Score:2)
I would gladly buy two nVidia based PNY Quadro FX 4400 cards with a dual Opteron motherboard that supports SLI. I would use these two graphics cards in non-SLI mode most of the time so that they can drive 4 1600x1200 (or 2 3840x2400) screens at the same time, and use SLI only to simulate very detailed environments. I would also buy Matrox QID Pro cards that handle 4 monitors per card, a total of eight monitors. This setup would cost at least $5700 for the video cards alone ($900 for the QID Pro and about $2
Re:Double The Money (Score:5, Interesting)
"I know, let's make it so that if you buy a second one a year later, it'll work WITH the first one!"
No one needs to buy two right off the bat. One is usually more than enough for any modern game. But one for a few hundred now, and the other for less than $100 later? That's a bargain basement upgrade, and one that's far more sensible than getting the new mid-range card now, and the new mid-range card a year from now.
Now, if someone *wants* to buy two top of the line cards today, more power to them. They want the ultra-high-resolution games with all the effects cranked up, and they have the money. It makes their games look nicer, while my games run well enough. We both win, and Nvidia no longer sits on piles of unused chips.
Re:Double The Money (Score:2)
power consumption??? (Score:2, Insightful)
Should have invested in... (Score:5, Funny)
Dual webservers. Would have delayed the Slashdotting.
Re:Should have invested in... (Score:2)
Ironic? (Score:5, Insightful)
Re:Ironic? (Score:2)
Re:Ironic? (Score:2, Insightful)
I Feel Another Commercial Coming On (Score:5, Funny)
2 nvidia SLI cards: $600
Getting 4 FPS anyway because 40,000 people are on the same server as you: Priceless.
Re:I Feel Another Commercial Coming On (Score:3, Informative)
Read the articles. Look at the prices. Compare the benchmarks. Make the right decision.
That giant sucking sound... (Score:4, Funny)
I weep for that man's router.
Ouch on Costs! (Score:3, Informative)
What I'd like to see.. (Score:4, Insightful)
=Smidge=
Re:What I'd like to see.. (Score:2)
See also the UK "PC Pro" magazine (Score:4, Informative)
This month the UK "PC Pro" magazine has a review [pcpro.co.uk] of the Scan White Cobra [scan.co.uk] gaming machine.
This is a fine example of SLI running with jaw dropping performance...a quote from the review puts Doom 3 running at 98fps!
Now I know what I want for Christmas, just not a snowball's chance in hell of getting one! :)
-- Pete.
Re:See also the UK "PC Pro" magazine (Score:2)
A snowball would fare worse in a Scan White Cobra.
Hercules? (Score:2, Offtopic)
32x (Score:3, Interesting)
The PCI Express standard allows for 32x lanes. The nVidia SLI uses two 8x lanes. Wouldn't it be nice if a motherboard supported two (or more) 32x lanes and 32x graphics cards working in parallel? Think ray tracing because at those bandwidths, and the fact that there is a ergonomic limit on how small a pixel on a display can be, one can have the average size of a triangle be smaller than a pixel. This isn't true ray tracing but the effect is there.
On a similar note, are GPUs a good platform for genuine ray tracing?
Re:32x (Score:5, Insightful)
Ray Tracing uses the CPU to do all of the work. Video chips are optimized to do a lot of "shortcuts" and "tricks" to render a scene, and the math is completely different. Trying to make them do something else is like trying to strap fins on a donkey and turn it into a fish.
A dual-core CPU, on the other hand, would work wonders on a ray tracing.
Re:32x (Score:2)
Re:32x (Score:2)
Re:32x (Score:2)
GPGPU (Score:4, Interesting)
Check out http://www.gpgpu.org/ [gpgpu.org] for cool stuff. And if I'm not mistaken, it is already possible to use SLI.
Cheers,
May the GeForce be with you! (Score:3, Funny)
A case of nVidia acting on the SLI?
VIA SLI pictures Houston AMD Tech Tour October (Score:2)
SLI is a rip off. (Score:3, Interesting)
This simply forces you to get a new motherboard. Which I guess is a win for intel and nvidia eh?
Let's see, get dual cards which requires a new motherboards, or wait and get a new video card that has gual GPU"s which takes about 10 minutes to install at most.
I bet you ATI will do the dual GPU solution first and nvidia will go "fuck we should have learned from 3dFX's voodoo 5500"
I had a 5000 series card, dual Gpu's on the SAME card amazing concept!
The dual voodoo cards made sense in a day when you had a lot of spare pci slots. But ever since we've gone to the methodolgy of a single graphic slot it's not simply a matter of slapping in a new video card and connecting an sli connector, you have to get a whole new motherboard.
I DO agree with a previous statement made that is if we could go up to 4 cards and 4 cpu's on a system. that kind of flexibility would be awesome.
Modern CPU's cannot handle this... (Score:2, Insightful)
If you get two 6800 GT's working together, well if one GT is bottlenecked from most CPU's (the GPU has to actually wait a little bit more for the CPU to catch up), how can that CPU possibly catch up to two?
I say that we should wait to buy SLI technology until better CPU's come out, or if
Just what I needed (Score:4, Funny)
Two video cards = 2 x resolution? (Score:3, Funny)
Comment removed (Score:4, Insightful)
Re:SLI? (Score:2, Informative)
Re:SLI? (Score:3, Informative)
--
so really, who is hotter? Alley or Alley
Re:SLI? (Score:4, Informative)
--
So really, who is hotter? Alley or Alley's sister?
Re:TWO Cards! (Score:5, Funny)
I think that's the first time the actual moderation of a post has made me laugh more than the post itself.
Re:this is sweet (Score:4, Informative)
Each card renders half of the same image. So each card needs access to the full texture set.
So 2x256 cards still only gives you 256 megs for your textures.
Re:SLIing other GeForces (Score:3, Informative)
there is a bridge adapter for the cards, if you look around they apparently come in PCB and Ribbon styles, and connect to a funky new cutout on the PCB on top of the Card.
Re:SLIing other GeForces (Score:2)
Re:SLIing other GeForces (Score:2)
Is the SLI only compatible with the new GeForce 6x00 series or can you use an older GeForce set?
Only some of the new GeForce 6x00 cards (not all) can be used for SLI. You need a special connector on the cards.... also, there are no PCI express versions of older GeForce cards anyway AFAIK.
Re:why so little support for gamers? (Score:2)
I would absolutely love to have a dual desktop again, but mostly just because it was more responsive and handled multitasking under load so much more gracefully. Little better in computing than having a processor running at 100% and still having a usable desktop running on the other pr
Re:why so little support for gamers? (Score:5, Informative)
Right now, the answer is pretty simple. If you want a game to use multiple processors at the same time, you need to include more than one execution thread--the programmer has to divide the work in such a way that two or more processors can do it. It's quite hard to build a multithreaded game; there was some SMP support in Quake III, but it wasn't very stable and didn't provide a huge performance boost.
With a multithreaded application, you have to guard against strange bugs that are very, very hard to fix. If your multithreaded application runs into a deadlock every hundred thousand frames or so, it will be next to impossible to isolate, and production will end up being slower than it already is. While I'm sure that writing multithreaded games will happen in the near future, I don't think it will catch on very quickly.
Re:why so little support for gamers? (Score:2)
not useful. (Score:2)