Dual Video Cards Return 264
Kez writes "I'm sure many Slashdot readers fondly remember the era of 3dfx. SLI'd Voodoo 2's were a force to reckoned with. Sadly, that era ended a long time ago (although somebody has managed to get Doom III to play on a pair of Voodoo 2's.) However, Nvidia have revived SLI with their GeForce 6600 and 6800 cards. SLI works differently this time around, but the basic concept of using two cards to get the rendering work done is the same. Hexus.net has taken a look at how the new SLI works, how to set it up (and how not to,) along with benchmarks using both of the rendering modes available in the new SLI." And reader Oh'Boy writes "VIA on its latest press tour stopped by and visited in the UK and TrustedReviews have some new information on VIA's latest chipsets for AMD Athlon 64, the K8T890 and the K8T890 Pro which supports DualGFX. But what has emerged is that DualGFX after all doesn't support SLI, at least not for the time being, since it seems like nVidia some how has managed to lock out other manufacturers chipsets from working properly with SLI. VIA did on the other hand have two ATI cards up and running, although not in SLI mode."
New trend ? (Score:5, Insightful)
And we're not even speaking of how much power (wattage) these 'dual solutions' consume...
Who to Trust (Score:5, Insightful)
power consumption??? (Score:2, Insightful)
Ironic? (Score:5, Insightful)
What I'd like to see.. (Score:4, Insightful)
=Smidge=
Buy the second a year later (Score:5, Insightful)
Re:New trend ? (Score:5, Insightful)
I don't think so. Quoting from Intel's web site: "Moore observed an exponential growth in the number of transistors per integrated circuit and predicted that this trend would continue." Many people assume Moores Law states that speed of processors will double every 18 months and that the fact that it is becoming difficult to increase clock speed now means that Moores Law is finished. However, increasing speed is a consequence of higher clock speeds and higher transistor counts. Dual cores means you can increase the number of transistors per IC more and actually use them to do real work rather than simply adding a huge cache (as was done with the latest Itanic). End result, more speed, higher transistor count, and Moores Law still fine. In fact, dual cores could mean that the transistor count increases at greater than Moores Law in the near term. Of course some might question whether a siamesed pair of processors actually constitutes a single IC.....
Re:Double The Money (Score:2, Insightful)
You know what? Comments like yours are worthless. Thanks for your opinion that you think gaming isn't worth spending money on. The fact of the matter is, I am a gaming hobbyist. I like games, and I really like games running well on my rig. Setups like this push the dollar envelope, true, but how is it any worse than spending $1000 on a new golf driver?
Come to think on it, SLI is better than a driver because the improvements are evident and more dramatic compared to more inexpensive solutions. It improves my overall gaming experience and in my mind is worth every penny.
Why don't you tell us what your hobbies are, so the collective group can crap all over them.
Re:New trend ? (Score:3, Insightful)
Re:32x (Score:5, Insightful)
Ray Tracing uses the CPU to do all of the work. Video chips are optimized to do a lot of "shortcuts" and "tricks" to render a scene, and the math is completely different. Trying to make them do something else is like trying to strap fins on a donkey and turn it into a fish.
A dual-core CPU, on the other hand, would work wonders on a ray tracing.
Modern CPU's cannot handle this... (Score:2, Insightful)
If you get two 6800 GT's working together, well if one GT is bottlenecked from most CPU's (the GPU has to actually wait a little bit more for the CPU to catch up), how can that CPU possibly catch up to two?
I say that we should wait to buy SLI technology until better CPU's come out, or if you have a dual CPU setup, or even until dual core CPU's come out.
Well, that sounds expensive to me, better start saving...
Re:Ironic? (Score:2, Insightful)
Err you aren't trying hard enough. (Score:5, Insightful)
As a lover of flight sims I'll be first in line to buy a mother board that can support 10 video cards. Along with an array of cheap monitors I will finally have a wrap around view of the sim world. This can apply easily to any game.
First person shooters could finally have peripheral vision (one center and two on the sides) along with a inventory and map screen. Brings the grand total to five.
Driving games could finally have a true perspective instead of the stupid 3rd person or 1/3 screen in car view. So at least three monitors.
RTS resource monitors, sat view, and ground maps. Well that could become quite the array depending on how much you wanted covered. Say anywhere from 3-12 monitors.
Same for Massive Multiplayer Online Games. I could see a use without trying hard that would require at least six monitors.
You could double, tripple or even quadruple up on the number of required cards for any one monitor that would require higher end graphics. There are always those twisted monkeys that come up with graphics that won't run on any one GPU these days. For example those lovely to the horizon maps that show up in various games that add about 100meters of high detail every year. I see another scenario where people boost their systems performance by picking up cheaper versions of cards they own to keep their graphics improving without breaking the bank. (We can all remember when GF 2 cards cost $400 each, that'll buy you 50 of them these days.
Who could afford all this you ask? Well just about anyone these days. I've got a stack of 17inch CRT monitors in the garage I picked up for $5 a piece that are just begging to be used. With the advent of sub $100 video cards and CRT monitors, and the fact that not every output would have to be super hi rez. Perpheral views, 2d maps, and inventory lists would be just fine on something to the equivalent to a GeForce 4 MX ($32 new). You could seriously enhance your gaming machine for the price of one top of the line latest and the greatest video card from ATI/Nvidia.
So you keep your two monitor display, for me I'm going to check to see if the wiring in my computer room can handle the extra 10 monitors I plan on adding.
Re:On framerates... (Score:2, Insightful)
This way, when we've the desire to gawk at some doodad in the game world for three minutes at a time, we can enjoy it in full detail, but when you're being bumrushed by five beasties, your first reaction isn't to bask in the per-pixel lit glory. That is when the engine can crank down the detail and turn up the FPS (and potentially the amount of carbohydrates being pumped into your bloodstream.)
Re:On framerates... (Score:4, Insightful)
However, there's a much more important factor at work here that confounds the film-vs-video-card comparison: video game frames are not the same as film frames. The biggest problem in this regard is motion blur. Here's a little exercise. Try it out in real life if you have the equipment, or just think along through it:
Let's say you were to use a video camera and capture 30 frames in 1 second. The subject is your own hand, waving up and down quickly.
Now let's say you rendered a 1 second video using the 3D engine du jour, also 30 frames, of a hand waving up and down quickly.
If you were to look at the 30 film frames, they would not be crisp. Each one of them would likely exhibit motion blur. However, when played at a rate of 30fps, to the human eye, that motion blur looks smooth.
If you were to look at the 30 rendered frames, there is no motion blur. Each frame is rendered crisply. The problem with this is, when played at 30fps, instead of smoothly moving from one frame to the next, the hand appears to jump between frames. There is no intermediate data to allow a smooth flow from frame to frame.
There are two ways around this: first, you could simulate motion blur in the engine. Second, you can pump the FPS up high enough that there is intermediate data for your eye to take in, and do the motion blur on it's own. The former of these options seems much more likely.
Comment removed (Score:4, Insightful)