Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

The First Quad SLI Benchmarks 109

An anonymous reader writes "X-bit labs have a preview of NVIDIA's Quad SLI system based on two GeForce 7900 GX2 cards. On each GeForce 7900 GX2 is allocated 512 MB of on-board memory, which is connected through a special bridge chip with 16X PCIe lanes to the other daughter card and the system. The two GPUs on the card work in SLI mode. The core and memory are clocked lower than a single GPU card at 550 MHz and 1.2GHz (DDR). For Quad SLI, NVIDIA has introduced a new mode of SLI, AFR of SFR where each card alternately renders a frame split between the two GPUs of one card after the other. The GX2 cards are benched (when possible) at resolution of 2560 by 1600 with 32X SLI AA and compared to a Crossfire x1900 XTX system on a variety of games."
This discussion has been archived. No new comments can be posted.

The First Quad SLI Benchmarks

Comments Filter:
  • HTIALOA (Score:5, Funny)

    by yogikoudou ( 806237 ) on Sunday April 30, 2006 @01:36PM (#15232804) Homepage
    Hell That Is A Lot Of Acronyms
    • OMFG you are right! why dont they stop using them though... it makes people feel inferior because they dont know what it stands for...
    • Well, you haven't had a look in the photography scene lately have you ? Nikon is a real winner with it's lense naming..

      There's NIKON 70-200 F2.8 G IF-ED AF-S but the winner goes too.... AF-S DX VR Zoom-Nikkor 18-200mm f/3.5-5.6G IF-ED
      Eat your heart out :)
  • by LiquidCoooled ( 634315 ) on Sunday April 30, 2006 @01:39PM (#15232817) Homepage Journal
    Why is this?

    Its obvious we expect more processing power, but the prices nowadays are silly.
    Its also fucked up the benchmarking because you can't just look for the card your interested in, you have to check for it being in SLI or QSLI mode.
    • 1st, dual core IS more expensive with cpus, too. Just compare.

      Well, because unlike cpus, the margins are a LOT lower.
      If you disregard the highest end, you pay maybe 200-300$ for a card that has 100$ worth of memory alone on it, too.
      And that for chips made in the newest processes with 300-400 mm^2 die size.

      While CPUs are barely reaching 150mm^2. Thats why one can easily add another die to a cpu without breaking the manufacturing bounderies, while this is not possible with gpus.
      Another reason: GPUs are much m
    • Reason may be that dual-GPUs are not dual core but two GPUs (usually) on two different PCB?

      In a word, they're merely sticking two full graphic cards together, while dual-core CPUs stick the cores and the dual-CPU handling logic in a single physical package.

      Dual GPU is twice as expensive to buy because it's twice as expensive to make in the first place.

  • by DrunkenTerror ( 561616 ) on Sunday April 30, 2006 @01:41PM (#15232825) Homepage Journal
    Twenty-seven pages? Gimme a fucking break. Think they're milking it a touch?
    • by Bogtha ( 906264 ) on Sunday April 30, 2006 @02:26PM (#15233037)

      From the Onion article, February 2004:

      Stop. I just had a stroke of genius. Are you ready? Open your mouth, baby birds, cause Mama's about to drop you one sweet, fat nightcrawler. Here she comes: Put another aloe strip on that fucker, too. That's right. Five blades, two strips, and make the second one lather. You heard me--the second strip lathers. It's a whole new way to think about shaving. Don't question it. Don't say a word. Just key the music, and call the chorus girls, because we're on the edge--the razor's edge--and I feel like dancing.

      From CNN [cnn.com], September 2005:

      Gillette has escalated the razor wars yet again, unveiling a new line of razors on Wednesday with five blades and a lubricating strip on both the front and back.

  • Old news... (Score:3, Funny)

    by suv4x4 ( 956391 ) on Sunday April 30, 2006 @01:41PM (#15232827)
    Who cares about quad SLI, gimme the octet SLI, I just sold my house and I'm ready to buy one.
  • I wonder (Score:5, Insightful)

    by masterpenguin ( 878744 ) on Sunday April 30, 2006 @01:51PM (#15232873)
    I wonder what percentage of people who will be running quad sli 7900's live in their parents basement.

    Although I'm a college student, my experences are that once people graduate college(and are making the money to afford these toys), generally they realize what a waste of money it is to stay on the bleeding edge of PC Gaming tech

    I don't know though, perhaps there is a larger market for these than I think.
    • There's something to be said for being on the bleeding edge, I suppose; to some people, money is no object, and not all of them live in their parents' basements. This technology is marketed towards the Alienware crowd that has no problem dropping $5,000 on a flashy all-out computer system. The rest of us are probably not going to be gaming on a quad-SLI system any time soon.

      I agree, however, that the bleeding edge becomes sub-par so quickly that it's like buying a brand-new car -- it loses some absurd per

    • Get more experience then
    • Re:I wonder (Score:5, Interesting)

      by Kjella ( 173770 ) on Sunday April 30, 2006 @03:41PM (#15233347) Homepage
      Although I'm a college student, my experences are that once people graduate college(and are making the money to afford these toys), generally they realize what a waste of money it is to stay on the bleeding edge of PC Gaming tech

      That's one of the phases. But there's another phase in which you find that there's a lot easier to free up money than to free up time. Or to put it in other words, that you'd rather pay to have real fun than to spend time having sorta fun on the cheap. I had a machine (AMD2000+) that became unstable. Tried RAM tests, CPU burn, 3Dmark loops, disk scans & defrags, voodoo and exorcism to no use, nothing revealed an actual problem except practical use.

      I bought myself a new machine and retired it to one of the world's most overpowered home file servers. Why? Because I'd literally wasted *days* of my spare time, annoyance and grief over surprise reboots. I was so pissed I considered getting a Mac, but the x86 Macs weren't out yet. Why? "Just works(tm)". That kind of time I'd been wasting myself far more than covered the distance if I put any reasonable "price" on it.

      Another thing I don't do is seriously price chase. I find a serious online retailer (either one I know previously, or one with a good customer base and rep), and as long as their prices aren't really out of whack (looking at 2-3 serious shops, I'm usually within 5% of those I know cut corners on stock, service and support) I buy it. Before I'd checking for various special offers and calculating if the postage still made it preferable to buy from different suppliers etc., try out various semi-serious sites with attractive prices etc.

      To bring this back to Oblivion... I find it a very good game playing at half-res (960x600) on my 1920x1200 24" LCD monitor. I've tried it at 1920x1200 just to see what it looks like, and I don't feel it makes that much of a difference. That, and that I like my XPC that doesn't require a huge case and doesn't sound like an airplane taking off, which I imagine this will. But if I seriously felt "I need a quad-SLI to play this in 1920x1200 to really enjoy this game", I wouldn't really have a problem doing that.

      Compared to the number of hours I've spent (and would spend with future games, presumably it'd last a little while) it wouldn't be unreasonable. Just like this LCD is way overkill if you want to put it like "Do you really need more than a mainstream 19" LCD?" the answer is no. But well, then I'd have a slightly bigger number in an account statement somewhere. I don't mean the cash is burning in my pocket. But if FPS games is what you do for fun, it's not an unreasonably expensive hobby compared to many others.

      I know one who spent $3000 on a piano, one that spent $3000 on an HD camcorder, someone that likes to tune up his car for God knows what. All for their personal hobby, because that's what they do in their spare time, and they want their spare time to be fun. You need to have some disposable income to do that. Around here, it's easy to "rent/buy yourself to death", with a too expensive apartment/house. Then you sit there, don't go out, don't make any big purchases, you make the rent but live a sparse, plain and boring life. You choose what makes you happy.
      • Re:I wonder (Score:5, Interesting)

        by Aladrin ( 926209 ) on Sunday April 30, 2006 @07:11PM (#15234180)
        A lot of people don't get this concept. But it really DOES happen. I constantly have NO free time. I've even considered moving closer to work and spending hundreds of dollars a month extra to save a 1.5 hour daily drive. 7.5 hours a week doesn't sound like much, until you find yourself avoiding going to the grocery store for as long as possible because you simply don't have time.

        I'm playing off my student loans at about 6x the minimum payments. Money is definitely not the issue. Just time.

        Likewise, where I used to play every game that came out, now I only hand-pick the very best ones and I get seriously ticked if any are crap and waste my time. It's quite a marked change from the boy who always said 'I'm bored.' to the person I am today.
        • Pay the extra to get rid of the drive. Reverse your thinking about it: how much would someone have to pay you to drive an hour and a half every day for no reason? Are you in fact getting paid that much?

        • Or how about you get another job that pays less but doesn't take up all your time, and actually have some time over for yourself?

          I know, I know, CRAZY TALK.
      • I had a machine (AMD2000+) that became unstable. Tried RAM tests, CPU burn, 3Dmark loops, disk scans & defrags, voodoo and exorcism to no use, nothing revealed an actual problem except practical use.

        Sounds like a bad power supply to me.

      • What's the make/model of that 24" LCD?

        I'm debating replacing my 19" CRT with a 1600x1200 LCD, but a 1920x1200 LCD would give me more screen realestate without sacrificing veritical resolution.
    • Guy I know goes nuts like this, has SLI'd Geforce 7800s and such, way overkill. He's a Lieutenant in the US Army (not in Iraq). The military basically covers all his expenses, so he's got money to throw around and this is what he chooses to throw it at.

      Then of course there are people that just make tons of money. Another friend makes over $200,000 per year and, while he doesn't buy things like this, he does drop the same kind of money on silly gadgets. For example a PIX 515 to guard his home network. Necess
  • ...But good luck getting anything but the demo that ships with your prepackaged pair of identical cards to run on such a setup.

    Don't worry, though - The sequel to your favorite game might support such a configuration (assuming you have the right card model, the right rev of that model, the right motherboard, the right BIOS, and the right OS) somewhere around the time single-GPU cards have 8x (i.e. twice what this would yield, if you can get it to work) the power of anything available today.


    Does this ha
    • Does this have serious geek-cred? Sure. Would anyone but a total masochist try to run such a configuration, for anything more than bragging rights? HELL no!
      I think this is beyond bragging rights, its like people who buy hummers just to have the biggest toy on the block. We call it compensation =)
    • Er. That would be a very nice high-and-mighty I'm-better-than-you-because-I'm-not-jealous-of-yo u r-real-ultimate-power-promise speech, if it were true. As it stands, it appears that the config supports:

      • Call of Duty
      • Chronicles of Riddick
      • Doom III
      • Far Cry
      • F.E.A.R.
      • Half-Life 2
      • Quake 4
      • Serious Sam 2
      • The Elder Scrolls IV: Oblivion
      • Project Snowblind

      Now granted, I didn't check every one of those pages to make sure they didn't contain a "we couldn't get it to run" blurb, but a random sampling produces successes

    • by m50d ( 797211 ) on Sunday April 30, 2006 @02:37PM (#15233090) Homepage Journal
      Erm, wtf? The driver shipped with the card plugs it into opengl and directx, the game outputs to those and doesn't care, and everything is happy.
      • Erm, wtf? The driver shipped with the card plugs it into opengl and directx, the game outputs to those and doesn't care, and everything is happy.

        Kinda like the late great Voodoo 3? Yeah, a real breeze to use. Just pop it in and let the drivers do the rest. Riiiiiiight...
        • Wow, what an ignorant statement. The Voodoo line of cards didn't use OpenGL nor Direct3D. Rather, they used their own proprietary library called 3dfx. That's why one had to be worried about whether or not one's graphics card would be supported by their software. Now, there are very few problems unique to only one card manufacturer, and everybody is working through Direct3D. The olden days of video card compatability problems are all but gone.
          • Company was 3dfx. The library was Glide.
          • Wow, what an ignorant statement. The Voodoo line of cards didn't use OpenGL nor Direct3D [...] Now, there are very few problems unique to only one card manufacturer, and everybody is working through Direct3D. The olden days of video card compatability problems are all but gone.

            Wow, what an ignorant statement.

            Do you have any idea what a driver does? As someone who has made a living writing them, allow me enlighten you a tad...

            OpenGL and Direct3D provide an Application Programming Interface for games
    • So I guess you missed the part of the article where they played it with about a dozen of today's current most popular games off the shelf. You know, that whole benchmarking part?
      • So I guess you missed the part of the article where they played it with about a dozen of today's current most popular games off the shelf. You know, that whole benchmarking part?

        So I guess you didn't make it up to page 10. You know, the page titled "Bang Bang: Here Come Problems"? Where they show horrible mangled screenshots and make such comments as (Bolding mine):

        Before we proceed with the benchmark scores, we would like to stress that Nvidia GeForce 7900 quad SLI technology does not seem to be matur

        • Of course I read that. That's not the point. I'm not debating that the driver support at the moment blows.

          You made the assertion that this technology requires special software support from the games. That is not true, it is all handled in the driver in a way that is transparent to the app. Yes, it's buggy now, but that is beside the point.
  • by Prophetic_Truth ( 822032 ) on Sunday April 30, 2006 @01:51PM (#15232879)
    This will be yester year's technology when the next architecture comes out. In the video card market new archs seem to happen every year. My new cap on video cards is $300/year. I got a 7900gt, you can do a voltage mod and buy a $30 cooler and by overclocking, get the same performance as a 7900gtx. They both use the exact same gpus. Google for guides
    • As the article starts out with:

      You can buy a $200,000 Italian sports car or a $30,000 Japanese car and add $20,000 in parts to get almost the same performance. But you'll likely never get the same shit-eating grin.

      Now, for most people, a Ford or a Honda is plenty. They'd much rather have an OK car and the $180,000 difference that they never had anyway. But that doesn't devalue what Ferrari or Lambourghini offer those who are willing to pay for it.

      Similarly, yes, 1960's Ferrari probably can't hold a candle t
      • > Similarly, yes, 1960's Ferrari probably can't hold a candle to 1980's higher end Nissan - but the driver who can afford a Ferrari in 1960 has had 20 years of awesome enthusiast's driving and has likely bought 1980's Ferrari that Nissan's 1980 model still can't touch.

        May I also point out that if you kept that 1960's Ferrari maintained and in good condition it would likely be worth MUCH MORE--even in inflation adjusted dollars--than you paid for it. It would be fun AND a good investment. This is alm
        • Don't forget to add the cost of maintaining a Ferrari in collector condition for forty years. That's probably a large multiple of the original price.
          • > Don't forget to add the cost of maintaining a Ferrari in collector condition for forty years. That's probably a large multiple of the original price.

            Ok, no problem.

            Let's say I bought a brand new Ferrari 250 GT "Nembo" Spyder [sportscarmarket.com] for $12,000 back in 1964, then like most owners kept it garaged and drove it mostly evenings and weekends. And let's say that over the years I'd also spend another $36,000 on repairs and servicing...three times its original purchase price.

            After ten years parts would be dif
        • What'll make you sick is the cars for sale ads in the old car magazines. I've got R&T from the 50s to the mid 90s, and seeing Carrera-engined Speedsters for $2,000 with the engine blown (core engine is worth $50k+ today alone), '65 Mustangs in good shape for under $1k, not to mention the prices on the *new* on the show room muscle cars. Heck, even original spare parts are worth $$ to some people, especially the concours crowd.
    • Pick a price you are willing and able to afford once per year, and buy there. You are generally better off getting a lesser card more often than a great card once and a while. So set a range for yourself and then upgrade about once per year. For gamers, I recommend shooting for the $150-200 range if possible. Get a card like that once per year, and you'll find all games will run fine, even near the end of that cycle. Don't give in to the temptation to get a more expensive card thinking it'll be good for lon
  • What is the reason for this? Why would you spend $1000 for high framerate? At least for casual (or even hardcore) gaming, I find this stupid.
    • What is the reason for this? Why would you spend $1000 for high framerate?

      Playable framerates (e.g. >30, more like ~60) when playing modern games with very high quality detail (which makes the experience more immersive).

      Playing top dollar for a new title then playing it at 15 FPS is something that would be pretty stupid (I can't see how that would be 'fun' no matter how 'casual' a gamer you are - and if you are happy turning down the the graphics details to low end then you might as well settle for an a
    • To be leet.

      People with a lot of money (obviously) like to spend it on nice things! If you have the means to afford a quad-SLI rig, your priorities change.

      Me, I'll stick to my 64mb Radeon.
    • Isn't it obvious? The hot chicks at every LAN party always go home with the guy with the highest frame rate!

      Wait, you say there are no hot chicks at your LAN parties? Hmm... none at mine either!

  • What resolution? (Score:2, Interesting)

    by DarthChris ( 960471 )
    FTFS (emphasis added):
    "The GX2 cards are benched (when possible) at resolution of 2560 by 1600 with 32X SLI AA and compared to a Crossfire x1900 XTX system on a variety of games."

    Who actually has a monitor capable of such a high resolution?
    Secondly, correct me if I'm wrong, but CrossFire currently is two cards side by side, and if four cards don't perform significantly better than two, I'd be very worried.

    • Re:What resolution? (Score:2, Informative)

      by null-sRc ( 593143 )
      Who actually has a monitor capable of such a high resolution?

      *raises hand*

      and so does anyhow who bought the 3007wfp on sale during recent dell days... Oblivion at native res is only about 30fps... would prefer to quad it up for a decent 100+ fps
    • Hm. Thats the resolution of the DEll 30" LCDs.

      Any maybe you are asking the wrong question: Its not that people would buy those cards and then wonder what do do with the resolution (although i am sure such will be, too), but rather people who already spend 1000s for such ultra highres toys like the 30" Apple or dells, or those IBM displays with 4xxx*3xxx that sell for half a fortune.
      • Well going by the article, it's not at all worth it. Seeing that a 7900GT can run most current games fairly comfortably at 1920x1200 (23-24" widescreen), you can save a grand on the cards and almost two grand more on the monitor. Plus, going by the pre-slashdotting benchmarks, the performance wasn't nearly what it could (should) have been - it got it's arse kicked by crossfire X1900XTs in quite a few configs. I knew the ATI flagship had a lead over that of nVidia, but it's certainly not over twice as pow
    • I do. I keep it at 1600 since that's all the EDID claims it will do, but it has no trouble with that res.
    • Re:What resolution? (Score:3, Informative)

      by Surt ( 22457 )
      2560x1600 is a nice resolution to use with the cinema display, or with the dell 3007.
      http://www1.us.dell.com/content/topics/topic.aspx/ global/products/monitors/topics/en/monitor_3007wfp ?c=us&l=en&s=gen&~section=specs [dell.com]
      It's not too expensive a monitor, popular with gamers who have the kind of money to buy quad sli.
  • I think the site has been Slashdotted...
  • The real question (Score:1, Interesting)

    by Anonymous Coward
    How much does it take to cool these things?

    Even if someone gave me $1k and told me I could only use it on a quad-sli setup, I don't think I'd take it, mainly because I suspect that the cards would fry everything in a five-mile radius without watercooling.
  • "...Nvidia GeForce 7900 quad SLI is not faster than ATI Radeon X1900 XT CrossFire (which is known for high performance in high resolutions and with FSAA) across the board and may even lose to dual GeForce 7900 GTX setup."

    So... can anyone explain *what's the point* then?
  • This would seem like not the best solution for, e.g., first-person shooter gamers, who mainly want high frame rates to lower the total latency (time between controller movement and seeing the result of that on the screen, not to be confused with network latency). In the worst case, in a Quad AFR setup, if you are running at 30 fps, that would mean each GPU is actually rendering at 7.5 fps, or 133 milliseconds per frame, versus the 33 milliseconds per frame of 30 fps. (Not counting the additional latency
    • Yeah, but if you read the info (hell, even the summary) it states that it does not do quad AFR, it loadballances each pair of cards and AFRs that, and given that i have seen (on the inquirer.net) this thing pulling about 45fps in fear on one of those 30" monsters, the latancy would be negligable anyway.

      This arguement was raised when they first started with the modern SLI tech, especially when the renderer doesn't support the SLI properly, I have never found any more lag than would be otherwise present (obje
  • iirc, didn't one of the video card companies the thing with cards alternating on rendering frames? didnt they call it sli back then also except it stood for something else and that was the only way it speed things up (alternating frames)?
  • i'm wetting my pants just by thinking about it
  • The new megahurtz madnes. How many CPUs/GPUs do you have? Ohhh.. that is so yesterday.

    To bad they don't come back in style in 10 years.
  • by eebra82 ( 907996 ) on Sunday April 30, 2006 @05:55PM (#15233917) Homepage
    Reading the comments above made me realize that a lot of people don't understand what NVIDIA has done here. Let me point out a few things for you:

    * This is not hardware for the mass market. In fact, even the dual SLI setup is overkill and mainly used as "we knew how to do it and so we did it to prove it".
    * This system is not supposed to be cheap and most definitely not intended to be the most effective cost per fps solution.
    * Although only a few will buy this, it is far more valuable for NVIDIA to kill ATI:s chances of de-throning them from the performance top.
    * Such excessive memory bandwidth is suitable for extreme resolutions that are currently unsupported by over 95 percent of the monitors, but the point is not that we should play our games at these levels, but to prove that it is possible.
    * NVIDIA gets an edge over ATI along game developers because, performance-wise, they will be able to run the future games on setups comparable to single cards that are two or even three generations away.
    * Yes, it's a waste of electricity, but if you're a member of Green Peace, then wait a few more generations before you buy a cow approved graphics card that fits into this category.
    * One user was upset, claiming that it would be stupid to waste $1000 on a setup like this. I agree, but if you happen to drive a Ferrari and if you are debt free and got a few million bucks stored, then why not settle for the best if you can afford it? And you can obviously get your 17-year-old Slashdot-reading neighbour to put in watercooling or whatever to make it silent, too. Point is, some people will buy this, and being able to afford something isn't being stupid.

    Last but not least, we should all remember that the CPU is the new bottleneck now. It will be interesting to see what a CPU a year from now can do to this rig.
    • Just like last year when ATI came out with the first card with 512MB on it. When asked what the use was for this an Exec. said "none, we did it because we could and it makes a great ad campaign".

      Someone will make use of the capabilities but it will never be mainstream.

      The real bottleneck on systems is still the buss, when buss speed is closer to CPU speed that will disappear. Of course any time you access the hard drive it seems like a 486. That is still a major bottleneck for those with less than 1GB of RA
    • Well yes, that all makes sense... but how are they doing anyone any good by being *slower* than current SLI/Crossfire setups? That's not a really good show-off...
    • I agree, but if you happen to drive a Ferrari and if you are debt free and got a few million bucks stored, then why not settle for the best if you can afford it?

      Maybe because $1,000 can do a lot of good for the world feeding/clothing/curing people, four hefty-draw video cards contributes more to global warming and includes a lot more toxic waste than one card, and you can get 90% of the gaming experience by dropping $150 on a GF6800.

      Basically, because if you do run four cards like this just because you have
  • Comment removed based on user account deletion
  • I remember when I got suckered into buying 2 voodoo cards placed into my PC in SLI mode.

    the demo disc that came with it showed off -: DAZZLING :- effects...
      and _NONE_ of the fucking games I owned or subsequently bought got any significant benefit from it...

    hmmmmm...

    No wonder people are buying consoles - they are tired of being milked!
  • So $1000 gets you cards to throw in the bin once Vista comes out , by that time you'll be wanting octo 8800GTX SLI 1GB cards ...
  • To those that say this will be obsolete in a year.. NONSENSE!! It'll never be obsolete! This is finally so fast that this will be THE LAST PIECE OF COMPUTER HARDWARE EVER TO BUY! It can only appreciate in value.

    I'm investing in 10.

    All those predictors judging this new hardware from prvious countless years of industry/market behavior judging from thousands of computer products.. they're all wrong. ROFL @ them..
  • Whats the point really? 4 separate graphics cards in 1 PC case? I have 1 7900GT and that loud enough, I wouldnt want to hear 4 of them at once! Realistically, its all aimed at the high end user market who would are more likely to purchace high end QuadroFX cards. I hope I dont see this on the consumer market any time soon, I would just feel compelled to upgrade again. I do believe that there is no need to force this technology onto the gaming scene, game developers arent making use of the dual core CPU be
  • So this ,for now, is a case of 2 in the hand is better than 4 in the bush lab. I really see a hard time for 4 graphics card to even become HI end Geekdom computers. Especally when we have barely even touched the capability with 2 graphic cards. The creators of the games are barely messing with SLI/Crossfire let alone quad. This is way to over the edge... Look at SLI when it first came out around what... 2000 and is just now being accepted? Put it back in storage Nvidia and give us the public about 5 years
  • of this endless upgrade path. I sold gaming rig and bought a cheapie Dell 8100. 256mb or ram and geforce MX is all I should need. But I got this itch to play COD again. Hmmm, If that system had 1 gb of ram and geforce ti4200, it could run COD pretty well. Ah, but that CPU is the weak link then. Maybe I got a socket 423 to 478 adapter and a P4 2.8 ghz cpu......
  • If you want to go out and dump a couple grand on some graphics cards and what not, go ahead.

    But
    a) its not going to make you a better gamer.
    b) Your system isn't going to start running uber fast.
    c) The only thing you are going to get out of it is bragging rights.

    Don't waste your money.

It isn't easy being the parent of a six-year-old. However, it's a pretty small price to pay for having somebody around the house who understands computers.

Working...