ASUS Designs Monster Dual-GTX285 4GB Graphics Card 212
suraj.sun writes to mention that ASUS has just designed their own monster graphics card based on the GeForce GTX 295. While the card retains the GeForce GTX 295 name, same device ID, and remains compatible with existing NVIDIA drivers, ASUS has made a couple of modifications to call its own. "the company used two G200-350-B3 graphics processors, the same ones that make the GeForce GTX 285. The GPUs have all the 240 shader processors enabled, and also have the complete 512-bit GDDR3 memory interface enabled. This dual-PCB monstrosity holds 32 memory chips, and 4 GB of total memory (each GPU accesses 2 GB of it). Apart from these, each GPU system uses the same exact clock speeds as the GeForce GTX 285: 648/1476/2400 MHz (core/shader/memory)."
I surrender. (Score:5, Funny)
Re:I surrender. (Score:5, Funny)
Maybe, but it will work better on Windows. Just ask Asus... :P
Re:I surrender. (Score:5, Funny)
I really miss a "Well Played Sarcarm" mod option.
Re: (Score:3, Funny)
Re: (Score:2, Informative)
Graphics cards have their dedicated BIOS on-board.
Re: (Score:2, Flamebait)
Re: (Score:3, Funny)
Yes, the kind they use at Jurassic Park.
Re: (Score:3, Funny)
It's a Unix system! I know this!
Re:I surrender. (Score:5, Funny)
Aren't we supposed to hate Asus this week?
Re: (Score:2)
Has there been an official announcement from Microsoft or Asus about the wacky website?
Re:I surrender. (Score:4, Funny)
So, when will be be getting dual-PSU cases... (Score:5, Funny)
...so we can dedicate a full 2nd 1KW Power Supply Unit for the graphics card alone?
Re:So, when will be be getting dual-PSU cases... (Score:5, Informative)
Re: (Score:2)
There are loads of cheap cases that support 2 PSUs....or you could just grab an old server case.
Re: (Score:2)
Re: (Score:3, Interesting)
Well even as hungry as this card is, a 1kw PSU would still do fine. Computers don't use as much power as people seem to think. However, there are actually larger PSUs for sale. For example E-Power sells a 2000 watt PSU. It is an external enclosure that houses the actually PSU components, and then a bunch of wires connecting to an internal patchbay that you hook your cables in to. Completely overkill, but then hey so is this GPU.
Re: (Score:2)
I remember an nVidia SLI configuration that required something like 1350 watts, as well (I believe OC'd 8800 GTX's a couple of years ago). The 260GTX I have requires 500W, so I can only assume this will need 900-1000W at bare minimum (it has more stuff, but 500W includes other components, so its not a doubling from a single GPU to a double).
The real scary thing is a 1480Watt draw on my wall would nearly blow a fuse by itself (120V*15A=1800Watts), God forbid adding a laser printer like I have on that wall (
Re: (Score:2)
Well, the circuit breaker box shouldn't be too far from your room in your mom's basement so adding another circuit should be easy.
Re: (Score:3, Informative)
Except you can't do that. A 4870x2 is already crossfired on the card itself. Quad crossfire involves two of those cards. Four of them would need 8 way crossfire, which ATi does not have support for.
Nor, for that matter, would such a thing be useful. You do not get linear scaling with multiple SLI/crossfire cards. As you start tacking more on, your gains rapidly decrease. About the only case where it would be useful is for extremely high resolution displays, but we are talking beyond 2560x1600.
You are also m
Re: (Score:3, Informative)
This monster demanded vast amounts of power. So they designed an external power supply that plugged into the back of it.
Never saw
Re: (Score:2, Interesting)
I recently had to upgrade from a 2 year old 500W power supply because it didn't have enough (6 pin?) power cables for my GeForce 9800GTX. I was honestly disappointed, but went ahead and bought a new one. I now have a 700W power supply from rocketfish, and I think that's quite insane.
In the end, I think graphics card manufacturers might just go back to ex
I feel nerd-emasculated (Score:5, Funny)
Oh dear. My primary computer has half as much RAM as a graphics card.
(Hangs head in shame.)
Re: (Score:2, Insightful)
A true nerd looks at this card the way an off-roader looks at an H2: It's bigger than it needs to be, costs more than it should, and is at best no better at what it's supposed to be good at then something a third the price. Oh, and only rich posers actually own one.
It's not tech for the sake of tech. It's tech because you can do something cool with it that makes you a nerd. And there's not really much you can do with this that you can't do just as well while spending less money.
A nerd can get his compute
Re: (Score:3, Insightful)
How can you compare something that costs $80,000 (plus running costs) to something that costs $800?
The other big difference is that this thing will be "normal" in a couple of years and only cost $100. Mid-range PCs will have this as standard.
A Hummer, OTOH, will still be just as expensive and just as stupid.
Re:I feel nerd-emasculated (Score:5, Funny)
How can you compare something that costs $80,000 (plus running costs) to something that costs $800?
You're on /. and you're questioning a car analogy?!
Re: (Score:2)
Re:I feel nerd-emasculated (Score:5, Insightful)
How can you compare something that costs $80,000 (plus running costs) to something that costs $800?
He didn't compare them, he used their few similar traits to illustrate a point. A common use of analogies.
Re: (Score:2)
He used a few similar traits to illustrate a point?
That IS comparison. Textbook comparison.
If I examined the similarities between my ass and a hole in the ground, that would also be a comparison. But I still wouldn't be able to tell the difference between my ass and a hole in the ground, because I only compared, and did not contrast.
Speaking of which, if you compare something, you've made a comparison. If you contrast somet
Re: (Score:3, Informative)
How can you compare something that costs $80,000 (plus running costs) to something that costs $800?
The other big difference is that this thing will be "normal" in a couple of years and only cost $100. Mid-range PCs will have this as standard.
A Hummer, OTOH, will still be just as expensive and just as stupid.
His analogy with the offroad makes sense... If you are familiar with off roading. My old First Generation 1989 4Runner will destroy a Hummer H2 offroad, and I am in the process of buying another rolling chassis for it today. Total cost? $500.00 off of Craigslist.
A H2 is something that APPEARS to do well off road, but in reality it does not. Plus, when parts come flying off of your offroading vehicle (and if you are doing real off roading, THEY WILL FLY OFF) replacing those parts on a old 4Runner is cheaper
Re: (Score:2)
If you're expecting him to get anywhere NEAR the FPS in crysis with his POS model that someone with that rig is getting you're smoking crack. I don't care HOW tweaked your computer is, if it doesn't have a comparable video card it's not going to happen.
Now, if he uses his nerddom to hack and sabatoge the fast computer, maybe.
Or if he uses his supreme skills to write an oldschool demo that performs better on his particular kit than the supercard.... yeah.
Or maybe if you set up the competition to where the c
Re: (Score:2, Insightful)
But unless you set up a rediculous(sp) scenario, a high end video card is almost always going to beat a low end or old one.
Unless you pick a rediculous scenario (e.g who can get the highest FPS on crysis with full detail on everything, 32x antialiases, etc) a medium and high end card will always give you the same key features (being able to play most modern games in high detail), while the high end card will draw more power and make more noise. The end result is a true geek will never guy this monstrosity but a poser will, Hummer analogy win!
Re: (Score:2)
A H2 is something that APPEARS to do well off road, but in reality it does not. Plus, when parts come flying off of your offroading vehicle (and if you are doing real off roading, THEY WILL FLY OFF) replacing those parts on a old 4Runner is cheaper than a H2.
So, his analogy is valid. A offroad nerd can get much more out of a 1st gen 4Runner than an H2, in the same way an IT nerd can get more out of a non-4GB card than the twit that likes to drop $800/month on his gaming system.
Take this link with a grain of salt, but apparently the 2008 H2 is a pretty damn good offroad vehicle (comparable to the 2008 4Runner) - http://autos.aol.com/gallery/top-10-off-road-suvs [aol.com]
HUMMER H2 (with optional air suspension) Ground Clearance: 9.7" Approach Angle: 42.8 degrees Departure Angle: 40.0 degrees
The toughest-looking SUV on the market is also one of the most capable off-roaders out there. The impressive H2 can climb a 60 degree grade -- a feat even a mountain goat would appreciate.
Re: (Score:3, Informative)
That wasn't his point. The point was that for somebody who actually knows what the heck he's doing, it's overkill. By a wide margin. On a par with the Killer NIC. Yes, it will perfrom a little better. But for a real world application, it's really not worth the added cost.
Case in point, my laptop has a Core 2 Duo @ 1.66GHz, 2GB of RAM, and a 256MB GeForce 8600M GT. It's driving a 1680x1050 LCD. The lappy is
Re: (Score:2)
How can you compare something that costs $80,000 (plus running costs) to something that costs $800?
Its called an analogy - WHOSH!
Big expensive video card "with status" is to computer nerd that doesn't know any better as big expensive four wheeler "with status" is to wanna-be off-road driver that doesn't know any better. Seemed pretty cut and dry to me.
Re: (Score:2)
Yeah, I meant the performance, not the physical construction.
My job is 3D-graphics programming but I never pay more than about $150 for a graphics card. It's pointless trying to stay ahead on the graphics performance curve.
Re: (Score:3, Insightful)
Says the nerd who can't afford it. I think it would have been awesome to put this in a liquid cooled quad SLI setup, even though it'd require it's own power circuit, AC unit and noise-isolated room in the basement. You can't say that's NOT nerdy...
Re: (Score:3, Interesting)
Re: (Score:2)
No problem, feel free to buy one and use it to the fullest, with the blessings of the rest of us. It's the high-end buyers and early adopters that subsidize the rest of us. Without them, there would be no sub-$100 cards that were worth a damn. They spend way too much for what they get, but the rest of us win.
Mal-2
Re: (Score:2)
Nope. In reality they sell very few of these cards, they mostly make them to get their brand name in the headlines. Any actual sales are just a bonus
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
A true nerd looks at this card the way an off-roader looks at an H2: It's bigger than it needs to be, costs more than it should, and is at best no better at what it's supposed to be good at then something a third the price. Oh, and only rich posers actually own one.
It's not tech for the sake of tech. It's tech because you can do something cool with it that makes you a nerd. And there's not really much you can do with this that you can't do just as well while spending less money.
A nerd can get his computer, with half as much RAM and less processor power, to do everything this card can do. And do it better.
Sort of... But not really.
I mean, I understand where you're going with this, and I generally agree. I usually buy a $100 video card and put the extra money into RAM and CPU. Generally that works out pretty well for me. But my needs are relatively low...
I play things like WoW and EVE on at 1280x1024. I don't have a ginormous monitor, and I don't play a whole lot of visually-impressive high-speed games. If I had a big ol' monitor and wanted to play something like Crysis at 2560x1600 it just wouldn't hap
Re:I feel nerd-emasculated (Score:5, Insightful)
The article doesn't mention the price, but I suppose it would cost more than the GTX 295, so this card would be expensive. The advantage of it though, is you can stick enough graphics power in a single slot to power a 30 inch monitor at the highest settings with playable framerates in almost any game. So while I can not speak for every nerd, this is surely not tech purely for the sake of tech. No one could get something with half the RAM, less processor power to do everything this card can do that I know of. Perhaps you could prove me wrong on that point?
So while some think your post is insightful, I think you have no idea what you are talking about. This card was made to fill a niche in the high end gamers market, pure and simple.
Re: (Score:2)
If the card is cheaper than a pair of the card it's made of plus the price differential for a SLI motherboard then it's a massive win for anybody who wants that level of graphics. Not only could it be really useful in that context but there's lots of "professional" applications where the Quadro feature set is unnecessary, but this card might come in nicely, if the price were right.
Re: (Score:2)
This card was made to fill a niche in the high end gamers market, pure and simple.
Or any other industry that needs huge uber-FPS, uber-high-def video. I'm thinking military, medical and "content creation" applications, but certainly there are others.
And CUDA apps would certainly scream on such a card...
Re: (Score:2)
And here you thought you just had to upgrade to be a good nerd. According to this post you have to make that piece of shit you already have perform miracles!
Good luck with that!
Re: (Score:2)
Heck yes. I played Left4Dead on a PC from 2002, and was getting 30fps!
But I did finally succumb and pick up a brand new 8800GS... for $45CAD... about $35 USD.
True geeks don't need a monster card like this. ;)
Re: (Score:3, Interesting)
True enough, but think of what a nerd could do with this?
I have a GTX 295. It is by far the most monstrous card I have ever put into a machine since the early PC days. I can't begin to imagine what its big brother looks like or how much power it will suck down.
As to the point about getting the most out of your equipment, there are people who have the skills to do things like modify and fix cheap or old cars and equipment. Personally, I find that neat. However, I never do that. The closest I get is buyi
Re: (Score:2)
2GiB RAM? What are you doing on slashdot?
Re: (Score:2)
GiB is a term made-up by a consortium of fools.
GB is proper.
Re: (Score:2)
The only fools were the ones who thought they could redefine kilo, mega, giga, etc.
Re: (Score:2)
The only fools are the ones who don't understand the difference between qualitative and quantitative measurements.
Re: (Score:3, Insightful)
Well, since numerical prefixes are by definition quantitative...
Re: (Score:2)
2^30. Your point?
Re: (Score:3, Interesting)
I was told by a NVidia scientist that the memory that these video cards comes with actually is more a result of the kinds of parts available at the speeds needed for the amount of address lines they need to connect rather than a requirement for an application.
GDDR3 (Score:4, Interesting)
Bleh on the GDDR3. Radeon HD 4870 I just picked up for $200 has GDDR5, just smoking fast memory.
Re:GDDR3 (Score:5, Informative)
Re: (Score:3, Informative)
Ah, 448-bit vs 256-bit on the 4870. Didn't catch that, thanks for the correction.
Re: (Score:2)
Yeah, but ATI's is longer (this, of course, actually being what the discussion is all about).
Re: (Score:2)
Funny you should mention that; the 4870 is the first "good" video card I've ever owned.
Before this one I was on an AGP Mac circa 2003 with a Radeon 9000 Pro (32MB version). :)
Re: (Score:2)
Hmmm, maybe I was right:
http://www.anandtech.com/video/showdoc.aspx?i=3341&p=1 [anandtech.com]
"The Radeon HD 4870 and 4850 both use a 256-bit memory bus like the 3870 before it (as well as NVIDIA's competing GeForce 9800 GTX), but total memory bandwidth on the 4870 ends up being 115.2GB/s thanks to the use of GDDR5. Note that this is more memory bandwidth than the GeForce GTX 260 which has a much wider 448-bit memory bus, but uses GDDR3 devices."
Different card, but you get the idea.
Re: (Score:2)
Oh yes, I'm aware. Actually, the most common Erdos number is null, since 99% of the population doesn't have one.
Batteries not included (Score:5, Funny)
Customers who bought this also bought:
Delonghi PAC C100 Portable Air Conditioner 10,000 BTU
Re: (Score:2)
Customers who bought this also bought: Two (2) Delonghi PAC C100 Portable Air Conditioner 10,000 BTU
Fixed that for you :p
Re: (Score:2)
Dual GPU card (Score:2)
So if I'm reading this right, they've taken what would normally be a dual-card solution and put it on a single card. This should have been an obvious next step.
Re: (Score:3, Insightful)
So what happens when you SLI two of these badboys together?
Re:Dual GPU card (Score:5, Funny)
Re: (Score:2)
Re: (Score:3, Informative)
So what happens when you SLI two of these badboys together?
The card supports quad-SLI, so I guess you just end up with 4 285s in SLI.
Re: (Score:3, Funny)
The government opens up the taps on the Strategic Petroleum Reserve, as lights dim from Key West to Keokuk.
Does this mean I can run Solitaire in FULL RES? (Score:5, Funny)
Someone had to say it. I bet a Minefield comment will beat me to the punch...
Re: (Score:3, Funny)
Minefield? No, this thing still doesn't have enough memory for Firefox.
I wonder... (Score:2, Interesting)
Re: (Score:3, Funny)
Re: (Score:2)
You create The Singularity
There, fixed that for you.
Re: (Score:3, Interesting)
No graphics card maps the entire framebuffer into the physical address space, even on 64-bit OSs. I'll just use up a few 10s of MBs for BAR0, a few more for BAR1, and so on. The driver will manage all the framebuffer memory for you, all the client has to do is call the equivalent of malloc().
Re: (Score:2)
On my 32-bit XP machine with 4 GB physical memory and 2 512 Mb 8800 cards, it shows up as 3 Gb memory. Remove one card and it shows up as 3.5 Gb, so it looks like the OS is mapping then entire VRAM framebuffer to physical address space?
Re: (Score:2, Informative)
No, Windows can only access 3.5GB of system memory, the remaining 0.5GB will be mapped above 4GB in the physical address space. When you have lots of PCI devices in the system, they take up some space in the physical address space. So if your PCI(E) devices take up 1GB of space, the BIOS will fit less of that 4GB of RAM into the 4GB physcial address space. Your PCI devices would would already be allocating BARs like I said earlier. Like AC said, you can enable PAE to reclaim some of that lost space. I know
Re: (Score:2)
Thanks for the info
BitchinFast3D (Score:5, Funny)
http://www.russdraper.com/images/fullsize/bitchin_fast_3d.jpg [russdraper.com]
Re: (Score:2)
I remember that one, thanks for posting it! :D
Puerto de Graficos Acelerados Gigante!!
Oblig Lowtax Quote (Score:5, Funny)
Re: (Score:2)
Even get rid of your wrinkles...
It was inevitable (Score:2, Funny)
I think this may actually support a higher resolution and framerate than real life.
Overcompensate much? (Score:2)
Re: (Score:3, Funny)
What's a VHS tape?
Re: (Score:3, Funny)
Never thought I'd feel old at 20, but there it is...
Not particularly useful even for Folding at Home (Score:2)
Folding at Home GPU clients don't require all this graphics RAM. You would probably do just as well with two GTX295.
Can't use it all anyway (Score:2)
This dual-PCB monstrosity holds 32 memory chips, and 4 GB of total memory (each GPU accesses 2 GB of it).
In normal SLI setups the video memory has to be 'mirrored' for each GPU. Being a dual-PCB card means it probably works the same way, and so it's effectively a 2GB card.
All we need now... (Score:3, Funny)
Re: (Score:3, Informative)
AH! so... (Score:2)
so this is what is needed to run Windows 7. Good work Asus!
Hmmmmm.... (Score:3, Insightful)
Meh (Score:2, Funny)
My intel GMA950 is still overkill for Diablo II and Starcraft.
Please don't mention Diablo III or Starcraft II.
Will they price it right this time? (Score:2, Funny)
YES!! Just In Time For Duke Nukem Forever!!!! (Score:3, Funny)
What about Crysis (Score:3, Funny)
Two G200-350-B3 graphics processors... (AWESOME)
240 shader processors enabled... (HELLS YEAH!)
512-bit GDDR3 and 4 GB of total memory... (I JUST WET MYSELF)
I put in Crysis , max out the settings, and play the game...
*Ahh just a bit Choppy*...
*Oh no, not the "Lag o' Death"*...
*Shit, now it's Freezing*...
***C R A S H!!*** (SON OF A F*CKING B*TCH)
Yes, even the best graphics card mankind has ever made can't compete to the horrible coding of Crysis!
Re: (Score:3, Funny)
I'm working on my game, called Titanographic, and it requires a 16GB graphics card.
Coincidentally, so will directX 12 *ducks*
Re: (Score:2)
Re: (Score:2)
Dude, don't get your hopes up.