AMD-ATI Ships Radeon 2900 XT With 1GB Memory 132
MojoKid writes "Prior to AMD-ATI's Radeon HD 2000 series introduction, rumors circulated regarding an ultra-high clocked ATI R600-based card, that featured a large 1GB frame buffer. Some even went so far as to say the GPU would be clocked near 1GHz. When the R600 arrived in the form of the Radeon HD 2900 XT, it was outfitted with 'only' 512MB of frame buffer memory and its GPU and memory clock speeds didn't come close to the numbers in those early rumors. Some of AMD's partners, however, have since decided to introduce R600-based products that do feature 1GB frame buffers, like the Diamond Viper HD 2900 XT 1GB in both single-card and CrossFire configurations. At 2GHz DDR, the memory on the card is also clocked higher than AMD's reference designs but the GPU remains clocked at 742MHz"
But... (Score:5, Funny)
That could be viewed as a serious question (Score:3, Informative)
Re:But... (Score:5, Funny)
Re:But... (Score:5, Funny)
Re:But... (Score:4, Funny)
Re: (Score:3, Insightful)
Re: (Score:2)
http://ati.amd.com/products/fireglv8650/index.html [amd.com]
Re:But... (Score:4, Informative)
I know you're kidding, but as a matter of fact, it is supported under Linux by a couple different drivers.
A good review of the 2900 XT under Linux [phoronix.com]In fact, you have options.
Using the proprietary driver [phoronix.com]
Using the open source driver [phoronix.com]
Re:But... (Score:5, Interesting)
also, more vespene gas (Score:5, Funny)
Well, that's because when they tried to build the 1GB units, a loud voice was heard saying "We require more minerals", and production was blocked.
Re: (Score:1)
Re: (Score:2)
Re: (Score:2, Informative)
http://en.wikipedia.org/wiki/StarCraft#Gameplay [wikipedia.org]
Re: (Score:1, Redundant)
Considering 32-bit OSes are still mainstream.. (Score:4, Insightful)
Wow! Now that 4GB of main system memory I installed has been pared back down to a more manageable 2GB!
WHEE!
Until 64-bit becomes more mainstream, cards like this will only become more and more detrimental to the systems they're installed in.
Re: (Score:1)
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
Re:Considering 32-bit OSes are still mainstream.. (Score:4, Funny)
Re: (Score:3, Funny)
Too bad it was designed by left-handed monkeys on crack.
Re: (Score:1)
Re: (Score:3, Funny)
Re: (Score:2)
You mean like doubly so? Like the point that the money spent on the doubling the money is almost completely wasted, the money spent on doubling the graphics chips is almost completely wasted?
Re: (Score:2)
Re: (Score:2)
Sure, for anything that remains strictly on the graphics card, it's great. But for anything else (functions besides raw graphics in a game (like AI) or for non-gaming application), stealing that memory allocation space degrades overall system performance.
Re: (Score:3, Interesting)
Re: (Score:2, Insightful)
Oh wait. You meant Windows. Sorry, I do apologize
Re: (Score:2)
The problem isn't the OS, it's the hardware support. In my case, USB wi-fi dongles; none of the ones I have kicking about the place have driver support from the manufacturer (thanks for that Netgear!) and I don't fancy trailing cat5 cabling through my house again.
And yes, Linux is great, and yes it was my primary desktop OS for a couple of years, but it simply doesn't support all the software I want to run (and yes, that includes
Re: (Score:3, Insightful)
This is where open source trumps closed source, hands down.
In the majority of cases, having an open source 32-bit driver, almost automatically implies having a 64-bit driver. It's just a recompile. Yes, a lot of the time there will be bugs, but since developers are usually on "higher end" 64-bit systems, those bugs are usually fixed quickly.
I'm running 64-bit Debian, and have never had a problem with drivers. My video card, sound card, firewire card, USB devices, network cards and printer all work
Re: (Score:2)
Re: (Score:2, Insightful)
Re: (Score:2, Funny)
Re: (Score:2)
However, you'll still have the luxury of running multiple processor intensive apps without bringing the whole system to a standstill.
Re: (Score:2)
Re: (Score:1, Funny)
Exactly. That's why I'm holding out for 128 bit CPUs for my next upgrade.
I'm holding out for 65 bit CPU's for my next upgrade.
Re: (Score:1)
Re: (Score:2)
On the PPC side since the 745x at least there has been support for Extended Addressing (H
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Seriously. What is with the lack of uptake on 64-bit?
Severe lack of motivation from the majority of consumers to whom it offers no benefits (and, due to frequently buggy and/or nonexistant hardware and/or drivers, numerous disadvantages).
Re: (Score:3, Interesting)
I'm not talking about an XBox. I'm talking about a PC.
An XBox has half a gig of memory, half of which is dedicated to graphics at a relatively low-res output.
I'm talking about a gaming PC with 2+GB of RAM in it and how graphics cards with multiple gigs of memory are detrimental to overall system performance (including gaming) in a 32-bit memory map.
Re:"Framebuffer memory" (Score:5, Funny)
Re: (Score:2, Offtopic)
Re: (Score:2)
More info on the 4GB limit (Score:2, Informative)
A good article on it is here: http://www.dansdata.com/askdan00015.htm [dansdata.com]
UEI++ (Score:2, Interesting)
Re: (Score:3, Informative)
Finally some effects! (Score:5, Funny)
Re: (Score:2)
Re: (Score:1)
Useless! (Score:5, Insightful)
Re: (Score:2, Informative)
Re: (Score:2, Informative)
Re: (Score:1, Interesting)
Re: (Score:2)
Re: (Score:3, Informative)
Memory doesn't make a card faster, except on REALLY insane resolutions (way higher than 4MP I suspect) when you
Re: (Score:3, Insightful)
With an 8800GTS/320, myself, and most all review sites, struggle to stay above 60FPS in 1024x768 at times with all the eyecandy on.
Re: (Score:2)
STALKER had to have one of the heaviest GPU resource consuming options bumped to keep frame-rates in the 20-30 range (which is what I consider playable for single-player games, and where I prefer eyecandy to the frames. multiplayer would go the other way)
STALKER also displayed some warping on the side screens, showing it was not designed nor QA'd for unconventionally wide
Re: (Score:1)
We've had 256mb cards for a few years now for "normal" resolutions like 1024x768 and 1280x1024... but that's not hardcore anymore. Hardcore is SLI GTX'es (or HD2900s) driving a 30" Dell at 2560x1600.
Comment removed (Score:4, Informative)
Re: (Score:1)
Re: (Score:3, Informative)
Basically, other than the framebuffer for what's actually displayed on screen none of the graphics card memory is depended on screen resolution.
Anyway, this card isn't useful *now*. That's because video game producers target the cards that are widely available. 2 years from now you'
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Insightful)
Quad 32" screens at 1600x1200 fits in 32Mb (Score:3, Informative)
What you *do* need it for is texture and vertex data, but even then games aren't really going to use it - they're designed for current hardware.
Nope, the only people who'll buy this are ignorants with too much money*.
- Not that there's any shortage of those.
[*]
Re: (Score:2)
This is probably redundant but.. (Score:3, Insightful)
Notice how some of the newer games see less performance degradation on some of the 640MB nVidia cards than equivallently clocked 320MB versions of the same card.
Depends on what side your on... (Score:2)
Anyway, if it finds a markets more power. Engineers do their jobs, people get paid. Welcome the open market. (:
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
When I bought my 1900 XT several months ago, I decided to get the cheaper 256MB version instead of the 512MB version because benchmarks showed there wasn't even a 5% difference in frame rates between the two cards. I play all my games at 1680x1050 with 4x AA and 8x
Big screens.... (Score:2)
Useful for 3D animation work. (Score:5, Informative)
Sounds useful for 3D animation work, where you need all that memory for textures. Remember, by the time players see a game, the textures have been "optimized"; stored at the minimum resolution that will do the job, and possibly with level of detail processing in the game engine. Developers and artists need to work with that data in its original, high-resolution form.
Re: (Score:3, Informative)
Ahh... (Score:5, Funny)
Re: (Score:1)
Anyhow I just read the review in the newest CPU and they gave it what could best be described as a "meh". Give it a few more iterations and then we might have a respectable competitor to the current top-shelf Nvidia offering, but of course by then there will be something better out...
Frame buffer? You mean video ram? (Score:5, Informative)
Re:Frame buffer? You mean video ram? (Score:5, Funny)
What is the point for most users? (Score:2, Interesting)
Re: (Score:1)
At the extreme end of the scale it is bad value. However if you do need a new graphics card it often works out better to go towards the high end. I'd rather spend $300 on one card that keeps up with the latest games for two years than $180 get a mid range card that will need to be updated with another
Eventually (Score:2)
Possible to be used as system RAM? (Score:2, Insightful)
Re: (Score:1)
Gentoo guide [gentoo-wiki.com]
Re: (Score:1)
How this is newsworthy now (Score:1, Informative)
bitchin (Score:5, Funny)
Yes but (Score:1)
Wake me when it's got 8GB (Score:1)
Seriously though, we're already seeing problems with PSU's that can't deliver enough on the 12volt rail (2900XT needs 24 Amps by itself) and now they want to push that up to 26-28 amps? So where is the power going to come from? The wicked fairy's curse?
Re: (Score:2)
That limitation is a design choice. Beefier supplies are no problem to build.
Re: (Score:2)
Whither my RAM on ia32? (Score:2)
"lawl"
nVidia Eyez (Score:1)
Can't resist! (Score:1)
Re: (Score:1)
I'm not feeding the trolls... (Score:2, Interesting)
Re: (Score:1, Insightful)
And AFAICS, the statement that "ATI's hardware is better it's just the drivers that let them down" sounds pretty unsubs
But is it? (Score:1, Insightful)
Re: (Score:1)