Graphics Memory Sizes Compared: How Much Is Enough? 347
EconolineCrush writes "Trying to decide between whether or not to get a 64MB graphics card, or spring for that 128MB version? Hit up this article, which explores the performance of ATI and NVIDIA-based cards with 64 and 128MB of memory, before swiping your credit card. Not so long ago 32MB was the top end for graphics memory on consumer video cards, but now even budget cards are available with 128MB. 128MB might seem excessive now, but a year from now 64MB cards might just be obsolete."
What are you doing with it? (Score:2, Insightful)
Re:What are you doing with it? (Score:2)
If I were to get a new video card I would probably be using that video RAM for SWAP
Re:What are you doing with it? (Score:2)
If I were to get a new video card I would probably be using that video RAM for SWAP ;)
That would be a cool hack. Is there any reason why unused video RAM couldn't be utilized by the OS for other purposes? It'd be awesome to have an extra 100MB of ultra high-speed swap space for when you're not playing games.
A way to map that video RAM into your regular address space would be a really cool hack.
Re:What are you doing with it? (Score:2, Informative)
You don't?
OOhh. .
http://slashdot.org/article.pl?sid=02/09/02/23212
Chris
Re:What are you doing with it? (Score:2)
UT2003 / Doom 3 (Score:2, Insightful)
Re:What are you doing with it? (Score:2)
Re:What are you doing with it? (Score:2)
Re:What are you doing with it? (Score:2)
I do the reverse. My game box has a voodoo4 PCI and a Radeon 9000 in it, so I can still play my favorite glide game, Mechwarrior 2.
3dfx was and still is a quality product, and there's no reason to get rid of your 3dfx hardware, regardless of what the "oooh! Shiny!" nvidia people tell you.
Re:What are you doing with it? (Score:2)
However, my GeForce2 seriously slows down on Morrowind. Fortunately, Morrowind is perfectly playable at low framerates, but that's not my point.
My point is, of course Counter-Strike and Quake3 run just fine on your 64MB Radeon: both of those games are older than your card!
Re:What are you doing with it? (Score:5, Interesting)
Frame, depth, and stencil buffer: 1024x1024x(32 bits + 32 bits + 8 bits) = 9 megabytes
9 megabytes so far. No problem. Double that when I push the resolution up to 1600x1200 for demos, but we'll ignore that for the moment. Now, the model I'm using has 19MB of surface texture, so we're up to 28MB. The system I'm running on this poor hypothetical card uses 512x512 textures to replace distant geometry. Each one takes up 768K of memory and I've generally got a working set of between 40 and 60 textures. There's another 30-45MB, so total usage is somewhere between 58 and 73MB. Add in shadow maps and we lose another 20 or 30MB. The 64MB card is now swapping to AGP memory. The 128MB card is filling fast. It's adequate for the current generation of the system I'm running, but before I can write the next couple versions I'm going to have to implement some serious resident-texture-set cache management.
Now, you can certainly argue that this is an atypical application. You would be quite correct. However, I do need that much video memory and I do use it. Yes, it's massive overkill if you want to play Quake, Unreal, whatever, but once you start looking into more exotic applications it's easy to get into situations where you can use arbitrary amounts of texture RAM. Real-time shading can get you there in a hurry, too, once you start using textures as lookup tables for expensive-to-compute functions (e.g. light fields or factorized BRDFs) or caching the framebuffer for later re-use.
So yes, 90% of the programs 90% of users will run will currently fit neatly in 64MB of video memory, but there definitely exist systems that require more than that.
What's the answer? (Score:2)
I got a GeForce 4 Ti 4200 with 128 megabytes and video input for $160. The 64 MB version with no video in was $130. So, the difference is $30. For $30, I'd get the extra 64 MB.
Re:What's the answer? (Score:2)
Re:What's the answer? (Score:2)
If it was only $30, I'd agree, but I picked up the 64 MB 4200 for $100 this weekend at Best Buy. Also, for what it's worth, the 64 MB version of the 4200 ships with faster memory (3.6 ns, clocked at 250 MHz DDR) than the 128 MB version (4 ns, clocked at 222 MHz DDR).
Re:What's the answer? (Score:2)
Is it worth it? Depends on if you want that much soda.
Memory for texures, etc. (Score:2, Insightful)
For those of you planning to never buy another game, well, why ask in the first place?
Bah... (Score:4, Funny)
Bah. Next thing you'll be trying to tell me my Voodoo 3 2000 PCI is obsolete!
Re:Bah... (Score:3, Funny)
Re:Bah... (Score:2)
Bah. Next thing you'll be trying to tell me my Voodoo 3 2000 PCI is obsolete!
Sure is, I've got a Voodoo 3 3000 AGP!
Re:Bah... (Score:2)
removable RAM? (Score:5, Interesting)
Re:removable RAM? (Score:2, Interesting)
Re:removable RAM? (Score:5, Interesting)
JOhn
Re:removable RAM? (Score:3, Funny)
No doubt you'll be receiving an extra bonus from the Taiwanese motherboard manufacturers this month. Of course I suppose that assumes they're able to recognize solder drips and scorch marks as good reasons to assume the board was not received DOA...
(Yes, I realize there are overclockers all over the world -- I apologize in advance for my horribly US-centric post... geez...)
Re:removable RAM? (Score:2, Troll)
Anyway... yes, engineers could design in removable RAM, but consider this. The RAM types are changing all the time! You would pay an extreme premium for the specialty RAM because it would vary from manufacturer to manufacturer, card to card, even model to model. The memory types are changing all the time and getting faster all the time and thus the connectors would have to change to keep up with the speed. For now it really is just easier to solder it on. Besides that card of yours will be worthless in two to three years anyway...
Hope that helps dispel some of the skepticism.
JOhn
Re:removable RAM? (Score:3, Informative)
So now you are probably asking. How much capacitance can a typical graphics chip or processor drive? Well I tried to find the datasheet on nVidia's website for their GeForce chips, but didn't turn up a thing. So I went to Intel's site and looked up the datasheet for their 845G [intel.com] chipset with integrated graphics. If you look on page 525 you will see that the output drive for the Intel chip is 12pF. So now you can probably see the problem. Assuming all we drove were memory modules directly from the 845G (which we wouldn't in real life) we could put just two to three 256Mb (32MB) modules on board without connectors. If we put the Molex connector you specified in between that number changes from 1 - 2 chips. In real life we would put a nice buffer in between that has a stronger output drive in between the 845G and the memory. Like TI's 24-Bit to 48-Bit Registered Buffer [ti.com]. That sucker has a 30pF drive and each buffer could easily drive 6 - 7 modules for a total of 256MB of RAM without the connector. Add a connector and this number deindles to 4 - 5. Anyway, I'm sure you get the point. At this scale even a 2pF connector makes a big difference.
However, after saying all that I should mention that I do not believe that these electrical considerations are the main or only reason the cards are not expandable. I think alot of it has to do with demand. Very few people are gonna upgrade their video card with more memory. I don't know any Matrox Millenium owners, including me, that upgraded their memory on their video cards. Because the economies of scale for a specialized memory make them much more expensive to produce than consumers like myself are willing to pay. In adition, by the time I am gonna upgrade a cards memory I can probably buy a brand new one with that amount of memory for the price of the piddly module
JOhn
Re:removable RAM? (Score:2)
Well, you're wrong. Capacitive loading affects the signal integrity to a great degree, and adding any metal to the conduction path will add capacitance. Then there are the reflection and EMI effects of stubs on the line (unfilled sockets).
Why do you suppose most motherboards only have three dimm sockets? Because it is too much trouble to get a 4 dimm socket design working. and the ones that do have 4 sockets only work right if you fill them in a certain way (i.e. those closest to the CPU first, etc...).
Or why do most PCI implementations only have 4 sockets? Same reason. How about AGP or PCI-X? At 133MHz, they can only work with one socket.
Re:removable RAM? (Score:3, Interesting)
Re:removable RAM? (Score:2, Insightful)
Re:removable RAM? (Score:2)
Re:removable RAM? (Score:5, Informative)
Another reason in addition to the ones posted by other users... when are you going to upgrade the memory on your graphics card? Perhaps 12 months after you bought it? Two years?
--If you're going to stay close the cutting edge in PC graphics, you'd be buying a new card at the point. Considering the pace at which PC graphics card technology increases, your card would be fairly dated by that point anyway and you'd be looking at another one.
--If you were going to buy the extra video memory fairly soon after purchasing your card when it's still bleeding-edge, why not buy a card with that much RAM in the first place?
Of course, other posters have noted lots of good reasons as well such as the profits made by board/chip manufacturers on the extra RAM, physical RAM connection issues, etc, etc.
Re:removable RAM? (Score:2)
I think the optimal situation would be to have stage 1 and stage 2 texture memory. Stage 1 would be fairly large, perhaps 32-64MB just for textures, and stage 2 would be one or two SDRAM DIMMs. Then you could swap things in from secondary texture memory as necessary, or render them directly if they're small.
Re:removable RAM? (Score:2)
At any rate I've never found a time where I said "Wow, my GPU is just fine, but I really need more RAM!" Whenever I upgrade, it's because I need a new GPU, which also comes with more RAM. Now as apps grow they need more RAM than what a given GPU has, BUT they have also grown in other ways that simply outdo the GPU entirely, and you need a new one.
As a side note ultra high end GPUs like some of the Wildcat sieres DO have upgradable RAM, BUT only extra texture memory, the main memory is still soldered on for signaling reasons.
iMac has it . . . (Score:2)
My Bondi Blue iMac has ability to upgrade the video memory from 2MB to 6MB....
Re:removable RAM? (Score:2)
64Meg Card obsolete? (Score:5, Insightful)
If a piece of hardware is doing what you need it to do, then it is not obsolete. Not every plays/needs/wants the latest UT2003/Doom3 game.
Re:64Meg Card obsolete? (Score:2, Informative)
Comic-not
Re:64Meg Card obsolete? (Score:2)
But you make a good point. Just because hardware or software is old doesn't mean it gets any worse than when it was new!! (eyeing prominent desktop icon for WordPerfect 5.1
Re:64Meg Card obsolete? (Score:2, Insightful)
Re:64Meg Card obsolete? (Score:2)
Nice to see somebody can actually think before they post. Even an AC.
Re:64Meg Card obsolete? (Score:2)
Maybe we need a new term meaning "obsolete for a particular level of usefulness".
Re:64Meg Card obsolete? (Score:2)
Cripes, the best video card I've got here is a 16mb G200 that just barely keeps up with a P3-500. But it does what I need of it, so who cares if it's not the latest and greatest??
Maybe they should of put that memory in the server (Score:2)
And even with the slashdot 128mb we still can't take out more than 25 sites a second...
enough is when winamp truly rivals LSD (Score:3, Funny)
Wouldn't that be cool. You could make your freakin trip end when you needed it to. Game Over Man. Legal too.
Where's my flying car! (Score:5, Funny)
Yeah, I know: If you're the kind of gormless muggins who thinks the only use for $2500 worth of steel, plastic and silicon is to play Doom and Quake, then this kind of crap is important to you. I, however, have real work to do and couldn't care less about this kind of garbage. <grumble> <wheeze>
Re:Where's my flying car! (Score:2)
As others have pointed out, on-board memory is much faster than AGP.
A decent PC no longer costs $2,500. If a $100 video card can turn a sub-$1,000 PC into a game machine, I'm all for it.
Re:Where's my flying car! (Score:2)
32Mb still does for me (Score:4, Insightful)
It alls depends on what you want, how much are you trying to spend and what are you doing with it. If its a luxury but you know it, what the heck! When I buyed mine it was pretty expensive, but I liked it. On the other side a cheaper card may work wonders this days and you can get a Geforce 2 for maybe 5 times less than I payed mine.
The game market is the driver for 3D graphics cards, so what's your game?
Bad joke. (Score:4, Funny)
Obsolete a year from now? (Score:4, Interesting)
Comparison Article in Sharky Extreme (Score:4, Informative)
must be a day slowing down (Score:4, Insightful)
Does the amount of memory matter as much as the bandwidth of the memory on the card? what about the GPU on the card? I had a nVidia TNT2-M64 with 32MB SDRAM which i used to play practically every game that came out at 1024.768 & 16bit colour - Direct3D and OpenGL were perfect - On Windows and Linux systems ( thank you Loki
Bottom Line I dont think 64 meg will become obsolete in an year even for hardcore gamers and hardcore gaming is defined by how good you are at a game compared to other humans and how less time you spend in mastering parts of a game - not really by a geforce4 card with 128mb or a parhelia with 256megs just cos you can afford them big cards or you have a rich parent. Cmon guys there are hardcore gamers here - its a joke that 64MB would become obsolete.
Obsolete? So what? (Score:5, Insightful)
So? A year from now, 128MB might be a low-end card, too. So in a year, buy a new card. Don't invest in tomorrow's technology today at a premium, when you can get it tomorrow at a discount. That's why smart buyers invest in modular components. When your hardware gets outdated, pluck and chuck.
I never invest in the top-end. I buy in the middle ground. Why? Because components drop from high-end to mid-range very quickly, but then stay there a long time before obsolescing to the low-end (or dead-end). And when a product drops from the high-end to the middle ground, the pricetag typically gets cut in half.
Re:Obsolete? So what? (Score:2)
When your hardware gets outdated, pluck and chuck.
That sound you hear is a million Macintosh zealots twitching and convulsing while they try and convince themselves that lack of upgradeability is a GOOD thing because it's "less confusing". :)
Re:Obsolete? So what? (Score:2)
Still, the upgrade options do kind of suck. Adaptec, Nvidia, and ATI are the only manufacturers who support the Mac. Bleh.
Re:Obsolete? So what? (Score:2)
Me either. Still holding out for the possibly mythical G5/Power4 though.
Adaptec, Nvidia, and ATI are the only manufacturers who support the Mac.
For graphics ATI and Nvidia pretty much cover the bases. Hard drives aren't Mac-specific, just drop in a Western Digital or Seagate, format it using OS X's Disk Utility, and go. What else do you need?
Re:Obsolete? So what? (Score:2)
There's a lot more than ATI and Seagate out there. I can deal with not having Matrox video cards (kinda sucks) or Tekram SCSI cards (oh well), but paying double the PC price for your hardware -- when you can even find any upgrades -- is just outrageous. It's better to just buy a brand new Mac than try to upgrade an old Mac. I'm sure they're designed that way.
Re:Obsolete? So what? (Score:2)
Actually it's the sound of them pointing and laughing because the only thing to do with year old PC components is "pluck and chuck" as the man said. The mac zealots are all watching their two year old hardware draw bids on ebay and taking more than half of their original purchase price to the bank. And people say macs are more expensive than PCs...
Re:Obsolete? So what? (Score:2)
Dude, they get more money back because they start life as twice as expensive! PCs don't hold their value as much because they start at rock-bottom pricing. Sheesh, the reality distortion field is pretty strong around here.
Re:Obsolete? So what? (Score:3, Funny)
No! No! No!
Please DO invest today in the top-end graphics cards! Spend two to three hundred $ buying the best cards on the market! (Or more!)
You see, unlike the parent poster, I think this is a positively brilliant plan for each and every one of you in the high-end gaming crowd!
Look at the benefits: State of the art technology, frame rates so fast that subliminal advertising is practical, bitBLTs that could move your entire DNA encoding in one transfer, and colour depth that makes the games so close to real life you never have to leave your chaise-lounge and encounter the real world!
And as a nice bonus for those of us in the category of the less driven to best-of-the-best-damn-the-cost, there is this:
As all of the high end gamers drive the market up, some really decent hardware becomes really cost-effective and affordable for the rest of us!
So yes, Please Please Please DO buy the BEST and Most Expensive! Drive the market as hard as it can be driven! The mild and meek will quietly thank you and buy really nice (but obviously outdated) products for a bargain basement price!
Ooops.... forgot to tag the whole post <SARCASM>
I asked a similar question on the newsgroups (Score:2)
Getting an nvidia? 128 or 64? Read this... (Score:5, Informative)
Ti-4600: Highest price, best features, 10.4 GB/s memory bandwidth, 650 MHz memory clock
Ti-4400: High price, excellent features, 8.8 GB/s memory bandwidth, 550 MHz memory clock
Ti-4200 (1): Decent price, great features, will handle BF1942 and UT2003, 64 MB limit, 8 GB/s memory bandwidth, 500 MHz Memory clock
Ti-4200 (2): High price, great features, slowest out of all 4 thanks to memory speeds, will handle BF1942 and UT2003, 128 MB limit, 7.1 GB/s memory bandwidth, 444 MHz memory clock.
Basicly, on the 4200's, if you go for double the memory for almost double the price, you will see a performance hit.
After my research (urged on by PNY's box), I decided that by the time I need 128 Mhz, I'll also want the features of a chip beyond the current Nvidia line.
Of course, if you want anything that performs beyond the 4200, then why bother reading anything here in slashdot? You're getting at least 128 MB on your card
So, this weekend, I found a 64 MB Ti 4200 for $129, and it printed out a $30 rebate at the counter. Happy day, indeed. I spent the rest of the weekend playing OpenGL-boosted Doom [doomsdayhq.com] and Hexen.
BTW, if you are completely out of the know, but love gaming, do not but the MX series of cards. They are not for you.
Trollstomping...by the time I need 128 MB, not MHz (Score:2)
I also meant to say "do not buy the MX series". And a few other typos.... If I took time to get them all right, my post would have hit near the bottom of the list, thanks to the active troll population here on
G'night!
Do NOT buy MX cards (Score:2)
AMEN to that. I helped a buddy pick out a new graphics card last weekend so he could play Battlefield 1942 (which requires hardware T&L). I reccommended a GF4 Ti4200 to him, but he opted for the cheaper GF4 MX 460 instead. (I tried to warn him, really I did)
Hmm... no fan on this heat sink. Oh well.. maybe that's a blessing... no moving parts to break down. I'm sure it won't overheat.. I mean they test this stuff, and if it ran too hot, of course they'd slam a fan on it. Right? Right???
Hmm... Unreal Tourney locks up after 5 minutes.
Hmm... May Payne locks up after 1 minute.
Hmm... BF42 locks up in SECONDS.
How can they sell this shit? Doesn't it get some cursory testing?? I even UNDERclocked the damn thing to minimum speed, it still froze on absolutely everything we threw at it.. the more advanced the graphics, the faster it crashed. Anyway, we returned it and picked up a cheap Radeon 7500 which has been running like a champ. ARE YOU READING THIS, NVIDIA!?
Re:Do NOT buy MX cards (Score:2)
btw, I purposly removed the heatsink from my gf4 mx, I like my computers quiet. It works perfectly with only a decent heatsink on it...
Re:Getting an nvidia? 128 or 64? Read this... (Score:2)
The MX series of the GF4's do not support the full line of graphical features of even the GF3's let alone the GF4's - the "GF4MX" series is a very misleading name.
This is not, true, however, with the GF2 MX series of cards. These are a great value (esp. the 64meg ones) for light or even casual gamers.
Personally, if you want the best possible graphics on the latest game engines all for a reasonable price, I think that you should seriously consider the GF4 Ti4400. The ATI's seem to be getting better driver support as well so their higher end (8500's or 9000's) may warrent a good look as well.
Re:Getting an nvidia? 128 or 64? Read this... (Score:2)
Edward Tufte, author of Visual Display of Quantitative Information, would not be pleased with PNY's marketing. The bar graph on the back of the box implies that the 64 MB version is 150% faster than the 128 MB version, based on a 3D Mark score that is only 2% higher.
I chose 64 MB for the value ($100), because it seems silly to spend more on something that is already eclipsed by the Radeon 9700 Pro, not to mention the forthcoming NV30.
Re:Getting an nvidia? 128 or 64? Read this... (Score:2)
It is a nice try for backing up the "Stomps" claim.... a claim that could be (mostly) wiped out with a little bit of overclocking on the 128MB variant.
Still, like I've said, it's not worth the money in my case to spend that much more money on a 4200.
Re:Getting an nvidia? 128 or 64? Read this... (Score:2)
64MB of extra memory on the 4200 is sorta like sticking monster truck tires on a Camaro. It's wasted money.
Re:Getting an nvidia? 128 or 64? Read this... (Score:2)
Re:Ti4200 == Speed Demon (Score:2)
Well, to be quite honest.... (Score:2)
Also, manufacturer of the card does make a difference. Just a note (look a PNY's RAM sinks. wheee doggy
texture memory (Score:3, Insightful)
However, if you intend to play Q3 or whatever enough at superhighres, ultracolordepth, whateverwhatever, then you may want more Video RAM. Crank down the texture detail a little bit and you don't need as much, I'm sure the game is just as fun.
AGP, fast video cards and video RAM are all about games. But when you can buy a whole PS2 for the cost of an expensive video card, it makes you think a bit.
With my old 15" 1024x1024maxres monitor it doesn't matter much anyhow - phorm
Re:texture memory (Score:2)
Re:texture memory (Score:2)
This, by the way, has got to be one of the most useless features of AGP. Not only is the transfer from main memory to the card slow as molasses, it's also tying up the memory so that the CPU has to wait for the video card in order to get to RAM. So while the graphics card is rendering your scene out of main memory textures, it's also making it impossible for the CPU to get to RAM in order to do geometry processing for the next frame of animation.
Not to mention, these cards are coming out with enough video RAM to rival the main memory, so the entire point is moot anyway. Well, unless you've got 1 GB of RAM like I do..
Re:texture memory (Score:2)
AGP will let you run the textures from your video card off of system RAM, but you might as well go fix a can of soup, feed the cat, etc for all the good it does you speed-wise.
Ok, so maybe that's too extreme.
Not to mention, these cards are coming out with enough video RAM to rival the main memory
Spending money on 256MB/512MB video cards seem ludicrous to me, when people I know are just scraping that much up in system RAM (I'm 512 myself until I get a new motherboard). If a game takes more RAM than my standard PC configuration, I think it's time to dull those pixels a bit in favor of a little decency as far as memory consumption. I imaging when these cards start toting that much RAM and CPU power, they're also going to be so hot that you could probably find a case mod that allows them to boil your morning coffee...
The newest innovation, part video card, part base heater, part toaster. Don't forget the heatsink and fan! - phorm
Re:texture memory (Score:2)
Ridiculous for most users (Score:2)
I'm much more interested in why I cant pick up cheap ($20) 2 head or 4 head video cards, or ones with decent video out at the same time as VGA out.
Re:Ridiculous for most users (Score:3, Funny)
what? try smoking something else (Score:2)
if you're even questioning if the cheapest card you can find on the shelves today has enough ram, you're being silly.
video quality and number & type of outputs are all that matter. go buy a game console if you think otherwise.
--
ask not what your pocketbook can do for you but what you can do for your pocketbook.
I'm not trying to brag or anything, but.. (Score:3, Informative)
and I play alpha centauri, starcraft, freeciv, etc. And I have been using it day and night since around 1993 without it melting, and with no noisy cooling fans. Considering it cost me one buck, I think that it is not a bad bargain.
Re:I'm not trying to brag or anything, but.. (Score:2)
Re:I'm not trying to brag or anything, but.. (Score:2)
It came with two 5.25" floppies full of drivers, sadly, all of them totally useless today. However, my card was supported natively by PSpice for DOS, so I could have 1024x768 graphs of circuit simulations onscreen. Nice.
Re:I'm not trying to brag or anything, but.. (Score:2)
I really can't stand... (Score:2)
I'm impressed by people who can get by with old, "outdated" hardware. That's REAL geekdom. Anyone who can make their old shit work and is proud of it is a real geek. People who buy the newest just to buy the newest are nothing but the new yuppies. How fucking boring.
Not just for gamers (Score:2, Informative)
If other windowing systems head in the same direction (and MS indicate that Windows will, in a couple of years' time - X... who knows? Anyone have a plan there?), the advice will presumably apply equally.
Re:Not just for gamers (Score:2)
Performance is far superior to what it was in OSX 10.1.5 of course. The genie and scale effects are WAY faster. Scrolling has been improved by a lot, and the new eyecandy goes at a perfect speed. The only thing bad is resizing windows, and that is around a 4X improvement from OSX 10.1.5 . Most importantly, however, is that my load average has gone down a bit. While writing this my load average has been about at 0.60. Back in 10.1.5 It would have been about 1.35 to 1.6.
As always, your mileage may vary and speed is subjective, but I have found it zippy. You definately don't need the latest and greatest video card, but it can't hurt.
Sten
i don't think so. (Score:2, Insightful)
It's mostly texture memory (Score:5, Informative)
If you do not use OpenGL/Direct3D, then any RAM above, say 8MB (you may be doing dual or triple-head at 1600x1200 32bit or more), is completely useless.
The extra bandwith on the cards is also useless, as only 3D operations are accelerated across the super-fast busses built into these cards.
Everything else, including 2D blits in the majority of available OpenGL/Direct3D drivers are handled by the host CPU and involve reading from system RAM and passing that data across the AGP bus.
I am not aware of many (any?) games that can take advantage of more than 64MB of texture RAM, and while games that *may* take advantage of >64MB are on the horizon, the big news for games is vertex/pixel shaders, rather than the ability to texture map hundreds of megabytes of pixel data per frame.
There are applications that will benefit from the availablity of 128MB or more texture RAM, but these are typically custom-written scientific visualisation apps, or conceivably you could use 128MB of textures to do realtime previews in your lightwave/3DS Max/Maya/Blender scenes.
However, the actual utility of this RAM for most desktop users and even gamers is rather questionable. I don't doubt that the Radeon 9700 and the NVidia Ti4600 are fast cards, but they still rely heavily on the host CPU to achieve their stellar performance, as opposed to some of the professional cards which provide much more capable geometry engines and accelerate practically all of the openGL pipeline, as opposed to the consumer cards which are focussed mostly on texturing and fillrate optimization, ideal form games but not necessarily optimal for other forms of 3D activity.
That being said, the pace of development from Intel and AMD have made it more difficult to justify using dedicated hardware for these seteps, as a 2GHz Athlon will probably out-light-and-transform dedicated OpenGL hardware, which is much more costly and low-volume to produce.
The SGI O2 is a good axample of a machine that simply uses system memory to store textures, and while the SGI's graphics system is not in the same class as some of the more modern 3D boards from NVidia and 3DLabs, it is certainly sufficient to do impressive texture-mapping demos. This is really not an option on the current x86 architectures, but is a useful example of the 'other' way to handle texture memory, as it allows the user of the system to make maximum use of the resources available - i.e. when 3D graphics are not used, the 'texture memory' is available to the apps, and vice-versa.
I think it is amazing that we now have consumer cards that contain more texture memory than was typically available as system RAM in a mid-range 3D workstation a few years ago, but the unfortunate thing is that very, very few people are able to put those capabilities to real use with the current crop of system architectures, applications and games available
Re:It's mostly texture memory (Score:2)
Well, the article shows (as did Anand [anandtech.com], and others, in June) that Jedi Knight II can use the extra memory for a 10 - 25% increase in FPS. We've heard the Unreal Tournament 2003 will use more detailed textures than the demo, so 128 MB may help there, too.
Re:It's mostly texture memory (Score:2)
Look at QuartzGL. Next generation *2D* compositing can use much more than the standard framebuffer. Folks have posted evidence here on
Of course, this isn't a reason to go with a 128 card over a 64 necessarily. On the mac it's a reason to go with a 32MB card rather than a 16 or an 8MB card. At some point, however, they might figure out how to suck up a ton of RAM and accelerate scrolling of composited windows (which I understand they haven't done yet in QuartzGL) and all of a sudden you might want massive amounts of VRAM in your next windows machine.
How about 1GB of RAM? (its real--see msg) (Score:2)
1GB of 128-bit DDR memory at 333MHz
1GHz Motorola 7455 CPU (i.e. an apple G4 chip)
custom memory controller
SIMD vector math unit
PCI-X host interface
Yes, you can be the first on your block to have a graphics card that runs its own operating system!
Why you want 128MB. (Score:5, Insightful)
Ok Ok, so it's a Mac OS X thing, so what? How long before M$ innovates this feature into Windows? How long before it's patched into XFree86?
Think of all the cool things you can do, both for visual pleasure and UI functionality by operating in an accelerated, 3D enviroment, while the main CPU is free to crunch away at whatever it is you have your CPU doing, thus improving overall speed. Yes, I realize the CPU still has to intruct the card of what to do, but at least we're not blitting as we're trying to host web pages, for example.
For that you're going to need texture memory. Lots of texture memory. When you run out of memory on the card, the framebuffers must be stored in RAM. When those framebuffers are needed, you'll need to swap them into the card's RAM. This will cause the main CPU to stutter as it pumps a couple 8-9MB buffers through the system & PCI bus, which, needless to say, will get old fast, especially if the framebuffers get paged out to a swap file. Yuck!
Of course, maybe you should wait until the other 2 of the Big Three implement this in some form (I know some work as been done on a 3D window manager for X, no idea if it's meant to take advantage of acceleration, though). I've heard rumor that M$ is working on it for Windows XP(ensive) 2005 or 6 or whatever it is, and I'm sure some Linux hacker has it working on his overclocked Athlon box already. Either way, you probably want to be ready for this. Or wait and buy a card when it finally happens, when 128MB will be standard.
Since color depths will probably never exceed 48-bit (32-bit + alpha), screen resolutions are fine at 2???X???? or whatever the current highest is, it'd take quite a few windows open at once to framebuffer all that memory up. Assuming about 8 megs per window, which is admittedly above average for most windows (sans Photoshop or web browsers), you'd get about 14 or 15 windows open at once.
Oh well, someday, you'll be sorry your card doesn't have 512MB on-board
Tim Sweeney endorses texture caching. (Score:3, Interesting)
"This is something Carmack and I have been pushing 3D card makers to implement for a very long time. Basically it enables us to use far more textures than we currently can. You won't see immediate improvements with current games, because games always avoid using more textures than fit in video memory, otherwise you get into texture swapping and performance becomes totally unacceptable. Virtual texturing makes swapping performance acceptable, because only the blocks texels that are actually rendered are transferred to video memory, on demand.
Then video memory starts to look like a cache, and you can get away with less of it - typically you only need enough to hold the frame buffer, back buffer, and the blocks of texels that are rendered in the current scene, as opposed to all the textures in memory. So this should let IHV's include less video RAM without losing performance, and therefore faster RAM at less cost.
This does for rendering what virtual memory did for operating systems: it eliminates the hardcoded limitation on RAM (from the application's point of view.)"
Warning: Radeons unsupported, regardless of RAM (Score:2)
DON'T buy an ATI.
The DRI team aren't allowed to implement S3 Texture Compression, so you won't be able to run UT2003 or any other games which use Texture compression.
The DRI team aren't allowed to implement ATI's HyperZ technology.
The Gatos team aren't allowed to implement TV-out.
Everywhere I turn ATI are advising that I am not allowed to use feature 'X' under Linux.
ATI are now releasing closed-source FireGL drivers for their newer Radeons. But I paid $AUS500 for my 64MB DDR VIVO Radeon only a year ago and I don't need to upgrade yet thankyou. And the FireGL drivers are slower and less stable than the DRI drivers.
ATI should provide closed-source binary-only modules for the DRI drivers to add features which are patented. But instead they force their customers to upgrade early and suffer inferior quality drivers. Not I! I am going back to bloody nVidia. And I swore I'd never do that..
sorry, but that article was rediculous (Score:2)
Re:more more more (Score:2)
Re:RADEON 9700 (Score:2, Insightful)
And when the games that come out that will emphasize the difference between the two (which is when you'll really want one anyway), the price of a 9700 or similar card will be half what it is today.
11 GB (Score:2)
Re:Tired of the rat race (Score:2, Insightful)
Well don't listen to them then! But I think you will find alot of people will want, or require the latest stuff, regardless of if the companies are trying to push it ir not.
Are KDE and GNOME suddenly going to decide to render everything in OpenGL?