Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATI Announces 512MB Graphics Card 440

Annoyed.Gamer writes "Today ATI announced their first 512MB graphics card, the X800 XL 512MB. I have some systems that don't have more than 512MB of system memory, much less on a graphics card. According to AnandTech, the 512MB card can't outperform its 256MB counterpart and costs 50% more. ATI's favorite Half Life 2 showed the only real performance increase in the entire article. Overall a disappointment, especially because ATI for some reason didn't outfit their highest end GPUs with 512MBs, only the mid-range X800 XL."
This discussion has been archived. No new comments can be posted.

ATI Announces 512MB Graphics Card

Comments Filter:
  • by Anonymous Coward on Wednesday May 04, 2005 @11:28AM (#12432119)
    And you have the nerve to submit articles to Slashdot?
  • by ackthpt ( 218170 ) * on Wednesday May 04, 2005 @11:29AM (#12432124) Homepage Journal
    he 512MB card can't outperform its 256MB counterpart and costs 50% more.

    I'd be thrilled just to have my ALL-IN-WONDER® 9800 Pro not be so damn fragile. Often it comes up with bars and artifacts and I keep rebooting until it behaves. I've tried all the driver and firmware updates and fiddled with AGP volage settings to no avail. Graphics benchmarks all pass with flying colors (no pun intended) then the PC crashes when I start up some games. Meanwhile, a $37 graphics car (with a $10 rebate) from Circuit City is 100% reliable (except I can't watch TV on it.) Time for ATI/Nvidia race to focus on quality rather than quantity.

    • by bfischer ( 648685 ) on Wednesday May 04, 2005 @11:33AM (#12432158)
      Your mobo does not have a VIA chipset does it? There is a known problem with 9700/9800 and some via chipsets (and both via and ati keep pointing fingers at each other)
    • by Ford Prefect ( 8777 ) on Wednesday May 04, 2005 @11:36AM (#12432189) Homepage
      If it's not a motherboard chipset conflict, try pointing an extra fan at it.

      I had a Geforce 4Ti which suffered from nasty screen corruption in some games, which was fixed with the aid of a CPU fan from a 486 blowing air in the general direction of the graphics card.

      Yeah, high tech, I know. Even better - said fan was held in place with a mounting bracket from a 386's hard disk. :-)
    • I'd make sure that puppy is cooled VERY well. I had a lot of problems with mine (9800 pro w/128MB) due to heat issues. Even blew the crap out of my first one.

      I put two case fans in mine (intake at the front, outtake at the rear, and changed my power supply to an enermax (with yet another fan). This stopped all my problems (nForce2 board Asus - a78nx with an AMD 2800+ cpu). A friend has basically the same configuration and was having problems as well until he added more cooling.

      I've talked to techs who
    • I personally think ATI is horrible when it comes to support and especially when it comes to writing reliable drivers.

      I decided about two years ago to purchase a Radeon 9800 with 256 MB when it first came out. I had to order it overseas it was so new. However, the graphics drivers suck. I see more artifacting than I ever have before. The same thing happens on my laptop which has a radeon 9600. It has to be ATI and not the games because the artifacting happens in every graphical application.

      It's the la
    • 9700/9800's do have serious quality control issues. I bought a 9700 close to two years ago here's its true actual story,

      1st one > would start drawing artifacts on the screen about 30 seconds into the game.

      2nd one > worked great for a week, then it started corrupting textures, and vectors (wierd protrusions would pop out of walls etc) and in 2d mode the fonts would look all sparkly, and when you typed the sparkles would change.

      3rd one > DOA - didn't post at all

      4th one > DOA - also didn't post
  • out of hand (Score:5, Interesting)

    by Kaamoss ( 872616 ) on Wednesday May 04, 2005 @11:29AM (#12432127) Homepage
    Things are getting somewhat out of hand as far as graphics cards. It seems like every 4-6 months there is a new line of cards out with slightly better specs in the 500 or so price range. I have a GeForce Ti4800 128mb and it runs all of my games, including doom3 and halflife two just fine. I'm not sure how people even justify the cost to them selves.
    • Re:out of hand (Score:5, Insightful)

      by TrippTDF ( 513419 ) <{moc.liamg} {ta} {dnalih}> on Wednesday May 04, 2005 @11:39AM (#12432223)
      Why can't more people think like the parent?? I really, really don't get it. While I like my games to look good, I am really fine with my system as it is. Are you ready for this, everyone? It's a 1.4 Ghz AMD, 512 MB DDR and a (gasp) GeForce 4 MMX 440! It ran Doom3 and HL2 quite well. Sure, I didn't get the full effects of the games, but I still played them quite nicely performance-wise.

      On a side note, my office computer is a Dual 2.8 Ghz P4 machine, and I don't see a difference in normal day-today office stuff. Hell, my olf 400 Mhz. G3 laptop is just as capable as my Office machine for 95% of the work that I do. All those guys out there dropping $500 every 6 months on new cards are not showing their muscle under the hood, but rather their lack of brains. Or their large quantity of spending cash, due to the fact that they still live at home. (I'm totally getting flamed for that last comment, but that's cool)
      • Re:out of hand (Score:3, Insightful)

        by gosand ( 234100 )
        While I like my games to look good, I am really fine with my system as it is. Are you ready for this, everyone? It's a 1.4 Ghz AMD, 512 MB DDR and a (gasp) GeForce 4 MMX 440! It ran Doom3 and HL2 quite well. Sure, I didn't get the full effects of the games, but I still played them quite nicely performance-wise.

        I am sure I am way in the minority, but my Windows system is an Athlon 900 (slot), 512 SDRAM, Win98, and an ATI-AIW32MB video card. It plays all my games fine (except the latest Ghost Recon, which

      • Unlike you, a lot of people seem to think it's the graphics that make a game good. Personally I think they're just the icing on the cake.
      • Re:out of hand (Score:5, Insightful)

        by DoubleD ( 29726 ) on Wednesday May 04, 2005 @12:04PM (#12432464)
        Shhhhh!

        We should thank these people that are willing to pay for the bleeding edge graphics performance. They enable us to pay bottom dollar for yesterdays technology that performs 90% as well.

        You do not have to understand a performance enthusiast to benefit from their pocketbook.
      • I look at large large images in 'roam' mode on the screen, so that I can view a 25Kx25K (typical scan from a Leica scanner) image.

        These cards, with the specialized software, stuff quite nicely that image into the card memory, which allows my system to roam with a high end display.

        Course, I don't know about *this* card, just others that have 512mb.

        In fact, I did inquire with one manufacturer about upgrading a card to 1gb... talk about eyeballs popping ;)
      • dual p4? õ (Score:3, Informative)

        by crabpeople ( 720852 )
        "my office computer is a Dual 2.8 Ghz P4 machine,"

        i doubt the accuracy of this statement. Especially since a dual p4 machine does not exsist.

        you either have:
        1) a new dual core EE cpu (unlikely)
        2) A dual xeon server (more unlikely)
        3) a normal p4 with hyperthreading (most probably)

        just because it has two cpu bars in task manager does not mean you are running a dual system my friend.

        the reason you dont see a difference between a p4 2.8 and an amd 1.4 is because the 1.4 is an AMD :)
        put a p4 1.4 and a p4 2.
      • Re:out of hand (Score:5, Informative)

        by danila ( 69889 ) on Wednesday May 04, 2005 @02:52PM (#12434096) Homepage
        Well, both HL2 and Doom 3 had renderers for old versions of DirectX. Some people even managed to run Doom 3 on Voodoo 2 [google.com]. Yes, any graphics card can probably handle the levels and the characters moving around. You don't need an X800 for that. But if you don't mind low-res textures, low-poly models, no bump, no shadows, no dynamic lighting, then you will be essentially playing something only slightly better than Quake 2 and Half-Life. What's the point? I probably can also watch video on a 386 (an MPEG1 in a 160x120 window), but is it the same as watching High Definition DivX [divx.com]?

        Good videocards allow better image quality in games. If you don't need better image quality, that's fine, but most people disagree with you.
      • Re:out of hand (Score:3, Insightful)

        by joto ( 134244 )
        Why can't more people think like the parent?? I really, really don't get it.

        Most people do think like that. But the extreme gamers are then ones who bring the prices down for the rest of us. Please let them continue...

      • " Sure, I didn't get the full effects of the games, but I still played them quite nicely performance-wise. "

        Thats like asking a kid who has been blind since birth how he feels about no seeing anything for his whole life. Of course he doesn't miss what he never had. Until you experience a high end system displaying high end graphics, you can speak about how good or bad you old system is. You are 'blind' to what you have never seen. How can I explain what red looks like to a blind person? How can I ex
    • For one, need them to anticipate graphics performance a couple years into the future.

      Average gamers though... yeah, I don't see the point.

    • If you want to complain, do so to the people who actually buy the cards at $500 or so. The cards wouldn't be selling for that much if there wasn't an enthusiast market out there to pay such prices.
    • "I have a GeForce Ti4800 128mb and it runs all of my games, including doom3 and halflife two just fine. I'm not sure how people even justify the cost to them selves."

      First off, not all the world is video games, and in the Windows world, many high-end graphics tools correctly take advantage of extra video RAM.

      That said, more video ram directly translates to the ability to render faster at higher resolutions. No increase in performance? Try running an MMORPG with 50+ mobs/characters in front of you in 1280x
    • I'm not sure how people even justify the cost to them selves.

      Who cares, as long as they keep doing it?

      See, I have roughly zero interest in the latest FPS game ("Jax and Daxter" on PS2 is more my style), but this irrational push for the latest and greatest means that you and I get to buy some amazing year-old hardware for next to nothing. $30 will get you an MX 440. Joe Gamer would look on it with distaste, but it's screamingly fast for the easy work I ask of it. With the upcoming OpenGL desktops, I

  • by Stevyn ( 691306 ) on Wednesday May 04, 2005 @11:29AM (#12432129)
    But it's only going to outperform in a situation that requires more memory. Having extra memory that goes unused doesn't make a difference.
  • it's funny.. laugh.. (Score:5, Interesting)

    by Fry-kun ( 619632 ) on Wednesday May 04, 2005 @11:30AM (#12432132)
    sounds like the author could use this little gem: http://kerneltrap.org/node/143 [kerneltrap.org] :)
  • Chicken and egg (Score:5, Insightful)

    by ergo98 ( 9391 ) on Wednesday May 04, 2005 @11:30AM (#12432135) Homepage Journal
    To be the master of the obvious, of course there will be no, or limited, benefit of that much memory on your video card.

    The reason is obvious: game designers target the prevalent market. Given that there are a limited number (zero) of users with 512MB of onboard memory, few video game makers are going to require 512MB of simultaneous textures (or even 256MB, and to a degree not even 128MB). Doom 3 may, as the article states, have 500MB of textures, but I highly doubt they are used simultaneously.

    This is just another card for people with the money to say "just in case...".
    • Re:Chicken and egg (Score:5, Informative)

      by dzym ( 544085 ) on Wednesday May 04, 2005 @11:36AM (#12432188) Homepage Journal
      Au contraire. Doom 3 in the "Ultra" mode will most definitely require 512MB of graphics card memory to run well, because it is loading that much data ... not just for the art, but every layer of processing that goes over the textures like normal mapping, shaders, etc.

      Otherwise you get hitching in scenes when Doom 3 needs to swap out that amount of data quickly for another batch of data (opening doors, switching from rendering level to reading the PDA, etc) because it will be moving data from the AGP memory cache from the main system memory bank.

      • Re:Chicken and egg (Score:2, Insightful)

        by ergo98 ( 9391 )
        Doom 3 in the "Ultra" mode will most definitely require 512MB of graphics card memory to run well

        But this article shows otherwise - there was almost no difference having 512MB of video card memory. The reason is most certainly that different subsets are used in different areas, and the hit on AGP/PCI Express to pull the active set into video card memory is momentary and largely irrelevant. If every 30 seconds you need to purge and cycle in through ultra-high speed AGPx8 or PCI Express, that really isn't t
        • by justins ( 80659 ) on Wednesday May 04, 2005 @11:52AM (#12432360) Homepage Journal
          But this article shows otherwise - there was almost no difference having 512MB of video card memory.

          No, it does not. It shows the limitations of a benchmark which is focused solely on frames-per-second performance.

          The effects of texture thrashing will be perceptible (and distracting) at times to the human player, but they won't do much at all to effect such a benchmark.

          If every 30 seconds you need to purge and cycle in through ultra-high speed AGPx8 or PCI Express, that really isn't that great of a hit.

          It's a noticeable flaw, every 30 seconds. Doesn't matter if all you care about is "frames per second."
  • About as useful... (Score:4, Insightful)

    by Ahman_Ra ( 839208 ) on Wednesday May 04, 2005 @11:30AM (#12432137)
    I agree, it's about as useful as a humvee in the city.
  • by Anonymous Coward on Wednesday May 04, 2005 @11:32AM (#12432152)
    Carmack said that you'd need a 512MB card to use the Ultra quality mode. If John Carmack is reading this, do you have any reason why Doom3 performed no better in Ultra mode with the 512MB card as opposed to the 256MB card?
  • by pieterh ( 196118 ) on Wednesday May 04, 2005 @11:32AM (#12432153) Homepage
    Every time some manufacturer adds globs of memory, be it huge disks, huge memories, fat network pipes... we all go "no-one will ever use that, 640k is enough for anything"... ... and 24 months later we're wondering how we ever lived without it.

    Somewhere, someone is thinking of a killer application that needs 512MB of video RAM to work.

    I just can't, for the life of it, imagine what it could be...
    • Somewhere, someone is thinking of a killer application that needs 512MB of video RAM to work. I just can't, for the life of it, imagine what it could be...
      Virtual-reality porn. Duh.
    • Everquest, times 6.

      Yes, people really do play six characters at once.
    • by Space cowboy ( 13680 ) * on Wednesday May 04, 2005 @11:44AM (#12432272) Journal

      With Quartz 2D Extreme (marketing!) putting the entire rendering of the display onto the graphics card as an OpenGL surface, and lots of the display-rendering code itself being stored there as well, you can never have too much RAM - especially with the composition manager etc. all eating up gobs of it...

      Simon
    • ...if there is some way to harness the GPU as an add-on vector processor, it could get very interesting.

      Now - I'm just an end-user, sitting here working with a large project in Revit [autodesk.com], an application that brings even fast PCs stacked with RAM to their knees. It's basically a database with a graphical interface and so every little operation results in refreshes and an element of regeneration of the display. It's a good tool, with great potential, yet that lag is a total patience-killer.

      If the vector oper

    • Somewhere, someone is thinking of a killer application that needs 512MB of video RAM to work. I just can't, for the life of it, imagine what it could be...

      The GUI system of OS X is relying more and more on the GPU to do drawing operations. The approach that Apple seems to take is to store everything graphics related in the GPU's memory, this includes window contents (for compositing), but also bitmaps, font glyphs etc.

      This approach gives improved performance, but eats up a lot of graphical memory, s

    • Shouldn't it be possible to use the memory when (ab)using the graphics card for general purpose programming. I imagine e.g. a server (read: no graphics) for a MMORPG or some other 3D simulation should run better on a GPU than on a "normal" CPU. A whole MMORPG world should be able to fill 512 MB easily even without textures.
  • by SunFan ( 845761 ) on Wednesday May 04, 2005 @11:32AM (#12432156)

    Just because some games don't use that other 256MB doesn't mean that no apps use it. The "pro" cards have been at 512MB to 640MB for a while, now. They wouldn't even bother selling them if no one knew what to do with them.
  • by Anonymous Coward on Wednesday May 04, 2005 @11:34AM (#12432164)
    While this may not lead to huge increases in performance for gaming applications, scientific applications stand to gain tremendously from increased memory for visualzing large datasets.

    A lot of applications in biology (3D microscopy, macromolecule interactions, MRI etc..), weather modeling, oil field visualization, to name just a few, are hungry for more onboard video memory.
    • by xRelisH ( 647464 ) on Wednesday May 04, 2005 @02:31PM (#12433887)
      There are already cards with a lot of onboard memory made for these sorts of applications. Both NVIDIA and ATI have been making workstation class cards for ages that come with loads of onboard memory.

      This card is supposed to be a gamers card as its optimized for such things. Workstation cards are the opposite, most of them perform poorly on games even though their specs may lead one to believe otherwise.
  • This may be a simple question - but how would the amount of memory and the performance of the card relate to each other? I can understand how having a faster GPU can be a benefit, but I fail to see how having more RAM (past a certain point) is a benefit.

    Obviously if you don't have enough (e.g. 64Mb RAM when the game engine needs about 128Mb RAM) there will be a performance hit, but if the game has all the memory it needs what would the point of having more be?
    • but if the game has all the memory it needs what would the point of having more be?
      So the unused RAM can pitch in and help the other RAM go faster, silly! Now, please excuse me, I gotta go paint some racing stripes on my case to up my 3dMark score!
    • You know, this is a common misconception that I find amongst people who claim to "know a lot about computers". I ask them what kind of video card they bought, and they say "It's a 256 MB NVidia card". Me: "... Ookay.. give me some numbers or letters, something. FX? Ultra? 5600? What?" Him: "Uh, I don't know, I just bought it because it had the most RAM and only cost $150".

      These pseudo-techie people seem to have the misconception that a video card's performance these days is entirely reliant on how
  • Driver or hardware? (Score:3, Interesting)

    by Gondola ( 189182 ) on Wednesday May 04, 2005 @11:35AM (#12432184)
    Well, hopefully the performance issues are driver related and not hardware bottlenecks.

    On a somewhat unrelated note, why don't these tests ever include MMORPGs? I'd like to think that a very crowded area in EverQuest during a raid with a lot of spell effects going off would challenge even the highest-end video card on the market. I think it's debatable that including some of these other types of games (MMORPG's specifically) would be more appropriate and well-rounded than 6 different FPS's.

    Of course, the problem would be fair testing of what is obviously a dynamic environment. My opinion is that two identical machines attending the same event with an almost identical viewpoint could be achieved. It would just require some social coordination to get the testers included in these events.
    • On a somewhat unrelated note, why don't these tests ever include MMORPGs

      as much as that would be nice, unless your MMPORG allows you to record/playback a demo, it would be impossible to make any meaningful comparisons between runs and/or different cards.
    • by chinard ( 555270 )
      EQ2 and WoW are both extremely taxing on video hardware and should be benchmarked on these tests. Lets not kid ourselves, FPS shooters are not the be-all and end-all of gaming technology. If ANYTHING is going to make use of that extra texture memory its going to be MMOG's due to the fact that you are dealing with 3000+ players per server. In high population areas it would be easy to have on screen 80+ mob's (players and npc's) each wearing something different, and setting off different spells and effets.
  • Graphic Apps (Score:3, Interesting)

    by alecks ( 473298 ) on Wednesday May 04, 2005 @11:37AM (#12432199) Homepage
    I've always wondered, would a program like Photoshop, benefite from 512 Video RAM??? Or does it work some other way where it doesn't use video ram like that. Ofcourse, let's assume that you are working with 600+ MB PSD files....
    • Re:Graphic Apps (Score:2, Informative)

      by archen ( 447353 )
      I think VRAM wouldn't be of any help there. What you think of Photoshop is what you render on the screen. That's 2D - or, just dumping pixels to the screen. All photoshop rendering and operations are done by the CPU - which is why photoshop will complain about junk like the resolution (too small, not enough colors) but doesn't care about your graphics card (well it doesn't say on the box anyway).

      But I don't really know either =P
    • Re:Graphic Apps (Score:3, Interesting)

      by Queer Boy ( 451309 ) *
      I've always wondered, would a program like Photoshop, benefite from 512 Video RAM?

      If the card manufacturer writes a hardware plug-in for Photoshop to use it, which I've never seen one outside of Radius (not for RAM but for processing).

  • by ivan256 ( 17499 ) * on Wednesday May 04, 2005 @11:38AM (#12432214)
    Since Apple has just released software that takes advantage of huge amounts of video memory, and they have a big ATI logo on the page describing it, perhaps the release of Tiger has something to do with the announcement of this card... If that's the case, trying to figure out what this has to do with gaming performance misses the point.

    From the "Core Image" page [apple.com]:

    When a programmable GPU is present, Core Image utilizes the graphics card for image processing operations, freeing the CPU for other tasks. And if you have a high-performance card with increased video memory (VRAM), you'll find real-time responsiveness across a wide variety of operations.
  • by cliffski ( 65094 ) on Wednesday May 04, 2005 @11:38AM (#12432217) Homepage
    As someone whose worked at various big games companies, and writes his own stuff too, I really would rather someone at ATI attended a 'driver stability for dummies' course, rather than got all macho about 16 terrabyte RAM cards.
    if ATI cards were twice the speed of nvdia, I'd still avoid them, simply because nvdia drivers are rock solid and unfussy, whereas the ATI driver 'envrionment' is usually a bug ridden barrel of unstable bloatware, that avoids standards like the plague
    Your mileage may vary etc blah blah
    • Here's the deal. You don't show up on the spreadsheets, so you "don't exist" to them.

      A sale is quantifiable on the sheet. A lost sale is an abstract concept that requires human intelligence to comprehend and take into account.

      So time and money "wasted" on coding drivers looks like a pure expense with no payback to the bean counters who think the computer has all the answers.

      This is the sort of shit that happens when you abrogate your rightful place as the thinking componant of the system to a slice of ro
    • Your mileage may vary etc blah blah

      Haha, yeah. It's funny, Nvidia's stuff is the only thing to ever blue screen any of my win2k machines. The ATIs have always given me a little warning by shitting all over themselves and giving me time to close down and reboot the machine.

      I wonder if AGP drivers are a variable which effects the stability and performance of various cards differently. There are always people who swear up and down that they have better experience with one brand or the other, and they certa

  • As software makers add eye-candy, the graphics board becomes more important than the CPU. The advent of graphics card such as this suggests that perhaps the CPU and main RAM is becoming less important to system performance.

    I wonder when the GPU will supplant the CPU? I'm sure it would be much easier for ATI to add a few million transistors for some general CPU performance than for Intel/AMD/IBM to replicate a high-power GPU. The CPU-needs of the core logic of basic applications are pretty minimal and
  • by wowbagger ( 69688 ) on Wednesday May 04, 2005 @11:44AM (#12432278) Homepage Journal
    Many people have asked "What the @#$%$# would you USE 512M of Video RAM for?"

    Others have responded with various games as the killer app.

    And perhaps, today, they are the driver for this much VRAM.

    However, there is a use for a card with that much VRAM that isn't gaming - compositing window managers.

    Apple's MacOS, Microsoft's Longhorn, and *nix's various compositing WMs all operate by giving each active window its own chunk of memory sufficent to hold the whole window, and then treating that memory as a texture for a polygon and letting the 3D hardware do the final compositing onto the display. This allows for effects like translucent windows, smooth window movement, quick resizing of windows, simplified backing store (handling windows overlapping other windows), and many other useful items - these aren't just "eye candy", but things that make the system much more useful.

    Now, think about how many windows you have open right now. Think about how many windows a power user may have open. Think about how much memory that can burn to give all those windows their own space.

    512M of VRAM isn't overkill for such situations - it's barely enough, and video card vendors are starting to look to supporting virtualization for the card's memory needs (especially in PCI Express cards where the card can have a decent amount of bandwidth to system memory.)
    • I doubt 512MB is necessary for that unless maybe you are running two 30" Cinema displays, and even then... Basically that is 128 million pixel storage capacity at 32bpp. One frame on a 30" Cinema display would be about 4.1 million pixels, so you'd need to have 16 full screen apps before running out on a system with 256MB cards.

      My Mac mini does pretty well with 32MB on a 2048x1536 screen resolution, which is 3.2MP.
  • by Minute Work ( 749085 ) <ipirate@y[ ]o.com ['aho' in gap]> on Wednesday May 04, 2005 @11:46AM (#12432290)
    They probaby had some extra RAM lying around and the marketing guys urged them to just put it in the card. That way they could claim...

    512 MEGABYTES OF MEMORY!!!
    TWICE THE MEMORY OF ANY OTHER GRAPHICS CARD OUT THERE!
    NO OTHER GRAPHICS CARD COMPARES!

    I expect ATI to come out with a sound card next month with a volume control that goes up to 11.
  • This card has EIGHT THOUSAND times as much RAM as my first computer had. (It was a Sanyo MBC-555 with 64KB RAM.)

    Truly, we live in an age of wonders.
  • by RealProgrammer ( 723725 ) on Wednesday May 04, 2005 @11:48AM (#12432307) Homepage Journal
    Sometimes software comes out which is "too slow", or "bloated", and doesn't become popular.

    For instance, the Lotus Smartsuite products were way ahead of Microsoft's Office suite when they were released, but the entire package was took about 25 1.4MB floppies, I think, and then would hardly run on the typical system at the time. A couple of years ago I was looking for some clip-art and loaded it from CD. On modern hardware, the package was quite pleasant to use.

    There were some bugs in SmartSuite, and Microsoft did a number on compatibility at the API level, but I think overall it was the bloatware aspect that hurt it the most. A few years later the package seems rather spritely and compact.

    Hardware suffers from the opposite problem. The attitude "Why would I need that much?", which hardware vendors play into by offering products with overkill specs in the wrong areas. Since they can't double processor speed, doubling the amount of RAM is the next best thing, right?

    No, the next best thing would be to offer rock-solid reliability in the hardware and drivers. Make it cheaper. Ship the source for your drivers. I want it to work, and if it doesn't work I want there to be a way to fix it.

    I know that's not how the video card business works. If you're not at the cutting edge, you're an also-ran. I just wish it weren't that way.

    Sorry for rambling. To tie it all together, I think vendors get caught up in having features their marketing department can brag about, rather than delivering products their customers can use most effectively.
  • by Shuh ( 13578 ) on Wednesday May 04, 2005 @11:51AM (#12432339) Journal
    The extra memory is to keep the CPU from having to busy itself writing graphics to backing-stores in the RAM.

    http://arstechnica.com/reviews/os/macosx-10.4.ars/ 14 [arstechnica.com]
  • NVIDIA Geforce cards have had 512mb of RAM for a few months now, with similar caveats from reviewers that it really doesn't make a huge difference in performance.
  • by lax-goalie ( 730970 ) on Wednesday May 04, 2005 @12:02PM (#12432446)

    Once a Mac version of this is available, Core Image [apple.com] and "Quartz 2D Extreme" [apple.com] will put the extra vram to pretty good use.

    Ars has a pretty good explanation about why the extra elbow room will make a difference, namely, the GPU won't have to hit its backing cache in RAM [arstechnica.com] as often.

  • (There is another post, probably more along this line.... for one [slashdot.org] and for another [slashdot.org].)

    I tend to agree people will find use for the 512M memory in video cards. Of course there's the infamous Gates quote about "noone will ever need more than...." (or words to that effect)...

    I have NO idea what I'd use 512M memory for on a video card.... My first inclination might be to back up the hard drives from my first three or four PC's into the video cards memory each night ;-). But I do know I'm using technology in

  • As Apple has demonstrated, and Microsoft sometime or otherwill, moving the GUI rendering into the graphics card is an on-going process. So it's no surprise to see card vendors introducing products which they can dangle in front of the vendors and hope to have included in the build of a new system.

    Graphics cards aren't JUST designed for games...although it's hard to believe from what you read here.

    Of course we have games to thank for great graphics cards which have allowed for the GUI to move onto the card
  • by isecore ( 132059 ) <isecore@NOSPAM.isecore.net> on Wednesday May 04, 2005 @12:09PM (#12432512) Homepage
    ... from bying it. There's always tons of spoiled teenagers out there in tweaktown who HAS TO HAVE TEH LATEST SH1T!

    This is the real reason why ATI even does such a werd-ass thing.

    -Mommy, my penis is shrinking!
    -Well son, let's get you a new videocard then!

    That's just my opinion and experience of dealing with teenage computer users these days.
  • so is now a good time to upgrade from my 16MB ATi Radeon All-In-Wonder?
  • I've learned (Score:4, Insightful)

    by Anonym1ty ( 534715 ) on Wednesday May 04, 2005 @12:19PM (#12432616) Homepage Journal
    I for one have learned over the past many years not to ask the question: "What would you ever need all that for?" when it comes to computers.
  • Well duh (Score:3, Insightful)

    by Jugalator ( 259273 ) on Wednesday May 04, 2005 @12:21PM (#12432641) Journal
    According to AnandTech, the 512MB card can't outperform its 256MB counterpart and costs 50% more.

    Can that have anything to do with texture resolution not being there yet? They'll no doubt be there in the future though, so I can only see this as the first 512 MB card with more to come. I don't think it's really "bad", just a little bit ahead of its time.
    • Re:Well duh (Score:5, Insightful)

      by cbreaker ( 561297 ) on Wednesday May 04, 2005 @12:43PM (#12432904) Journal
      Phew, at least someone said it.

      I see a lot of really sour posts on this one about how it's stupid, ridiculous, how a P3 500 is just fine, how last year's game runs great..

      They say it costs twice as much but only helps one game? Then I say it's a sign of things to come. They've said this same crap about 3D video board memory for years. "You don't need 64MB!!!" "You'll never use 128!!" "256? You're stupid!"

      If the video boards all have gobs of memory, then the games will all start to have gobs of high resolution, bump mapped, great looking textures. Why is this a bad thing? When the next generation of games hits the shelves in a year or so, they'll use that video memory.

  • And... (Score:3, Funny)

    by kaoshin ( 110328 ) on Wednesday May 04, 2005 @12:33PM (#12432780)
    The result after renaming the halflife executable is?
  • Graphics Research (Score:3, Informative)

    by EmersonPi ( 81515 ) on Wednesday May 04, 2005 @01:05PM (#12433121)
    Actually, I know a lot of graduate students who will be really happy about this. It turns out that for a lot of research uses, 512 MB of ram would be really useful. Examples include 3D volume data-set visualization and general purpose GPU computations (GPGPU).

    I don't know where ATI expects to make the money on this (certainly not that much $$$ in the research market), but I'm personally glad that they released this card.

    The big question in my mind now is how good the cache performance is on this new card.
  • Usefulness... (Score:4, Insightful)

    by paithuk ( 766069 ) on Wednesday May 04, 2005 @01:26PM (#12433322) Homepage
    Although at first sight this card may have no use, think about Apple's Quartz technology that uses the graphics card video memory to hold all viewable window elements so that they can be rendered quickly and efficiently without requiring that data be paged in and out to real memory. With the new Longhorn graphics technology being announced this week, it's probably an emerging market that ATI want to take full advantage of. Plus the scientific applications stand to benefit (but I noticed somebody already mentioned this).

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...