Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

ATI Releases Five New Radeons 268

An anonymous reader writes "Eager to retake the performance crown from NVIDIA, ATI has announced five new releases for their Radeon product line. The latest card features 512MB GDDR4 memory running at 1000Mhz, it's currently the fastest single CPU VGA card out there. From the review: 'ATI has proven they are a leader and not a follower with the X1950 XTX. ATI has released the world's first consumer 3D graphics card with GDDR4 memory clocked at the highest ever stock speed that chews through games when it comes to high definition gaming. Memory bandwidth looks to once again be the defining factor in 3D performance. With a re-designed heatsink/fan unit, faster memory, and lowered price, the ATI Radeon X1950 XTX and CrossFire Edition are both serious 3D gaming video cards for the [H]ardcore that offer some value over NVIDIA's more expensive 7950 GX2. ATI's CrossFire dual GPU gaming platform looks to have just grown up.'"
This discussion has been archived. No new comments can be posted.

ATI Releases Five New Radeons

Comments Filter:
  • Screw ATI (Score:5, Funny)

    by Anonymous Coward on Wednesday August 23, 2006 @10:29AM (#15962344)
    What I want to know is where can I get the world's fastest accelerated EGA graphics card? I want to play Kings Quest II.
    • Re: (Score:3, Funny)

      I've got a crate full of Diamond Stealth 3d 2000 cards to sell if you are interested.

      They have 2MB of EDO memory (upgradable to 4) for the ultimate experience.

      It runs on PCI. Express graphics are guaranteed.
      • Re: (Score:2, Interesting)

        Why don't you rig all of them up in one massive SLI configuration? Pointless, no doubt, but cool...
      • It's got PCI!? Wow!

        You know, the Risc processor is going to change everything.

      • by LilGuy ( 150110 )
        I used to have one of those. I bout shit myself when my dad brought it home one day. Looking at the box I damn near pissed myself. I thought, WOW! Look at those graphics! Then we installed it and it didn't really do shit. The games wouldn't run, all my old games looked the same.. yeah that was a total bummer.
      • anyone got any 3dfx cards in their original boxes? ill take them off your hands(of course ill pay). ive got quite a collection of them going now..
    • Re: (Score:3, Funny)

      by Ed Avis ( 5917 )
      You're forgetting the naming conventions of gaming hardware reviews. Remember that a video card is called a VGA, a hard disk is called a hard drive, memory is called sticks, and most importantly a computer is always called a rig. (A fast computer is a 'sweet rig'.)
      • Remember that a video card is called a VGA, a hard disk is called a hard drive,

        IBM [ibm.com] seems to disagree with you. They have both words on that page to make Google happy, but refer to them directly as hard drives ("Troubleshoot my hard drive" , "your hard drive", "SCSI hard drive", etc.). Google puts the drive:disk ratio at 167,000,000 [google.com] tto 68,300,000 [google.com].

        In other words, don't make a campaign out of "hard disk" being correct unless you enjoy being rebuffed.

    • Re: (Score:3, Funny)

      by briggsb ( 217215 )
      Who needs EGA? I'm still waiting for this card [bbspot.com] with text game acceleration.
  • Drivers? (Score:5, Insightful)

    by Recovering Hater ( 833107 ) on Wednesday August 23, 2006 @10:30AM (#15962351)
    I just can't help but wonder what their Linux driver support will be like. If it is the same or worse then honestly, it only means that ATI has produced five more cards to ignore. Harsh? Maybe, but there is some truth in that statement.
    • I just can't help but wonder what their Linux driver support will be like.

      Most of us don't need these cards. These are for hardcore gamers. As in shoot-em-ups that will only run on Windows, not real games like nethack. ATI won't be very concerned about the lost market.

      • by spitzak ( 4019 )
        There are "hard core gamers" who want to be able to reboot their Windows machine into Linux, so lack of a Linux driver certainly is a factor.
    • Re:Drivers? (Score:4, Interesting)

      by nonmaskable ( 452595 ) on Wednesday August 23, 2006 @11:27AM (#15962790)
      Recently, I got a laptop with ATI video. The 8.25 drivers worked perfectly, but all the drivers since have died with:

      fglrx(0) PreInitDAL failed
      fglrx(0) R200PreInit failed

      The total failure of ATI quality control to regression test even on recent cards like in my laptop is astounding. I'll never make the mistake of buying ATI products again.
    • Re: (Score:3, Informative)

      by jdgreen7 ( 524066 )
      It appears that ATI is making an effort to improve their Linux driver support:

      ATI's Linux Driver Page [ati.com].

      They just released their 8.28.8 drivers a couple of days ago, and they had just released the previous version about 3 weeks before that. So, there are some changes being made at least. Also, with the AMD merge, they are considering opening up the source code to at least portions of the driver, so I personally expect ATI to become a serious player in Linux space in the not-too-distant future.

      • There have been many builds of the fglrx driver released, and all of them have been pure shite. A new version of the same old ATI driver is no more compelling than a new version of the same old Windows NT.
    • I just can't help but wonder what their Linux driver support will be like. If it is the same or worse then honestly, it only means that ATI has produced five more cards to ignore. Harsh? Maybe, but there is some truth in that statement.

      This comment would be equally valid if you s/Linux/Windows/. ATI has consistently displayed an utter inability to write drivers since, well, time immemorial.

    • The same goes for Windows gamers also. Any serious gamer just chooses Nvidia, you know with Nvidia your games will all render properly and the drivers always work.

        I don't see how ATI has such high quality hardware and such crappy drivers, on 2 platforms, year after year.
  • by spyrochaete ( 707033 ) on Wednesday August 23, 2006 @10:30AM (#15962361) Homepage Journal
    This does indeed look like a fantastic video card, but I found this comment from TFA a little funny:

    ATI has proven they are a leader and not a follower with the X1950 XTX

    No, X1950 XTX, ATI's top of the line card, sounds nothing like Nvidia's top model, 7950GTX.
    • by TheDreadSlashdotterD ( 966361 ) on Wednesday August 23, 2006 @11:13AM (#15962673) Homepage
      You're right, ATI has more X for XTREME!!! It has to be BETXERXXXXER!!!
    • ATI is clearly the leader in this race. Their top of the line card has THREE TIMES the number of X's in its name as the also-ran nVidia flagship card.
    • I once described the video card I wanted to my parents by saying "the one with all the letters in the name."

      I love my X850 XT PE.
    • Nvidia started using the "XT" monkier (6800 XT) after ATI used it for years.

      ATI started using the "GT" monkier (x800 GT, x1900 GT), and even extended the monkier to "GTO" after the incredible success of Nvidia's 6600 and 6800 GT.

      They've been doing this sort of thing for years.

      And while ATI looks like they ripped off the "950" name with the x1950 XTX, this one again cuts both ways:

      ATI was the first to use "50" increments in their product naming. Witness the Radeon 9250 and 9550, which were released a few ye
      • Re: (Score:3, Informative)

        by Gnavpot ( 708731 )

        Nvidia started using the "XT" monkier (6800 XT) after ATI used it for years.

        [...]

        Since the release of the GeForce 4 series, all Nvidia product numbers have been divisible by 100. The GeForce 7950 GX2 is the first and only exception to this rule, and is obviously inspired by ATI's recent naming schemes.

        Both statements are wrong, though the correct answer will not change anything in the Nvidia vs. ATI naming race:
        Nvidia 5900XT and 5600XT are approx. ½ year older than Nvidia 6800XT.
        Nvidia 5950 is approx

      • The GeForce 7950 GX2 is the first and only exception to this rule, and is obviously inspired by ATI's recent naming schemes.

        There's certainly a lot of borrowing going on in tech companies' marketing departments. It makes sense to an extent so that people know what they're buying and how it stacks up vs competitors' products. The whole Athlon XP (another stolen name) line was numbered to resemble the GHz rate of Pentium equivalents.

        You may be right about Nvidia stealing the "50" from ATI, but I thou
  • by peterdaly ( 123554 ) * <{petedaly} {at} {ix.netcom.com}> on Wednesday August 23, 2006 @10:33AM (#15962381)
    These cards are nice...for windows users.

    What the new AMD led ATI can do to help show leadership is to release the information (or even drivers) needed for Linux to take full advantage of their card capabilities.

    ATI seemed to not want to do this. I hope this changes under the new AMD administration.

    What I've heard in the Linux community is to stay away from anything ATI if you plan to use it with Linux. Too bad really, because they really do make nice cards.
    • by Whiney Mac Fanboy ( 963289 ) * <whineymacfanboy@gmail.com> on Wednesday August 23, 2006 @10:49AM (#15962499) Homepage Journal
      Well, said - but to expand on that a little:

      What I've heard in the Linux community is to stay away from anything ATI if you plan to use it with Linux.

      The same applies to nvidia. Try Intel or Unichrome cards. Support companies that support FOSS.

      Oh, and for the people who'll inevitably reply with the "they cant release the source, because of 3rd party IP" (I am tired of that particular whine) - why can't ATI/Nvidia release the source for the code they do have IP rights over? (and allow the OSS community to fill in the blanks).
      • Re: (Score:2, Informative)

        by genooma ( 856335 )
        Well, Nvidia might not release their drivers as Free Software, but at least their drivers, unlike ati's, work.
      • by Otter ( 3800 )
        I'm not (to put it mildly) a graphics card buff, and had never heard of "Unichrome" -- that has got to be the most uniniviting name for a GPU ever. It sounds like it should run the green-on-black monitor of an Apple ][ or vt100. Compare to "ATI Radeon X1950 XTX" (which itself could use another X or two).

        Anyway, if you have political issues with Nvidia that's one thing, but otherwise they've run fine under Linux for years.

        • by Whiney Mac Fanboy ( 963289 ) * <whineymacfanboy@gmail.com> on Wednesday August 23, 2006 @11:33AM (#15962847) Homepage Journal
          and had never heard of "Unichrome" -- that has got to be the most uniniviting name for a GPU ever.

          They're very low-end, (used in cheap laptops, via's embedded line, etc) so if your a windows-gamer-fanboy, you're not going to have heard of them. (and if you judge a card by its name, you have bigger problems than that).

          Anyway, if you have political issues with Nvidia that's one thing, but otherwise they've run fine under Linux for years.

          No they don't. They run better than ATI's offering. There's a number of things that don't work correctly. (TwinView doesn't support multiple monitors with different resolutions, framebuffer/x switching support is poor, you can't report (linux) bugs to the kernel team, you're allowing an unaudited binary blob to run in kernelland, I can go on and on).

          If Nvidia & ATI were the only choices, then fine, I'd reccommend Nvidia's buggy binary blob over ATIs buggier binary blob. But they're not. Two companies have offered the specs & a reference GPLd driver - I reccommend them and I think other supporters of FOSS should do likewise.

          Saying a reccommendation of a driver that actually supports linux over one that doesn't is 'political' is.... well - let's say I suspect you have a political agenda of your own.
          • by Otter ( 3800 )
            Saying a reccommendation of a driver that actually supports linux over one that doesn't is 'political' is.... well - let's say I suspect you have a political agenda of your own.

            Huh? Using "a very low-end" card over one with much more Linux functionality, even if not 100% of its Windows functionality, because of their licensing terms -- what is that if not "political"? Not that there's anything wrong with putting your money behind your politics.

            • Huh? Using "a very low-end" card over one with much more Linux functionality,

              1) I listed two cards (not just the low end)
              2) A bug free, but slow/low featured card is superior to a buggy fast/full featured card (IMO)
              3) I'm sick of people's support of cards that don't have kernel mode binary blobs as 'political', when it's practical.
              4) As other people in this thread have pointed out, nvidia's linux drivers are only useful for linux - think of the atheos, *bsd (minus freebsd), etc, etc users out there. Pleas
          • Re: (Score:2, Informative)

            by gdamore ( 933478 )
            There is another major point that the Linux fans are missing here.

            Having source makes the device useful for something _other_ than Linux. Like NetBSD.

            ATI have historically been easier to work with than Nvidia in this regard. One can get source for some ATI products. And they are willing to work under NDA. I've even produced a radeonfb for NetBSD using information that was under NDA (and had never been released outside ATI before), and ATI let me release the drivers back to the community under a BSD

          • ...you can't report (linux) bugs to the kernel team,...

            No more than you can report kernel bugs for any kernel issue where you have some random hunk of code involved, regardless of whether you have the source available to you or not. The kernel developers aren't going to help you unless they feel like you are only running their code and their code only.

            you're allowing an unaudited binary blob to run in kernelland,

            Unless you, personally, have audited the code you don't know if it has been audited or not. If y

      • by gid13 ( 620803 )
        Furthermore, who does this 3rd party IP belong to, and can ATI/Nvidia pressure them a bit to release it?
      • What I've heard in the Linux community is to stay away from anything ATI if you plan to use it with Linux.

        The same applies to nvidia. Try Intel or Unichrome cards. Support companies that support FOSS. Oh, and for the people who'll inevitably reply with the "they cant release the source, because of 3rd party IP" (I am tired of that particular whine) - why can't ATI/Nvidia release the source for the code they do have IP rights over? (and allow the OSS community to fill in the blanks).

        Oh, and for the p

        • because they're not releasing the source" (I am tired of that particular whine) - why can't people be left alone to use hardware that works? (and stop telling people what to do.)

          Those of us who want the source (or, at the very least, the specs so we can write our own drivers), want it because we want our hardware to work. The only thing I have ever seen crash FreeBSD was the nVidia binary driver. The vast majority of Linux kernel panics I have seen have been as a result of the nVidia binary drivers. T

        • "you can't use those cards, because they're not releasing the source"

          Whoa! Chill, I wasn't using the imperative, just suggesting people use FOSS friendly cards to support FOSS.

          It would be more like I'm saying:

          "if you support FOSS you shouldn't use those cards, because they're not releasing the source"

          As to the rest of your rant, if you're a gamer, sure, you're probably dual booting to windows and perhaps an nvidia/ati solution is better for you. But as far as choosing ATI/Nvidia for pure linux? Not many peo
      • You still need ATI and NVIDIA for gaming in Linux with 3D support.
    • I'm sure they will completely dominate if they just focus on that less than 2% of the market that runs Linux. And out of that 2%, how many need a top of the line 3d card?

      I'm sorry, but it's a waste of time for them.

  • Drivers (Score:5, Insightful)

    by achacha ( 139424 ) on Wednesday August 23, 2006 @10:33AM (#15962385) Homepage
    They keep releasing new cards and the drivers that support them keep slowing down and mangling the performance of the previous cards (currently had to uninstall 6.8 catalyst to use 6.6 because the FPS rate got cut in half due to a conflict with FSAA/Bloom effects; and 6.7 driver refuses to install because it thinks the card is not an ATI while 6.6 and 6.8 do). This has been their history. I have been buying ATI cards since mid 90s (glutton for punishment I suppose) and every time a new card comes out I install the new drivers and my slightly older card runs slower or drivers crash or effects are blurred. I think they really need to beef up their driver development to keep up with the constant release of new hardware, what good is a new card if you are worried the drivers will be problematic.

    While NVIDIA is not perfect, the 2 cards I have from them work perfectly with their drivers. While ATI is releasing better featured cards their drivers leave something to be desired.

    • I just built my own system, and got ATi Radeon card... Well, first of all, I tried downloading their updated Catalyst drivers, and it is 100 MB. WTF is that? This is not an operating system, it's a phreaking driver, for pete's sake!

      Next... I noticed that text on my LG LCD monitor (20 in widescreen) was of really poor quality. I even installed ClearType from Microsoft, didn't help much. Started thinking it was my monitor, but then hooked it up to my laptop that has NVidia. Wow! WHat a difference! Even wit
      • by Moraelin ( 679338 ) on Wednesday August 23, 2006 @12:27PM (#15963325) Journal
        Next... I noticed that text on my LG LCD monitor (20 in widescreen) was of really poor quality. I even installed ClearType from Microsoft, didn't help much. Started thinking it was my monitor, but then hooked it up to my laptop that has NVidia. Wow! WHat a difference! Even without ClearType, the text was so much better.


        I can tell you exactly what happens there, because I've put some time into diagnosing the exact same problem on my Acer AL2032W monitor. And it still pisses me off that the problem _still_ isn't fixed, in spite of being known for ages.

        The problem starts like this: some cretin at ATI decided that, if it detects a DVI cable, it should automatically trust the highest resolution reported by the monitor, and, here's the idiotic part, never allow the user or the monitor drivers to override it. So if it reads 1600x1200 as the highest supported resolution, any other resolutions you choose will automatically be either scaled to 1600x1200 or centered in an 1600x1200 image. It has no choice that lets me say, basically, "fuck off and just send the image as it is to the monitor."

        Why is that an idiotic idea? Well, here's why: because some monitors support resolutions higher than their native one. E.g., there are a ton of 1280x1024 monitors which report that they also support 1600x1200. Or the AL2032W has an 1680x1050 native resolution, but _also_ supports 1600x1200. They just then down-scale that to their native resolution.

        So think of the following scenario: let's say your monitor is an 1280x1024, but affected by the abovementioned quirk. And you set your desktop resolution to 1280x1024. It should be crystal clear, right? Well, on an Nvidia card it would be, but for ATI it's wrong.

        What ATI will do there is scale your 1280x1024 image to 1600x1200 first, before sending it to the monitor. Which makes it all fuzzy already. But then your monitor has received an image which doesn't fit its native resolution. So it will rescale this 1600x1200 image back to 1280x1024. This doesn't re-create the original crystal-clear image, but adds _more_ fuzziness to it.

        Yes, I know what you mean by "really poor quality" there, and even that is mildly put. It's piss-poor quality. It was so fuzzy on my monitor that it gave me headaches in less than an hour.

        And the really idiotic and annoying part is that it doesn't even allow you to override that. Once it's decided 1600x1200, that's it. Whoever designed it had the arrogance to decide that surely the user is too stupid to know such technical details, so let's not trust the user with the power to set something else. I find that not only utterly idiotic (since we just saw that it can guess wrong), but outright offensive.

        Anyway, there are two solutions to this:

        1. Download the Omega drivers. Strangely enough those are smart enough to read the native resolution, not the maximum supported one.

        2. Use a VGA cable. On VGA it does allow you to set your maximum resolution and frequency yourself.

        (This also goes in case someone wants to jump in with the usual "just set the resolution in the control centre" advice. Trust me, it doesn't work over a DVI cable. Over a VGA cable it works. Through DVI it doesn't.)

        Personally I find both solutions pretty annoying. Number 1 involves installing some non-official non-supported driver. (And if you know about how drivers run in kernel mode in Windows, you'll understand what's scary about running non-official drivers just downloaded off some web site.) Number 2 basically involved throwing the whole "digital" part out the window, and using an LCD monitor as a glorified analog CRT with larger pixels.
        • by geekoid ( 135745 )
          I would never buy an ATI card, but I was given one.

          I loaded the Omega driver and it works much better, and I've never had any issues witht he drivers as far as virus, spyware, carashes, etc. . .

    • I've had several ATI cards and haven't had any problems with any of them.
    • by Renraku ( 518261 )
      Feh. They're probably doing it on purpose.

      Imagine if car companies were also the gas company and could modify the gasoline to make their older models run worse/less efficient/slower?

      You wouldn't be able to get a car to work beyond 3 years.
  • by Brit_in_the_USA ( 936704 ) on Wednesday August 23, 2006 @10:36AM (#15962398)
    It is generally predicted that DirectX 10 cards will be with us in a few months (holiday season or just after).

    Are sales declining because of anticipation of this?
    Will ATI and Nvidia be able to shift large quantities of cards over the next few months, with people like myself waiting for the next (significant) generation?

    Aside: Yes, I am aware that these cards will still pack a punch in DirectX10 games, and will not be obsolete over night, but the unified shader/vertex architecture of DirectX10 seems to be a big shift in card design and will offer a lot of features to game desingers, not efficntly do-able on the odler hardware, so you may be stuck with a less good lookign rendering of a new game.
    • Well, it depends on your upgrade cycle. If you are the kind of person that upgrades when each new generation of cards comes out then go for it. When the DX 10 card are released, just upgrade again. Also if you are buying a new computer now but don't plan on upgrading to Vista for some time. Maybe you don't game (there are other uses for 3D cards), maybe you just like giving new OSes a year before you jump over. Again, these'd be good.

      Also there's always the possibility that they are DX 10 card. I don't know
  • by Lev13than ( 581686 ) on Wednesday August 23, 2006 @10:36AM (#15962404) Homepage
    Please note that the term "Radeon [wikipedia.org]" is not an official definition. The term was recently proposed by the International Videocard Union [wikipedia.org] to define a "Graphics card which costs less than $200 American dollars to purchase and whose shape is more highly inclined and angular than a traditional card".
    This definition causes all sorts of problems, such as how to define dual-card setups and what happens when a Radeon is attached to a daughter card rather than a motherboard. Videostronomers are currently divided between those who favour the term "Radeon" and those who argue that we should stick with the current definition favoured by consumers, which is "the weird square-ish blue plug at the back of my Dell".
  • by RShizzle ( 983535 ) on Wednesday August 23, 2006 @10:37AM (#15962411) Homepage
    I don't get how appending more X's, T's and the occasional G to ever increasing numbers help me understand the capabilities of a card... except that it's *Awesome* and I HAVE to buy it!
    • by Chandon Seldon ( 43083 ) on Wednesday August 23, 2006 @11:10AM (#15962655) Homepage

      They model numbers. The requirement is that they be different between different cards, so customers can see that different products are different. Beyond that, marketting can do whatever they want with them - it doesn't really matter.

      Suprisingly, the marketting departments at ATI and Nvidia have settled on a highly structured and informative system for model numbers (for something generated by marketting departments).

      Here's how it works: Take the "X1950 XTX". That splits into 4 segments: "X1" is the generation, "9" is the class, "50" is the revision, and "XTX" is the specific model. Nvidia uses exactly the same system. For the 7950 GX2, we have generation 7, class 9, specific model GX2.

      Generation usually changes yearly. Class splits into (generally): 0-3 is low-end, 5-7 is mid-range, and 8-9 is high end. The revision number allows more recent products to have higher numbers than older products. Generally for ATI "Pro" Now - that still doesn't let you determine which card is "better" based on the model number, but model numbers never do that. Which is better, An "AMD Opteron 165" or an "AMD Athlon64 FX-50"?

  • by Vigile ( 99919 ) on Wednesday August 23, 2006 @10:38AM (#15962421)
    http://www.pcper.com/article.php?type=expert&aid=2 87 [pcper.com]

    Here the review talks up the signle X1950 XTX card but finds the CrossFire platfrom from ATI still very under-developed.
  • ATI is Evil (Score:4, Insightful)

    by SleeknStealthy ( 746853 ) on Wednesday August 23, 2006 @10:39AM (#15962426)

    Although it was touched on a little above, I couldn't pass up the opportunity to rant about ATI. ATI does not support more than 2-3 generations of cards. Their driver development quickly stops and their Catalyst drivers are ridiculously huge.

    On the linux side of things their support is so freaking lame it is ridiculous. Reverse engineered open source drivers are 10X better than drivers developed by ATI. ATI is pathetic and any company that releases such terrible software in their name does not have very high standards and cannot be trusted. I had a radeon 8500 and I will never recommend or waste my money on such pathetic ATI junk again.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Wednesday August 23, 2006 @10:40AM (#15962434)
    Comment removed based on user account deletion
  • HEXUS.review (Score:3, Informative)

    by unts ( 754160 ) on Wednesday August 23, 2006 @10:48AM (#15962486) Journal
    Here's HEXUS's review [hexus.net].

    Loads more reviews out there too. Anyone feel like making a list?
  • About time (Score:3, Funny)

    by salad_fingers ( 908746 ) on Wednesday August 23, 2006 @10:50AM (#15962506)
    I can finally play Wolf3d on 200x AA. Took long enough...
  • This just makes the market that much more confusing for the customer. I'm not arguing it i' just saying that this type of marketing screws over the not so technicly inclined user. The releasing of multiple cards at once is confusing to the customers. I personally don't trust any number or letter string following Radeon or GeForce #. A while back i got tricked by the GeForce 4 Mx. I bought the card thinking it was just a step under the GeForce 4 TI. Boy was i wrong that card didn't even out preform the
  • by thatguywhoiam ( 524290 ) on Wednesday August 23, 2006 @11:01AM (#15962592)
    ATI Radeon X1950 XTX and CrossFire Edition

    Is there some kind of rule that says we can only use letters like X, N, R, and words like CrossFire, to denote 'cool' products mainly aimed at men?

    Just once I'd like to see an ATI Shiny B001 LALA and FluffyPants Edition. Just to shake things up.

  • by Fallen Kell ( 165468 ) on Wednesday August 23, 2006 @11:07AM (#15962633)
    I mean, come one people. The whole point of benchmarks is to show how well the different cards perform with the same settings!!!! The numbers they post for all the cards have different configurations and settings that generated those results. Show me ALL the cards running the SAME EXACT settings and give me their results. Don't just arbitrarily show what you consider as "playable" speeds and then show the game settings used to produce those speeds. How in the world are these guys staying afloat when making horrendous reviews like this?
    • by Sycraft-fu ( 314770 ) on Wednesday August 23, 2006 @12:41PM (#15963439)
      Well, they should have used better controls, but this kind of thing isn't useless. I don't really give a shit about balls tot he wall FPS. I personally find that anything about about 60fps doesn't really look any more smooth to me, and yes I do have a monitor that supports higher refresh rates. From 30-60 is acceptable, but much below 30 starts to annoy me. So my question of hardware is: In the range of 30-60fps, what kind of quality can I get on a given game? Can I crank it to 1600x1200? Can I kick up FSAA? Can I turn on all the shiny options?

      That's what's really relevant. I don't care if card X gets 200fps in 1024x768 mode and card Y gets 300fps. Both are way above my "give a shit" boundary. What I want to know is at what level to they start to drop to the point where I'll notice.
    • How in the world are these guys staying afloat when making horrendous reviews like this?

      They are floating on free graphic cards.
  • by Ginger Unicorn ( 952287 ) on Wednesday August 23, 2006 @11:09AM (#15962651)
    i'm serious, not trolling. is anyone on here any more than utterly ambivalent to the fact that there is yet another slight incremental advancement in the power of video cards?

    i am interested to hear from anyone who is genuinely excited by this news. I'm also interested in hearing from someone who would pay £400 to increase their rendering power by 15%.

    (yes i know that only applies to people who already have the current fastest video card, but i'd love to know if anyone is actually rich and bored enough to replace bleeding edge with bleeding edge at every opportunity)
    • by King_TJ ( 85913 )
      Well,the *only* reason this gives me any "good vibes" at all is knowing it will help push prices down on cards that are more within the price-range I'd shop in.

      But since I've almost completely switched over to Macs these days, it matters even less than before. (With the new Mac Pros using EFI instead of a BIOS, we Mac users are now stuck hoping ATI and nVidia will go to the trouble to release custom Mac versions of some of their cards that work with EFI. And so far, the *only* one shipping today is a crap
  • Seriously, as much as I'd love to be able to afford a high-end graphics card, I'm more interested in finding the cheapest card which will support Oblivion reasonably. Any recommendations?
  • I got in to this habbit of internet gaming on the PC and upgrading my video card every two years. This has been on going since getting my second Voodoo card to get SLI performance for Quake 2. I've frankly avoided the console market for a long time, I was one of those PC gaming snobs. However, I finally got off the upgrading crack and now smoke from the console opium tent. I took the plunge on a 360. You fellow husbands out there know how this works. I laid out a very convincing rationalization for thi
  • by Azure Khan ( 201396 ) on Wednesday August 23, 2006 @12:17PM (#15963236)
    Do you know how one of these companies could be a "leader"?

    They could start by unifying features into a tight and manageable product set, and eliminate some degree of confusion about features and chipsets from the market.

    -AND-

    They could stop working on the "problem" of pushing more triangles, and work on the real problem with modern video cards: Power. Personally, I don't really need photorealistic graphic quality if it means I have to keep two power supplies in my system, or plug my video card directly into the wall.

    Graphic quality is already impressively high, so maybe it's time to step back and improve the underlying technology and give the market time to absorb and upgrade. Like others, I still work on my ATI Radeon Pro 9800 with 256MB of RAM. I'm not upgrading anytime soon, because there are fewer and fewer AGP cards available, and I'm not willing to replace my entire otherwise completely functional system just to get a PCI-E slot. There are a lot of people like me, who are waiting, and I'm no Luddite. I like my gadgets. But keeping up with PC improvements has become a game of diminishing returns, since I run huge graphics and multimedia applications, plus most of the games on the market, very comfortably on my AMD64 3400+ processor with 1GB of RAM. I have yet to find a game I WANT to play that doesn't play quite nicely on my hardware.
    • They are addressing the issue of power draw; take the nvidia 7600gt/gs, it doesn't even need a separate molex connector, it gets all its power from the pci-e bus. The thing about feature sets is, you want one thing (a low wattage card, with less cutting edge performance) and _I_ want a card that pushes polygons as fast as possible, and hang the power-use; that's why there is more than one card in a range.

      Also, as new chips come out, the old ones are retired, but not immediately. So the question often is do
  • All graphics cards seem to do one or two things really well, then fail at the rest. The only consistency I've noticed between brands is their crappy fans. If you look at it funny it will start making that familiar buzzing noise and will fail (obviously) not long after that. Until video cards develop either better fans or a cheap and reliable liquid cooling system, I shall hate all video cards and consider them a necessary evil.
  • by mikemuch ( 870535 ) * on Wednesday August 23, 2006 @12:32PM (#15963359) Homepage
    At the end of the day, the worth of the Radeon X1950 XTX comes down to this: Does the improved memory bandwidth you get from GDDR4 really make a difference if you don't change anything else about the card? Unfortunately, the answer is no. In most games, at high resolutions like 1600x1200 with 4x antialiasing and 8x anisotropic filtering applied, the speed goes up by a modest 5% to 8% over the Radeon X1900 XTX. If that's all you get from an almost 30% increase in memory bandwidth, color us unimpressed.

    X1950 XTX review [extremetech.com]
  • Did anyone else get the idea that "anonymous reader" is a marketdroid at ATI?
  • Say what you want about the drivers ... but ATI has the highest quality TV out. Since I have been using my computers almost exclusively on televisions for 11 years now, over 5 ATI cards -- this is very important to me. I'm more than willing to try out a few drivers to fix things. Once I know what driver set works, I denote that information so I don't have to go through the hassle next time I re-install (if ever.. My Windows 2000 installation is 3.5 yrs old with 280 programs installed, and still ticking.
  • ...I guess. I've flipped between ATI and Nvidia for years. Currently, I have an ATI part in my primary desktop at home. Plays COD2 pretty well. BUT, it's an older mobo with AGP...no PCI-E. So, if I want more "ohhh....pretty!" from Doom3, FarCry, HL2, etc., I'll be buying an Nvidia 7800GS AGP part. ATI has indeed had driver issues and I've experienced them with the Radeon 9800 Pro I've got now. OTOH, my older Athlon machine is STILL running an old Nvidia Ti4200 and loving it. So, unless ATI offers up some
  • Nice CPU (Score:2, Interesting)

    by operagost ( 62405 )
    About half of our credit unions run on servers with about the same amount of RAM and half the clock speed of this card.
  • Disclaimer: I use Nvidia and have since the TNT2 was out. I'm not a fanboy though.

    I have never held a grudge against ATI but these driver issues have been a problem for them for a VERY long time. I remember buying my first TNT2 card and back then, the competing product from ATI (can't remember what it was) was riddled with driver problems. So I avoided ATI like the plague and went with Nvidia. Wash, rinse and repeat for each iteration of cards...

    It's very interesting to me that here we are - 10
  • Oh goody!
    Now I can play Duke Nukem at 120 frames per second despite the fact that human eyes aren't capable of seeing that much data.
    Given the fact that we seem to be reaching laws of diminishing returns on video cards, shouldn't the hardware manufacturers instead start to concentrate on the weakest link by improving the capabilities of the human?
    I need eyes than can handle more fps. I need more bandwidth from my eyes to my brain. I definitely need more processing power in my brain.
    I won't mention one
    • by geekoid ( 135745 )
      "120 frames per second "

      sigh.

      There are two factors:
      1) Page fliping.
      2) 120 FPS will drop when there are 10-20 other people on the screen moving around.
      I can gt 60FPS in WoW, but as soon as it gets busy with people it drops to 20.
      Of course even at 20 it's pretty good.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...