Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATI, Nvidia Reveal New $250 Graphics Cards 84

ThinSkin writes "As part of their 'Spring Refresh,' both AMD and Nvidia reveal new $250 graphics cards, the Radeon 4890 and GeForce GTX 275. ExtremeTech takes both cards and runs them through a gamut of gaming and synthetic benchmarks to decide which card triumphs over the other. Long story short, the GeForce takes the cake with impressive performance at its price, while the Radeon didn't show a high improvement over the cheaper Radeon 4870."
This discussion has been archived. No new comments can be posted.

ATI, Nvidia Reveal New $250 Graphics Cards

Comments Filter:
  • by FreakinSyco ( 873416 ) on Thursday April 02, 2009 @02:25PM (#27435183)

    Yet HotHarware tests their 4890 and shows that it outperforms the 4870 in every category...

    http://hothardware.com/Articles/ATI-Radeon-HD-4890-RV790-Unveiled/ [hothardware.com]

    and I quote:

    "In every test, the Radeon HD 4890 (Asus EAH4890) was faster than the 1GB Radeon HD 4870, and the overclocked 4890 (Asus EAH4890 TOP) simply increased the card's overall lead. In comparison to competing offerings from NVIDIA, the Radeon HD 4890 is faster than the GeForce GTX 260 Core 216 overall, but it didn't quite keep pace with the just announced GeForce GTX 275."

    • Confusion (Score:2, Insightful)

      by mac1235 ( 962716 )
      Hardware site 1 disagrees with hardware site 2! Who can we trust!
      • no one. We see this every time these two companies release cards. Go around their site and see what they have given the other cards my bet is if one company wins all the time you can find the problem.
      • Both agree that the Nvidia 275 outperforms the ATI 4890, so really, both sites agree. The differences between the sites performances for the ATI card are probably related to the differences in platforms (CPU, motherboard, RAM, FSB) that the card was installed.
      • Re: (Score:2, Informative)

        The site that provides more concrete data, particularly about their test methodology. After all, "In god we trust. All others must bring data."

        ExtremeTech chose not to test at 2560x1600, despite using a beta driver that significantly alters the resolution scaling performance of the NVidia cards. This means their numbers and thus their conclusions are worthless for anybody who wants to us the higher resolution. On the other hand, they didn't test at 1680x1050 either, so their tests didn't reveal the signific

        • by HTH NE1 ( 675604 )

          I'm more interested in performance at resolutions 3840x1024 and 5040x1050 myself (max resolution capabilities of the analog and digital Matrox TripleHeads2Go respectively).

    • by Zakabog ( 603757 ) <.moc.guamj. .ta. .nhoj.> on Thursday April 02, 2009 @02:34PM (#27435331)

      No improvement of the 4870??

      and I quote (from the summary):

      "Long story short, the GeForce takes the cake with impressive performance at its price, while the Radeon didn't show a high improvement over the cheaper Radeon 4870."

      Both articles say the same thing, the GeForce 275 is a better performer, and the 4890 isn't a much higher improvement over the 4870 (though it is still an improvement.)

      • by electrosoccertux ( 874415 ) on Thursday April 02, 2009 @03:05PM (#27435757)

        I didn't read the hothardware article. Did they specify at which resolutions which card wins? Did they test with the newest 185 Nvidia drivers? They're moderately slower than the 182's.

        Anandtech, my personal favorite reviewer (none of that 1 paragraph/page + 100 page article nonsense *cough tomshardware cough*) tells a different story [anandtech.com].

        In case you don't feel like clicking-- 4890 takes the cake hands down on 24" and sub 24" displays (1920x1200 resolution and lower). At 2560x1200, it's a tossup.

        Considering you can buy the 4890 right now [newegg.com] and the GTX275 won't be available for 2 more weeks [dailytech.com], I think it's pretty clear which card to get.

        • Re: (Score:1, Funny)

          by Anonymous Coward
          The nVidia card of course
        • by afidel ( 530433 )
          24" for 1920*1200, hehe. Try a 42" 1080p lcd tv with a real home theater for surround sound effects, much more fun than a little PC monitor =)
      • by Ecuador ( 740021 )

        Hmm, both articles I read today from my preferred sites (HardOCP and Anandtech) said that it was a close call, but the ATI a bit faster (and the only one actually available for purchase).

        Anyway, my gaming days are long gone (except the occasional Wii sports & Civilization), so it is a moot point for me...

      • by Ecuador ( 740021 ) on Thursday April 02, 2009 @03:35PM (#27436137) Homepage

        Replying to my own post above, I actually quickly went through the ExtremeTech article (ok ok take back my /. badge) and confirmed why it is not one of my "preferred" sites. To call the nVidia better they divide the cost by the FPS over the games they tested. Yes, the raw FPS over different games, doesn't need a PhD in statistics to figure out the problem here. Then, they admit that the ATI overclocks very well (with the included utilities from ATI and ASUS), while the nVidia does not, but instead of stating this as an advantage for the ATI card they complain about the cards not shipping at the higher speeds they can easily attain. And finally, they list the nVidia at "$250 street est." price, while the ATI as "MSRP $259" when the latter is right now selling NOW on NewEgg for $230 after MIR or $250 directly. Great Job ExtremeTech!

        • instead of stating this as an advantage for the ATI card they complain about the cards not shipping at the higher speeds they can easily attain

          Why should I have to overclock to get the most out of my brand new graphics card?

          • You shouldn't which is why the review should focus on the as sold spec instead of what a team of monkeys with a cryogenic cooler and 10 years of experience can accomplish. The cards likely ship at the speeds they do because of some failure rates at higher speeds that the manufacturer decided was too much of a liability. It may shorten the lifespan to before the warranty expires or it may cause data corruption resulting in wired errors getting blamed on their equipment and they don't want the hit inreputatio

          • by Ecuador ( 740021 )

            I am not a gamer, but reading the reviews it seems that the ATI cards are conservatively clocked to allow their partners to sell at higher clocks, but also give the users a utility which has a simple slider to OC. So we are talking about somewhat "manufacturer-sanctioned" overclocking that is done by simply moving a slider on a software utility. For the specific ASUS card tested, an ASUS utility was also included which can even increase the voltage for higher speeds (the HardOCP unit was stable at 1GHz).
            It

    • by MrHanky ( 141717 )

      Not only that, but the linked ExtremeTech article shows that the Radeon 4890 is faster than the 4870 in all cases, and in many cases faster than nvidia's offering as well.

  • Beware (Score:4, Interesting)

    by HunterZ ( 20035 ) on Thursday April 02, 2009 @02:27PM (#27435209) Journal

    The Inquirer (I know, they hate nVidia with a passion) is speculating that the GT275 may be a relabeled GT260, except for reviewer cards which may be relabeled GT280's: http://www.theinquirer.net/inquirer/news/599/1051599/nvidia-hoodwinks-reviewers-mythical-gt275s [theinquirer.net]

    I guess this is common for ATI/AMD and nVidia to do, but it's the first I've heard of it and it seems awful slimy.

    • by MoFoQ ( 584566 ) on Thursday April 02, 2009 @02:31PM (#27435263)

      yea...but don't think it's a "they"...just one crusader (don't think he wears a cape): Charlie Demerjian

      He just hates nVidia with a passion.
      Supposedly, there was some sort of "tiff" between them...them, like many companies, wanting to limit negative reviews, etc.

      either way, grains (not just grain) of salt required for his articles regarding NV.

      • Re: (Score:3, Insightful)

        My impression of charlie's nvidia coverage is that he "sweats the small stuff," "makes mountains out of molehills," etc. But on the big stuff, he's pretty accurate. For example, he had the scoop on the overheating nvidia chips months before it came out anywhere else. That screw-up cost nvidia (and partners like HP) double and maybe triple digit millions of dollars of losses. I would consider the claims in that article to be "big stuff" - sending high-end cards to reviewers and low-end (and mostly unavai

        • The only problem with this theory is that the video card companies have been called out on this every time they have done it in the past. It is very simple for a review site, once seeing allegations like this, to go to their local store and simply buy a card. And the other problem is that there are simply too many review sites for companies like ATI or Nvidia to send review samples. There is almost a new review site up almost every 2-3 months, and if one of those picks up a retal product (or as is the case
          • by MoFoQ ( 584566 )

            true...but video card companies aren't the only ones.

            many semiconductor companies do it mainly to increase yields off of a single wafer.
            Not all chips on a wafer can pass off as high-end parts. Some will have to be "modified" to make them lower-end parts; even then, at least they aren't throwing that piece of silicon away.
            Case in point, celerons and durons (blast from the past)

        • by MoFoQ ( 584566 )

          the problem is...he's crying wolf too many times...sure...once in a while, he's bound to get it right...but other times...no.

          in the end, it just hurts him and his crusade.
          If he at least wears the face of being unbiased, his observations would carry more clout.
          Case in point, look at the comments on his articles, many claim that he's...well...let's just say they aren't nice comments.
          Sometimes, more damage can be done by being or at the very least seeming to be objective. It adds credibility.

          either way, I was

    • by Sycraft-fu ( 314770 ) on Thursday April 02, 2009 @03:21PM (#27435975)

      The 275 is something of a "cross" between a 260 and 280. The 280 has 240 shader cores, 1GB of RAM and a 512-bit memory bus. The 260 has 192 or 216 cores depending on the version 896MB of memory and a 448-bit memory bus. They are both 65nm parts. Well the 275 is a 55nm part and nVidia's spec page says it has 240 shader cores, 896MB of memory and a 448-bit memory bus. Hence like I said, a cross between the old two.

      Ok well that leave one of two situations for the Inquirer conspiracy theory:

      1) nVidia is giving the reviewers cards with more RAM. Possible, but not likley. Also, wouldn't give significantly better results. Turns out that much RAM isn't useful for games these days.

      2) nVidia is lying on their product spec page. They are sending 240sp versions to reviewers, and 192 or 216 core versions to the public. Very unlikely, they'd get sued for false advertising.

      I just don't buy it. I suppose in theory they could do something like increase the clocks on review cards, but that is real likley to get noticed. Those reviewers know how to run utilities like GPU-Z as well as the rest of us.

      I am starting to think nVidia needs to sue Charlie Demerjian for libel. There's not much question his intent is malicious, and he certainly puts out false information. His only defense would be that he didn't know it was false and of course that brings up the question as to why he didn't check, being a journalist and all.

      I'll believe this if someone has some kind of real proof, but this seems totally unsubstantiated.

      • The 275 is a 285 with rops disabled, that's the only difference, or Core 216 with ROPS enabled. But they are the same chip the GT200. If you look at the specs it's obvious they based the 275 around the GTX 260 reference design PCB based on the Core 216/GTX 260.

        Either way I was totally underwhelmed by both cards in terms of performance, I'm not sure why everyone is getting their panties in a bunch. Anyone owning 8800 GTS / GTX from the last gen still can play every game on the market.

        For anyone who owns a

  • ATI 4890 on linux (Score:5, Informative)

    by MC68040 ( 462186 ) <henricNO@SPAMdigital-bless.com> on Thursday April 02, 2009 @02:28PM (#27435229) Homepage

    This might be useful to someone like me, Phoronix just reviewed the 4890 on Linux with the ATI catalyst drivers:
    http://www.phoronix.com/scan.php?page=article&item=amd_radeon_hd4890&num=1 [phoronix.com] :)

    • Yes that is great news indeed!

      I am looking to replace an onboard (Asus / nVidia) video card by an ATI.

      That might be contrary to what most people say around here, but I've always had more luck on Linux with the ATI drivers than the Nvidia driver.

      Rep

      • I'm running ATI now (Ubuntu 8-10 @ 64 bit), but with my on-board I have fewer options than with nVidia, the options pane looks just horrible, I cannot tilt my screen and (biggest point) I cannot do any 3D while having Compiz (desktop acceleration) enabled. It pretty much sucks and I am thinking of buying nVidia again. I moved to ATI because of the promises of better drivers, but IMHO they haven't really delivered.

        OTOH, it's a fast, fan-less GPU with dedicated memory for an on-board and my machine is mainly

        • by Nikker ( 749551 )
          I have to agree there with you. I bought an Asus board with an onboard ATI/AMD GPU on it, it's just for a media centre pc so graphics doesn't mean that much, so I figured with all the source released and Linux support getting a bit more press it might be ok for now .... I gotta say you've burned me before ATI and this was no exception it wasn't even a cutting edge GPU either . Ubuntu would id the card and I installed both the radeonhd and blobs both sucked pretty hard when it came to even basic desktop ef
        • by makomk ( 752139 )

          biggest point) I cannot do any 3D while having Compiz (desktop acceleration) enabled.

          I think that the latest ATI driver release is supposed to fix this, finally. Took them long enough. Users of xserver 1.6 may have to wait for the 9.4 drivers next month, though...

      • by Svenne ( 117693 )

        Keep in mind that ATI's driver doesn't support vsync on Xvideo, which means you'll experience lots of tearing when watching videos.

    • by ianare ( 1132971 )

      Do you know if this version of catalyst finally supports the new X.org driver ? I can't upgrade to ubuntu 9.04 because of this issue ...

  • by MooseMuffin ( 799896 ) on Thursday April 02, 2009 @02:29PM (#27435243)

    Nothing in the linked article or the other various reviews of the two cards I've seen today concludes "the Radeon didn't show a high improvement over the cheaper Radeon 4870."

  • by PIBM ( 588930 ) on Thursday April 02, 2009 @02:30PM (#27435247) Homepage

    I'm not sure on how to analyze that post after reading TFA. It seems that the radeon beat the nvidia in most of the cases, even at the $ per average FPS..

    Thus why is this tagged with nvidia as the winner ?

  • by Colonel Korn ( 1258968 ) on Thursday April 02, 2009 @02:30PM (#27435251)

    Looking at a wider range of reviews, I think we can call this round a draw. That means the real winners are consumers, because the selling point will become price.

    Or, if you read the most interesting review of these cards, you'll see why maybe nvidia will skip the price game this time and instead try (and fail) to sell their cards based on physx:

    http://www.anandtech.com/video/showdoc.aspx?i=3539 [anandtech.com]

  • by Anachragnome ( 1008495 ) on Thursday April 02, 2009 @02:50PM (#27435539)

    Kind of a useless test from my point of view, that of someone that would be looking to upgrade, for one simple reason.

    These comparisons never seem to include the last generation of cards, and thus of no real value to me since I cannot determine how much of an upgrade I would be getting.

    I don't care how many fucking cores it has if it doesn't perform better then what I have right now.

    Benchmark testing my own machine(as a comparison tool) is sort of useless as well since the REST of my machine may be totally different then what they used.

    Which is better then which is an entirely moot point if neither is better then what I have.

    • by GeorgeS ( 11440 )

      AMEN Brother!!
      It's like a car comparison telling me one gets better MPG but, never telling me how many MPG it actually gets.

      I don't care if it gets 2 MPG more than the other if they both only get around 5-7 MPG and therefore they both suck. /end rant
      Sorry, I need some sugar :)

    • GPUReview (Score:5, Informative)

      by TypoNAM ( 695420 ) on Thursday April 02, 2009 @03:21PM (#27435965)
      Check out GPUReview's video card compare and see what the theoretical performance differences are:
      http://www.gpureview.com/show_cards.php [gpureview.com]

      It does appear that the just announced cards aren't listed on that site yet to compare against unfortunately.
      • You are correct. Neither the GTX275(not listed) or the Radeon4890(listed, but no data) are available for comparison on that site.

        I already tried that. That very same site is what I used to decide on my GeForce 8600GTS, and I am quite happy with the results. I feel the comparison was pretty accurate, having used the card for almost a year now, and having had a chance to compare to the cards friends have.

        I stand by my original statement. The test in the article provides me with no usable information other the

  • Wow (Score:3, Funny)

    by SnarfQuest ( 469614 ) on Thursday April 02, 2009 @02:53PM (#27435605)

    $250.00. That's more than half of what I paid for my entire system new. The cost of a new system seems to be heavily based on the graphics/monitor. Used to be RAM and disk.

    • The problem is you are comparing a very high end graphics card to the price of a very low end system. These two cards aren't quite the highest end offered, but they are up there. These are the serious gamer type cards. We are talking "All details up, high rez, fast frame rate," things. Thus they are targeted at higher end systems.

      A $500 system is very low end. Nothing wrong with that, just recognize that. Now that means that all the components are thus lower performance. In a system of that price, probably

      • Years ago I knew why some premium sound cards were worth the money, but I've long since lost track of the marketplace (aside from reading various things in the last year about old versus current versus upcoming Linux sound support).

        So I'm curious, what do you get on today's systems that makes it worthwhile to pay $200 for a sound card?

        I had the feeling it was no longer for wavetable, nor for number of bits of d-to-a conversion. Is it 7.1 surround, or what?

        • Re:$200 sound card? (Score:5, Informative)

          by Sycraft-fu ( 314770 ) on Friday April 03, 2009 @12:57AM (#27441431)

          Depends on the purpose of the soundcard. In the pro arena, it is number of inputs and outputs, and quality of said inputs and outputs. In the consumer arena it is also I/O quality and to a lesser extent variety, but hardware effects as well.

          The effects are pretty simple. Some cards, like the X-Fi, have a DSP onboard and can process 3D spatialization effects in hardware. This is of use for some games. Some games, like say Unreal Tournament 3, only do rudimentary software processing. So if you lack a card that handles OpenAL acceleration, you get only 2 channels and little effects. If you have a hardware OpenAL card, you get full surround and all the effects. Some games can go both ways and simply sound a little better with hardware, or just load the CPU less. Others can't use hardware acceleration at all. It isn't a magic bullet, but useful if you are a gamer (and I am, very much so).

          The quality of inputs and outputs is a bigger one, and one you can spend a lot more money on. You are correct that basically every converter these days is 24-bit. However, that isn't the tricky part. The tricky part is all the supporting analog hardware that you use. This is the difference between a system with a high and low SNR and things like that. For example you will find that nothing comes near the theoretical 144dB SNR that 24-bit offers. You may find that a cheap soundcard gets 90-100dB, and it may even by lying about that and reporting the D/A component values, not the final signal, whereas a pro card might get close to 120dB (and really get that).

          Also they often have outputs to better deal with special things like low impedance headphones. If I plug my headphones in to the onboard sound, there is audible noise. Why? Well in part the lower SNR but most because its opamp is getting overloaded. The headphones are rather low impedance (40 ohms) and that's more than it can handle. My X-Fi has a headphone port, which has either a buffer to provide more current or a better opamp (or both) so it has no problem powering them and generating no audible noise.

          So higher end cards get better overall sound quality, at least if paired with higher quality components later on. Does it matter? Depends on you mostly. Me, I like high quality sound. I have a "home theater" setup for my computer, not computer speakers.

          It isn't necessarily something for everyone, but some people like it. Also it may fall by the wayside in consumer systems if HDMI becomes popular. That carries surround audio, in addition to video. So instead of a soundcard you could hook that in a receiver and handle all the D/A conversion there. The digital signal would be output by the video card, of course.

      • Five hundred is "very low end"? Not for general purpose desktops. Think more like $250-300.

  • Funny (Score:4, Funny)

    by Dunbal ( 464142 ) on Thursday April 02, 2009 @03:27PM (#27436057)

    I thought price fixing was illegal in the US. I'm curious as to how the two top competitors manage to release "new" products at the same time and for roughly the same price (within $50 of each other)...

    Ohhh wait, must be that "free" market at work again?

    • Re:Funny (Score:4, Insightful)

      by geekoid ( 135745 ) <{moc.oohay} {ta} {dnaltropnidad}> on Thursday April 02, 2009 @03:40PM (#27436211) Homepage Journal

      Both have the same entry price point.
      That's not uncommon at all. In fact there is a whole method for calculating it.

      There doesn't need to be a conspiracy or collusion or 'price fixing' for 2 similar products to have the same price point.

      • That's not uncommon at all. In fact there is a whole method for calculating it.
        There doesn't need to be a conspiracy or collusion or 'price fixing' for 2 similar products to have the same price point.

         
        Quite true. The situation for the consumer, however, is the same. And really, even if they aren't agreeing on a price, if a group colludes on an algorithm to set the price then they still collude.

      • "There doesn't need to be a conspiracy or collusion for 'price fixing' for 2 similar products to have the same price point."

        There, fixed that for you.

    • by jandrese ( 485 )
      It's certainly not because they both want to compete at a very lucrative price point, nor is it the case that they have various cards at different price points (which also compete with each other). This is a clear case of market manipulation, call the FTC!
    • Pay attention (Score:3, Informative)

      by jgtg32a ( 1173373 )
      When ATI released the 48x0 cards, their top of the line, they had something like 80% of the performance as the top nvidia cards and cost 1/2 as much.

      Prices on cards are dropping.

      And as others have stated its a price point.
    • Re: (Score:3, Funny)

      by int69h ( 60728 )

      Strangely enough McDonald's, Burger King, and Wendy's all sell their sandiwches for roughly the same price as well. Clearly it's the work of a burger cartel.

    • Re: (Score:3, Interesting)

      by Kjella ( 173770 )

      In perfect competition price = marginal cost and unless there's cost differences there shouldn't be a price difference either. Even if you include the write-down costs of assets for a sustainable zero profit operation it still shouldn't differ. The situation is more like this: Around here there's three gas stations very close to each other and they're usually within... in USD, 0.015$/liter. Why? Because it's damn hard to sell overpriced gas if doing a live market survey takes less than a minute. When someon

  • Since the world is slowly (rapidly?) moving towards the lap/notebook this market surely can't be a growth one.

    i.e. in my house I've moved all my kids computers (4), mine, and the wife's to laptops. Oh, and my low power home server broke so I switched it to EEE701 since the 20MB/sec it cranks out is more than sufficient for the G/N network.

    That, and none of the new chips they're bring out seem to be much better than renamed versions of the old ones. My "old" 7900GTX has about the same performance as the 9600

  • I'm not sure about nVidia, but ATI has (for a while now) offered "X2" versions of its high-end cards, with two GPUs connected via integrated CrossFire (like SLI). These cards are more expensive, to be sure, but last time I checked the 4870 X2 was the best single card available.

    Presumably the 4890 is also available with two GPUs? How about the 275?

  • For years I used the ATI Rage Pro 128 MB AGP card in all my computers, because it worked reliably with all linuxes and other OS's and was cheap, and performance was more than good enough. At one point I bought 15 of the cards for $3 each on ebay, and used them one by one in various servers and desktops.

    Now, most of the motherboards have PCI Express slots in them. I would like to use ATI cards with all open-source drivers. What is the equivalent card I should be purchasing ?

    • Keep an eye on Phoronix to see what works (and performs) well with Linux. While they don't inspire any confidence in their abilities to do an in-depth, comprehensive assessment of several pieces of hardware, they are good at comparing one ATI card to the next and one version of the kernel or drivers to the next. They also do a great job of reporting on the status of new and upcoming features in the open-source drivers.

  • Hmmm, I read the article and paid great attention to the benchmarks. 4890 tends to score better.

    Here, read the FPS results for yourself all run by Extreme Tech at 1900x1200 (from about 22" to 27" monitors).

    card noAA/4xAA

    crysis
    275 24/19
    4890 24/21

    far cry 2
    275 68/56
    4890 79/56

    l4d
    275 125/105
    4890 126/95

    COD5 World in Conflict
    275 61/40
    4890 56/38

    Company of Heroes
    275 99/84
    4890 69/60

    Supreme Commander
    275 66/64
    4890 68/63

    Hawx
    275 71/43
    4890 61/54

    Stormrise
    275 29/28
    4890 47/42

    Stalker Clear Sky
    275 50/23
    4890 48/23

    OCUK Price of

  • ok im not seeing how the 275 "takes the cake". the benchmarks all show the 4890 beating it in game...AND with less power consumption and a lot less heat...and the 275 is 10$ more. and thats b4 rebates :P lol as far as i can see...the 4980 took the cake...and ate it too :P

To be awake is to be alive. -- Henry David Thoreau, in "Walden"

Working...