Forgot your password?
typodupeerror
Graphics AMD Upgrades Hardware Games

AMD Delivers DX11 Graphics Solution For Under $100 133

Posted by timothy
from the does-not-improve-c64-aztec dept.
Vigile points out yesterday's launch of "the new AMD Radeon HD 5670, the first graphics card to bring DirectX 11 support to the sub-$100 market and offer next-generation features to almost any budget. The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while still offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features (like tessellation) in upcoming Windows gaming titles. Unfortunately, performance on the card is not revolutionary even for the $99 graphics market, though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design."
This discussion has been archived. No new comments can be posted.

AMD Delivers DX11 Graphics Solution For Under $100

Comments Filter:
  • Why? (Score:3, Insightful)

    by Nemyst (1383049) on Saturday January 16, 2010 @05:34PM (#30793830) Homepage
    I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh...

    Seriously, good for AMD, but I just don't see the point. Say it's a good card, say it has very low power consumption, but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless.

    And before anyone says I'm just bashing AMD, my computer has a 5850.
  • by Anonymous Coward on Saturday January 16, 2010 @05:56PM (#30794036)

    Which is still plenty powerful enough to run any game that also launches on the Xbox 360.

    It also does it without having to buy a new PSU. The DX11 bits are just there to help cheap people (like myself) feel comfortable buying the card, knowing that it'll still play all the games (even if poorly) that come out next year since games that use DX11 are already starting to come out.

    It's a good move from ATI, targeted at cheap gamers that are looking to breathe life into an older computer.

  • Re:Why? (Score:1, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @05:58PM (#30794054)

    I don't get it.

    Of course you don't. This card is for people who have lower resolution monitor (under 1440 X 900), since at lower resolutions it can run all modern games comfortably. About 50% people still run at 1280 x 1024 or below, and for them this is a great graphics card. It gives good performance at reasonable price, and has the latest features.

  • by tji (74570) on Saturday January 16, 2010 @06:11PM (#30794160)

    I'm not a gamer, so the 3D features are not important to me. I am an HTPC user, and ATI has always been a non-factor in that realm. So, I haven't paid any attention to their releases for the last few years.

    Has there been any change in video acceleration in Linux with AMD? Do they have any support for XvMC, VDPAU, or anything else usable in Linux?

  • Re:Why? (Score:5, Insightful)

    by Sycraft-fu (314770) on Saturday January 16, 2010 @06:13PM (#30794188)

    Well the things that may make DX11 interesting in general, not just to high end graphics:

    1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.

    2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.

    3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.

    It's a worthwhile new API. Now I'm not saying "Oh everyone needs a DX11 card!" If you have an older card and it works fine for you, great stick with it. However there is a point to wanting to have DX11 in all the segments of the market. Hopefully we can start having GPUs be used for more than just games on the average system.

    Also, it makes sense from ATi's point of view. Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc. Makes sense to do that for a low end part, rather than a totally new design. Keeps your costs down, since most of the development cost was paid for by the high end parts.

    In terms of hyping it? Well that's called marketing.

  • Re:Why? (Score:5, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @06:17PM (#30794224)
    Same thing was said about DX10. And about HD4670.

    And about DX9 before that. And DX8 before that. And on and on. I'm amazed by how many people here don't seem to "get" that advances in technology is precisely how technology moves forward. I mean, it's really a pretty simple concept.
  • Re:Why? (Score:4, Insightful)

    by Kjella (173770) on Saturday January 16, 2010 @06:38PM (#30794408) Homepage

    For example, using 2-3 GPUs in one box, people doing architectural visualization can get their results in minutes instead of days.

    Yeah, and the point was that those people wouldn't be buying this card. Face it, GPGPU isn't a general purpose CPU, we have some companies that are already damn good at making those. This means you either need it or you don't, and if you first do you'll probably want a lot of it. Companies and research institutions certainly will have the money, and even if you are a poor hungry student you can probably afford to invest 2-300$ in your education for a HD5850 which has a ton of shaders compared to this. The only real purpose of this card is to phase in a chip built on a smaller process, that'll be cheaper to produce. All they could have gained in performance they've instead cut in size.

  • Re:Why? (Score:5, Insightful)

    by BikeHelmet (1437881) on Saturday January 16, 2010 @06:52PM (#30794510) Journal

    Google Earth across 6 monitors from a single $100 card? Seems like technology is heading in the right direction!

  • Re:Why? (Score:2, Insightful)

    by Anonymous Coward on Saturday January 16, 2010 @06:56PM (#30794558)

    I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh....

    Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either... But hey, it IS "The way it's meant to be played", so can YOU even complain!?

  • Re:Why? (Score:5, Insightful)

    by MrNaz (730548) * on Saturday January 16, 2010 @07:03PM (#30794618) Homepage

    Face it, GPGPU isn't a general purpose CPU, we have some companies that are already damn good at making those.

    Not quite accurate. While GPGPU != CPU, there are things that GPGPUs can do far better than CPUs, and those things are more common than you'd think.

    The only real purpose of this card is to phase in a chip built on a smaller process, that'll be cheaper to produce.

    Even though I don't agree with you that that is the only reason, isn't making the same product, but cheaper, a worthy cause in and of itself?

    I feel that you are being unduly dismissive.

  • by DAldredge (2353) <SlashdotEmail@GMail.Com> on Saturday January 16, 2010 @07:23PM (#30794770) Journal
    If it wasn't for those games your 3D accelerator would cost much more than they currently do.
  • Re:Why? (Score:3, Insightful)

    by Antiocheian (859870) on Sunday January 17, 2010 @04:42AM (#30796988) Journal

    A new DirectX version is not technology moving forward. CUDA and PhysX are.

You don't have to know how the computer works, just how to work the computer.

Working...