An Early Peek At AMD's Radeon HD 4870 X2 148
Dr. Damage writes "AMD has quite a hit in the Radeon HD 4000 series. Coming up next is a product code-named R700, a high-end graphics card based on two 4870s paired together. TechReport has a preliminary look at how the card — to be called the Radeon HD 4870 X2 — performs. Nvidia could have one heck of a fight on its hands."
radeonhd driver? (Score:1)
Any idea if the radeonhd driver will be in a usable state for these? Or does nVidia still lack competition on the Linux front?
Re:radeonhd driver? (Score:5, Informative)
By the time they ship, we might have released working 3D drivers for these, through xf86-video-ati and xf86-video-radeonhd. Can't guarantee anything, though, since we don't even have the documentation, but I do know that there's been some NDA work going on already.
And yes, I AM a Mesa dev. :3
Re:radeonhd driver? (Score:5, Interesting)
Is there a place that has the current state of the Radeon support in the various drivers lined up that's possible for someone who isn't a developer to make sense of?
When I was putting together my current box last week, trying to figure out which card was better to get was a pain when it came to the AMD hardware. I ended up getting the GTX 260, because it was the best performing card that fit into my budget and I knew it would work fine under Linux.
I couldn't make any sense of the state of the drivers for Radeon hardware. I gathered that the radeonhd driver was the actively developed one, but RV7XX hardware wasn't listed as supported [x.org]. The latest catalyst drivers [ati.com] didn't list support for the 4850/4870 either, so hearing that both drivers have working 3D support for a card not yet released is... not really odd, but the contradictions are symptomatic.
Re: (Score:1, Informative)
Radeon Driver Feature Matrix [x.org]
The RV700 are similar to R600 series, so the rightmost columns apply to RV700 too.
For news, Phoronix is your friend (Score:2)
Phoronix [phoronix.com] is a very Linux oriented news site, which also follows closely various development in both radeon opensource drivers, in geforce nouveau project, and the official binary drivers from ATI and nVidia.
radeonhd.org [radeonhd.org] is a sister site they've put up, which more specifically hold news about both drivers and links to specific ressources.
Every once in a while, they do some benchmarks and thus you can have an idea about how these drivers perform.
Re: (Score:2)
Re: (Score:2)
Or does nVidia still lack competition on the Linux front?
They still have Intel to deal with though. Granted, Intel's graphics cards are usually lower end than nVidia and ATI's cards, but even with nVidia you still have to configure things, with Intel its simply install it and it works.
Re: (Score:1)
Drivers (Score:2)
4870X2 has already been supported for a week [phoronix.com] (for 2D only) by both open source drivers, thanks to the Atom-Bios support.
For 3D see what the Mesa developer said a couple of posts above.
The Windows Catalyst and the Linux fglrx share a lot of common code and AMD has pledged to make efforts to keep quality in the Linux drivers.
The HD3000 has seen a very quick support in the closed source drivers. So probably the HD4xx0 will be supported into Linux fast.
91+ degrees (Score:3, Funny)
Now that's a nice heater for the winter
Heat (Score:2, Interesting)
Re: (Score:2, Informative)
I de
excuse me (Score:4, Informative)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
With the exception of laptops, are there any graphics cards available that won't make my room an inferno when I'm gaming?/I.
GeForce 2 was pretty good, or maybe a Radeon 8500?
Nice and toasty (Score:5, Funny)
Crysis benchmarks are very good (Score:4, Insightful)
Re:Crysis benchmarks are very good (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
dont think so (Score:2)
Re: (Score:2)
length is important (Score:2)
wow (Score:2)
It's great to see ATI come back so strong. I honestly didn't expect it - it's keeping the entire market sharp. Good riddance to the +$1k prices of the 8800 ULTRA 1 year ago!
i didnt see that prices ever. 1 year ago i was using my radeon 1600 without any need for upgrade. i upgraded to 3870 for age of conan around 4 months or so ago, and at that time prices for 3870 was around $280. 8800 was around what, $500-600 or something. good thing i didnt see 1k prices.
Quoting from TFA (Score:5, Interesting)
However, playing with this early sample of 4870 X2 is a vivid reminder that we don't make these choices in a vacuum. The reality is that a single Radeon HD 4870 GPU is nearly fast enough to keep pace with the GeForce GTX 280. Even if you're running a game that lacks a driver profile or simply doesn't scale well with more than one GPU, the 4870 X2 ought to perform awfully well. And when it does get both GPUs going, as our results show, it's by far the fastest single video card we've ever tested. If this is how AMD rolls, it's hard to complain.
thats good news for gamers' wallets.
holy @$#^#^%&# FSM! (Score:3, Funny)
FTFA:
That's, erm, considerableâ"beyond the obvious graphics applications, that's the sort of computing power that may one day enable men to figure out what women want.
If you are a guy and are looking at video cards to figure out what women want... errr, you're doing it wrong!
Even if you are referring to CPU cycles, they've tried this once, almost unanimously across the galaxy, 42 is not what women want.
Re: (Score:2)
It's funny you should say that.
On several occasions, a high end, overpowered video card is exactly what my wife has been looking for.
Re: (Score:1)
TFA discusses what women want? (Score:5, Funny)
... the X2's 1600 total stream processors have a peak computational rate of 2.4 teraflops. That's, erm, considerable--beyond the obvious graphics applications, that's the sort of computing power that may one day enable men to figure out what women want.
Allow me to note that the very idea of plugging a woman's desires into a matrix processing unit is precisely what women do not want. It simply won't work.
To effectively compute female emotions, you'd need something like a quantum computer where you get all possible results at once (and I do mean simultaneously), usually with lots of yelling, doors slamming, and things being thrown.
Re: (Score:3, Funny)
It's noble of you to suggest, but I don't have what it takes to risk my life for science.
Re: (Score:1)
Re: (Score:1)
Allow me to note that the very idea of plugging a woman's desires into a matrix processing unit is precisely what women do not want. It simply won't work.
To effectively compute female emotions, you'd need something like a quantum computer where you get all possible results at once (and I do mean simultaneously), usually with lots of yelling, doors slamming, and things being thrown.
With the expected result of there being a "silent treatment?"
Re: (Score:2)
Re:TFA discusses what women want? (Score:5, Funny)
To effectively compute female emotions, you'd need something like a quantum computer where you get all possible results at once (and I do mean simultaneously), usually with lots of yelling, doors slamming, and things being thrown.
Sorry, it's not that easy, though you're right - it's a quantum effect. Womanly wants operate according to the uncertainty principle. It is possible to figure out what a woman wants, but as soon as you do, it's no longer true. If you think you're about to figure out what she's going to want, and you may very well be right, then you can't know what she wants right now, so you're still wrong.
Re: (Score:1)
It's exactly like trying to prove chaos theory , but the sheer act of measuring the computation changes the result, so you're always wrong.
Re: (Score:1)
Women's emotions reflect those of men. In order to know what women want one must first know what men want.
Given that the human race achieved peace only by pointing nuclear warheads at itself, good luck!
htpc usage - audio out (Score:2, Informative)
One bonus about these ati HD series cards is they support audio out through dvi. With a dvi to hdmi dongle it will also output 5.1 / 7.1 digital sound. Great for people who are using their pc as a home theatre hub.
4800 running too hot? (Score:5, Informative)
Make a profile in the Catalyst Control Center, make sure ATI OverDrive is enabled and check marked. Now find the profile files in:
C:/Documents and Settings/{user name}/Local Settings/Application Data/ATI/ACE
Open the profile you just created in notepad and change these lines:
My 4870 still idles at 58C or so, but anything over 30% is just too loud for me to have running all the time. Swapping the thermal paste on the GPU has also produced some good results for people.
Why do you think it is too hot? (Score:5, Insightful)
You have a misconception about what temperatures should be. They should be whatever the manufacturer rates the part at. Not all parts have problems with high temperatures. My 8800 runs at about 90C and has done so for a long time, still works great.
Have some faith in the companies to test this. They have it run hot because it can run hot without ill effects.
Re: (Score:1)
>90C
8800 GTX?
-Viz
Re: (Score:2)
I see what you are trying to say. Nonetheless, you must have in mind that the graphics card isn't the only component of a PC. There are also a whole gob of components which may not have such a high thermal tolerance. So your graphics card may work well at very
Re: (Score:2)
How is turning UP the GPU fan going to help the OTHER components?
Re: (Score:2)
yeah.. it takes heat off the GPU, but it dumps it into the "common air" zone. Which increases the temperature of the air flowing over other components, reducing the effectiveness of their heat-sinks and fans.
PCB substrate is, unfortunately, a fairly poor thermal conductor, which is why we need the fans in the first place.
Now, a ducted GPU fan might have some benefit, depending on how the ductwork is routed.
Nevertheless, the most important factor is not the temperature of the GPU, but the heat dissipated by
Re: (Score:2)
Re: (Score:2)
Uhh...
When talking about the temperature of the CPU with respect to fan speed, the conversation is about dissipating a set amount of heat, not "not generating it".
The energy is still there in the form of heat, it's just a matter of where it's located.
Re: (Score:2)
Aren't you buggered if the ambient temperature in your parts goes up 10C, though?
Re: (Score:2, Insightful)
*Doing this mod disables active fan control on your card. The fan will run at the set percentage of its full speed all the time. Setting that number too low can result in overheating and permanent damage to you card. Mod at your own risk.
Re: (Score:2)
Is it actually a problem? These things are designed to cope with very high temperatures. My 8800GTS 512 idles at 68c, I can't say I'm too worried about it; by the time it dies, it's going to be getting sand kicked in its face by £30 passively cooled cards.
Re:4800 running too hot? (Score:5, Informative)
In fact, the article addresses this issue, see this page [techreport.com]
All of the Radeon HD 4800-series cards we've tested have produced some relatively high GPU temperatures, and this early X2 card is no exception. When we asked AMD about this issue in relation to the 4850 and 4870 cards now shipping, they told us the products are qualified at even higher temperatures (over 100 [degrees] C) and tuned for low noise levels. In other words, these temperatures are more or less by design and not necessarily a problem.
Get ATI Tray Tools (Score:3, Informative)
~50 watt video card (Score:2)
Are there any decent video cards that run without adding another casefan and a 1000W PSU to my system?
Re: (Score:2)
I for one... (Score:2)
I run an 8800GT off a decent 380W power supply. The power on the 12V line is abnormally high for a 380W rating, but still. The 8800GT does require an extra connector. My Antec Solo keeps it respectable at medium fanspeed on the single 12 CM fan. I pummelled it repeatedly over the months and could not get it to hang or do anything erratic, so I'm confident that this power supply is adequate for my setup.
A problem pervading power supply 'requirements', is that no vendor can require that simple rating. Th
Yea but what about memory? (Score:2, Informative)
They need to get the memory bus width straightened out. The 4870 GPU does 1.2 tfps(Teraflops), the nvidia 280GX something like 933Gfps, but the 280GX beats it handily in framerates.
This is largely because 280 can get the textures from memory to GPU hella faster (115Gbps vs 141Gbps, 256 bit bus vs 512 bit on the 280) for compositing. As well the 280 has 1GB video memory.
Given equal memory subsystems the 4870 would smoke it. The memory subsystem on the 4870 is a huge handicap.
Unless the upcoming dual GPU doub
Re: (Score:1)
You are talking about 4870, not 4870x2. 4870x2 has twice the memory bandwidth of a single 4870 (duh.) and performs accordingly.
Re: (Score:2, Informative)
No it has the same bandwidth to each GPU. They don't share texture memory. If they did, it would be a crapload faster than 2 4870s in crossfire mode.
As it is, the 4870s in crossfire edge it out. They alternate frames and use discrete memory allocated to the individual GPUs for textures. It's a pair of RV770 GPU's with the same problem on one PCB.
4870's that aren't memory starved will smoke this, like I said in the last post. This card is still memory starved. It's 2 256 data paths, one to each GPU. The auth
Re: (Score:2)
Wrong.
Read the article, the 4870x2 will come with 2GB memory giving it an effective of 1GB since they need to duplicate it.
Nvidia g84 and g86 chipset problems (Score:1)
Re:1gb mem (Score:5, Informative)
1Gb != 1GB
Re: (Score:2)
Try reading the comment mine was attached to.
Re: (Score:3, Informative)
Re: (Score:2)
Ill choose the 8Gb not the 1 GB
But they are the same!
No 8Gb is more then 1GB as 8 is a larger number then 1.
It is to bad that people just don't want to type out GigaByte and GigaBit. Heck I would like to see GigiByte and GigiBit as well. so you really can tell the difference.
Re: (Score:2)
16 chips actually. You missed the eight chips per processor part in your calculations.
Re: (Score:2)
Re: (Score:2)
Read the first sentence again: "The board has eight Hynix GDDR5 memory chips per graphics processor".
Eight x 1Gib per GPU = 1GiB per GPU.
Re:Driver Support (Score:5, Informative)
Re: (Score:2, Funny)
Re: (Score:3, Informative)
While I had no problems running XP or Vista using ATI drivers, I certainly have issues running X on Linux with ATI drivers. X keeps crashing at the weirdest times, whereas I have no problem with NVidia drivers.
Re: (Score:1, Informative)
I had more trouble getting X to work properly with the ATI drivers than the NVidia drivers, but I've gotten both to work (and stable) recently. My biggest nightmare was when I tried to use an ATI card with Sabayon linux. I could only get half of the graphical features working at any given time, but beyond that I haven't had any issues.
Re: (Score:1)
Re: (Score:2)
Are you serious? With the fglrx drivers, I get KDE4's composites features. Great. But any time I try to shut down X, I get a hard lock. With the radeonhd driver, I get FEWER crashes (using the git code - the last released version didn't work for me, either), but no composites, and even video is shaky. (AMD64, quad-core, with ATI 3870HD card.)
This is compared to my old nvidia-based P4 where video was *always* rock-solid using the proprietary drivers.
Re: (Score:1)
I don't know if this will help you but there is a fix suggested there that resolved the issue for me.
Re: (Score:2)
Unfortunately, that isn't true in my case (Radeon HD 3850). 8.6 led to the death of my motherboard--it ran some hardware autodetection program and I guess the pre-selected options were incorrect for my case (northbridge and ide options were selected).
After rebooting, Windblows blue screened shortly after entering the desktop. It kept doing that (and sometimes even rebooted itself without a BSOD), so I tried to install an older driver in safe mode. After the installation failed (the hardware detection por
Re: (Score:3, Interesting)
Re: (Score:1)
Trounced is a strong word for performance, you only trounce the competition if you can wallop t
Re: (Score:3, Informative)
Once again, games do not support drivers. (Score:1, Insightful)
Nor do drivers support games. Drivers support the OS and the API. The only thing a game supports is the API and the OS.
Re: (Score:2)
Re: (Score:1)
Too true. I won't be the first person to try them out. I remember the old catalyst drivers as being the impetus for my initial switch to Nvidia. I'll wait until the dust settles around the holiday season and see how the other geeks at work do using ATI cards. 2 guys plan on switching to 48xx something. One guy for sure won't, even if Nvidia is slower, because he can't sta
Re: (Score:2)
It makes me laugh that people keep saying that Macs are too expensive, then they turn around and say stupid things like "400$ is a good price for a video card". 400$ is 2/3 of the price of a Mac mini.
What a stupid argument. If you want that video card you want to play games. If you want a Mac that will play games, it will cost damn near twice as much as a comparable PC that will do so. If all I want is a web browser I can pick one of those up for a couple of hundred dollars.
Re: (Score:2)
Wow.
There is no need to spend $400 for a video card to play games. There are PLENTY of $150-range cards that are more than adequate to play everything currently available, and almost everything at "max settings."
Further, if your budget for video cards is $400, you're better off spending $150 now, and $150 in two years than to blow it all at once.
Compute power is a moving target. The wise buyer optimizes for cost per relative capabilities over time.
Re: (Score:2)
Wow. There is no need to spend $400 for a video card to play games. There are PLENTY of $150-range cards that are more than adequate to play everything currently available, and almost everything at "max settings."
I quite agree. Which is why I think that the extra markup Apple put on top of this is incredible.
Re: (Score:2)
Please name one person who has said both of these things.
Re: (Score:1)
Re: (Score:2)
All Macs have a GPU. Some have intel, some have ATI, some have nVidia. I don't see how GPUs are a Windows/Linux-only topic.
Re: (Score:2)
Re:Video card prices vs Mac prices (Score:4, Insightful)
Calling people stupid for buying a 1500$ Mac is okay but calling people stupid for buying a 400$ videocard is troll.
Typical slashdot.
Re: (Score:2)
Re: (Score:2)
Right, so the only measure of value is how much "gaming" your hardware will be enough for.
Therefore, typical Slashdot.
(Never mind the original point that you are paying a couple hundred dollar more for the video card just to turn some inconsequential settings from 10 to 11)
Re: (Score:1, Insightful)
You need to understand that some people use the computer for stuff that goes a bit beyond sending granny some emails or browsing myspace, for stuff where some computational muscle does in fact make some difference. Some enjoy playing recent computer games, which are very GPU-intensive, and some even need to have a high-powered machine to do number-crunching tasks for which they are paid for. For that class of users, powerful computing components are needed, components whose cost is proportional to the price
Your argument doesn't have much substance (Score:1)
... So I'll make sure mine does. My argument might be that Apple computers - while they do not typically represent a great value - They really aren't as over priced as everyone tries to make them sound. I'm far from an Apple fanatic, but I have to tell you, I get tired of seeing people argue something they've made no effort to crunch numbers on.
For example, their $2800 Mac Pro Desktop [apple.com] has Dual 2.8Ghz Xeons (Harpertown, quad-core). On Newegg, they are each over $700 [newegg.com] a pop. The board to put them in is a Dual
Re: (Score:2)
What, exactly, are you going to do with your three thousand dollar laptop, anyway? If your answer is CAD, you're probably buying such machines in bulk, and you're probably not going to be able to use macs, anyway.
If your answer is research, you've probably got a hundred+ node cluster stashed away in a fishbowl room in one of your universities' buildings corridor-crossings. (Macs make great front-ends to research computers, though, but I'm not sure the $2800 model would be a wise investment)
If your answer i
Re: (Score:2)
second yes, we use pcs to do stuff more than just 'arranging our mp3s', 'shuffling our photo albums', 'typing a document' and 'surfing the internet'. that includes gaming, one of the biggest entertainment of 21st century.
Re: (Score:1)
Well, if it were a video card that you bought through Apple, it would likely be marked up to the point where you'd need a mortgage to cover it.
Re: (Score:2)
Changed my password, some fucker was using my account. Sorry for the crap he/she wrote.
Re: (Score:3, Insightful)
Changed my password, some fucker was using my account. Sorry for the crap he/she wrote.
Me too!
Man, my impersonator was a real jerk. Nothing but lucid, excellent posts from now on.
Re: (Score:2)
Re: (Score:2)
Competition = good news for the consumer.
Not really. If you look at the Linux support for both nVidia and ATI you will find that they are both lacking. And Intel isn't much of competition for them because, even though they are commonly used, they aren't as high-end as nVidia or ATI's offerings.
4850 (Score:3, Informative)