AMD's Radeon R9 290 Delivers 290X Performance For $150 Less 183
crookedvulture writes "The back and forth battle for PC graphics supremacy is quite a thing to behold. Last week, Nvidia cut GeForce prices in response to the arrival of AMD's latest Radeons. That move caused AMD to rejigger its plans for the new Radeon R9 290, which debuted today with a higher default fan speed and faster performance than originally planned. This $400 card offers almost identical performance to AMD's flagship R9 290X for $150 less. Indeed, it's often faster than Nvidia's $1000 GeForce Titan. But the 290 also consumes a lot more power, and its fan spins up to 49 decibels under load. Fortunately, the acoustic profile isn't too grating. Radeon R9 290 isn't the only new graphics card due this week, either. Nvidia is scheduled to unveil its GeForce GTX 780 Ti on November 7, and that card could further upset the balance at the high end of the GPU market. As AMD and Nvidia trade blows, PC gamers seem to be the ones who benefit."
Additional reviews available from AnandTech, PC Perspective, Hot Hardware, and Tom's Hardware.
290X (Score:5, Informative)
Re: (Score:2)
You're not the only one. I was about to get my credit card to order parts for a Final Fantasy XIV, DX11 box.
Anandtech Fucked Up (Score:4, Interesting)
They used a shitty case with absolutely horrible acoustic profile to measure the card noise and got a whopping 57 dB.
Had they bothered to use a real case, they'd have had it almost half as loud (looks like everyone else managed to stay under 50 dB.)
It's like Anandtech never heard of Delta Fans, either.
Re: (Score:2)
Okay, I'll bite. What's wrong with the Phantom 630 case that Anandtech used? It has reviewed reasonably well, as far as I can tell.
Re: (Score:2)
The acoustic profile is absolutely horrible compared to say a HAF 922 or Fractal Define R4, hence Anand's nearly double-loudness noise measurement versus everyone else with a brain on choosing a proper computer case.
Well, they could've gone worse. They could've gone with an old SGI tower.
Re: (Score:2)
Then again they are measuring the noise of the graphics card, not the noise dampening of a case.
They could measure it with no case at all..
Re: (Score:2)
Derp, you're right. My HAF had foam panel inserts added, I forgot.
Re: (Score:2)
Re: (Score:2, Informative)
It's probably more to do with taking measurements at a 12 inch distance rather than something reasonable or even standard like 3 feet. While they're not perfect, I find that that techpower up [techpowerup.com] has the best measurements regarding noise and the largest sample size of different cards.
Re:Anandtech Fucked Up (Score:5, Insightful)
Re: (Score:2)
Fanboi of what, moron?
Show me the fanboyism. I think you'll find I can and do bash anything at any opportunity, including your dumb ass.
I've got cheap-ass $10 chinese cases quiet than that crappy thing Anandtech chose.
Also, Anandtech complains about noise, it's obvious they've never had a Delta fan or 5800FX.
Re: (Score:2)
according to many other tests the 290 and 290X in fact heat more and thus have beefier fans than their geforce counterpart (the 780 and its cousin the titan) - and that, by a rather large factor.
If you don't care for a little noise tho, the AMD is a pretty good value right now. My 780TF is nearly silent under load. I choose the noise - but if i had a really good case, i'd be tempted.
Re:Anandtech Fucked Up (Score:5, Insightful)
Noise measurements (all noise measurements, not just those related to PC hardware) are always suspect:
What is the ambient noise level?
What is the test environment? (Is it a well-isolated anechoic chamber, a common desk with a computer near the corner of the room, or is it on the deck of a boat, or on the back of a llama? It makes a huge difference.)
What is the distance between the rig under test and the measurement rig with the microphone?
Is this test rig calibrated? (To what standard?)
What are the properties of the noise? (if it is 57dBa at only 1.5kHz, it is very annoying to me. If it's 57dBa only at 25kHz, it is annoying only to my dog.)
Is the noise different in differing directions?
How do you know?
Did you measure it?
It's all important, lest the resultant number be absolutely unimportant.
Also: Meh. "This blue car sounds better than that other blue car!" is roughly as accurate as a non-descrip "noise measurement" of computer hardware.
ambient heat (Score:2)
Ambient heat makes a bigger difference, as the fan will have to work harder, spin faster, louder, to keep up.
Particularly when the mode of cooling is basically shoving as much ambient air at the problem to solve it.
You can try to correct for it, however then you assume the cooling curve is consistent, which it isn't.
Re: (Score:2)
A bigger difference? How so?
Sound falls of at 6dB per doubling of distance. There's a world of difference between Joe's measurement at 3 inches from the card, and Sherli's measurement at 1M from the card.
And that's just -one- vector for process-induced measurement SNAFU.
Sure, ambient heat (or rather, the speed of the fan, which may or may not be directly related to ambient or any other temperature, depending on yet other test conditions) makes a profound difference. A bigger one? Nope, sorry. Not buyin
Re: (Score:2)
Oh yes, I still remember the Delta fan of that particular Globalwin copper CPU cooler. I have built the PC for an almost deaf gramps and he actually complained about how loud the PC was. Vacuum cleaners pale in comparison.
Re: (Score:2)
Re: (Score:2)
The logarithmic exponent scale of decibels is for every +/-10 dB you've either doubled or halved the loudness. What nonsense are you speaking?
Better headline: AMD's Radeon R9 290 Slashvertised (Score:5, Insightful)
seriously, it could only have been worse if there was "ON SALE NOW!" in the summary. then again, there is "Nvidia cut GeForce prices" so meh.
Re: (Score:2)
Not really. Running harder to reach similar performance as a higher level card with high energy consumption, lots of noise and a GTX 780ti coming soon.
May sell to some, not to others.
Re: (Score:2)
You must have read the fucking summary, cheater.
May sell to some, not to others.
I built a quiet gamer for the kid after years of wind tunnel simulators, spent quite a bit extra on that silence, too.
The extra I spent would buy one of these things actually. The point I was getting to is, that I recently realised that the gunfire and explosions pretty well drown out most other noises in the region and we could have stepped up to a faster card anyway.
It'll make a nice Home Theater when we move on.
Re: (Score:2)
The point I was getting to is, that I recently realised that the gunfire and explosions pretty well drown out most other noises in the region
That's what headphones are for. Seriously, even if noise wasn't an issue, I used to notice that headphones actually made me a better player in online games, because I could more accurately judge where an enemy was just from the sound alone. So, unless you've got a perfectly positioned surround sound setup hooked up to the PC, headphones are probably best for everyone.
Re: (Score:2)
True about the hearing other players, (or so I've heard), but the cheap-ass $100 headphones have never lasted. I got an extra little baby Mackie and a Rode* condenser mic but still haven't scored the good quality headphones for that upgrade. (Got any brand recommendations for that?) I think the kid likes to rumble the neighborhood anyway.
Hmmm... 5.1 eh?
*I can't render that letter, do I spell it Rude?
Re: (Score:2)
I don't think price is necessarily anything to go by when it comes to headset quality, and I can't recommend any as I've not bought any for years sorry. Just check plenty of Amazon reviews and you should get a good idea of build quality, etc
Re: (Score:2)
Yeah, I'm gonna get some regular studio-grade or audiophile-type, but I haven't shopped for those since 1978. I''ll be fooled by a reasonable facsimile of durablity, reckon they'll pack that with viable transducers. I don't even know who the manufacturers are these days.
The hoboroadie buys one piece at a time, so it takes a while to integrate.
Re: (Score:2)
I know that Sennheiser make good headphones, though avoid the ones with carbon fibre headbands. Mine cracked after a few months of popping them on and off my head. I then used a metal/leather headband from a cheaper Sennheiser set that my flatmate wasn't using, and it was comfortable. The transducers were great though, and you could replace the cabling very easily if needed, so I think a pair with a good headband would last you a long time.
Re: (Score:2)
(there is a lot to be said for a good bassy rumble though :D )
Re: (Score:2)
Considering how shitty nvidia drivers have been since ~292.xx? They'd have to pay me to buy one of their cards at this point, seems that they've done a great flip as has happened in the past, and they outsourced their driver development to 3 cats and a dog.
Re:Better headline: AMD's Radeon R9 290 Slashverti (Score:4, Interesting)
It wasnt always that way. For the most part you could just use the latest drivers and everything would be OK, but about 2 years ago I started having issues where a game wouldnt work with one driver while another game wouldnt work with the ones that would work with... which bothered me but didnt push me over the edge. Then the reports in June of the newest drivers killing cards, and rendering horrible artifacts in many games...
Its a shame, because I was really eyeballing that vanilla GTX 650 that runs on 64 watts...
In the intrim I picked up an A10-6800K with its integrated HD 8670D which I am extremely impressed with (low expectations shattered), and now I am eyeballing the HD 7790 that runs on 85 watts.
Re: (Score:2)
It wasnt always that way. For the most part you could just use the latest drivers and everything would be OK, but about 2 years ago I started having issues where a game wouldnt work with one driver while another game wouldnt work with the ones that would work with...
Maybe YOU haven't been having these problems, but these problems have literally been a part of the nVidia world since the geforce, if not earlier.
Re: (Score:2)
As opposed to AMD's drivers working flawlessly?
Oh no. I have been having problems with their video drivers since long before ATI even had a 3d accelerator.
It's a GPU issue, I don't think any particular brand is doing better than the other.
So far, I'll still have to side with nVidia. Their drivers mostly work, while ATI's drivers mostly fail. For me. Perhaps if people run different software, their experiences are reversed. But the nVidia driver version issue has long been an issue, with some problems only appearing for some titles in later drivers as the way old functions work are diddled to work with new software. Perhaps each and ever
Re: (Score:2)
I was looking at lower end cards to get a 3 year old desktop with integrated AMD graphics to last another year, but run Star Citizen. Tower had a 300 watt power supply and I was looking at having to replace both the power supply and get a video card at around $150. Or about 1/10th of what I'm planning to spend next year when it will be time to upgrade PC's anyway.
Well ended up getting the R7 240 which runs on 30 watts. I know it's about the equivalent of a 6670, but it will run Star Citizen on Low/Medium
Title should focus on AMD vs Nvidia (Score:5, Interesting)
The real story is a $400 AMD card can perform as well as or better than a $1000 Nvidia one....
Re: (Score:2)
The FPS per dollar scatter plot on page 9 of the linked article (here [techreport.com]) is really telling. There's a surprisingly tight correlation between dollars and FPS for almost all of the cards, and then the GTX Titan is way off in no man's land. Nvidia's going to have to drop the price, unless it's just there to soak up money from people with more dollars than sense.
Re:Title should focus on AMD vs Nvidia (Score:5, Informative)
The Titan isn't positioned as a high-end gaming card as much as it is a low-end scientific computing card. It's the cheapest GPU that has reasonable double-precision floating-point performance. For whatever reason, most Kepler cards run DP operations at 1/24th the speed of single-precision, but the Titan and most of the Tesla cards are able to do so at 1/3rd the speed. There, the Titan runs thousands less than the similar Tesla cards (the K20 is listed on Newegg for $3500, and the K20X is on Amazon for $7700).
The fact that the Titan also gets some buys from gamers with way too much money is just a side bonus. Even since the 780 came out, it's been extremely wasteful to get a Titan for gaming. And Nvidia's own 780 Ti is likely to out-perform the Titan in games for $300 less. Really, I think the only reason they ever marked it as a gaming card was as a publicity stunt - they held the title of "fastest card ever" for quite a while, and they held it by an impressive lead.
Re:Title should focus on AMD vs Nvidia (Score:5, Interesting)
This.
Yesterday and today I installed 20 Titans in a compute cluster at work, replacing the crappy GTX480's that crash constantly. The cost of these ($20K) would buy us TWO nodes on the local K20 cluster.
We don't really care that much about the float performance, even; much of our code is memory-bandwidth bound, and much of the rest runs iterative sparse-matrix solvers that can be run in "mixed precision", where you iterate a hundred times in single precision, do one update in double precision, a hundred more in single... So we could use the cheaper gamer cards, but the Titan's a price/performance sweet spot that we can't beat. It's even faster than the K20, and compared to the other gamer cards the 6GB memory gives us a huge amount of flexibility.
Re: (Score:2)
Lattice gauge theory calculations. We use Monte Carlo techniques to numerically evaluate the Feynman path integral to simulate the behavior of quarks in the medium-energy regime, which is the interesting one since it governs the properties of protons and neutrons. Other techniques (perturbative QCD) work fantastically at higher energies, and at low energies effective field theories like chiral perturbation theory work, but in the medium-energy regime all anyone knows how to do is to brute-force the quark dy
Re:Title should focus on AMD vs Nvidia (Score:5, Informative)
But again, for gaming, it's entirely unnecessary. Heck, it's extremely likely that the 780Ti, which should be revealed in a few days, will basically be a Titan with higher clocks, slower double-precision operations (whereas the 780 has a few cores less) and less VRAM.
um (Score:3)
Who the hell spends $400+ on a video card anymore? How many games will come out in the next year that will get any benefit from any card over $200? 2? maybe 3? And don't forget, a year from now the $200 mid range cards will out perform this card anyway.
Re: (Score:2)
Thats a year of gaming on top settings or emerging 4k resolutions to consider. We have the generation of games, ssd, Windows 8.1, cpu, bandwidth, ram, lcd at ~usable levels.
The "GPU" as a card or more cards is the interesting part to get right with drivers and ongoing issues.
Drop the resolution, quality and todays mid range cards are good, but where is the fun in that
How the brands write their code, deal with the heat and work over 2 or
Re: (Score:2)
Since dual cards (SLI/CF) of the top of the line from AMD/Nvidia barely put out 30fps at 4K in recent games, i'm guessing, we need better video cards, not worse.
Or better eyes (Score:2)
Re: (Score:2)
1080p was a "novelty" at first, too.
Re: (Score:2)
But the human eye can't tell the difference.
http://icdn3.digitaltrends.com/image/720vs1080-625x1000.png [digitaltrends.com]
unless you've got your nose to the monitor and have a 30" screen
This is just like the audiophile garbage. We hit "Max Quality" in audio some time in the 80s after CDs came out. And yes, if you have crap speakers you can still get poor quality but the fact of the matter is any stereo at walmart that costs more than $200 would produce sound indistinguishable from a $10k "audiophile" amp you got from a boutiq
Re:um (Score:5, Interesting)
Re: (Score:2)
What sort of games do you play? Flight sim?
The larger monitors tend to have higher latencies, so they're not so good for games where higher lag would make a difference. Should be fine for flight sims I guess.
http://www.displaylag.com/display-database/ [displaylag.com]
There aren't as many big monitors with 16ms lag (16ms = 1 frame at 60Hz), except maybe some Sonys? For some reason the lag tends to get crappier the bigger the screen gets Despite what the database says I don't consider 30ms lag to be great when it comes to
Re: (Score:2)
Also, 14 year olds who have daddy's credit card number and want super-realistic explosions while playing CoD / Battlefield online.
Re: (Score:3)
I do, but it's not as bad as you think. I started buying the $1k cards about 10 years ago, then sell them after a year for roughly $700-800. There always seems to be people looking for "older" cards to SLI their current setup. So although I initially did pay $1k to buy into the game, so to speak, I rarely spend more than $200-300 to upgrade to the latest and greatest at any given time. It's not like I'm dropping $1k a year.
Do I need it? Definitely not, since the popularity of consoles has gimped the ad
Re:um (Score:5, Insightful)
That's kinda how all consumer (and even most non-consumer) stuff works.
You have the enthusiasts who for whatever reason have a stronger interest in the technology and are willing to spend significantly more for slightly better. They fund the R&D until it makes it down to the cheaper mass consumer pricing.
Personally I don't see anything wrong with this. I for one was an early adopter of SSDs. I bought one (then another) when 30G was still a big deal. I knew in a few years you'd get way more capacity for way cheaper.. but I didn't care, it was something I wanted to play around with.
If someone has the money to spend and is going to get enjoyment out of paying $1000 for a card where a $200 or so card would probably do, so what... their money, their hobby.
Re: (Score:2)
You obviously didn't buy an SSD early on. They were 400$ as well.
Re:um (Score:4, Informative)
I think you're underestimating how much GPU power games need these days. I bought a Dell 30" monitor 5 years ago, which I'm still using for gaming. The native resolution is 2560x1600, so not even close to the new 4K ones. At this resolution, my old 3 years old Radeon 5870 was struggling to get smooth framerates for several games. So I bought the new GTX 780 when it came out for $600. The new card is fantastic, I can finally play The Witcher 2 at full resolution with high settings, same with Bioshock Infinite, etc. Keep in mind, the new 4K resolutions will demand even more out of GPUs, so it's not likely that the demand will go down all that much yet.
Sure, if you're a gamer who fires up a 1080p console port once in a while, a cheap GPU will do. If you're an avid gamer who needs more than 1080p, you still need to buy the $400+ cards to keep up.
Re: (Score:3)
The first commercial WQXGA displays were released in 2010, so (barring time travel) I call shenanigans.
And I think you overestimate the dot pitches discernible by the human eye :)
4k is just a hype machine.
Re: (Score:3)
Re: (Score:3)
My monitor is a Dell 3007WFP, which was released in the US in December 2005 (so 8 years ago). It's also far from being the only 30" monitor released at that time with that resolution, although most modern 30 inchers seem to be 16:9 instead of my preferred 16:10.
As for pixels, I can definitely see pixels on mine if I disable AA in games, although I only need 2xAA for it to appear totally smooth. 4K would probably take it close to the point where the pixels aren't visible anymore from 3 feet away even with AA
Re: (Score:2)
290 times the performance? (Score:2)
Sign me up!
Fans? Who uses fans? (Score:2)
You're spending how much video hardware and you're still running air cooling?
Put in a good water loop already, sheesh... :-/
Who needs a Bugatti Veyron? (Score:4, Insightful)
I mean, there are no roads where you can safely and legally drive it at its top speed, so you may as well get a Mazda MX-5. Similarly; every single time there is a new graphics card out, the Slashdot response is the same. "Who needs this? There is minimal difference between this and this! Are there any games taking advantage of this?"
If you have the money and your an avid gamer, why not? If you can afford to spend $500 on a graphics card every year, I'm sure you also have a top notch monitor with a massive resolution. Also, I'm sure there is always another setting you can switch on in Crysis N. Most of the people who buy these cards aren't suckers. They know a card won't provide them with 3x as much enjoyment even though it costs 3x as much. They simply can afford to stay above the affordability sweetspot.
They also pave the way for the rest of us and ensure that there will be a card next year which does the same for half the price.
I can't help but think this reaction is mostly about penis^H^H^H^H^Hgraphics card envy.
Re: (Score:2)
Astroturf much?
Just out of interest, since I could with a bit of free cash, how much is AMD paying for shilling these days?
Re: (Score:2)
It gets WORSE. Now the new consoles are here, each with 8GB of memory, the average amount of GPU memory needed for 1080P or above is about to rise above 2GB for the first time.
Erm... this doesn't even make sense. At all. Are you claiming that to display 1080P you need a graphics card with 2GB onboard? Are you claiming that the new consoles have 8GB GPU RAM? What?
Besides, the amount of onboard RAM has long been an utterly useless metric for determining graphics performance. Since RAM is so cheap, nVi
Re: (Score:2)
Nvidia has been holding VERY profitable meetings with every possible technical site, explaining in detail just how they should trash the new AMD cards in their forthcoming reviews.
It's not uncommon for competitors to run "debunking" presentations for their partners and vendors in anticipation of their competition's releases.
Re: (Score:2)
If the two players in the doupoly don't keep their game up, they'll have a 3rd player move in on their turf by the name of Intel.
Re: (Score:2)
I've been pretty impressed with how well Intel's integrated graphics have been doing... with the HD 3000, I can play many modern games on intermediate settings with no problem at 1920x1080, which I imagine is good enough for a majority of users.
Re: (Score:2)
Re: (Score:2)
I find the Intel HD4000 quite capable of handling modern games. It can do a reasonable job on 3D accelerated graphics. I find it about equivalent or maybe a bit faster than older low end stuff like the Radeon HD5450. Games won't have the fastest frame rates, and will want to tune the graphics options to the least demanding settings, but they work. And the drivers may be buggy with DirectX 11, but DirectX 9 works.
The point of a chipset like Intel's HD line is low power usage, not high performance. A s
Re: (Score:2)
Which is why they now have a 5200 series with dedicated on-chip video ram.
Re: (Score:3)
s3? shit cards and everyone who bought them got burned.
powervr? desktop cards were shite and everyone who bought them got burned.
sis? everyone who bought their desktop 3d cards got burnt.
matrox? their 3d gaming stuff was shite.
getting the point? the problem was that all the competiton was even more liars about their cards than the two that remain.
Re: (Score:2)
I still have a 3dfx card. Booyeah!
Re: (Score:2)
me too... its collecting dust along with the matrox that it paired with...
Re: (Score:2)
I remember beta testing Everquest and feeling very smug about my hardware
Damn my life was depressing back then. Advice to hardcore PC gamers: spend less time grinding/leveling on your computer and more time grinding/leveling in real life.
Re: (Score:2)
Awesome... I waited until near the end of their run to finally pick up their business-class Voodoo3 with 8MB of SGRAM to replace my crappy S3 Virge something.
Then ATi said they'd more actively support OSS drivers for Linux, so I picked up an All-in-Wonder Radeon 7500 with 64MB RAM and a built-in TV tuner. But ultimately nothing ever came out of that that wasn't already reverse-engineered and supported by the OSS Gatos team. Later still, when ATi finally started releasing their closed-source fglrx drivers
Re: (Score:2)
I miss the days when video cards made sense
Re: (Score:2)
Corollary: Damn it's nice to have third party 3D graphics frameworks nowadays.
Re: (Score:2)
Matrox was a little behind on their general 3D acceleration, but they were ahead of everyone else with multi-monitor support on consumer-level cards.
Re:Are PC gamers benefiting ? (Score:5, Informative)
From a marketplace that used to be served by 6 competing vendors into a duopoly marketplace that is currently served by only 2 vendors --- the pace of innovation has slowed to a crawl.
We're most definitely not in a duopoly marketplace at the moment. There are currently only 2 companies offering high performance 3D consumer priced cards, but there are other companies in the graphics business. The most popular graphics card used by people using Steam is the Intel HD Graphics 3000 [steampowered.com], for example. Matrox is still about, too, but not competing in consumer 3D.
To be honest, I can't really remember a time in which there were more than 3 (possibly 4) major players in the high end consumer 3D market. Matrox dabbled, but never got close to a cost efficient gaming card, really IMO... the closest they came was the G400 IIRC. That was the era when you could possibly claim there were 4 competing vendors. Soon after, Matrox left the market to concentrate on 2D, and 3dfx dissapeared up their own arse. I'm not sure who the other 2 you are alluding to are.... SiS, VIA?
Re: (Score:2)
I'm always curious about that survey. Wouldn't it count two graphics devices in most systems? Intel integrated and then the discrete NVIDIA/AMD card? I always buy without any integrated video because it has bit me in the past but it can be hard to find the perfect mobo sometimes.
Re: (Score:2)
Matrox dabbled, but never got close to a cost efficient gaming card, really IMO... the closest they came was the G400 IIRC. That was the era when you could possibly claim there were 4 competing vendors. Soon after, Matrox left the market to concentrate on 2D, and 3dfx dissapeared up their own arse. I'm not sure who the other 2 you are alluding to are.... SiS, VIA?
3DLabs? PowerVR? Rendition? Granted, most of them only released 2-3 generations worth of graphic chips, but they did gave us options back then. I remembered the how PowerVR delivered competition when they released the Kyro after the success of PowerVR2 on DreamCast
Re: (Score:2)
To be honest, I can't really remember a time in which there were more than 3 (possibly 4) major players in the high end consumer 3D market.
So we had 3dfx, Rendition, ATI, PowerVR, Matrox, S3, 3dlabs... And I know I'm forgetting at least one of the major players at the time, but that's seven, not four. Some of the lower-end Oxygen cards were priced competitively with high-end gaming GPUs today, so you don't get to quibble about cards positioned towards consumers. If they were on the market, they count. SiS didn't have 3d accelerators back then. (They have brought them out since, they are windows-only garbage with little to no actual 3d accelera
Re: (Score:2)
Rendition were never competetive, PowerVR are still about, S3 are still about, Matrox are still about, 3dlabs were not gaming cards. Yeah, ok, I'm stretching a bit ;)
However, if you do include embedded graphics, there are still plenty of players on the market. Also, this market only lasted 2 years or so, and it was a brand new market, so there were bound to be more competitors. Those that did fall out of the market and survived in some form have gone to the embedded market.
Re: (Score:2)
Yea, but none of them really mattered to anyone who knew what they were doing.
Back in those days, you had 3DFX and then you had everyone else.
Well, that's nonsense. Actually, it's complete fucking bullshit. People who knew what they were doing most certainly did not buy 3DFX. Until the Voodoo 2 came out, support was shit. When the Voodoo 2 came out, its visual quality was shit. I got a Permedia 2 and was within a couple FPS with higher visual quality for less money, with support for higher resolutions.
Perhaps you were not one of the people who knew what they were doing.
Re: (Score:2)
Re: (Score:2)
Now get off my lawn.
Re: (Score:2)
More accurately, they bought the "assets" of 3DFX, after 3DFX decided it would be a good idea to go into competition with all their customers.
That way, they could pick and choose what expenses they keep (buildings, employees, etc.)
Re:Are PC gamers benefiting ? (Score:4, Insightful)
Better question: what game actually requires this?
Seriously now. Unless you're trying to just throw money away on some 6-screen rig or something, a single-screen at 1920x1080 will run almost all games of today fine from 3-year-old cards. "Bleeding edge" is a function of throwing your money away on diminishing returns problems.
Re: (Score:3)
Depends on what point those diminishing returns start to diminish for you. Having a high framerate is nice, while having max texture and shader detail turned on at the same time. You don't need the million dollar sports car to get you to work, but that doesnt' mean it isn't nice.
Re: Are PC gamers benefiting ? (Score:2)
Re: (Score:2)
Depends on what point those diminishing returns start to diminish for you.
Yeah but *my* hobbies are better. Therefore anyone who doesn't spend money on what I consider worthwhile is an idiot, and anyone who does is clearly very smart.
Re: (Score:2)
Crysis 2. Metro 2033. Metro: Last Light. Supreme Commander 1 and 2.
Just off the top of my head - all those games will benefit greatly from a faster card (or CPU, in the case of Supreme Commander).
Re: (Score:2)
Exactly. Also, Tomb Raider 2013.
I would to love to be able to play all my games at 2560x1440 @ 120+ Hz using LightBoost on a single monitor but that is not realistically possible for another 10 years.
Re: Are PC gamers benefiting ? (Score:2)
For the last time some of us gamers want a guaranteed 120+ Hz @ 1080p with all the bells & whistles with LightBoost.
Check our nVidia's G-Sync if you want to learn more
www.geforce.com/hardware/technology/g-sync
Re: (Score:2)
I have a single screen at 2560x1440 and I can almost play Dota 2 (not a demanding game) at full settings with a HD6770. I get about 35 fps, but horrible jerking / stuttering. There is a large difference in how good I can play with better fps, but I also get less headache. Turning down the settings is a solution, but it makes it slightly more difficult to keep up, and I need all the help I can get. I want to play on linux only, but on Linux I get ~13 fps with the same card. So I need a card that's 4 times a
Re: (Score:2)
Almost all the 3D games if you want to have a uniform 60fps full time on 1080p. Intel "HD graphics" is still shit and is the same to the most integrated graphics. And the most obvious benefit of "GPU wars" is better prices for the low and mid-range discrete cards.
Re: (Score:2)
Because clearly the state of today's software will never change, and never become more resource hungry once those resources are available in the market.
Re: (Score:2)
Better question: what game actually requires this?
Seriously now. Unless you're trying to just throw money away on some 6-screen rig or something, a single-screen at 1920x1080 will run almost all games of today fine from 3-year-old cards.
X-Plane... though some argue that's not a "game." Even on 1080p with the latest, fastest consumer GPUs, you can't max out all the GPU-dependent settings on a scenery-heavy area without fps dropping to single digits.
Re: (Score:2)
Get the kids hooked on the meth and the crack cocaine, 'cause once they're hooked on that, you know what's next: marijuana. Then jazz music. Forget about it. - Brent Butt
Re: (Score:3)
A 1.4GHz P3 is a hell of a lot faster than a 1.4GHz P4.
Sort of how a Pentium M 1.6 is around the same performance as a 2.4GHz P4...
Re: (Score:3)
Re: (Score:2)
A number can't be trademarked, that's why they lost it.
Re: (Score:2)
Re: (Score:2)
what kind of framerate do you get in Natural Selection 2 with that?