NVIDIA GeForce GTX 690 Benchmarked 119
MojoKid writes "NVIDIA has lifted the embargo on benchmarks and additional details of their GeForce GTX 690 card today. According to a few folks at NVIDIA, company CEO Jen Hsun Huang told the team to spare no expense and build the best graphics card they possibly could, using all of the tools at their disposal. As a result, in addition to a pair of NVIDIA GK104 GPUs and 4GB of GDDR5 RAM, the GeForce GTX 690 features laser-etched lighting, a magnesium fan housing, a plated aluminum frame, along with a dual vapor chamber cooler with ducted airflow channels and a tuned axial fan. The sum total of all of these design enhancements results in not only NVIDIA's fastest graphics card to date, but also one of its quietest. In the performance benchmarks, NVIDIA's new dual-GPU powerhouse is easily the fastest graphics card money can buy right now, but of course it's also the most expensive." The GeForce GTX 690 has been reviewed lots of different places today, Tom's Hardware and AnandTech to name a few.
Finally (Score:5, Funny)
Re:Finally (Score:4, Insightful)
Not to mention Minesweeper!
Re: (Score:3)
after looking at all the other charts for a single card it is indeed the fastest. interesting that for skyrim the crossfire places almost bottom and even blow the non crossfire solution.
for the compute section likely this is an architectural difference and demonstrating the same calculations that make AMD rock bitcoin mining too.
Re: (Score:1)
You kid, but the shaders mod rapes framerate. Even my 9600 GT can only run it at 10fps with decent settings.
Re: (Score:2)
It's 3 years old. That's not a really old, obsolete GPU. You're really talking out of your ass here, you know, I can run Skyrim decently (40fps or so) at high settings and turn everything up to ultra in Fallout: New Vegas without breaking a sweat. That and it's not even 5 generations old, unless you're an idiot that considers every new line of an nVidia card a generation. It's actually only one generation offset, it's a DX10/Shader 4 card, the current standard being DX11/Shader 5.
Re: (Score:2)
Have you considered that your other components may not be up to spec?
Re: (Score:2)
Elaborating a bit: I can indeed play skyrim on the 9600 GT, but I usually play connected to a fairly low resolution display (my 720p TV), I'm running TinyXP on a gaming partition, and I've got pretty decent specs otherwise (8GB DDR3 1600, Phenom II x4). Yes, it's not the greatest of the great, but it runs acceptably. Indeed, it outperforms the low end of the 500 series by a decent margin. The 9600 GT is not an outdated piece of shit.
Nitpick: the 400 and 500 series should be considered a single series; they
Re: (Score:2)
I'm just saying that it's a bit early to call the 9600 an obsolete piece of shit when, despite what you have convinced yourself as I doubt you own one, it runs newer games fine on medium settings (though admittedly, gamebroyo isn't that intensive). Besides obviously low end cards, I think it's a stretch to call anything in the DX10 generation obsolete.
WTF (Score:5, Interesting)
Tomshardware is showing GTX beating ATI by 50 - 200% in every benchmark. Anandtech shows the opposite with ATI still winning under the same games? Anyone else notice this?
Does Toms Hardware or Anandtech get paybacks from either company for biased remarks?
Re:WTF (Score:4, Interesting)
Part of it depends on what you choose to bench (Score:5, Interesting)
I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).
Personally I'm a HardOCP fan when it comes to benchmarks. Not only are they all about game benchmarks, but they are big on actual gameplay benchmarks. As in they go and play the game, they don't run a canned benchmark file. This does mean that it isn't a perfect, "each card sees the precisely equal frames" situation, but it is far more realistic to the task they are actually asked to do, and it all averages out over a play session. I find that their claims match up well with what I experience when I buy a card.
http://hardocp.com/article/2012/05/03/nvidia_geforce_gtx_690_dual_gpu_video_card_review [hardocp.com] is there 690 benchmark. It's a selection of newer games, generally played with triple head (the game displayed across three monitors at once) on a 690, 2 680s SLI'd and two 7970s CF'd.
Re: (Score:2)
I don't care for Anad's benches much because they seem to like synthetic compute benchmarks. That is really all kinds of not useful information for a game card. I want to see in game benchmarks. If any compute stuff is going to be benchmarked, let's have it be an actual program doing something useful (like Sony Vegas, which uses GPUs to accelerate a lot of what it does).
But... They tested with 10 games, 1 raytracer and 0 synthetic benchmarks. I don't know what they usually do but this article was very focused on real world performance.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
Try running 6 EvE clients across two 24" 16X10 displays and you will notice the difference between the newer cards.
My GTX265 starts to choke on 4 clients running at full speed, but with ISBoxer I am able to run two large screens at 40FPS and four small screens running at 15FPS.
Not everyone runs FPS games where you have one screen you are doing everything in, some games multi client very well.
Re:WTF (Score:5, Informative)
I've seen it rumored in more than a few places that Tom's Hardware is very Intel and Nvidia, shall we say, "friendly".
That would explain why in their most recent Best Graphics Cards For The Money [tomshardware.com] AMD's cards only won 5 categories compare with Nvidia's massive win in 1 category (plus a tie in another and 3 categories with no winners). Basically if you ignore all the times that they say good things about AMD, then it is obvious that they favour Intel and Nvidia.
As for the original poster claiming big differences in the rankings, I just don't see it. If you filter out the cards that are not tested on both sites you get the following rankings:
Battlefield 3
Toms: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 7970CF, 6990, 590GTX, 680GTX, 7970, 580GTX
Skyrim
Toms: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 6990, 680GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 590GTX, 680GTX, 7970, 580GTX, 6990, 7970CF
DiRT 3
Toms: 680GTX-SLI, 690GTX, 7970CF, 680GTX, 6990, 590GTX, 7970, 580GTX
Anan: 680GTX-SLI, 690GTX, 7970CF, 590GTX, 680GTX, 6990, 7970, 580GTX
Metro 2033
Toms: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX
Anan: 7970CF, 680GTX-SLI, 690GTX, 6990, 590GTX, 7970, 680GTX, 580GTX
Only Skyrim seems to show any major differences, and that was probably due to some driver issues, game version or alternative testing methods.
Re: (Score:2)
Re: (Score:2)
And this is why you're the gadget guy.
He he. You made me think back to the days when I first came up with the Gadget moniker. I was probably using a TNT2 Ultra video card then. It was a far cry from the monster cards we are looking at today! Even so, my card had great TV input/output and included LCD 3D glasses. It seems the actual feature set if graphics cards hasn't improved a lot over the years.
Re: (Score:2)
I wouldn't say they've been bought and paid for by any one specific entity, but they do tend to come up with such completely and obviously false stats on occasion that it is difficult to believe the outlet is completely unbiased.
Well we have already seen in this thread someone who claimed an example of bias where there it has been demonstrated that there was none. I wonder how many other times people's evidence proves to be just as "reliable".
Re:WTF--- compute on linux... (Score:2)
well it depends on what you want things for
basically I don't really see much difference in the graphics openGL/DX11 side of things but this was very interesting to me :
http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/15 [anandtech.com]
regards
John Jones
Re: (Score:2)
Well, when you test a gaming card by running GPGPU stuff on it, when nVidia specifically sells GPGPU cards, maybe you are running the wrong test.
Re: (Score:1)
I simply don't trust Tom's Hardware anymore. I bought an A7V Mobo (1st gen Athlon) many years ago based on TH's glowing reviews. The board was buggy as all hell and it came out months later that Asus had paid for that review either directly or indirectly through advertising. I now take a fairly sceptical view of all reviews I read now based on that experience.
Re: (Score:2)
Re: (Score:2)
They seem to have merely omitted the games which favor AMD more strongly. Compare, for example, the Metro 2033 [anandtech.com] benchmarks [tomshardware.com] (or BF3, or Skyrim) and you can see that they are relatively similar. THG did not test Crysis or Total War: Shogun 2, which the AMD cards perform better on.
Re: (Score:1)
Tom's Whoreware? No, they've never taken a sweetener.
Re: (Score:2)
Tom's Hardware is a joke. They're moneyhatted by Nvidia and Intel all the time.
Re: (Score:2)
I dont know about that, buit I do wonder why Toms Hardware wont let Archive.org's wayback machine show their old pages.....
Big Fermi is still on the horizon... (Score:4, Interesting)
I'm still waiting for the GK110-based "Big Fermi" due out Q3. Considering how well the 680 and 690 have performed the Gk110 will be a monster, probably power hungry but still a monster. Nvidia really hit gold with their latest generation, it is speculated that the current 680 was intended to be the 660 until it outperformed AMD's top offering. Can't wait to get my hands on a 4gb GK110.
Re: (Score:1)
But only if you do single-precision FP workloads. If you do integer workloads, the 680 can't even beat Nvidia's own 580. Pass.
Re:Big Fermi is still on the horizon... (Score:5, Insightful)
If you're doing serious GPGPU stuff, you shouldn't be relying on fickle consumer boards in the first place. This is a gaming card marketed to extreme gamers. I've fooled around with CUDA stuff like raytracing and H.264 encoding, mostly as a curiosity, but the reason I bought this quad-SLI setup years ago was for games and real-time 3D rendering. I couldn't care less about FP performance, and neither does Nvidia's target market for this product line.
GPGPU on consumer cards is still a novelty at this point. We're getting close to the tipping point, but for most users, as long as it plays their game and can handle 1080p video, they're content. If and when that balance tips in favour of OpenCL and CUDA, both GPU manufacturers will adjust their performance targets accordingly. Their #1 priority is still 3D gaming for now.
Re: (Score:1)
"This is a gaming card marketed to extreme gamers."
And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability.
This is why nVidia is losing in the general-purpose GPU arena. AMD just keeps trucking along, upgrading EVERYTHING. NVidia? Gimps your shit.
Re: (Score:2)
"This is a gaming card marketed to extreme gamers."
"And since games are probably the most resource-intensive fucking thing, you should expect your GAMING CARD to kick major ass at everything else if it has the capability."
Did you not understand what he said?! It's a GAMING CARD - it's designed to kick ass when rendering games. Everything else is secondary - it's not a general purpose card, and it's not marketed as anything other than a high end gaming card. If it happens to kick ass as a more general purpos
Re: (Score:2)
"it's not a general purpose card"
THEN WHY FUCKING INCLUDE CUDA AT FUCKING ALL IN THE HARDWARE?
Derp, you're not thinking this morning. Go get yourself some coffee and think a little harder.
Hmmm, and what uses FP32 workloads? (Score:5, Insightful)
Oh that's right: Video games. You know, the thing it was made for.
The GTX series are nVidia's gaming cards. They are made for high performance when you wanna play 3D games. They aren't made for compute performance. That is not to say they cannot handle compute stuff, just that it isn't what they are primary designed for. So the kind of compute stuff they are the best at will be more related to what games want.
Their compute products with be the Teslas. They are made for heavy hitting compute performance of all kinds. If you are after purely GPGPU stuff, they are what you want.
nVidia seems to be separating their designs for the two to an extent. Still common over all design, but concentrating on making the desktop GPUs more efficient, at the expensive of high end computer features (like Integer and FP64 power), and the workstation/compute cards good at everything, even if they need beefier power and are louder.
I'm ok with that. I buy a GeForce to play games, not to do high end GPGPU stuff. We buy Teslas at work for that.
Also, there's a shitload of other things out there GPGPU wise that are FP32, and the 680 really is killer at that. Does a great job accelerating video encoding and the like.
Re: (Score:2)
Re: (Score:2)
The reason for making an account is so that you're notified when you receive a reply. Further, when someone is considering whether or not to respond to you, they'll have some level of confidence that you'll actually see their response. Why go through the trouble of writing something useful to an AC who won't read what you're saying?
As for your issue: just get a new computer, all you need to know is how much money you have, and then follow one of these:
$650: http://www.tomshardware.com/reviews/build-gaming-p [tomshardware.com]
I think people need to stop being so hyped up (Score:5, Interesting)
There is zero actual evidence that there is going to be a "GK110" this year, or that if there is it will be a high end part (bigger numbers in their internal code names don't always mean higher end parts).
I see people all in a lather about the supposed amazin' graphic card that is up and coming, and lots of furious rumors, but nothing in the way of any proof. I also can see some fairly good arguments as to why nVidia would NOT be releasing a higher end card later on (excluding things like Teslas and Quadros, which are higher end in a manner of speaking).
Speaking of Teslas and Quadros, that may be all that it is: A version of the hardware with a redesigned shader setup to give higher FP64 speed. As it stands the card is quite slow at FP64 calculations compared to FP32. It could be 50% of the speed, in theory, but is more like 1/16th. Basically it seems to be missing the necessary logic to link the 32-bit shaders together to do 64-bit calculations for all but a fraction of the shaders. Maybe to protect their high end market, maybe to keep size and heat down (since it does take additional logic). Whatever the case a Tesla/Quadro version with that in place would have much improved FP64 speed, and thus compute performance for certain things, but be no increase to gaming at all.
So I think maybe people need to settle down a bit and stop getting so excited about a product that may not even exist or be what they think, and may not launch when they think even if it is. Chill out, see what happens. Don't get this idea that nVidia has something way MOAR BETTAR that is Coming Soon(tm). You don't know that, and may be setting yourself up for a big disappointment.
Power consumption matters (Score:3)
I've been watching my UPS power load meter since I upgraded from a GTX 560 to a GTX 680. I'd estimate the 680 uses a bit less than half the power of the 560 when idle. At peak usage the 680 uses more, but only by a hair.
I was never happy with the 560 in general. The 3D performance was surprisingly glitchy at 1080p. Even though I wasn't too keen on trying NVIDIA again after that, I gotta admit they won me back with the 680.
Re: (Score:2)
Sorry, but NVIDIA have already hinted strongly that there is no "big Fermi" gaming card coming this year. At least nothing that will eclipse the 690. I'm seriously starting to wonder what happened to big Fermi, if it was all just a rumor or perhaps they are saving it for Quadro/Tesla.
Still, I can't wait for 680 prices to drop a bit so that I can replace my overclocked 570. AMD hardware is pretty decent these days, but every time I touch their drivers... I go back to NVIDIA. Hate to say it, but NVIDIA's driv
Re:Big Fermi is still on the horizon... (Score:4, Informative)
Re: (Score:2)
Unfortunately you won't be able to get one (Score:3, Informative)
According to Semiaccurate there's a mask design flaw in the GK104, which has caused poor yields. Less than 10,000 GTX 680s shipped worldwide, even though it's been released a month ago.
http://semiaccurate.com/2012/05/01/why-cant-nvidia-supply-keplergk104gtx680/ [semiaccurate.com]
Re:Unfortunately you won't be able to get one (Score:5, Interesting)
I would encourage people to look at the site's name before taking anything they say seriously. And then I'd encourage them to look in the archives (if they keep true and accurate archives of their past stuff, I've never checked) to see all the shit they get wrong (and there is a lot of it). Then maybe you understand that like most rumour sites, you don't want to take it too seriously.
For some overall perspective, consider that Charlie Demerjian, the guy who runs it, was given the boot from The Inquirer, which is not precisely what one would call a bastion of journalistic excellence.
As an example of one major madeup story from them, in February they claimed that the GTX680 would have a "PhysX Block" basically either dedicated hardware to speed up PhysX, or special instructions/optimizations for it at the expense of other things. They said that the supposed edge in benchmarks was only because of that, the 7970 would out do it in most games.
That is not at all the case, it turns out. The GTX680 has nothing particularly special for PhysX, other than a shit-ton of shaders, and it in fact outperforms the 7970 by a bit in nearly all game, including ones without PhysX. HardOCP (http://hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/) has them both tested with real gameplay, as usual.
So really, don't take anything that site says seriously. It is a blatant rumours site that just makes shit up.
Re: (Score:1)
I do follow that site and most of the stuff is spot on. It's true a few stories are wild speculation, but this story rings true.
Re: (Score:2)
"This story rings true"? In other words "I'm an AMD fan that wants to see nVidia fail, so this story sounds true to me because I like it."
who cares (Score:2, Interesting)
who is going to pay $1000 for a piece of hardware with a halflife of maybe one year? this card is really worth about $400 at most.. and the 680 should be $200. what games actually take advantage of this? there are hardly any pc games worth playing nowadays :\. It's too bad too, because I LIKE new graphics hardware. it's always fun to play with, but at $1000 I can't justify it.
Re: (Score:1)
I think what he means is that the performance of these cards will be the best, or close to it, for only about a year. In a year you can get a card this good for $150-200 instead of whatever it is right now. Combining that with what you wrote we get the question of why should anyone buy these cards, right now, when the old card still works and you can get these whizz bang cards for cheap in a year?
Re: (Score:1)
There's no way in hell this card will have dropped down to $150-200 in a year. I own a 560 Ti and that's still going for about $250, as a reference.
Re:who cares (Score:5, Insightful)
Normally I'd have preordered two of these already, but it's too rich for my blood right now. This card is for us nutjobs who want quad-SLI and panoramic "3D Surround", with our custom-built driving cockpits and 3 large monitors, or the equally obsessive flight sim crowd. In my case, these displays run at 2560x1440 and that requires a ton of memory bandwidth on each card, just to push all those bits around.
For almost everyone else, a single $300 GPU is enough to run just about any game at 1920x1080 with very respectable settings.
As for your suggested prices, you're just talking out of your ass. If you're going to lowball the latest and greatest GPU on the market, maybe you should set games aside for a while and look at your income. Even though I agree the price is a bit high, spending $1000 on a hobby is nothing. You save up for that shit, and it lasts a very long time. My current cards are over 3 years old, so it works out to just over a dollar a day for kickass gaming graphics. Even if I played for just a few hours a week, it's still cheaper than any other form of modern entertainment. Cheaper than renting a movie, cheaper than a single pint at the pub, cheaper than basic cable TV, cheaper than bus fare to get to and from a free goddamned concert. For what I get out of it, having the highest end gaming hardware ends up being a sweet deal.
Re: (Score:3)
Meh, if there was a reasonable (no, a $36000 Eizo doesn't count) 4K/QFHD monitor I'd consider it. I don't like triple screen setups with their bezels and odd aspect ratio with stretching and whatnot, I want it all on one screen. IMO the problem is not the price of the graphics card, it's having something useful to show it on. Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX, which is why I'm still on a good 1920x1200 IPS monitor. Of course it helps that I'm not a FPS junkie but
Re: (Score:2)
Even at 2560x1440 I'd have to pay more for a single monitor than for a 680 GTX
No you wouldn't. I mentioned this monitor [ebay.com] earlier in the discussion... I'm not trying to sell them, by the way, I was suprised at the cheapness. Most of the consumer feedback with them has been good, too.
Re: (Score:2)
Ah, nod. A brand I've never heard of, on a site I don't trust, with no UK suppliers providing it.
Most of the astroturfing has been good though, I agree :)
Given the resolution you can get on the iPad2 it's reasonable to expect 2560x1440 for $400, especially without the free iPad2 thrown in too. It has taken monitor providers several years to really push into that market though.
Even the one you're linking to.. 27 inches? Why not 22? Hell, I'll happily use a 15" 1920x1080 screen, so why not a 19" screen with 2
FP64 (Score:2)
I miss the days (Score:2)
Re: (Score:3, Insightful)
Mostly multihead gaming. While a $150 card is plenty at 1080p, at 5400x1920 or 4320x2560 it's a different story.
Awesome (Score:2)
One question though: If I can play Skyrim with all settings to max at 1920x1200 with a GTX 560, what is SLI of two GTX 690's needed for?
Re: (Score:2)
According to this page [techspot.com], a GTX 560 _averages_ 25fps at 1920x1200. That's not that good.
Re: (Score:2)
Hmm, plus I didn't use FSAA or other types of AA so maybe it was not *all* setting to max :)
Re: (Score:1)
not the most expensive (Score:1)
Re:Slashvertisement (Score:5, Insightful)
Re: (Score:2)
My bad.
Re:Slashvertisement (Score:5, Insightful)
Yes, except the GTX 6xx series is slower at CUDA processing than its predecessors. This is a gaming product. Nvidia did this on purpose, sacrificing some compute speed in favour of rendering performance and power efficiency. They also artificially limit double-precision FP speed on consumer boards, to steer professional users toward the Quadro.
As a result of this hobbling, GPU computing hobbyists tend to gravitate toward the Radeon, which has outperformed the GeForce in OpenCL for a few years now, in both performance-per-dollar and performance-per-watt.
Re:Slashvertisement (Score:4, Interesting)
Re: (Score:2)
You mean @HOME projects, don't you? Nobody uses BitCoin.
Re: (Score:1)
Nvidia cripples GPGPU in Geforce GTX 680 [theinquirer.net]
Benchmark Results: Sandra 2012 [tomshardware.com]
NVIDIA GTX 680 Reviewed: A New Hope [brightsideofnews.com]
Re: (Score:2)
But if you're into HPC, this is not the card for you. You want a 685/695 which won't be released until 1st quarter next year.
Re: (Score:1)
Actually high performance computing has created more demand. Nvidia GPU's are being used in massive supercomputers using OpenCL and CUDA. "AMD GPU's support OpenCL." There are a many more people who are interested in the latest and greatest GPU than you may think, specifically on a news for nerds site. So yeah, sweat article and thanks for the heads up about the new benches MojoKid.
They did not focus on the general purpose computing circuitry [anandtech.com] so that power and heat could be reduced. The main focus was on gaming. Seperate cards will be created specifically for computing.
Re: (Score:3)
Re: (Score:2)
I got an AMD 6870 over a year ago ($150), and it's played everything I've thrown at it just fine with maxed graphics. Skyrim, Witcher 2, etc play without any stutter and look wonderful. All on an AMD 965 (3.4 Ghz X4) CPU from the year prior.
I'm just trying to figure out what I'm missing by not spending 5x that price.
Probably not much (Score:5, Insightful)
Re: (Score:2)
That's only true if you're running a 60Hz low-mid res display, say 1920x1200(~2.3 megapixels) or less. Though, even then the actual retail price of such a card, most of the time , will probably closer to $250 than $150.
If you want to run 120Hz, or run 2560x(1600|1440)(~3.7-4.1 megapixels), or run 3+ monitors in an eyefinity config
Re:Probably not much (Score:5, Insightful)
That's only true if you're running a 60Hz low-mid res display, say 1920x1200(~2.3 megapixels) or less. Though, even then the actual retail price of such a card, most of the time , will probably closer to $250 than $150.
If you want to run 120Hz, or run 2560x(1600|1440)(~3.7-4.1 megapixels), or run 3+ monitors in an eyefinity configuration(~4-24.5 megapixels), then you need all the power you can get.
Yes, but the GGP was addressing someone complaining about the cost of the card, not someone who's running a fucking surround-video in their replica Cessna cockpit. Anyone who's dishing out for high end displays isn't going to (justifiably) complain about the price of the card(s) needed to drive them. For everyone else, like the OP, a $150 GPU will play almost any game on their standard 1920x1080 60Hz display with decent performance settings just fine.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
It's all about the monitor. Are you running at least 1080P? HDTV has done a horrid thing and effectively keeping displays at or close to HD res there are very view monitors that will do more than 1920*1080 some pack some more vertical pixels in at 1920*1200 with a 16x10 ratio. Anyways it's rather easy for a few year old card to run maxed out if you only running an average display 1366x768 was the most predominant per http://www.w3schools.com/browsers/browsers_resolution_higher.asp [w3schools.com] and that's really not a
Re: (Score:2)
It sucks. Now that we have graphics cards capable of effectively pushing >3 million pixels effectively, we don't have the high end CRT monitors that have that kind of resolution any more.
Re: (Score:3)
This [ebay.com] is a 2560*1440 monitor for $320. The early ones had higher quality internals, and could actually run at 100hz at that resolution. They're shipped direct from Korea.
People saying they're running on maximum settings, without mentioning the pixel count are being disingenuous. The above monitor pushes over 3.5 million pixels. 1336*768 is about 1 million.
Re: (Score:2)
Re: (Score:1)