AMD's Radeon R9 290X Review 212
Billly Gates writes "AMD may have trouble in their CPU department with Intel having superior fabrication plants. However, in the graphics market with GPU chips AMD is doing well. AMD earned a very rare Elite reward from Tomshardware as the fastest GPU available with its fastest r9 for as little as $550 each. NVidia has its top end GPU cards going for $1,000 as it had little competition to worry about. Maximum PC also included some benchmarks and crowned ATI as the fastest and best value card available. AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target"
ATI drivers (Score:5, Informative)
Re:ATI drivers (Score:5, Informative)
148MB for the latest Nvidia driver.
Re: (Score:3)
148MB for the latest Nvidia driver.
*sigh*
Re:ATI drivers (Score:5, Insightful)
Its more than that by the time the package decompresses.
Just some data points from a single machine.
C:\NVIDIA folder
V197 (~2010) 85M
V320 (~2013) 182M
The vast majority of it appears to be the control panel, and the physx package.
The display driver is just a few megs by comparison. If you skim off the hd audio/nv stereo/cuda/opencl/GL libraries you probably could get the whole shebang in under 10MB, and you could still play directX games.
I've been killing the nview and services for years. Never had a problem with the machine, but it always bothers me that they have a bunch of crap running that doesn't actually appear to do anything.
After all the once or twice a year I actually manipulate my monitor settings I am fully capable of finding my way into the windows display control panel and adjusting things there of opening the control panel from the actual control panel.
Its quite possible I'm not getting the absolute best performance playing games, but frankly I would much rather adjust settings from within the games than have nvidia overriding the game settings.
Re: (Score:2, Informative)
For Nvidia drivers, don't forget to remove the 2 "AppInit_DLLs => nvinitx.dll" entries inside the registry. Preloading this DLL inserts nasty hooks for optimus's support. I that kind of tricks.
Also remove the "updatus" user and its account/files.
In the services, after the parameters of the card are configured as desired inside the control panel, you should turn off permanently the nvidia 3D profiles updating service and the driver's support service.
Once this is done the computer is more stable, less bloa
Re: (Score:3)
Re: (Score:2)
BTW: The original PowerVR cards from the late 1990's worked the same way. They formed the frames and dumped them to the system video card.
The whole thing worked better than the VGA pass-through on the voodoo boards I also own (cause they are still in a PC in the attic). Most of the time I simply disconnected the voodoo pass through and plugged my monitors in directly to either the system video card, or the voodoo. That is because the pass-through interface totally screwed up high resolution (1600x1200 at th
Re: (Score:2)
Re: (Score:2)
The original PowerVR cards did not really take over your bus; everything in the system would still work fine. But you have to remember that we're talking about a maximum resolution of 1024x768 for those cards, too. They weren't having to handle today's resolutions. Also, you need to remember that they were literally the slowest GPU of their day. I had 'em all, more or less; TNT, TNT2, Voodoo, Voodoo 2, Power VR, Permedia 2. The PowerVR was significantly slower than any of these, and it had lower visual qual
Re: (Score:2)
Re: (Score:2)
A pretty fast hard disk back then (say, an original ultra-scsi barracuda) would stream around 20MB/sec peak, most of us were still using modems or had moved up to ISDN... There was plenty of free bus bandwidth, even accounting for overhead. And even then there were machines with multiple PCI buses, though they remained rare throughout the dominance of that bus. Now it's common to have a PCI bus and a PCI-E bus, but that's not really the same thing. Then again, it's not the shared graphics-only bus, either.
Re: (Score:2)
I don't know about you, but I almost always had two cards on VLB in my 486's
I don't know about you, but I never saw that work without a whole lot of hassle, and I seldom saw it work properly. In theory, you could have three VLB cards. In practice, you could only be sure that they would work if you only had one of them. Since few people had a need for them (you could get pretty good throughput from one of those Adaptec ISA SCSI cards with its own Z80 running the show, basically any of the ones that work without drivers) few people found it worth the trouble.
So, no, VLB wasn't just a graphics bus.
In theory, no. In practic
Re: (Score:2)
Yes, a pair of Voodoo2 cards was the ultimate for quite some time. I never managed to get a pair, but I did have a 12MB card. Then I got the Permedia 2, which was a little slower than a single Voodoo 2 but which would do higher resolutions and which had significantly better visual quality. After that, I didn't upgrade until the gf2.
Re: (Score:2)
They were able to do it because the PowerVR chip processed a tile-at-once and only wrote the final rendered frame over PCI. Since you never have to read the frame buffer for more complex multi-pass effects (unlike all other cards at the time), you could get by with PCI throughput. The card had local memory to store textures and the scene draw ordering buffer.
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
No, but it's a universal installer. One driver package for all supported operating systems, 32 and 64 bit variants, and all supported graphics cards. It's pretty impressive really.
Re: (Score:2)
You fucking nerds can't figure out that they're including legacy drivers in the package as well? LOTTA cards out there.
and you're too stupid to understand that it's just 10 megs that is the actual driver portion of it even for that.
the real problem of the size is them being very, very stupid about how they built their menus...
Re: (Score:2)
"You fucking nerds"
They're not nerds otherwise they'd know. They're just users playing the cool game.
Re: (Score:2)
Don't use AMD's control panel (Score:4, Informative)
The drivers do NOT need .Net, or 90Mb. The extremely crappy control panel, which has NOTHING to do with the drivers, uses the dreadful .Net API, and thusly needs loads of HDD space. People in-the-know install third-party front-ends like 'Tray Tools' or the like.
Sadly, ATI loves to take significant pay-offs from companies like MS, acting if THEY are the customer, not the person who purchased the graphics card. This, we can truly describe as ATI/AMD endlessly shooting themselves in the foot. Using .Net for the official control panel was a disgusting and despicable act, and was a great example of the contempt the older version of ATI had for its users.
AMD/ATI is a much better company today- it was either improve or die, and after the longest possible time, AMD finally made the right choice. However, we get glimpses of the bad old ATI with issues like the fiasco over the recent release of 'new' GPU cards that are almost all just re-brands of older cards, with the free games removed (AND higher prices). This kick-in-the-teeth for customers was done simply so AMD can make a song and dance about free games with all their cards AFTER they finish releasing the new 290 family (the 290X is just the first of three 290 cards- the 'free' games won't be announced until after AMD launches all of them).
In truth, ATI/AMD customers need to be smarter than customers of Nvidia products. Nvidia prides itself on cards that 'just work'. With AMD, you frequently need to know what you are doing, at which point AMD rivals Nvidia- but 'out of the box' the AMD experience is usually worse. Nvidia supports its older graphics cards MUCH better than AMD, but older graphics cards from AMD tend to get faster with time as newer games exploit the more forward looking architecture of ATI designs.
People have more problems with ATI cards in games, but this happens because uncommon settings in ATI's control panel (like the number of frames being rendered ahead) can cause terrible game problems if not adjusted per game on the desktop. Again, informed ATI owners KNOW which settings to tweak, but for the average user, the ATI experience can be frustrating. This is entirely ATI's fault, because a PC game, with a tiny amount of code, can programmatically set the correct options, but many game developers do not know how to do this. Nvidia does a much better job helping developers set-up their game code correctly for all usable generations of Nvidia graphics cards.
ATI has a nasty habit, as well, of disowning very recent cards that, on paper, had the features to support current games. ATI likes its shills to say ('jeez, your 4 year old card is out-of-date junk') whereas Nvidia happily ensures every generation of its cards that support DX9 work as well as their hardware allows. In reality, ATI cards from the 2000, 3000 and 4000 series are effectively the same as everything up to the 6000 series (excluding the orphan architecture of the 6900 VLIW4 oddities). However, ATI pays technical sites to state the cards from the 5000 series and earlier are obsolete (technically this is completely untrue). In contrast, Nvidia is proud to support cards from the 8000 series and onwards, which is a similar timeframe to the 2000 series from ATI.
While it is true that 'cheap' current gen cards destroy premium cards from that far back, it is the principle that matters.
Re: (Score:2)
ATI has a nasty habit, as well, of disowning very recent cards that, on paper, had the features to support current games. ATI likes its shills to say ('jeez, your 4 year old card is out-of-date junk') whereas Nvidia happily ensures every generation of its cards that support DX9 work as well as their hardware allows.
This... I've gone from a ti4200 to an 8800GT to a 460GTX, and that is all. I've had decent graphics throughout... not great, I've always been behind the times (save for my 8800 for a while),
Re: (Score:2)
In reality, ATI cards from the 2000, 3000 and 4000 series are effectively the same as everything up to the 6000 series (excluding the orphan architecture of the 6900 VLIW4 oddities). However, ATI pays technical sites to state the cards from the 5000 series and earlier are obsolete (technically this is completely untrue). In contrast, Nvidia is proud to support cards from the 8000 series and onwards, which is a similar timeframe to the 2000 series from ATI.
While it is true that 'cheap' current gen cards destroy premium cards from that far back, it is the principle that matters.
Fair comment, except that in Windows 8.1 you /cannot/ install any AMD-supplied driver on my HD3870. It's a perfectly serviceable card, but has now been rendered obsolete through the manufacturer abandoning it. The reason is that they won't supply WDDM 1.3 or 1.2 drivers for this card, and they won't supply updated WDDM 1.1 drivers for 8.1
Certainly makes me think twice about buying another AMD card...
Re: (Score:3)
I just recently wrote an asm dll to return to the 128-bit result of the 64-bit multiply that the x64 processor produces for free every time it multiplies integers, for use in
Re: (Score:3)
Sounds dreadful.
Re:ATI drivers (Score:4, Insightful)
I installed fresh ATI graphics drivers today. 90MB for a driver. .Net 4.5 needed to be installed. GTFO.
As much as I find 'Catalyst Control Center' to be totally fucking useless, and would be pleased by a 'just the damn driver, the OS already has interfaces for changing monitor resolution and whatnot' edition, isn't using relevant vendor APIs for your application, rather than rolling your own or using real antiques, sort of what you are encouraged to do?
.NET version, or QT, or some braindead AMD custom nonsense, would it?
Its existence is obnoxious; but it would hardly be the better for depending on an older
No 90MB for ALL drivers (Score:5, Informative)
I installed fresh ATI graphics drivers today. 90MB for a driver. .Net 4.5 needed to be installed. GTFO.
You didn't download a 90MB driver. You downloaded a 90MB package which includes all drivers for all versions of windows, for all architectures, for all ATI cards, and it came with a utility that automatically installs the correct thing for your situation.
I wish more companies did this. Take the guess work out of the download screen. NVIDIA does it too.
Also what's wrong with .NET 4.5? Do you regularly judge applications solely by the framework their developers chose?
Re: (Score:3, Informative)
> Also what's wrong with .NET 4.5?
It's slow... so slow to open what is basically a dialog box.
Also, it's not cross platform - so they can't use it for Mac and Linux.
Re: (Score:3, Informative)
Nothing about drivers is cross platform. I highly doubt that even made it into a list of considerations.
As for .NET being slow, yes it's slow for the end user. But how often do you use it? I don't think I've opened the NVIDIA control panel since I installed windows a year ago. You know what .NET is fast at? Developing. Your complicated dialogue box you likely never use was also likely very quick to throw together.
Re: (Score:2)
A great deal about drivers _is_ cross platform. Many of the same libraries, used by programs to manage the actual behavior of the card, are OpenGL based which is indeed cross-platform. The binary drivers do require _compilation_ for a particular graphical environment, and that does take thoughtful development to manage OS-specific function calls.
My experience of .NET is that it's very fast for developing programs that are unusably slow because they are bloated. This is not a good trade-off.
Re: (Score:2)
Don't want .Net for your drivers? Try Linux.
Don't like wasting 90MB? Well, maybe if they didn't have drastically less resources than their competition, they could spare someone to optimize for size. For now, perhaps you can spare the 0.01$ of hard drive space. Sure, I agree 90MB is horribly bloated, but its not anywhere bad enough to care about when compared to actual driver bugs, or the price difference in GPUs.
I use Linux in work, Ubuntu 12.04 LTS. I had to reinstall it when I changed graphics card because it kept crashing (kernel too). I'm afraid to plug a second monitor in in case it causes the same thing to happen. I seriously considered installing Win7 on the machine and running Linux under VMWare. That's what I do on my laptop.
Re: (Score:2, Troll)
That sounds slightly odd. A reinstall would just install the exact same software again. If your kernel panics then installing the same kernel again won't fix it.
Personally I have 5 monitors connected to 2 ATI cards with the fglrx drivers.
Works pretty damn well. Of course the way I have it set up is utterly impossible in Windows.
Re: (Score:3)
Err there is no hardware detection for the graphics card. 'fglrx' is the driver for every ATI card so the driver didn't change at all.
When you reinstalled it would have been the exact same driver, exact same everything.
Like I said, your situation sounds a bit odd.
Re: (Score:2)
"I use Linux in work, Ubuntu 12.04 LTS. I had to reinstall it when I changed graphics card because it kept crashing (kernel too)."
Somehow these sort of problems always seem to involve Ubuntu. Might be a clue...
Re: (Score:2)
Re: (Score:2)
A reinstall would just waste your time and put you right back where you were. Linux is not like windows where a reinstall fixes things.
Re: ATI drivers (Score:5, Insightful)
When Windows breaks, it's because I installed an update that went horribly wrong.
When Linux breaks, it's because I installed an update that went horribly wrong.
Re: (Score:2)
Re: (Score:2)
And in every thread loads of people show why Linux isn't on loads of desktops by denying that anything is wrong and accusing the person of being stupid and/or a shill.
AMD - Can't help but be a fan.. (Score:5, Interesting)
After seeing AMD bet the farm on Athlon and beat a company with 10x the r&d budget, I cannot help but be a fan. The biggest reason for AMD being behind in CPUs today is lack of r&d budget based on unfair duopoly competition from Intel during the years where AMD was superior. Hopefully, AMD can make up for this missing r&d money by being superior with graphics for a while.
I do not believe there is a tech company pushing more innovation with less resources..
Re: (Score:3)
I like them simply because they are significantly cheaper for the same performance.
I prefer saving more money and getting 'pretty damn good' rather than 'ultimate system which is out of date a week after it arrives'.
Re: (Score:2)
Re: (Score:3)
Has AMD only cut the budget for CPU R&D and not GPUs?
They don't report R&D per division, only overall so you'd pretty detailed internal knowledge to say.
Re:AMD - Can't help but be a fan.. (Score:4, Informative)
Partly, but they never could match Intel on process technology which meant Intel always had a cost advantage, even when their CPU designs were inferior. As for more recent events, AMD looks saved for a while as the division that includes consoles more than doubled last quarter and gave them an overall profit so at least for the next year or so with big console sales they should be good. Still, with all their diversifying I'm worried that they simply don't want to step back into the ring with Intel, but instead focus on graphics cards, graphics-heavy APUs, heterogenous computing, semicustom designs, ARM micrservers and so on.
The reason I say that is because their CPU sales are way down, still going down and losing money - they have to either really step it up or step out and their roadmaps don't exactly indicate going on the offensive, just moderate revisions that might keep them from losing more ground. They have CPUs good enough to be "console-quality" for this generation of consoles, that'll sell for a good while since many PC games will be console ports and so play well on that level of hardware even if they give up competing with future Intel CPUs. It's not like they're competing very well on high-performance or performance per watt today, jjust performance per dollar and it's showing on AMDs financials.
Re: (Score:2)
I've been looking at getting a card to that will run Star Citizen decently during the dogfighting module. Not looking to spend more than $100 as I'll buy a new desktop computer next year as originally scheduled. Biggest problem was the weak power supply in the existing PC. Well the the R7 240 only draws 30 watts of power for about $90. I know it's not the highest performance card. Existing 7750's and 7770's beats it's performance, but with the R7 240 I didn't have to worry about spending $50 and repla
Re: (Score:2)
That's good, if you pay attention to gddr5 vs ddr3 ; the former is much better even on the lowest card. i.e. look only at 1GB cards with explicit gddr5, even for radeon R7 240, 250, 7730, 7750.
Re: (Score:2)
I'm actually just as impressed with their business wins of late. They've gone from posting massive losses with no signs of anything on the horizon to getting all the processors for the next-gen consoles except the Wii U's CPU, plus a heavy feature in the new Mac Pro, plus a growing tablet side. And while Bulldozer still seems to be an overall failure, GCN is very competitive and Jaguar seems to be pretty powerful.
If they can fix the IPC problems with Bulldozer, or otherwise get a decently competitive deskto
Re: (Score:2)
I'm guessing the only reason they got CPU wins for the consoles is because they got the GPU wins for those consoles. And while volumes may be high, the margins will be minimal compared to a desktop CPU sale.
Re: (Score:2)
Yes, but at this point AMD needed the publicity win. I'm on a few gaming-oriented sites, and opinion on AMD has pretty much pulled a 180 from a year ago. The standard responses have gone from "they're doomed" and "they suck" to "they're winning in some areas" and genuine interest in their future plans. Bulldozer CPUs aren't popular, except for some video editors, but their GPUs are on the upswing (they're fast going from "OK but cheap" to "good and cheap") and there's growing interest in Jaguar chips in low
Re: (Score:2)
No kidding. Last time I mentioned AMD's Bobcat (predecessor to Jaguar) as viable in its segment against Atom I got downmodded here at Slashdot.
Re: (Score:2)
Now Jaguar gets all the major console wins. Which was pretty sweet.
Re: (Score:2)
accidentally moderated redundant so posting to clear moderation
Re: (Score:2)
Not to mention that they sold their designs to Qualcomm and then had the CEO jump over there as well. Now look how well Qualcomm is doing with AMD IP. Their biggest problem was a string of incompetent CEOs and upper management.
Re: (Score:2)
It wasn't a string of incompetent CEOs. It was just one. Hector Ruiz.
For as little? (Score:2)
Re: (Score:2, Interesting)
R9 290X also has double point precision for about half the price.
The R9 290X is a direct Titan competitior.
Re: (Score:2)
You do understand you have to compare with similar goods to know if the price is relatively cheap or not, right?
Sure, but for the purposes of this conversation, a used car is a "similar good" in that either can be used for entertainment purposes. And I can buy a running used car (though probably with back registration) for that. So I'd buy one across a state line, and only pay initial used car registration.
$550 is a lot of money. It'll cover most households' groceries for the month. Don't pretend it's cheap for something that will be outdated in a couple years anyway. If it's worth it to you, that's a separate subjec
amd crippled R9 double presicion just like nvidia (Score:3, Interesting)
We've also come to learn that AMD changed the double-precision rate from 1/4 to 1/8 on the R9 290X, yielding a maximum .7 TFLOPS. The FirePro version of this configuration will support full-speed (1/2 rate) DP compute, giving professional users an incentive to spring for Hawaii's professional implementation.
Lots of folks use ggpu but don't have a "professional" budget to pay the extortion fee to have artificial limits lifted from the hardware they purchase.
Re: (Score:2)
You can turn recent Nvidia Geforce cards into Quadros by removing and adding 1 or 2 resistors.
The resistors simply encode the PCI device ID.
http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/ [eevblog.com]
No Mantle for Xbox/PS (Score:2)
> AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target
I read somewhere that that's unfortunately not true; Mantle will not be available for the new Xbox or Playstation. My speculation is that Microsoft and Sony don't actually want to be THAT compatible as it would make porting too eas
Re: (Score:2)
I read somewhere that that's unfortunately not true; Mantle will not be available for the new Xbox or Playstation. My speculation is that Microsoft and Sony don't actually want to be THAT compatible as it would make porting too easy...
Its the other way around. Mantle on PC is the equivalent of console APIs, with all the lovely low level access you get in consoles.
You will be able to take console game and throw a little shim between GFX calls to make it PC Mantle game.
Maybe I'm just a lame "PC gamer"... (Score:3)
...but "for as little as $550 each" just blows my mind.
I thought I was crazy when I spent $400 on a graphics card once, but I (and I understand it's subjective) was perfectly happy with the performance on any game I played for the next 2 years. $500-$1000 (x2) Crossfire/SLI setups just seem to me to be about people with too much money and not enough creativity as to how to spend it...
Re:Maybe I'm just a lame "PC gamer"... (Score:5, Insightful)
I've always had the notion that if you just wait a year, you can get yesterday's models for a great price and instead play the games that now have been out long enough to be properly patched. This has the bonus effect of weeding out a lot of crap games.
obligatory xkcd (Score:5, Funny)
I've always had the notion that if you just wait a year, you can get yesterday's models for a great price and instead play the games that now have been out long enough to be properly patched. This has the bonus effect of weeding out a lot of crap games.
Which of course comes with some downsides. [xkcd.com]
Re: (Score:2)
I resemble this remark, but I don't think I'm the target audience of these graphic card vendors. The only game I currently play is DOTA on the Warcraft III engine, under wine on Linux Mint 14 on an AMD64 system I assembled around 2005 that includes a Radeon 9500 or 9600.
I don't meet the system requirements for DOTA2 on Steam, so I'm guess I need to upgrade some time in the next couple of years.
Re: (Score:2)
You have to consider performance against the current flagship, the ATI card is as fast as a TITAN in most cases and significantly less costly. So if the highest performing part (titan) is 200-400 more than the ATI card, then the ATI card (for that performance level) is in fact the lowest cost option.
You always pay premium for the bleeding edge. When the 8800 GTX/GTS came out it was the same since it was the top performer so they could charge premium prices. At the time the 8800 was so good they could cha
Re: (Score:2)
I made the mistake of buying BF3. My graphics card (not the top of the line, but not bad) played it fine. The problem was the game was mostly mindless repetitive crap.
Re: (Score:2)
Oh. "LOL".
Re:Maybe I'm just a lame "PC gamer"... (Score:5, Informative)
This stuff is just amazing to me. The bottom end R7 260x card clocks 1.97 TFLOPS for $139. For that $550 you get up to 5.6 TFLOPS. It wasn't so long ago you would expect to pay $2,000 for a desktop PC. In fact, you still can.
In June 1997 ASCI Red at Sandia Labs was the first supercomputer in the TOP500 to breach the 1TFLOPS barrier. It had 7,264 cores in 104 cabinets or system racks consuming a total of 1600 square feet of datacenter space. It required 850 KW of power, not including cooling. With upgrades it remained at the #1 spot on the supercomputer charts until 2000, and wasn't decomissioned until early 2006 when it remained in the TOP500 list as #276 with only 2.4 TFLOPS.
Re: (Score:2)
Cross platform? (Score:2)
Right. Cross platform would be important, especially if the API appeared on the next-gen consoles.
However, I can't really see Microsoft implementing this API on their console. And I don't think Sony will do that either.
And then there's the fact that a game developer now needs to implement two APIs - and if "Mantle" is actually closer to the hardware then there won't be much portability between the two. Which makes this somewhat dead in the water.
Re: (Score:2)
Re: (Score:2)
Those are locked-down consoles. Might be a trifle difficult to install a direct competitor on such a device. You're pretty naive if you think that AMD can simply provide such tools against the wishes of the creators of the console.
Not to mention that I'd be wary because even if you got it to run, you as a developer might simply be hit by the banhammer for using an inofficial API.
rich people problems (Score:4, Informative)
"only" 550 dollars. Most people spend less than that on a whole computer, or don't HAVE 550 dollars.
Re: (Score:2)
Most people also don't have any use for a $550 graphics card, if you buy one you're almost certainly a serious gamer who'll get many hours of use out of that card. Of course you're not doing that on minimum wage, but having a $1k gaming PC is hardly just for the excessively wealthy. Honestly the cash investment is very low compared to many other hobbies, it's mostly time and effort. Just like WoW addiction is probably the cheapest addiction you can get, well if you don't count losing your job over it. Perso
Re: (Score:2)
Re: (Score:3)
In economics there is a term called opportunity cost. For example you could be working a second job instead of reading this reply or going to school for a better career.
There are three people out of work for every available job. Many of the available jobs are bogus listings designed to justify hiring an H1-B who doesn't actually meet their bogus requirements either. Most of the available jobs are minimum wage, or near it, and even having two of them wouldn't let you feed a family.
My ex guild leader is much richer and lost money as he did not want to start a business again and make gobs of money as that means no midnight runs. When things got rough his sister told him to quit playing and go get a real job or get your butt out of here. He did just that and is now making about 6 figures again.
Most of us have never made six figures. Why don't you talk about people to whom we can relate?
or earn more over a lifetime with $500 a credit hour for an advanced degree.
Many of the people who can't find a first job in the USA right now have an advanced degree. You're tal
Is it HDMI 2.0 or 1.4?! (Score:5, Interesting)
Has anyone else noticed that despite the endless 4K resolution marketing being put out there by AMD, there is not a peep on the specific type of HDMI port the card has?
There is a HUGE difference between HDMI 2.0 and 1.4, but it's always specified as just "HDMI" with no version number. No review mentions the HDMI version, even though one would think that a real journalist would put in some effort to research this and find out.
I suppose it's easier to run the card through a bunch of automated benchmarks, cut & paste 15 pages of results to maximise ad impressions, and call it a "review".
Re: (Score:2)
It will have DisplayPort, as will any monitors of 4k resolution.
Re: (Score:2)
That's great, but my 4K TV only has HDMI inputs, just like every other new TV out there.
Re: (Score:2)
That most probably means that those HDMI inputs are NOT HDMI 2.0 ports (especially because the HDMI 2.0 specs were released this September).
Re: (Score:2)
You do realise that Display Port Adapter cables don't support HDMI 2.0 or even Dual Link so there's no way you can support 4k resolution over such an adapter.
I'm afraid not (Score:2)
AMD/ATI also has introduced MANTLE Api for lower level access than DirectX which is cross platform. This may turn into a very important API as AMD/ATI have their GPUs in the next generation Sony and Xbox consoles as well with a large marketshare for game developers to target
MANTLE is not on any of the consoles. [tomshardware.com] This articles mentions only the lack of Mantle on the Xbone, but since the PS4 GPU is the same architecture with bigger numbers, it's safe to say it's not on the PS4 either.
Anyway, the problem with Mantle is not mantle it self, but the lack of games that will actually make good and innovative use of that tech. Sure, FrostByte 3 games will support Mantle but for what? So that you can play console games with better graphics? Sorry, good graphics are a great but afte
Re: (Score:2)
Re: (Score:2)
It doesn't matter much on linux which manufacturer is better. There is almost no need for GPU acceleration. Even if the GPUs accelerates anything, how important is it to you, personally? Most linux users are better off with a good quad core CPU and >=4 GB RAM.
High-performance GPU acceleration on Linux is very important.
Re: (Score:2)
very important.
every year there's some or another impractical demo about on cpu rendering and the next year cpu prices then haven't gone down enough to make sense for you to drop 16 top of the line intels into a machine just to play cpu rendered wolfenstein.
Re: (Score:2)
There were actually good games to play on the PC. But $550 for a pixel pumper just to play another CoD game. Not worth it.
Few to none of them need a $550 GPU, even on 1920x1080 and higher; but the PC is where all the good games are, aside from a few XBOX/Playstation title still in exclusivity periods, and anything Nintendo, if that's your thing.
I can't think of a platform-jealousy incident as a PC gamer since, what, Escape Velocity and Marathon?
Re: (Score:2)
Get Crysis 3 or Battlefield 4 on a 4k screen and your opinion will rapidly change. They can barely even make 40 fps !
Hence my weasel-wording of 'few to none'. Though, if somebody thinks that a $550 GPU is excessive, they probably aren't buying 4k screens, yet. People who use the term 'just to play another CoD game' with that tone of disdain probably aren't Crysis or Battlefield 4 poster children, either.
Re: (Score:2)
lol. There is a console even capable of 4k?
Re: (Score:2)
The new consoles are, particularly the PS4.
Re: (Score:2)
Xbone is much better :)
Re: (Score:2)
Natural Selection 2 is a fantastic FPS. Pretty much brought NS1 back to life. There are some great RTS and 4X games out there. Not only that but there are so many indie games and mods to explore : ) You don't need a fancy computer to play great computer games.
Re: (Score:2)
try starcitizen, still room for alpha coming out in December.
Re: (Score:2)
There's really no reason not to offer them for xp.. The code is already written, so why not keep support in until xp really drops off the radar?
Re: (Score:2)
They couldn't test for dropped and runt frames, and said so. So, the tests tell me nothing I want to know, other than I'll still be sticking with NVidia. ;)
"That leaves us with Fraps. And of course, there’s no way for us to pick up dropped and runt frame using Fraps. So, we immediately shed the dual-GPU solutions from our charts."
Re: (Score:2)
thats great, as it should be... maybe the triton will be less than a million dollars then ;)
Re: (Score:2)
also from what i remember triton was suppose to ship with a 512bit memory bus... so maybe they will finally ship the card that titan was meant to be and a diliberatly disabled one because there was no need/competition.
Re: (Score:2)
agree, do wish they opted for a better cooling design...