How OpenGL Graphics Card Performance Has Evolved Over 10 Years (phoronix.com) 115
An anonymous reader writes: A new report at Phoronix looks at the OpenGL performance of 27 graphics cards from the GeForce 8 through GeForce 900 series. Various Ubuntu OpenGL games were tested on these graphics cards dating back to 2006, focusing on raw performance and power efficiency. From oldest to newest, there was a 72x increase in performance-per-Watt, and a 100x increase in raw performance. The NVIDIA Linux results arrive after doing a similar AMD comparison from R600 graphics cards through the R9 Fury. However, that analysis found that for many of the older graphics cards, their open-source driver support regressed into an unworkable state. For the cards that did work, the performance gains were not nearly as significant over time.
FOSS drivers (Score:2)
The FOSS drivers for Nvidia ("nouveau") and AMD ("amdgpu") have come a long way.
They are outperforming the proprietary drivers in some games already.
That is, at least with the latest pre-release kernel, for amdgpu, and Mesa 11.
Quirks are getting fewer and fewer. OpenCL support is progressing.
However the most important advantage is that the open drivers can support old hardware forever.
Re: (Score:1)
Sounds like they were studying the wrong thing (Score:1)
Re: (Score:1)
So we're using nearly 40% more power? (Score:5, Interesting)
While it's nice and all that we're getting 72x more performance per watt, since we are at 100x times the performance, that means that we are using 100/72=1.39 times the power that we used to.
I could start into a tirade about how this is contributing to global warming, but I'll leave that somewhat political stance for another time.
Re: (Score:2)
While it's nice and all that we're getting 72x more performance per watt, since we are at 100x times the performance, that means that we are using 100/72=1.39 times the power that we used to.
If we are getting that much of a performance boost, we are most likely not spending as much time maxing out the card's performance (and using that extra energy) as we were before.
Re: (Score:3)
Ah, but 10 years ago or so, we were really happy to get 1080p resolution working. Then we added to it - quad HD (2560x1440 - 4x 720p), and now ultra-HD/4K (3840x2160).
So a lot of performance improvements have gone to quadrupling the number of pixels we render out, including the increase in texture resolution and details.
So we're li
Re: (Score:1)
So we're likely back to where we started.
Yeah, because rendering the first Tomb rider in 4K would had been so hard ..
Also most people use 1080p _NOW_, ten years ago maybe they used 1280x1024 or something.
To work/complexity has increased a lot of course and will continue to do so with increasing capability for games.
For desktop & even video work it likely haven't increased as fast as the capabilities of the cards.
Re: (Score:2)
Re: (Score:1)
800x600 / 1024x768 / 1280x1024 and so on was the standards of the CRT era but the first LCD/TFT monitors ...
Here's a TomÂs hardware article from 2004:
http://www.tomshardware.com/re... [tomshardware.com]
"Getting hold of a suitable TFT panel presents no difficulties, as we said before: an old 15.1" flatscreen monitor makes the ideal basis for building a powerful projector. Most displays offer a physical resolution of 1024x768 (i.e. 786,432) pixels, for playing back high-quality DVD videos or displaying the Windows screen.
Re:So we're using nearly 40% more power? (Score:4)
Re: (Score:2)
Well, they still need to lower the power when using them too!
Re: (Score:1)
The old cards are completely choking for vram, so they are realistically idled waiting for texture data to swap in from system ram. Unfortunately the data just isn't that interesting, the way it was done.
unfortunately (Score:2, Insightful)
This article is not that interesting, primarily because they used high settings + 2560x1440 on old 256MB (!) cards, so most likely those cars are bottlenecked by gpu system ram texture swapping, showing lower performance and lower power reading than if more reasonable settings had been used. We're not seeing GPU power compared, but vram bottlenecks instead.
Re: (Score:1)
What, you actually expected something out of a Phoronix article?
Hold my beer while I guffaw.
Is this of interest to anyone besides gamers? (Score:2)
The main reason I've favored intel MBs recently is that they've opened sourced their graphics, which are good enough for me, so I don't have to worry about them. But then, I'm not a gamer. Are there folks out there who need the high end graphics stuff for something besides games?
PS
Just for the record, I have ways of wasting my time that may not be any better than playing games so I'm not going to adopt a 'holier than thou' attitude towards gamers. And even I may benefit from the gamer world because gami
Re: (Score:2)
modern cad work /tv etc
video editing
multihead displays
simulators (and I dont mean flight sim)
physics calculations
3d rendering for art / film
Re: (Score:2)
Re: (Score:1)
that's about a 1.6x improvement year over year (Score:2)
Wolfram Alpha [wolframalpha.com] is great when you're out and all's you got is your phone.
Re: (Score:1)
Can you just go away. People like you that make dumb comments like this add nothing to the discussion. Avoid the article and move on.
Re: (Score:1, Insightful)
Not really that interesting.
Summary: Older video cards are slower than newer video cards, but linux still a piece of shit.
Re: (Score:2)
Welcome to the club.
Re: (Score:3)
Linux has much less overhead and is often tailored to squeeze every ounce of power out of it's hardware. MS is a little more loose in that it will cater to as much hardware as possible at the cost of performance.
Linux if built for specific hardware can achieve much better performance than Windows in it's current state. I say current state because Xbox is an example where Windows could be optimized to perform for specific software hence avoid all the extra validation that eats up the hardware.
Re: (Score:3)
Re: (Score:2)
Which part?
Re: (Score:3)
Re: (Score:3)
All of it. Most performance improvements are in the driver side, and drivers for Windows are more performant than Linux versions in general. Windows isn't as bloated as it used to be. The fact that you can run Windows 10 on a 2GB tablet with pretty good performance is a testament to this.
But the tablet version of Windows 10 is probably not the best example since it's been optimized for the hardware (drivers). And as I stated, until you've come across a sandboxing issue, you don't know how much validation is actually being done. This applies to Windows 10 as well.
You should choose Linux because it is Open, not for performance reasons.
I totally agree with this. It still doesn't put Linux behind Windows performance wise. As a mainstream desktop OS Linux simply hasn't lived to the expectations of users and that's why Windows is still doing very well.
Re:My conclusion is that linux sucks for games (Score:4, Interesting)
Actually the grandparent is right. Windows has gotten much more performance over the last few versions. With Windows 7 MATLAB ran about 20% faster on Linux than Windows. With Windows 8 the Windows version was very slightly faster and with Windows 10 the different is about 5% now in favor of windows.
Overall I suspect it is nothing magical. Microsoft has just worked very hard to offload more work to the GPU and also to optimize many other aspects of their systems for power usage. I get about an hour more battery life on windows vs linux.
Re: (Score:2)
The speed of MATLAB should be completely independent of the OS.
The only things where the OS could be involved is swaping/paging. If a OS is slowing down an application from its max performance there is a serious problem in the OS.
Re: (Score:1)
The difference between "should be" and "is" has some importance here. Perhaps there are some factors at work you are not aware of? Or have gotten the relative importance factors wrong?
Re: (Score:3)
How fast MATLAB runs will vary based on how much CPU time it can get. If Xorg+WM takes more CPU time than Windows does then MATLAB won't have as much CPU time to run with.
Anything that makes the OS offload more work or do work more intelligently will increase the speed of any cpu bound operation. Just like you do the same total number of operations if you use a triple nested loop to multiply two matrixes or you use xGEMM. However the xGEMM version will run almost 100x faster since it uses the CPU FAR more e
Re: (Score:2)
On the moment an application like MATLAB is in the foreground, the OS should basically do nothing.
In other words only competing applications and the scheduling overhead of the OS should slow down any given application.
Re: (Score:1)
Trimming off services you don't need running in Windows will give you another 10-15% performance boost easily and no you don't need to run all of the services windows has on all the time by default. I suspect they set it up that way so 'everything just works' out of the box if a user requires some of those services. Yes, this can be done in *NIX of all kinds as well in their daemon processes too, but I just thought I'd put that point across too. I've been doing it for decades since the original Windows NT 3
Re: (Score:2)
But the tablet version of Windows 10 is probably not the best example since it's been optimized for the hardware (drivers).
The version running on those tablets is the same Windows 10 you run on desktops.
It still doesn't put Linux behind Windows performance wise.
Like anything you need to look at the specific use-case and hardware involved, there is no one answer that applies to all.
Re: (Score:2)
The version running on those tablets is the same Windows 10 you run on desktops.
If your desktop runs ARM then maybe, but I doubt it. The other big difference is that the tablet manufacturer will tune the OS to minimize potential failure points (by removing features) and will maximize performance in the process.
Like anything you need to look at the specific use-case and hardware involved, there is no one answer that applies to all.
Absolutely. Something many /. users oversee. Hardcore gamers will invest time in "tuning" their hardware and software. Although Windows partially caters to this, Linux caters more. It that a valid argument for suggesting one if better than the other? I'd suggest the answer is no
Re: (Score:2)
If your desktop runs ARM then maybe, but I doubt it.
Most Windows-based tablets are x86, not ARM.
Re: (Score:2)
How do you know?
Because they're both x86, so I just installed the same OS on both.
You have compiled them and/or have the buildchain to verify that the source code for both is the same?
That seems pretty moronic, I just used the same install media on both.
Windows isn't as bloated as it used to be. (Score:5, Informative)
You almost got me up to that statement. I did a VM install of Win10 over the weekend; it failed the first time, because I thought that a fixed 16GB for the test partition would do. The dynamic container is at 24.738.004.992 bytes now after the Threshold 2 update. Nothing else was installed - just Win10 + updates.
Give it a try, grab the iso and fire up a VM. No need for a Windows key, you can skip entering it just like the activation.
Threshold 2, which like all updates is not optional, as we all know, took >1 hour on a 4 core system with a decent SSD and ~2,5GB RAM for the VM. I wonder what you'd call a "bloated" OS.
Re: (Score:3)
Windows itself is fairly decent, I've even seen Windows 7 run on an Athlon XP with 768MB RAM just fine for offline duties.
Windows Update is a nightmare, easily the most demanding application on a Windows PC and seemingly non-deterministic. You can leave it for an hour, with the fake progress bar treading water and the CPU somewhat hammered and it fails to find updates forever. There was that WUReset script that helped, and now my VM fails to update. Also I have a physical installation with Windows 7 SP0 and
Re: (Score:2)
Ya, define "bloat." I have installed Ubuntu Desktop (14.04 and 15.10) about 300 times in the last 6 months (seems like, anyway), and Windows 10 Enterprise about 20 times. It also seems like I've spent about an equal amount of time on both tasks. Installing Windows 10 is definitely slower, more interactive (I call it "painful"), and requiring more system resources.
Re: (Score:2)
I wonder what you'd call a "bloated" OS.
An OS that runs slowly and poorly because of extra cruft that is loaded and doing something that is of no benefit to the user.
Using up diskspace is not bloat.
Using up diskspace for uninstall information of a major update and a system restore point is definitely not bloat.
And while each successive version of windows has gotten larger on the drive, they have also gotten progressively faster, more responsive, and better at managing memory, the opposite of bloat in my books.
Re: (Score:2)
I haven't used the Win10 install much, but just to nitpick:
bloated [wiktionary.org]
...
3. (computing, of software) Excessively overloaded with features (...)
In my book, both apply :-P
Without the rollback and backup stuff you mention, the frickin thing is still huge.
Don't get me wrong, but just the other day I was reminiscing with a friend about the times when our PCs ran at 4,77 MHz and how we dreamt of what those machines could do if they ran at 40 or even 400 MHz. Today, t
Re: (Score:2)
Yeah I understand the sentiment but the problem with Windows (if you call it a problem) is integration. Cortana search, speech recognition, handwriting recognition, they all seem like standalone systems but they are heavily ingrained and used in multiple places throughout the OS each interdependent on another. Excessive feature overload depends entirely on if you use all those features and most people would, mind you I also don't consider a 11GB OS (Windows is only massive if you keep uninstall information
Re: (Score:2)
You can run linux on a 8MB RAM system with as much performance as he processor likes to give you. ...
I bet you can scale it even down to 1MB
Re: (Score:1)
The only thing that gets people to buy new computers these days is the fact that Windows can't be
upgraded in any reasonable amount of time. Seems to me it gets more bloated all the time.
Re: (Score:2)
Is it a guess? Or just cheering for your special toy?
Meanwhile back in reality if the CPU has less to do it performs better at the things it is doing. A stripped down MS system would also perform well for exactly the same reason.
Re: (Score:1)
Re:My conclusion is that linux sucks for games (Score:4, Interesting)
The big push to DirectX happened because OpenGL was just a royal pain in the ass to develop with.
Not quite. OpenGL was an alterative to the software rendering modes that most video games had in the early days. Once gamers saw Quake running in OpenGL in 1997, they ran out to the stores to buy OpenGL-compatible video cards. Microsoft didn't have an alternative API to compete with OpenGL. Hence, DirectX was born. It took a handful of years before DirectX stopped being a royal pain in the ass for most gamers.
Re: (Score:2)
Re: (Score:2)
DirectX first came out in late 1995, so your understanding is wrong in that respect.
I stand corrected. I didn't become serious about video games until I got a job as a video game tester in 1997. Direct3D at work didn't become a factor until the company decided to give up software rendering mode in 1998.
Re: (Score:2)
Thank you for the memories of upgrading my Pentium 200 with a 3DFX Voodoo card. At the time, a mind boggling feat to have a whopping 4 megs of ram just for 3D.
GLQuake.exe here we come!
Re: (Score:2)
Re: My conclusion is that linux sucks for games (Score:2)
You must confuse it with Cyrix. Their CPUs were utter crap, but K6-2 and expecially K6-3 were quite decent when it came to ingeger performance and bearable at the FPU side.
Re: (Score:2)
Re:My conclusion is that linux sucks for games (Score:5, Informative)
In 1995. when Quake made use of the floating-point and integer units of the Pentium CPU to do software texture-mapping in a custom engine, SGI realized that they had to bring out a software version of OpenGL that would run on desktop PC's. Back then some bits of OpenGL would be implemented in hardware (the "fast path"), and other bits in software (the "slow path"). It was a pain-in-the-ass for developers to try and divine which were slow and which were fast. Some combinations of vertex/color/normal attributes were fast and others were slow. Microsoft bought out a 3D game engine developer, pulled out the lower layers and created DirectX.
The 3Dfx brought out a piggy-back board, that worked with desktop PC's. Then SGI engineers left to form Nvidia, and a great race began. First texture-mapping was hardware accelerated, then both companies try to outdo each other every quarter with new extensions. That led to a legal battle, with Nvidia winning.
Eventually by 2001, they reached having the first true full hardware accelerated consumer 3D graphics acceleration for a PC. That's continued.
Re: (Score:2)
> Microsoft bought out a 3D game engine developer, pulled out the lower layers and created DirectX.
That was due to Microsoft's NIH syndrome; they bought RenderMorphics in February 1995 and transformed their shitty Reality Lab API into DirectX.
* https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
The OpenGL bits that "sucked" at the time were lighting (8 or 16 light sources max, built in to the pipeline). This was great for CAD companies who just needed stuff to look like plastic, but it was awful for game developers. Also, OpenGL back then only used the begin/end model with display lists. The main vendors only optimized the paths that ID software used (various combinations of vertex, normal color and texcoords). Programmers had to speed test all the different paths to find out why there were speed
Re: (Score:2)
Do you even understand the phrase NIH at all ??
> Not Invented Here (NIH) is the philosophical principle of not using third party solutions to a problem because of their external origins.
The third-party OpenGL *already* existed and they refused to use *that*.
They *invented* DirectX by buying a company and bringing it in house.
Re: (Score:2)
Have you even *played* glQuake ?? Because you keep using this word "sucked" -- it doesn't mean what you think it means.
*At the time*, OpenGL was WAY easier to **developers** as Carmack pointed out in his plan how much *Direct3D sucked*. Why do you think Windows NT _supported_ OpenGL in the _first_ place??
I know because I was one of the many game developers who signed the original petition to Microsoft to support OpenGL *at the time* instead of that shit-fest Direct3D. The fact that Microsoft's politics o
Re: (Score:2)
Why do you think Windows NT _supported_ OpenGL in the _first_ place??
Not for games. Windows NT was the business operating system. OpenGL was there to support CAD and graphic users with high-end OpenGL cards as an alternative platform to Silicon Graphics.
Re: (Score:2)
Programming wise the royal pain in the ass is quite the opposite around, and it is not over. I pity those who use DirectX instead of OpenGL.
Re: (Score:2)
Even if that is true, it is irrelevant in a comparison of video games, where a quality video card driver is almost exclusively the only relevant factor.
Re: (Score:2)
Linux has much less overhead and is often tailored to squeeze every ounce of power out of it's hardware.
In what cases is it "tailored to squeeze every ounce of power out of it's hardware"? In its most common form (Android, Debian/Slackware/RedHat-based desktop and server distributions) it is about wide hardware compatibility. Even most embedded are just the stable kernel tree.
Linux if built for specific hardware can achieve much better performance than Windows in it's current state.
That's true of anything. If you make a special version of software that targets specific hardware it is most likely going to run better than software that targets a whole range of hardware.
Re: (Score:2)
But there are distros of Linux, like Gentoo...where you can install it and custom compile your kernel, etc...for the hardware it is running on.
If for nothing else, this is nice for squeezing out the most of your older hardware you might be wanting to run things on rather than toss it in the trash.
Re: (Score:2)
But there are distros of Linux, like Gentoo...where you can install it and custom compile your kernel, etc...for the hardware it is running on.
Yes that is what I said, if you make a custom version of software to target specific hardware you're more than likely to get better performance than software that targets a range of hardware. But in reality the investment in the hardware driver is going to provide far more value than modifying the kernel.
If for nothing else, this is nice for squeezing out the most of your older hardware you might be wanting to run things on rather than toss it in the trash.
Yet as this article demonstrates, the open source drivers for older hardware often regress into an unworkable state, obviously the time spent on making this older hardware work is not worth the effort.
Re: (Score:2)
In what cases is it "tailored to squeeze every ounce of power out of it's hardware"?
...an any case where the admin/owner decides to take a trip through make config , in extreme cases literally tear out unused bits of kernel from source (then recompile), or on a lesser scale, tweak up /etc/sysctl.
Now, if you mean tailoring from the OEM? That's most often found in Linux-based appliances, where the source itself is often hacked on to remove stuff the appliance would never use.
To be fair? Yeah you can twiddle with Windows Registry settings and turn off/disable excess services (I'd done both ba
Re: (Score:2)
In what cases is it "tailored to squeeze every ounce of power out of it's hardware"?
...an any case where the admin/owner decides to take a trip through make config , in extreme cases literally tear out unused bits of kernel from source (then recompile), or on a lesser scale, tweak up /etc/sysctl.
Not really, that might reduce the footprint a bit but it's a far cry from tailoring it to "squeeze every ounce of power out of it's hardware". I doubt many - if any - people actually do much outside of what you specified. The real hardware-specific performance gains are going to be achieved with good hardware drivers, not just omitting things from the kernel.
Re: (Score:2)
In what cases is it "tailored to squeeze every ounce of power out of it's hardware"?
Because of the community you get specialized version of drivers that remove unneeded checks for specific applications. This isn't possible with Windows drivers as they are usually not open source. It all comes down to the open source nature of Linux and it's distributions.
That's true of anything. If you make a special version of software that targets specific hardware it is most likely going to run better than software that targets a whole range of hardware.
Linux in it's base form is an empty shell you can fill. Windows more or less has an equivalent of this. Windows 10 IoT is probably the closest thing to it and I'm not sure how feature rich it really is compared to the Linux Kernel.
Re: (Score:2)
Because of the community you get specialized version of drivers that remove unneeded checks for specific applications. This isn't possible with Windows drivers as they are usually not open source. It all comes down to the open source nature of Linux and it's distributions.
You're talking about open source rather than Linux. A great deal - the majority actually - of the hardware drivers for Linux are in fact closed source.
Re: (Score:2)
Windows performance, especially for games, is extremely good. Microsoft has now produced three games consoles and learned a lot about gaming performance from them. That's one of the reasons why the performance boost from Vista to Windows 7 was so big - they used the tools they developed for games to profile Windows code and fix the bottlenecks.
DirectX is also pretty damn good for games. OpenGL is lagging a bit, especially in terms of bringing together more traditional rendering and compute. Non-Microsoft co
Re: (Score:2)
Linux is just not very good for games. Windows has much better technology when it comes to computer graphics
Linux is weaker because its a *relatively* unpredictable platform to target by comparison to windows, and crucial driver performance optimizations lag behind, or simply aren't done by the vendors; due to the relative demand/marketshare and that fact that companies have limited resources..
this is why the xbox is based on windows technology
That's mostly because the xbox is made by microsoft.
and not free hand-me-down college project stuff.
PS4 Orbis is based on FreeBSD. (Remember what the B in BSD stands for? Hint... think 'names of colleges'.)
Troll grade: FAIL
Re: (Score:2)
Windows 10 is 10/10? Well, 10 divided by 10 equals one. But they did make the Xbox One. On the other hand, it means Xbox 1 is their third game console.
I really can't wait to see what stupid name they'll be using for their fourth console:
- Xbox
- Xbox 360
- Xbox One
- Xbox Delta?
Re: (Score:2)
Should be enough for anyone!
Re: My conclusion is that linux sucks for games (Score:2)
Windows 10 is 10/10
Is it easier type that with a straight face when you post anonymously? :)
Re: (Score:2)
I wish Valve had picked PC-BSD as their platform, instead of Ubuntu. I'd have loved to be able to play Civilization V on this PC-BSD laptop. Guess I'll have to wait until PC-BSD 11, when SteamOS jails will be included.
Re: (Score:2)
It's probably more about market-share than quality. Vendors of software and drivers target the biggest markets first because that's where the most revenue comes from, and Linux is not a big consumer market.
Same with Mac. Mac's are not cheap, but not a good game platform due to the same market-share forces.
If vendors want to go after smaller game markets such as Linux, Android, and Mac, then perhaps they should focus on strategy instead of flashy graphics. That way they don't
Re: (Score:2)
Human "apes" are drawn to the big shiny red ball, as usual. And fancy graphics are a status symbol: keeping up with the Jones'.
I think your problem is your inability to see anything beyond the "big shiny red ball". Go play tuxracer if you want but the latest Forza, for example, offers a *lot* more than just realistic graphics, now of course as an "ape" you can't see that can you? You're too distracted by the "shiny" to notice everything else. If you're going to try to create a realistic experience with physics, sound and AI then why would you not do the same with graphics?
Re: (Score:1)
I'm not quite following your criticism. There seems to be an unstated assumption. They could target Forza at other or multiple platforms and still have "full" physics. The physics calculations shouldn't be "lost" on other platforms, even if some graphics are, unless perhaps they are using the GPU for physics also?
Re: (Score:2)
They could target Forza at other or multiple platforms and still have "full" physics.
Why make a half-assed version for a niche platform that can't support proper graphics? Having a high resolution visual experience isn't some "status symbol" it is being able to express the vision of the art designer and to create an immersive experience for the gamer. Previously we had to expend considerable effort to scale down assets and remove detail because of hardware constraints, now the hardware is powerful enough that we waste much less time compromising the art design.
The physics calculations shouldn't be "lost" on other platforms, even if some graphics are, unless perhaps they are using the GPU for physics also?
Very often the GPU is used for
Re: (Score:2)
Are you talking about the XBox when you claim: "Why make a half-assed version for a niche platform that can't support proper graphics?"?
No, this article is about Linux graphics drivers and that while they have gotten better for brand new hardware they have gotten a lot worse for older hardware.
Here's the reason why: there's money on the table.
But there isn't, because the target is a niche.
There's an advantage too to writing multiplatform, as any programmer knows: if it works on several systems, its less likely to have crashing bugs and to work better.
Umm...no, there's no factual basis to support such a thing.
It even makes QC easier, since you don't necessarily have to test every permutation of game progress, since the differences in game play on different OSes can exercise different path orders
How does that result in less burden on QC? You still have to test everything and you have to test it on every target platform.
especially useful in a multithreaded program, where nondeterministic code paths is a serious cause for bad bugs in software.
And since every platform has different methods for handling multiple threads that increases the burden on QC. Just becaus