Intel Releases Ivy Bridge Programming Docs Under CC License 113
An anonymous reader writes "The Ivy Bridge graphics processor from Intel is now fully documented under the Creative Commons. Intel released four volumes of documents (2400+ pages) covering their latest graphics core as a complete programming guide with register specifications. Included with the graphics documentation is their new execution unit and video engine."
Captain Obvious Here (Score:2)
Re:Captain Obvious Here (Score:5, Insightful)
And Broadcom too, while we're at it - it's not as if we're asking for the schematics to copy the chips, just some low-level api information would be nice for OSS driver development.
Re: (Score:2)
Re: (Score:2)
Those broadcom chips are SDR I'm pretty sure. Just leaving the code open like that and handing the hardware out might cost them somewhat. Things like FCC certifications.
Re: (Score:2)
Re: (Score:2)
They require firmware uploads, their chips are most definitly SDR.
Re: (Score:3)
Broadcom is a highly profitable and successful company. You could argue what they're doing is financially beneficial.
The argument "we must be doing it right because we're making money" works great right up until the point you realize somebody did it better and passed you by.
Re:Captain Obvious Here (Score:5, Informative)
Yup, to the tune of a couple 100 million: http://www.phoronix.com/scan.php?page=news_item&px=MTEyNTE [phoronix.com]
Re: (Score:2)
Can't wait for the Chinese version of this ATI chip to come out... it oughta drive down the prices of nVidia cards which is always good for me
Re: (Score:2)
Yes because the Chinese version of the CPU drove down Intel and AMD CPU prices... Oh, wait.
Yeah, I guess you're right. It's not like cheap MIPS chips coming out of China help keep the price of other chips down. Oh, wait...
Re: (Score:1)
The nV and ATI chips are already manufactured in China by Chinese companies.
When they want copies of the chips they just run extra real ones on the same production line.
The Chinese government already has copies of the designs.
Only end users are hurt by this sort of pseudo-secrecy.
Re: (Score:2)
ATI has not been around since 2006 and died as a brand 2 years ago.
Take that you morons at nVidia! (Score:3, Insightful)
If you won't allow us to write software for your crappy cards, then they'll be no software for your cards. I don't understand why these Microsoft-style closed source morons always think not allowing people to use what they sell will help them. They're letting their paranoia get in the way of good business.
Re:Take that you morons at nVidia! (Score:5, Interesting)
I don't understand why these Microsoft-style closed source morons always think not allowing people to use what they sell will help them.
You're asking a question about market behavior, but the problem isn't a market problem (what you say makes sense, in a free market). Since the expected market behaviors don't exist, you have to ask, "why is this market broken?"
The standard answer is that they're violating thousands of patents six ways to Sunday, and the more open they are about their hardware the more risk they expose on these being found out.
Of course all the manufacturers are doing it because the patent system is so screwed up and the product would be impractical otherwise. People get grants on the obvious and necessary techniques all the time. And it's not just the big three where they could cross-license - there are trolls out there who just want to be parasites on the successful shops.
As usual, this is social engineering run amok. Yes, the reason you can't have good video drivers for linux is because the government has screwed up this market too. Take away this patent morass, and the vendors become interested in selling cards any way they can. Of course, the smartest-kids-in-the-room will now chime in and say that there simply wouldn't be any good video cards without the government getting involved. You decide which acutally makes sense.
Re: (Score:3)
Fascinating idea. I thought the main issue was that some of their OpenGl code was licensed to them, and that license was non-transferable.
Re:Take that you morons at nVidia! (Score:4, Insightful)
They don't need opengl code to release hardware documentation... Given appropriate documentation, people could implement a clean room version of opengl and replace the bits that can't be released. There are already several open source implementations of opengl which could be adapted.
Re: (Score:2)
Sort of what Bert64 says, this has nothing to do with OpenGL implementations.
It's stuff like "poke an 0x1f into memory offset 0xWhatever then peek at offset 0xSomewhere to see the state of SOME_COOL_FUNCTIONALITY"
Re: (Score:3)
I agree. Let's ditch the patent system. It's the only rational, logical thing to do. I mean, if you think that the Patent System is good, then I call bullshit until you test the damn hypothesis already. It's not like we can't re-enact whatever crap laws we want.
I can hear it now: "If you get rid of copyrights and patents NO ONE WILL INNOVATE"... Well, please explain how the fashion and car industries are doing so well without these protections? [ted.com]
FUDsters gonna FUD...
It's time we did the damned expe
Re: (Score:2)
Well, don't think it's good to abolish it all together... But a big rewrite is definitely needed..
I agree that the fashion industry is a bit different... But the car-industry does use quite a bit of patents...
For copyright... Lets make it "until you cannot prove that you have made less than 5 times the invested money or 5 years, whatever comes first.". Commercial use of the content could be restricted to 15 years or something like that but for all non-profit use it's free for all. Also lets make any work th
Re: (Score:2)
Trolls can just use legal discovery. The companies don't gain anything by keeping the spec's secret, since they'd have to give whatever was asked for in a patent trial anyway. It may be sealed (non-public) but the companies can't hide it, or refuse to provide it.
The standard answer is that they're violating thousands of patents six ways to Sunday, and the more open they are about their hardware the more risk they expose on these being found out.
I don''t see how making it public would encourage trolls. If everyone already knows then the trolls do too. Uncertainty about hardware specifications is not holding back the trolls, it's the millions of dollars for multi-year trials that have uncert
Re: (Score:2)
I don''t see how making it public would encourage trolls. If everyone already knows then the trolls do too.
The trolls don't know which patents are being 'violated'. You can't simply go fishing with discovery. "Your honor, it stands to reason that the defendant is probably using our patents, so we'd like to see all the source and design documents for their core products. Our evidence is that we suspect they owe us money." What judge is going to go along with that?
Re: (Score:2)
The trolls don't know which patents are being 'violated'.
They know which patents they own, and can make a reasonable guess as to which apply. That's all they need for discovery. Trolls don't have to present evidence prior to filing suit. Think of SCO or Oracle v. Google, all it takes is money and time, and then they're in discovery.
Using discovery as 'fishing trip' is about finding new, unrelated information or issues, which trolls don't need to do.
Re: (Score:3)
Re: (Score:2)
Because it's a balance of interests - financial, engineering, IP, etc.
Let's say someone like nVidia or AMD open up their drivers completely - how much extra money would that make them?
Given that Intel is t
Good news! (Score:4, Insightful)
Re:Good news! (Score:4, Informative)
Nvidia's "half baked" support is actually better, since their drivers are backported to older stable distros. Stuff that requires kernel or bleeding-edge X.org is a royal pain to make work on a box running an older distro.
Re: (Score:2)
AMD has release the specs of their chips and has even helped with the development of drivers. It is unfair to call their support half baked.
Re: (Score:2)
The article itself says they've lagged behind in handing out specs.
Collaboration (Score:2)
They might be slow, but:
- They *DO* end up releasing specs. (Although after a long time in legal checks before getting approved for release).
- They do *PAY* developers to write open source drivers for their GPUs.
This has ended up well nicley [phoronix.com] for them: unlike Nvidia, they have a nice opensource driver that can be ported to chinese (MIPS based) CPUs, bringing lot of money to them. (Also in five year, expect cheap chinese clones of current Radeon GPUs :-P).
(And its not the first time that the opensource driver
Re: (Score:2)
Not to mention that Intels GPU are just not as good or I bet as complex as AMDs at this time.
Link to documentation (Score:5, Informative)
The documentation referenced is available from Intel Linux Graphics: Documentation [intellinuxgraphics.org].
That's a lot of information (Score:3, Interesting)
We've come a long way since the 47 registers and paltry documentation of the Commodore 64's 6567 video chip. My question is, who can actually master these modern systems before they are obsolete? No one person, I think, can gobble 2400 pages of documentation to work with a graphics system. Are people now merely specialists of one tiny subset of a system, never to understand what is going on overall? That might explain why we need 600M device drivers these days.
Re:That's a lot of information (Score:4, Informative)
That's what makes the release of Intell's documentation under a CC license so logical. A group or even a confederation of groups working to develop a good driver can really make use of the docs. This can also make far more sense for Intel as they don't have build a driver for every purpose that their chips can be applied to. If anyone could afford to release documentation like this, without worrying about exposure to patents (as one other poster noted), that would be Intel. They're big enough to defend against most suits without going bankrupt.
I'm looking forward to seeing what comes of this.
It's already separated! (Score:2)
It's not really obvious where you can divide a driver up into sections, so the division-of-labor thing doesn't work.
Well currently, Linux GPU drivers are *already* separated into several parts:
- there's the kernel module (DRM) which is in charge of initialising the card, modesetting, and saving/restoring the state for suspend/resume. Texutre uploading/buffer object downloading is partly handled here, in cooperation with the in-kernel GPU memory management.
- then there's the Gallium3D back end which exposes the basic building blocks (shaders, etc.) to the Gallium3D modular API so front-ends can handle the APIs (OpenGL, Op
Is 2400 pages much? (Score:2)
Also there's probably both highlevel and lowlevel APIs, instructions or modes on the video chip, and in order to write a video driver you probably don't need to understand the internal lowlevel instructions. If you want the write the o
Re: (Score:2)
We've come a long way since the 47 registers and paltry documentation of the Commodore 64's 6567 video chip
I wouldn't call the 6567 (VIC-II) documentation "paltry." Sure, it didn't cover every weird edge case, but the official C-64 Programmer's Reference Guide included full register details and everything you needed to get access to all the chip's features.
Re: (Score:2)
That might explain why we need 600M device drivers these days.
The more likely explanation is a combination of third-party bloatware attachments, feature creep, and lazy programmers who don't bother with (or don't care about) proper optimization.
Code too... (Score:2)
My question is, who can actually master these modern systems before they are obsolete? No one person, I think, can gobble 2400 pages of documentation to work with a graphics system.
The engineers who designed the chip know it at best.
And the good thing is that with intel, they collaborate with the drivers writing efforts early on. Drivers are written all the while the hardware is designed.
At the end, not only to you get a massive documentation, but you get piece of (L)GPL code written by the very few people who have an understanding of how the system works.
But yes, nowadays, the barrier of entry for writting driver is rather high and requires collaboration between hardware and software
*Not* the first public release of information (Score:5, Informative)
This is a release of a large and very complete set of formal documents, but open source driver code (GPL'd and part of the mainline Linux kernel) has been released under a public development process since just after Sandy Bridge first came out in preparation for the Ivy Bridge launch. This code is written by paid Intel employees.
Incidentally, large portions of the DRM infrastructure in the kernel *and* the X server *and* the upcoming Wayland project are all being made by paid Intel employees. Note that this development work also has major benefits to the open-source AMD driver development and we would all be better off if AMD (not to mention Nvidia) adopted Intel's approach to paying people for open-source work.
In a similar manner, there is already 100% GPL'd code that is available for the next-generation Haswell graphics engines. Obviously at this stage it isn't complete, but things are not hidden behind closed doors and, just like Ivy Bridge, there should be solid launch-day support for the Haswell IGP. Considering the rumours going around about the extra resources that Haswell will offer for the GPU, this could chip could provide very solid launch-day out-of-the-box graphics support in notebooks and other devices that don't require a dicrete GPU.
Re:*Not* the first public release of information (Score:5, Informative)
P.S. --> To anyone who saw "DRM" in the previous post and had a heart attack... DRM here means Direct Rendering Manager [wikipedia.org] and is the Linux infrastructure that lets you access the GPU for graphics acceleration.
Re: (Score:2)
Thanks for the clarification. :)
Re: (Score:2)
Re: (Score:2)
Please elaborate. VAAPI has been open and working for a long time. Nobody uses it (are you listening, mplayer guys??? - idiots), but that's hardly because VAAPI isn't there. (actually vlc uses it)
[fnj@baldur FullDisc]$ rpm -qa|grep libva
libva-devel-1.0.15-1.puias6.x86_64
libva-1.0.15-1.puias6.x86_64
libva-freeworld-1.0.14-1.puias6.x86_64
libva-utils-1.0.15-1.puias6.x86_64
[fnj@baldur FullDisc]$
AFAIK AMD has FOSS devs on payroll... (Score:2)
I thought that AMD had a number of devs working on open graphics drivers and on other open stuff like Coreboot...right?
Here's their "Open Source Zone" [amd.com], and here's Kevin Tanguay's blog post [amd.com] of May 5, 2011 (emphasis mine):
Re: (Score:3)
That's a great press release and AMD is better than Nvidia since AMD does publish at least partial documentation.. *but*... support for newer AMD cards in the open source drivers has major issues. As of right now, the 7 series cards that have been out for over 6 months have basically no support under the open source drivers. The 6 series and older cards are improving, but Catalyst is still leagues ahead in OpenGL performance and support. The documentation for the AMD cards is not release until months aft
Re: (Score:2)
Big thank-you to AMD for supporting CoreBoot. Intel on the other hand seem disinterested in CoreBoot at best, actively against it at worst. You cant run CoreBoot with any Intel CPU made in the last decade and Intel has thus far refused to share any specs on their chipsets.
Re: (Score:2)
Smaller team (Score:2)
Yes they do. Only they have very few developers on the graphic drivers (a couple of guys), whereas Intel had a whole company (Tungsten graphics, now bought by VMWare) writing their drivers.
They'll probably now ramp up their opensource team after getting the chinese deal.
Re: (Score:2)
we would all be better off if AMD (not to mention Nvidia) adopted Intel's approach to paying people for open-source work.
They do pay people, but they're not a large part of their total driver team. To put it a bit cruelly Intel doesn't have much to lose by being open source as AMD (through ATI) and nVidia probably know way more about high performance graphics than Intel does. AMD does open source, but they like nVidia both consider their proprietary highly optimized 3D engine their crown jewels. Intel would love a fully optimized OpenGL 4.2 engine when they're working on Mesa that's still on OpenGL 3.0 as far as I know. AMD h
Even worse... (Score:2)
Intel would love a fully optimized OpenGL 4.2 engine when they're working on Mesa that's still on OpenGL 3.0 as far as I know.
Even worse, for their older hardware, the still use the classic (non-modular) part of Mesa, so they don't even get full OpenGL 3.0.
At least Gallium3D (the modern modular part of Mesa) is fully OpenGL 3.0 and is slowly being developed onward. It's used by the oficial AMD opensource drivers, by Nouveau, by the Intel official driver for the latest generation, and some experimental drivers for older generations done by tungsten when they started toying with the idea (and some further development done by google
A good start... but Intel graphics still need work (Score:5, Insightful)
This is a good thing - it means that open-source drivers can now be written that will be adequate for most users. Unless you are doing heavy 3D gaming or HTPC, Intel's products are fine.
For HTPC, Intel would be a great choice if only they'd finally fix that lingering 23.976 FPS bug. They just don't seem to be taking it that seriously, though, since it's existed since the G45 days at least. Also, I don't know if this is supported through the registers (even the documents may not make it clear) but it would be great to have real YCbCr 4:2:2 output – AMD cards claim to do this, but they are actually converting the data from YCbCr (on DVD/Blu-Ray) to RGB and then back to YCbCr for output. Allowing source-direct YCbCr output (which currently only dedicated SoCs can do) and fixing the 23.976 FPS problems would make Intel-based HTPCs a viable option at the high end. (Advanced videophiles want to use a dedicated scaler device, which offers much better scaling and/or deinterlacing results than what software and average standalone players can do.)
Re: (Score:2)
I'm curious how many Stream processors Intel can fit into a dedicated chip.
Re: (Score:2)
YCbCr 4:2:2 output
What interconnect do you use for this? What kind of display devices will accept it as input?
Re: (Score:2)
Plain old RCA cables like you use for composite video, except you use three of them. Most HDTVs have a set of component inputs for this. It was the analog standard used before DRM, and thus HDMI, was mandated by the content assholes.
More info:
https://en.wikipedia.org/wiki/Component_cable [wikipedia.org]
https://en.wikipedia.org/wiki/YCbCr [wikipedia.org]
Re: (Score:2)
HDMI and DisplayPort both support this, and lots of displays will accept it as input. For example, both my home theatre projector (Epson PowerLite 8345) and my desktop's computer monitor (Dell U2711) will accept YCbCr input.
In fact, dedicated media players (such as my PS3) are already outputting YCbCr, so I'm currently using this... Of course, ultimately, it gets converted back into RGB at display-time, since all display devices are RGB.
Re: (Score:2)
What interconnect do you use for this? What kind of display devices will accept it as input?
YCbCr 4:2:2 is part of the HDMI standard, so you would use a regular HDMI cable. Videophiles would run this through a stand-alone video scaler like the Lumagen or Crystalio, so that the scaler could see the exact decompressed data from the original video stream instead of having it already pre-processed into RGB (which is a lossy process on the 8-bit channel values used in digital video).
FWIW, my Sony TV takes YCbC
Re: (Score:2)
Fixed-function hardware can implement an algorithm much more efficiently than software. Reproducing what the dedicated scalers do in software may not be practical.
Re: (Score:2)
How does a dedicated scaler device offer much better results than what software can do? Does it use some magic process that isn't reproducible using a Turing machine?
I'm sure you could replicate it using shaders on a video card (or more slowly on the CPU) if you had access to the exact algorithms used. But the companies that produce dedicated scaler devices aren't going to make these algorithms publicly available, since keeping them a trade secret is in their financial best interest. And Intel, AMD, and n
The actual documents (Score:2, Informative)
Direct link without the Phoronix fluff:
http://intellinuxgraphics.org/documentation.html
Re: (Score:3)
Well, both Intel and Linux have come a long way since the Pentium 4 era, so I don't think it's fair to use a 10-year-old chipset as an example to avoid Intel GPUs now. Is also isn't fair comparing a 2002 iGPU to a 2007 iGPU. Of course the 7025 still works and is supported - it's much newer. My 2003 FX card, however, won't render anything GTK3 properly, it's completely abandoned by Nvidia and a very low priority for nouveau developers.
Re: (Score:2)
I compared them because it was a direct replacement. The same things are true of every other NVidia card I've got... My 8400, Geforce 4, etc., all similarly "just work" like they should, and always have. Intel is the one releasing hardware with all kinds of issues.
Re: (Score:2)
Except you're discussing ancient GPUs and extrapolating your experience with those to the assertion that they are currently releasing hardware "with all kinds of issues"
Modern Intel GPUs work rather well.
Re: (Score:1)
I doubt the issue is with the hardware itself. Issues with "hardware" are generally driver or firmware related.
Re: (Score:2)
Funny. I categorically won't even consider anything EXCEPT Intel graphics hardware for linux. It does a beautiful job for anything I need. Not only 10 year old stuff, but the latest. I've got both ancient 865 and 945, and two Sandy Bridge systems running PUIAS6 (free RHEL6 clone) and other distros - flawlessly.
I wouldn't use Nvidia and AMD power hog crap even if completely capable linux drivers WERE available open source. I don't need dozens of wasted watts to draw text and little pictures on my monitor. An
Re: (Score:2)
It's a fair question. My usual answer to most of this class of question is to say Lenovo. Check out the Lenovo 530 [lenovo.com]. Hit the customize button. You will be able to pick all the way up to Core i7-3520M, display type up to 15.6" 1920x1080 LED backlit anti-glare, and still pick Intel HD Graphics 4000. It takes a little diligence to see what they will let you build
Re: (Score:2)
Under RHEL6.x, it was a non-starter... 30 minutes of use or so, and the screen stops redrawing. You've got a mess on your screen, and (thanks to KVM) restarting X11 doesn't fix it... you have to completely reboot.
No, you've got a mess on your screen, and you have to completely reboot.
The Rest of the Story (Score:4, Funny)
CC-BY-ND (Score:1)
For those of us who don't want to waste time actually downloading the PDFs buried in links from the phoronix story, the license is CC-BY-ND, so you can access it freely, and use whatever you want, but god forbid you should fix a typo.
Phoronix (Score:3)
Man, it seems like every other sentence in that article is a link to another Phoronix article. I count 14 Phoronix links in there, and the actual link to the Intel docs is buried in the middle of that.
http://intellinuxgraphics.org/documentation.html [intellinuxgraphics.org]
You cannot share what you do not own (Score:3)
Re: (Score:2)
Lets see how much documentation Intel releases for the Atom smartphone chips with PowerVR GPUs.
I think we already know the answer to this through Poulsbo. Though in 2013 they're releasing Atom SoCs with Ivy Bridge graphics called Valley View, so hopefully PowerVR are on their way out.
Re:First Post (Score:5, Funny)
I would've got first post, but I'm still reading TFA. Only two thousand pages to go...
Re: (Score:2)
Maybe this isn't news to you, and you do make a valid point about this story not being news. But not everyone follows Intel as closely as you do. I was glad to read this article and to know that I could buy a computer without having to worry that the next Linux update or upgrade would render my graphics performance to be slow and chunky. Of course, whether or not my graphics card is supported on Linux depends on the open source development community, but the odds of good support are much better with open do
Re: (Score:1)
I propose an amendment to slashdot whereby anyone posting a "Why is this news?" post has their account deleted and ip banned. I feel confident in saying that since I have been on slashdot, there has not been a single "Why is this news?" post that added anything of value to the conversation.
Re:why is this news? (Score:5, Informative)
Intel publishes ( for free ) nearly all their architecture documents. It's been their business model since the beginning... how else would the X86 platform exist?
Someone clearly fell asleep in the 1990s, when Intel were so terrified of the V86 extensions being copied by AMD that they wouldn't tell anyone except Microsoft how they worked. People actually reverse-engineered it and released their own documentation before Intel was willing to allow things like Linux DOSEMU to use it. This did not endear me to Intel back in the day.
Indeed, an interesting relic from that era is my Turbo Assembler 5 manual. It has a number of blank entries in it for Pentium instructions, e.g.
RDTSC (Proprietary instruction. Contact Intel for more Information.") - Turbo Assembler Quick Reference, p.118
Re: (Score:2)
Re: (Score:2)
Read Data 'Till System Crashes?
Re: (Score:2)
I should note that my copy of the Intel Instruction Set Reference from around 2003 has documentation for RDTSC
Re: (Score:2)
I'm going from memory, but facts were a bit different. the RDTSC had gone through various incarnations in different versions of the x86 processor, and they were afraid that if they made it publicly available they would have to forever support it in the name of backwards compatibility.
This is not an unreasonable fear, given of the x86 IS has survived all thee many years.