Intel To Integrate DirectX 11 In Ivy Bridge Chips 199
angry tapir writes "Intel will integrate DirectX 11 graphics technology in its next generation of laptop and desktop chips based on the Ivy Bridge architecture, a company executive revealed at CES. AMD has already implemented DirectX 11 in its Fusion low-power chips. Intel expects to start shipping Ivy Bridge chips with DirectX 11 support to PC makers late this year. Ivy Bridge will succeed the recently announced Core i3, i5, and i7 chips, which are based on Intel's Sandy Bridge microarchitecture."
also includes DRM ? (Score:5, Insightful)
does it still contain the DRM restrictions capability ?,
because Intel can forget all about CPU sales from us and from any of our customers until its removed
i dont care if it promises a free pony
contains DRM==No sale
period
Re: (Score:2, Funny)
What the heck are you babbling about? Do you have the slightest idea?
Re:also includes DRM ? (Score:5, Informative)
What the heck are you babbling about? Do you have the slightest idea?
I believe he's babbling about this [techdirt.com]. Sandy Bridge will have DRM in it (though they don't call it that for some weird reason), and Sandy Bridge is directly related to Ivy Bridge [wikimedia.org], so therefore it could possibly inherit the DRM features of Sandy Bridge.
Disclaimer: I am a total n00b when it comes to discussing processor architectures, so I could be wrong about something.
Re:also includes DRM ? (Score:4, Interesting)
At least that is a coherent discussion, which I haven't seen elsewhere. But when idiots talk about DRM, they lose contact with reality. Content producers want true end to end DRM for obvious reasons. This just gives them a way to realize that. It can't encumber anything that presently exists. It just allows some new DRM'ed protocol to be developed; one that only works on recent Intel processors.
So what? If you don't like closed content, just don't use it!
Re:also includes DRM ? (Score:5, Insightful)
So what? If you don't like closed content, just don't use it!
Widespread deployment of systems that allow closed content are likely to encourage content providers who are releasing content using current unprotected or insecure systems to switch to a more secure closed system. This reduces the utility of open source software, which almost universally is unable to take advantage of this kind of system due to protection measures that typically require signed trusted code. Hence, it is something that should be discouraged.
That said, boycotting closed media is likely to be just as effective as boycotting hardware that supports it; probably more so, as it is somewhat more direct.
Re: (Score:2)
Almost all boycotts are quixotic.
and if... (Score:2)
So what? If you don't like closed content, just don't use it!
And if you don't like the CPUs that support the creation of the closed content, just don't buy them!
Re: (Score:2)
So what? If you don't like closed content, just don't use it!
That's exactly what he said he was going to do, so it seems you're the one who's babbling.
Re: (Score:3)
So what? If you don't like closed content, just don't use it!
That only works if you don't like closed content for purely selfish reasons.
If you believe, as many do, that the DRM is inherently bad for society in general, then it is important to go far beyond simply avoiding it yourself. It is necessary to convince as many others as possible about the problems DRM creates for us all.
Re: (Score:2)
He's babbling about DRM.
What that has to do with this Intel Chip? I don't know. But at least I have a SLIGHT idea what he's ranting about.
Re: (Score:3)
Anything with an HDMI output has to support DRM so people can't record the signal.
(We have the master key so, yes, it's a waste of time but Intel is contractually bound to support the DRM if they want to have HDMI output)
Re: (Score:3)
It's ironic that no one ever had the slightest intention of trying to record a digital monitor signal anyway. The very idea is insane. HDMI is rated at 10.2 gigabits. That's 76.5 gigabytes per MINUTE! Anybody who has a clue is more interested in decrypting the Blu-Ray files (quite a trick, but that genie is decidedly out of the bottle).
Or you can just attach an HDFury2 to the HDMI and pipe the resulting component video into a Hauppauge HD PVR.
Re: (Score:2)
go read the slashdot [slashdot.org] article on sandy bridge
Re:also includes DRM ? (Score:4, Interesting)
I take the sentiment back.
http://www.techdirt.com/articles/20110107/10153912573/intel-claims-drmd-chip-is-not-drm-its-just-copy-protection.shtml [techdirt.com]
As someone up in the discussion mentioned, it may have something other than TPM.
What the hell Intel?
Other OSes ? (Score:5, Interesting)
Re:Other OSes ? (Score:4, Insightful)
Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.
Re: (Score:3)
Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.
sure, just like GMA 500
Re:Other OSes ? (Score:4, Insightful)
Yes. Assuming someone writes the driver. DX11 is a bit ahead of OGL in hardware requirements/capabilities, so full support for dx11 means it has everything OGL needs also.
Re:Other OSes ? (Score:5, Informative)
Better than that. In OpenGL, you say "give me this vendor-specific feature" you get it. Programmers have used this to get at the latest features of chipsets long before they're standardized.
OpenGL programmers are always ahead of DirectX, even in this case where the hardware directly targets future DirectX specs.
It's like using -moz-border-radius, -webkit-border-radius and -khtml-border-radius to get CSS3 rounded borders long before CSS3 is officially released, and yet CSS3 won't be beholden to any one browser's implementation.
Re: (Score:3)
You can get to the vendor specific features in directx also. But in either case, that's definitely the ugly way to write code.
Re: (Score:3)
lol. Some folks still don't get it. Direct X is 'vendor specific' no matter what manufacturer's chipset is supported. That's why the guys doing OpenGL (ES) can write for Android, and iPhone/iPad, and Linux, and Solaris, and Max OS X, *AND* Windows.
Incidentally, your "DX11 is a bit ahead of OGL in hardware requirements/capabilities" is incorrect (used to be true for a while n
Re: (Score:3)
Yes. Assuming someone writes the driver. DX11 is a bit ahead of OGL in hardware requirements/capabilities, so full support for dx11 means it has everything OGL needs also.
Not true at all. OpenGL 4.1 incorporates pretty-much everything in DX11 and more, not forgetting that OGL can then have extensions added taking it even further ahead.
Linux will be definitively be supported (Score:3)
Re: (Score:3)
Its worth noting that Linux now has a long tradition with Intel at receiving support first because the code base is readily available for development, experimentation, and testing. So chances are, most any new feature is going to be implemented on Linux first.
Re:Other OSes ? (Score:5, Informative)
Direct X is a Microsoft product
Direct X isn't really a product (you can't buy it and never have been able to). DirectX itself is an interfaces supplied by windows for various things gaming related. Most significantly these days 3D graphics.
These days each version of directx specifies a set of required features. A "DirectX 11 card" means a card that implements all the features required by DirectX 11. In this context it's perfectly reasonble to ask whether those features will be exposed to other operating systems.
Interesting possibilities (Score:3)
this is a kind of a interesting line of thought to follow. One would suppose that the DX11 chip will be proprietary hardware acceleration that will integrate with the API. Now, because this is being baked into chips by Intel, they w
Re: (Score:2)
And just how were you planning to write the drivers without documentation?
Re: (Score:3)
Re: (Score:3)
DirectX (Score:4, Funny)
Goes to 11!
(I'm sorry)
Intel integrated graphics (Score:2, Insightful)
I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.
Re: (Score:2)
The main point of Intel graphics is it is cheap. If you want a barebones low graphics computer you buy integrated, which Intel regularly develops, mostly for use in laptops (which add the bonus of power savings).
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
That's what "support" means when talking about graphics. Graphics processing is all about taking some piece of over-used software and putting it in hardware so that it consumes a few hundred picoseconds instead of a several dozen nanoseconds per iteration. It makes common algorithms run faster.
DirectX is a standard for a set of common algorithms. It makes sense to implement as many of them in hardware as you can. DirectX11 is merely the latest iteration of DirectX, and the first to get consideration as
Re: (Score:2)
I suppose it's easier to implement something than it is to implement it well.
80/20 rule.
Re: (Score:3)
I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.
It will include DirectX 11 *and* theoretically be twice as fast as Sandy Bridge. Not much to complain about there.
P.S. By theoretically I mean it will have twice as many stream processors.
Intel integrated graphics at anandtech.com (Score:5, Informative)
You can find Sandy Bridge GPU benchmarks at http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]
"Intel's HD Graphics 3000 makes today's $40-$50 discrete GPUs redundant. The problem there is we've never been happy with $40-$50 discrete GPUs for anything but HTPC use. What I really want to see from Ivy Bridge and beyond is the ability to compete with $70 GPUs. Give us that level of performance and then I'll be happy.
The HD Graphics 2000 is not as impressive. It's generally faster than what we had with Clarkdale, but it's not exactly moving the industry forward. Intel should just do away with the 6 EU version, or at least give more desktop SKUs the 3000 GPU. The lack of DX11 is acceptable for SNB consumers but it's—again—not moving the industry forward. I believe Intel does want to take graphics seriously, but I need to see more going forward."
Note: all Sandy Bridge laptop CPU have Intel HD Graphics 3000
Re: (Score:2)
The numbers look even worse for Intel if you grab an "off-the-shelf" dedicated GPU thats one generation older, e.g. a 1GB Radeon 4670 for ~$65.
AMD also has Hybrid graphics, first introduced with the Puma or Spider
Re:Intel integrated graphics (Score:4, Insightful)
I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.
Have you seen performance numbers for Sandy Bridge's on chip graphics? The "Intel graphics are slow" meme is dead. Sandy Bridge's integrated gpu beats most discrete graphics cards under $50. The Ivy Bridge solution will be even faster.
http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]
Re:Intel integrated graphics (Score:5, Insightful)
The "Intel graphics are slow" meme is dead.
For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.
Re: (Score:3)
For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.
Which is exactly what 95% of people are quite happy with if it means they save $50.
Re:Intel integrated graphics (Score:4, Informative)
Do the "rest of us" constantly carp that Nvidia IGP graphics are slow, AMD IGP graphics are slow, and AMD Fusion graphics (will be) slow? Because this is what the GP was referencing. Nobody expects "built in" graphics to be comparable to high end discrete graphics. Performance comparable to the lesser Nvidia and AMD chips, e.g., AMD 5400 series, Nvidia 410 and 420 (possibly 430) series, is not considered slow by anyone except high end gamers. High end gamers buy discrete graphics cards (or specialized notebooks), period. The "rest of us" is broader than that. The "rest of us" includes business users, HTPC users, and casual gamers.
GP didn't mention gamers. I'm not willing to pay more so that every CPU and/or motherboard is suitable for high end gaming. Your expectations are unrealistic. Good day.
Re: (Score:3)
Yes, yes it's not exactly a gamer's GPU. It's not like Intel is going to include a top-end GPU on every CPU just in case you happen to need it either. However what Intel delivers on their IGP chips are typically the low bar of performance, like what I might get if you tried playing a game on a work laptop which obviously wasn't bought for gaming. That low bar is still quite low, but it's a lot higher than it used to be. A lot more older games will run at good performance. A lot of newer games are playable e
Two Questions (Score:4, Interesting)
1. Will this in any way benefit OpenGL?
2. Will this hinder future versions of DirectX or are they backwards compatible in a way that there would be large chunks in hardware and new changes made as firmware revisions or software implementations?
Re: (Score:3)
The hardware has all the features necessary to support dx11. dx11 is generally a superset of what opengl can do. So yes, opengl should be fully supported, assuming someone writes the driver.
Re: (Score:2)
Re: (Score:2)
Yeah, that's exactly why I had to put in the qualifier about the driver, unfortunately.
Re: (Score:2)
First Intel CPU + GPU on die? (Score:3)
The Sandy Bridge chips are the first in which Intel has combined a graphics processor and CPU on a single piece of silicon.
I thought Intel already did this a while ago with the newer Atom chips:
http://en.wikipedia.org/wiki/Intel_atom#Second_generation_cores [wikipedia.org]
Re: (Score:2)
I'm sure the article was thinking mainstream x86 line, but failed to say it. Or more likely, written by someone who doesn't care about the platforms atom is aimed at, and therefore didn't know.
Re: (Score:2)
They had. The news here is (more of) the DirectX11 API will be in HW.
Great! (Score:5, Funny)
Those new texture mapping algorithms will really make outlook load fast.
Re: (Score:2)
The 3d text mode in outlook 2012 is pretty cool. The words are practically poking you in the eyeballs!
Re: (Score:3)
cool! using outlook always felt like someone was poking me in the eye. now maybe others will be able to relate.
Re: (Score:2)
I love the way it bump-mapped the bumped post on 4chan.
They actually may (Score:2)
They actually may, seeing that the entire GUI frontend of EVERYTHING in Vista and Windows 7 is basically a multithreaded version of Direct 3D. Those "reflections" on the edges of the window frame? They're textures. And textures require mapping.
But will it improve Minecraft's graphics? (Score:5, Funny)
Re: (Score:3)
No. That's a problem in the minecraft client, not in the hardware that displays it.
Re: (Score:3)
Minecraft uses LWJGL, the lightweight Java game library, which in turn uses OpenGL.
A better graphics card, or better graphics driver, will render Minecraft better.
Re: (Score:2)
Not unless minecraft improves the features they are using. It's a really primitive design, there's almost no way any existing card isn't rendering what minecraft puts out at maximum quality.
Re: (Score:2)
So no, I don't think a new graphics card will help that much. (You should play Cube 2: Sauerbraten [sauerbraten.org], anyway.)
Re: (Score:2)
Re: (Score:3)
The blocks should be more blocky but look less blocky.
I want tessellated blocks. The entire Minecraft world should be a dynamic fractal, with the shape of each individual block mirroring the structure of the whole.
AMD has already implemented DirectX 11 in its F... (Score:2)
AMD has already implemented DirectX 11 in its Fusion low-power chips.
As has nVidia in GTX 400 [wikipedia.org].
Re: (Score:3)
gtx 400 isn't integrated onto a cpu, which I think was the point.
I hope .... (Score:2)
is this the best use of die space & RAM bandwi (Score:3)
The GPU on sandy bridge consumes die area approximately equivalent to two CPU cores [bit-tech.net].
Unified memory architecture is an elegant thing, but it does require storing the framebuffer in main memory. At 1920x1080 with 32-bit color, the framebuffer is close to 64MiB. This will typically be refreshed at 60Hz, requiring 3.7GiB/s of memory bandwidth. That is quite a lot of bandwidth to be consuming 100% of the time. Incidentally, I recall that on my old SGI O2 R10k, it surprised me to find that algorithms touching only the CPU and memory ran a third slower at maximum resolution vs at 800x600. This was not a happy discovery given that the machine cost $19,995 and was meant to excel at graphics.
I realize that Intel GMA is not meant to excel at anything at all save for ripping some additional cash from my hand, but there's no need to integrate brain damaged graphics or wireless to achieve this. I would gladly pay for additional L3 cache or another CPU core or two.
Re: (Score:2)
And what use is it when a bug is found in DirectX, you can change software, but hardware?
Well, considering DX11 has been out for a while and has been generally tested for bugs already - the idea is that you won't HAVE a bug if it's in the hardware - theres no where for the variables to change values based on a different CPU build or other factors if the calculations are specifically designed to run on that piece of hardware. At least, thats the theory.
But yeah - this does nothing if you typically aren't running Windows. Though I'm more concerned on what this will do to the future of DirectX. Wh
DirectX isn't open (Score:2)
OpenGL has to please a large group with more uses than just games; it is done with input from the wide range of developers that use it. Its open, more democratic.
The DirectX dictatorship is faster and likely more efficient (in a way) but it comes at a price that wiser people are not willing to pay.
I'll take slow freedom.
If they could do everything ass-backwards without a speed loss just to make it extremely hard to port to/from OpenGL DirectX would do that. If they really just wanted to move faster, they co
Re: (Score:2)
the idea is that you won't HAVE a bug if it's in the hardware
I can tell you've never developed graphics hardware or drivers... I'm sure the people I know who do that will be glad to know that they won't have to work around chip bugs anymore.
Re: (Score:2)
I've worked with directX at a low level a bit, but no I've never actually to develop the hardware or the drivers for such devices.
What I was getting at is that if the Chip is designed specifically for DirectX11, you shouldn't have DirectX11 bugs. Yes, chip bugs definately do exist, but I would think (though I have no proof) that when a piece of hardware is designed for a specific task, it generally preforms that one task better and has issues elsewhere.
Re: (Score:2)
It's not really hard-wired hardware these days. The graphics chip runs code which is uploaded when the machine boots. Fixing a bug is usually just a driver update.
Re: (Score:2)
ahhh... No.
DirectX has certain hardware requirements. They are not going to hardwire in DirectX but will instead support all the hardware features that DirectX 11 needs.
I hope they support OpenCL as well.
I am not a gamer but I would love to see more programs use the GPU for trans-coding and other none game play uses.
DX11 does support GPGPU but I use OS/X, Linux, and Windows so I want standards support.
Re: (Score:2)
It's not what you think. It's a built-in graphics card on the CPU. That graphics card has all the hardware necessary to support the directx 11 api. If they change the directx API, intel changes the driver.
Re: (Score:3)
So why not do it generically? IBM Cell chips integrate a Vector chip on the CPU. Intel and AMD both have video chips integrated into the CPU. So why not integrate like the old Altvec of PPC a Vector co-processor.
Why not use a generic chip designed for that type of instruction set? That way your not limited software versions for your hardware.
Re:Hard-wired DirectX? (Score:5, Informative)
So why not do it generically? IBM Cell chips integrate a Vector chip on the CPU. Intel and AMD both have video chips integrated into the CPU. So why not integrate like the old Altvec of PPC a Vector co-processor.
Why not use a generic chip designed for that type of instruction set? That way your not limited software versions for your hardware.
Because sufficiently generic hardware is not sufficiently fast at the desired task, graphics computation. Even with the optimization intel has put into this, they'll be MORE than an order of magnitude of graphics performance behind the dedicated solutions of their competitors.
Re: (Score:2)
The chip's instruction set will be designed around the shading languages used in 3D graphics, it won't be very generic.
Re: (Score:2)
Yeah exactly ... it wasn't at all clear how 'generic' the grandparent wanted ... so I actually replied twice depending on which level of generic they wanted.
Re: (Score:2)
Actually, on rereading your post ... I think it may actually meet your definition. It isn't hard-wired for dx11. There will be a driver. That driver can be modified/optimized later. The hardware is, in fact, generic graphics hardware, at least in the sense I think you mean.
Re: (Score:2)
because DirectX sounds cooler to marketing?
Re: (Score:2)
Use Linux?
http://intellinuxgraphics.org/ [intellinuxgraphics.org]
All Intel drivers are open source on Linux. I have no idea about code quality or upkeep, so I will say nothing except I know they add regularly.
What other kind of DirectX do you think there is? (Score:3)
Do you know some other way to do it? All graphics cards incorporate "hard-wired DirectX". If you are going to have graphics accelerators, they have to accelerate graphics. You can't meaningfully accelerate blits to frame buffers any faster than they already are. You have to accelerate higher level graphics abstractions. That's all DirectX is - an abstraction of higher level graphics operations. Any software, such as OpenGL, can (and does) tap into the more well chosen of those abstractions.
Re:Hard-wired DirectX? (Score:4, Insightful)
Worse what happens when directX 12 comes along? is the hardware useless? can the hardware be upgraded?
1) The same thing that happens when you install DirectX 10 on a DX9 card: the DX9 subset of DX10 is hardware accelerated, the DX10 parts are run in software.
2) No. It's not useless. It will still accelerate everything it was accelerating before.
3) Probably not. But who cares? Either replace it, or live with a subset of current functionality.
Re: (Score:3)
What happens to your nvidia 580 card when dx 12 comes along? Exactly the same thing happens with these cpus. Either you live with the reduced functionality, or you put in a new video card, assuming your motherboard has a graphics card slot.
Re: (Score:2)
For the foreseeable future you can have your pick of ARM and x86.
On the plus side, x86 has been pretty much RISC internally for a long time now. And a lot of the ISA has been changed over too. Once they tack on one or two more ISA extension you'll be able to have 100% of your code avoid the x86 path.
Re: (Score:2)
Nvidia is making ARM CPUs.
The next version of Windows will run on ARM.
So, yes.
And if you're a Linux zealot, you can compile your kernel for whatever target hardware you want.
Re:RISC please (Score:5, Insightful)
POWER is fast and has an excellent power/performance, but entry-level systems cost ~$3500 after discounts.
Itanium is fast, but expensive and power-hungry.
MIPS is fast and power-efficient, but none of the players in the high-performance MIPS market have any interest in anything but network processors.
SPARC gives you two options - SPARC64 (slow, expensive, power-inefficient) and SPARC T-series (fast, but only for throughput-driven workloads; expensive; fairly power-hungry)
ARM has good power and price characteristics, but is slow compared to any production x86 chip except the Atoms and ULV stuff.
Basically, I'm not seeing a credible alternative to x86 for the market that it thrives in. If you want to pay up and get a nice fast RISC system, they're out there; alternatively, if you want a somewhat slower one for cheap, ARM is always available.
Re: (Score:3)
It really depends on what you mean. If you mean strict RISC, it was too late the day the term was coined. If, OTOH, you mean a nearly orthogonal architecture that is general purpose (plus the ability to call on specialized functions from attached processor chips), that seem, to me, a real possibility.
Before you jump, though, you must decide on what is the longest word size your computer will address and what is the smallest unit it will address. The larger (and the smaller) you go, the harder the task wi
Re: (Score:2)
In what way do you mean?
Putting graphics processing in HW instead of doing it in SW is always better, and Intel currently rule in HW speed for mainstream chips.
So it's hard to tell what you're saying.
Re: (Score:2)
DX11 titles are so high-end, that no one would find them playable with the capabilities of intel HW. Intel HW indeed rules integrated graphics (until fusion is on the street), but no one plays high end dx10 titles, much less dx11 titles on such hardware. So why bother implementing dx11 at all (instead of, for example, making dx10 faster, possibly enough faster to play high end dx10 titles), when it won't be usably fast for any actual dx11 software? The answer of course is marketing.
Re: (Score:2)
If you're buying high-end software, why are you expecting to play it on low-end hardware?
Integrated GPU/CPU will always be lower performance than discrete. If you want bleeding-edge, open your wallet.
Re: (Score:2)
Precisely. So why is Intel bothering to support dx11? That's high-end only, and won't be playable on their hardware, even though it's 'supported'.
Re: (Score:2)
Because things change, and DX11 will soon enough be the low end.
Re: (Score:2)
Then (if it weren't for marketing) maybe it would make sense to implement directx11 in the next generation, or the one after that, when they can actually make directx11 content usable.
Re: (Score:2)
This is the next generation.
DX11 has been out for over a year.
Next year DX12 will be the meme.
Re: (Score:2)
If you're buying high-end software, why are you expecting to play it on low-end hardware?
What's the point of supporting DX11 if the game is unplayable?
My laptop's graphics card supports DX10, but if I enable the DX10 engine in any game I own that has one then the frame rate halves. So why bother?
Re: (Score:2)
You shouldn't bother paying for something that doesn't work for you. If you bought that laptop for the DX10 you should return it and get one that works.
Re:DirectX who? (Score:5, Informative)
Re: (Score:2)
None of these chips execute 'Direct3D' or 'OpenGL' directly, they remap the functions to an internal 3D API.
OpenGL and Direct3D do mostly the same things so it's not much of a hardship for the driver writers.