Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AMD Graphics Intel Hardware

Intel To Integrate DirectX 11 In Ivy Bridge Chips 199

angry tapir writes "Intel will integrate DirectX 11 graphics technology in its next generation of laptop and desktop chips based on the Ivy Bridge architecture, a company executive revealed at CES. AMD has already implemented DirectX 11 in its Fusion low-power chips. Intel expects to start shipping Ivy Bridge chips with DirectX 11 support to PC makers late this year. Ivy Bridge will succeed the recently announced Core i3, i5, and i7 chips, which are based on Intel's Sandy Bridge microarchitecture."
This discussion has been archived. No new comments can be posted.

Intel To Integrate DirectX 11 In Ivy Bridge Chips

Comments Filter:
  • by Anonymous Coward on Monday January 10, 2011 @12:19PM (#34824864)

    does it still contain the DRM restrictions capability ?,

    because Intel can forget all about CPU sales from us and from any of our customers until its removed

    i dont care if it promises a free pony
    contains DRM==No sale

    period

    • Re: (Score:2, Funny)

      by fnj ( 64210 )

      What the heck are you babbling about? Do you have the slightest idea?

      • by supersloshy ( 1273442 ) on Monday January 10, 2011 @12:46PM (#34825218)

        What the heck are you babbling about? Do you have the slightest idea?

        I believe he's babbling about this [techdirt.com]. Sandy Bridge will have DRM in it (though they don't call it that for some weird reason), and Sandy Bridge is directly related to Ivy Bridge [wikimedia.org], so therefore it could possibly inherit the DRM features of Sandy Bridge.

        Disclaimer: I am a total n00b when it comes to discussing processor architectures, so I could be wrong about something.

        • by fnj ( 64210 ) on Monday January 10, 2011 @01:08PM (#34825472)

          At least that is a coherent discussion, which I haven't seen elsewhere. But when idiots talk about DRM, they lose contact with reality. Content producers want true end to end DRM for obvious reasons. This just gives them a way to realize that. It can't encumber anything that presently exists. It just allows some new DRM'ed protocol to be developed; one that only works on recent Intel processors.

          So what? If you don't like closed content, just don't use it!

          • by julesh ( 229690 ) on Monday January 10, 2011 @01:32PM (#34825792)

            So what? If you don't like closed content, just don't use it!

            Widespread deployment of systems that allow closed content are likely to encourage content providers who are releasing content using current unprotected or insecure systems to switch to a more secure closed system. This reduces the utility of open source software, which almost universally is unable to take advantage of this kind of system due to protection measures that typically require signed trusted code. Hence, it is something that should be discouraged.

            That said, boycotting closed media is likely to be just as effective as boycotting hardware that supports it; probably more so, as it is somewhat more direct.

          • So what? If you don't like closed content, just don't use it!

            And if you don't like the CPUs that support the creation of the closed content, just don't buy them!

          • So what? If you don't like closed content, just don't use it!

            That's exactly what he said he was going to do, so it seems you're the one who's babbling.

          • So what? If you don't like closed content, just don't use it!

            That only works if you don't like closed content for purely selfish reasons.
            If you believe, as many do, that the DRM is inherently bad for society in general, then it is important to go far beyond simply avoiding it yourself. It is necessary to convince as many others as possible about the problems DRM creates for us all.

      • He's babbling about DRM.

        What that has to do with this Intel Chip? I don't know. But at least I have a SLIGHT idea what he's ranting about.

        • Anything with an HDMI output has to support DRM so people can't record the signal.

          (We have the master key so, yes, it's a waste of time but Intel is contractually bound to support the DRM if they want to have HDMI output)

          • by fnj ( 64210 )

            It's ironic that no one ever had the slightest intention of trying to record a digital monitor signal anyway. The very idea is insane. HDMI is rated at 10.2 gigabits. That's 76.5 gigabytes per MINUTE! Anybody who has a clue is more interested in decrypting the Blu-Ray files (quite a trick, but that genie is decidedly out of the bottle).

            Or you can just attach an HDFury2 to the HDMI and pipe the resulting component video into a Hauppauge HD PVR.

  • Other OSes ? (Score:5, Interesting)

    by SirGeek ( 120712 ) <sirgeek-slashdot.mrsucko@org> on Monday January 10, 2011 @12:19PM (#34824866) Homepage
    Will Intel provide documentation so that other OSes will be able to make use of this feature ?
    • Re:Other OSes ? (Score:4, Insightful)

      by Surt ( 22457 ) on Monday January 10, 2011 @12:40PM (#34825144) Homepage Journal

      Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.

      • Almost certainly. They want to sell hardware, and being a full generation or more behind their competitors, have no reason to hold back any secrets of their implementation.

        sure, just like GMA 500

    • Linux will receive support directly from Intel for Ivy Bridge, with better timing than for Sandy Bridge (whose support for Linux was notably very late): http://www.phoronix.com/scan.php?page=news_item&px=ODk3Nw [phoronix.com]
      • Its worth noting that Linux now has a long tradition with Intel at receiving support first because the code base is readily available for development, experimentation, and testing. So chances are, most any new feature is going to be implemented on Linux first.

  • DirectX (Score:4, Funny)

    by Anonymous Coward on Monday January 10, 2011 @12:21PM (#34824910)

    Goes to 11!

    (I'm sorry)

  • I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

    • The main point of Intel graphics is it is cheap. If you want a barebones low graphics computer you buy integrated, which Intel regularly develops, mostly for use in laptops (which add the bonus of power savings).

      • Yes but my understanding is you don't get that choice for some models of the CPU. For the mobile i3, i5, and i7 series now, the Intel GPU is integrated into the chipset. So if you get a new i7 and a discrete GPU, the Intel GPU is just disabled. Apple has done some work so that both the Intel and the discrete both operate depending on the on-demand video requirements. The new Ivy Bridge will be integrated into the CPU and not just the chipset.
    • Then you don't want Intel graphics. The point to their hardware is to make it cheap: low power usage and low die size. Features are just engineering time, and that's something Intel has a lot of.
    • by blair1q ( 305137 )

      That's what "support" means when talking about graphics. Graphics processing is all about taking some piece of over-used software and putting it in hardware so that it consumes a few hundred picoseconds instead of a several dozen nanoseconds per iteration. It makes common algorithms run faster.

      DirectX is a standard for a set of common algorithms. It makes sense to implement as many of them in hardware as you can. DirectX11 is merely the latest iteration of DirectX, and the first to get consideration as

    • I suppose it's easier to implement something than it is to implement it well.

      80/20 rule.

    • I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

      It will include DirectX 11 *and* theoretically be twice as fast as Sandy Bridge. Not much to complain about there.

      P.S. By theoretically I mean it will have twice as many stream processors.

    • by IYagami ( 136831 ) on Monday January 10, 2011 @12:53PM (#34825300)

      You can find Sandy Bridge GPU benchmarks at http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]

      "Intel's HD Graphics 3000 makes today's $40-$50 discrete GPUs redundant. The problem there is we've never been happy with $40-$50 discrete GPUs for anything but HTPC use. What I really want to see from Ivy Bridge and beyond is the ability to compete with $70 GPUs. Give us that level of performance and then I'll be happy.

      The HD Graphics 2000 is not as impressive. It's generally faster than what we had with Clarkdale, but it's not exactly moving the industry forward. Intel should just do away with the 6 EU version, or at least give more desktop SKUs the 3000 GPU. The lack of DX11 is acceptable for SNB consumers but it's—again—not moving the industry forward. I believe Intel does want to take graphics seriously, but I need to see more going forward."

      Note: all Sandy Bridge laptop CPU have Intel HD Graphics 3000

      • Yet, you still need an i7 + intel integrated graphics and an i7 compatible motherboard to get the performance of a $~50 dedicated GPU. Pricewise, you could go with an AMD solution and a dedicated GPU in the $75-$100 range from Nvidia or AMD and still pay half as much for better 3D performance.

        The numbers look even worse for Intel if you grab an "off-the-shelf" dedicated GPU thats one generation older, e.g. a 1GB Radeon 4670 for ~$65.

        AMD also has Hybrid graphics, first introduced with the Puma or Spider
    • by TheTyrannyOfForcedRe ( 1186313 ) on Monday January 10, 2011 @12:58PM (#34825368)

      I'd rather they made their integrated graphics fast than simply support new DirectX capabilities. I don't really see the point of supporting certain features if the whole thing is going to be slow. I suppose it's easier to implement something than it is to implement it well.

      Have you seen performance numbers for Sandy Bridge's on chip graphics? The "Intel graphics are slow" meme is dead. Sandy Bridge's integrated gpu beats most discrete graphics cards under $50. The Ivy Bridge solution will be even faster.

      http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11 [anandtech.com]

      • by 0123456 ( 636235 ) on Monday January 10, 2011 @01:38PM (#34825868)

        The "Intel graphics are slow" meme is dead.

        For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.

        • For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.

          Which is exactly what 95% of people are quite happy with if it means they save $50.

        • by DRJlaw ( 946416 ) on Monday January 10, 2011 @03:33PM (#34827700)

          The "Intel graphics are slow" meme is dead.

          For anyone who likes their games to run at 30fps at 1024x768 with low graphics settings. The rest of us find that kind of slow actually.

          Do the "rest of us" constantly carp that Nvidia IGP graphics are slow, AMD IGP graphics are slow, and AMD Fusion graphics (will be) slow? Because this is what the GP was referencing. Nobody expects "built in" graphics to be comparable to high end discrete graphics. Performance comparable to the lesser Nvidia and AMD chips, e.g., AMD 5400 series, Nvidia 410 and 420 (possibly 430) series, is not considered slow by anyone except high end gamers. High end gamers buy discrete graphics cards (or specialized notebooks), period. The "rest of us" is broader than that. The "rest of us" includes business users, HTPC users, and casual gamers.

          GP didn't mention gamers. I'm not willing to pay more so that every CPU and/or motherboard is suitable for high end gaming. Your expectations are unrealistic. Good day.

        • by Kjella ( 173770 )

          Yes, yes it's not exactly a gamer's GPU. It's not like Intel is going to include a top-end GPU on every CPU just in case you happen to need it either. However what Intel delivers on their IGP chips are typically the low bar of performance, like what I might get if you tried playing a game on a work laptop which obviously wasn't bought for gaming. That low bar is still quite low, but it's a lot higher than it used to be. A lot more older games will run at good performance. A lot of newer games are playable e

  • Two Questions (Score:4, Interesting)

    by chill ( 34294 ) on Monday January 10, 2011 @12:24PM (#34824960) Journal

    1. Will this in any way benefit OpenGL?

    2. Will this hinder future versions of DirectX or are they backwards compatible in a way that there would be large chunks in hardware and new changes made as firmware revisions or software implementations?

    • by Surt ( 22457 )

      The hardware has all the features necessary to support dx11. dx11 is generally a superset of what opengl can do. So yes, opengl should be fully supported, assuming someone writes the driver.

      • I read that Intel's drivers are notoriously shite for OpenGL. Indeed, my own experimentation showed them to be shite at D3D as well. The device I was using claimed to support PS 3.0 (in its caps), but point-blank refused to run some of my shaders (they ran ok with ATI and NVIDIA cards). I won't be supporting Intel Graphics, that's for sure.
        • by Surt ( 22457 )

          Yeah, that's exactly why I had to put in the qualifier about the driver, unfortunately.

    • by Tr3vin ( 1220548 )
      In theory, OpenGL 4 could take advantage of the new hardware, but Intel would have to write good OpenGL drivers. Future versions of DirectX may require new hardware. We won't know until there is a spec. If it does require new hardware, then people would have to replace their DX11 cards anyway.
  • by TeknoHog ( 164938 ) on Monday January 10, 2011 @12:27PM (#34824992) Homepage Journal
    FTA:

    The Sandy Bridge chips are the first in which Intel has combined a graphics processor and CPU on a single piece of silicon.

    I thought Intel already did this a while ago with the newer Atom chips:

    http://en.wikipedia.org/wiki/Intel_atom#Second_generation_cores [wikipedia.org]

    • by Surt ( 22457 )

      I'm sure the article was thinking mainstream x86 line, but failed to say it. Or more likely, written by someone who doesn't care about the platforms atom is aimed at, and therefore didn't know.

    • by blair1q ( 305137 )

      They had. The news here is (more of) the DirectX11 API will be in HW.

  • Great! (Score:5, Funny)

    by TechyImmigrant ( 175943 ) * on Monday January 10, 2011 @12:29PM (#34825000) Homepage Journal

    Those new texture mapping algorithms will really make outlook load fast.

    • by Surt ( 22457 )

      The 3d text mode in outlook 2012 is pretty cool. The words are practically poking you in the eyeballs!

      • cool! using outlook always felt like someone was poking me in the eye. now maybe others will be able to relate.

    • I love the way it bump-mapped the bumped post on 4chan.

    • They actually may, seeing that the entire GUI frontend of EVERYTHING in Vista and Windows 7 is basically a multithreaded version of Direct 3D. Those "reflections" on the edges of the window frame? They're textures. And textures require mapping.

  • by digitaldc ( 879047 ) * on Monday January 10, 2011 @12:31PM (#34825032)
    That's what I am worried about, I want my Minecraft landscapes to be rendered better.
    • by Surt ( 22457 )

      No. That's a problem in the minecraft client, not in the hardware that displays it.

      • by kyz ( 225372 )

        Minecraft uses LWJGL, the lightweight Java game library, which in turn uses OpenGL.

        A better graphics card, or better graphics driver, will render Minecraft better.

        • by Surt ( 22457 )

          Not unless minecraft improves the features they are using. It's a really primitive design, there's almost no way any existing card isn't rendering what minecraft puts out at maximum quality.

        • I suspect that there is little optimization done in the world, so each cube gets two tris per (visible) face, even if it's part of a larger polygon.
          So no, I don't think a new graphics card will help that much. (You should play Cube 2: Sauerbraten [sauerbraten.org], anyway.)
  • AMD has already implemented DirectX 11 in its Fusion low-power chips.

    As has nVidia in GTX 400 [wikipedia.org].

  • ... Intel will make it easy to re-flash the chipsets when DirectX 12 comes out. Or to install OpenGL firmware instead.
  • by hxnwix ( 652290 ) on Monday January 10, 2011 @03:53PM (#34827992) Journal

    The GPU on sandy bridge consumes die area approximately equivalent to two CPU cores [bit-tech.net].

    Unified memory architecture is an elegant thing, but it does require storing the framebuffer in main memory. At 1920x1080 with 32-bit color, the framebuffer is close to 64MiB. This will typically be refreshed at 60Hz, requiring 3.7GiB/s of memory bandwidth. That is quite a lot of bandwidth to be consuming 100% of the time. Incidentally, I recall that on my old SGI O2 R10k, it surprised me to find that algorithms touching only the CPU and memory ran a third slower at maximum resolution vs at 800x600. This was not a happy discovery given that the machine cost $19,995 and was meant to excel at graphics.

    I realize that Intel GMA is not meant to excel at anything at all save for ripping some additional cash from my hand, but there's no need to integrate brain damaged graphics or wireless to achieve this. I would gladly pay for additional L3 cache or another CPU core or two.

Utility is when you have one telephone, luxury is when you have two, opulence is when you have three -- and paradise is when you have none. -- Doug Larson

Working...