Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Discloses Its Forthcoming Discrete GPU Strategy and Design Efforts (hothardware.com) (hothardware.com) 92

MojoKid writes: Intel has been uncharacteristically vocal about its most recent plans to enter the discrete GPU market. Over the last year or so, the company has disclosed a few morsels of information and made some high-profile hires, in its bid to build-up and flesh-out its latest discrete GPU plans. This week, Intel decided to have a sit down with HotHardware, offering the opportunity to chat with Ari Rauch, Vice President of the Core And Visual Computing Group at Intel, to discuss what makes this most recent endeavor different from the company's previous and now discontinued attempts in the discrete GPU space. As a follow up, HotHardware also enlisted readership questions to engage with Intel about its upcoming GPU plans, compiling responses in a Q&A format.

In short, this isn't Larabee 2.0, not by a long shot. Intel is gearing up for a traditional GPU architecture design, coupled with some of the company's own strategic IP that it can bring to the table, to help differentiate its products. Further, Rauch noted Intel "will bring discrete GPUs to both client and data center segments aiming at delivering the best quality and experiences across the board including gaming, content creation, and enterprise. These products will see first availability over a period of time, beginning in 2020."

When questioned on their current silicon fabrication hiccups and delays and how it might affect Intel's ability to execute in this highly competitive space, Rauch noted, "we feel very confident about our product roadmap across software, architecture, and manufacturing." Based on some of the responses to product positioning questions, it also appears Intel is gearing up to address all performance envelopes as well, from entry-level to midrange and high-end graphics cards.

This discussion has been archived. No new comments can be posted.

Intel Discloses Its Forthcoming Discrete GPU Strategy and Design Efforts (hothardware.com)

Comments Filter:
  • by mentil ( 1748130 ) on Sunday December 02, 2018 @05:02AM (#57735526)

    Skimmed the article. The opening data is almost all stuff that's been previously revealed, or is obvious. The Q&A session is painful PR-speak noncommittal vagueness.

    I want to know if it's going to support DXR (directx raytracing) or how many generations of architectures they're committing to. If they buy up another promising game and then shut it down like they did when they cancelled Larrabee, I'll be peeved.

    • by _merlin ( 160982 ) on Sunday December 02, 2018 @05:06AM (#57735536) Homepage Journal

      I'll believe Intel can build a discrete GPU worth buying when I see it. Every attempt so far has been flawed (Real3D i740 starved for texture bandwidth), weak (Silicon Image GMA950 with terrible performance and even worse drivers on Windows), or vapourware (Larrabee). There's no indication that it'll be different this time.

      • Indeed. Every time I hear news about Intel and GPUs I think about this Santa comic [dvhardware.net]

      • There's no indication that it'll be different this time.

        This! Intel has nothing going for it right now. They have shown no ability to innovate in the CPU market, they have shown only to be capable of buying up another company's technology and bring it to market full of mixed messages and frankly broken promises (Optane), and their history in discrete graphics is a disaster.

        They don't deserve any benefit of the doubt. They deserve only skepticism.

        • What promises did Optane break? It seems to be exactly where everyone believed it would be. Much lower latency than SSD, at a price point somewhere between SSD and DRAM prices. Every benchmark I've seen shows that is exactly where it is.

          • by mentil ( 1748130 )

            They had to walk back their endurance claims by an order of magnitude, which put certain applications of the tech into question. The rollout was also much slower, at a lower density, than expected, even after delays. It was hyped/implied to be a replacement for DRAM and NAND but has drawbacks that don't let it completely replace either.

          • What promises did Optane break? It seems to be exactly where everyone believed it would be.

            It's exactly what everyone believed it would be (a fast technology for high-end SSDs). Just not what Intel said it would be (the end of DRAM as we know it) when Optane was in the same state as this GPU announcement. Hell they even market it as "Optane Memory". The fact that everyone called out their bullshit at the time and it has proven to be exactly what we thought doesn't change this.

            I didn't say it's a bad product without a purpose. I just said it's not what Intel promised in their useless marketing re

            • Can't believe I have to explain this on slashdot, but... Optane IS memory. So is NAND, your old spinning rust drive, digital tapes, CDs, and blurays. In fact, not only is it memory, but it's also RAM (random access memory). But so is everything listed above with the exception of digital tapes. If you believe the only thing that qualifies something to be memory is DRAM, then your definition is simply wrong.

              It really could replace DRAM -- if they can get the endurance back up -- it's not bad endurance, ju

              • Can't believe I have to explain this on slashdot, but... Optane IS memory.

                Maybe you should learn what marketing is and what people need to understand. Yes Optane and NAND is memory. Now why do you think that NAND isn't sold as "memory" to consumers? Why do you think Optane is?

                Your splitting hairs on definition is as dishonest as Intel's marketing division. The same marketing decision which has almost directly caused the current trend of advertising laptops with 24GB of "memory" (8GB of RAM, I'll leave it as an exercise to you to guess the rest). There's a time to split hairs (Tes

                • Now why do you think that NAND isn't sold as "memory" to consumers?

                  Let's stop right there, because NAND *IS* sold as memory to consumers. For example, let's take one of the largest suppliers of NAND flash products that are consumer facing today... Kingston. And here is an article by them: https://www.kingston.com/us/co... [kingston.com]. "Here's a quick primer on what you need to know about NAND Flash memory."

                  Here is the wikipedia article on NAND: https://en.wikipedia.org/wiki/... [wikipedia.org].

                  Cameras use "Memory Sticks" -- all based on flash memory.

                  I could sit here all day a google marketing pres

                  • Let me stop you right there again.
                    https://www.newegg.com/Product... [newegg.com]
                    https://www.newegg.com/Product... [newegg.com]

                    Or why not let the SIs speak for you: https://store.hp.com/us/en/cv/... [hp.com]

                    Considering that I understood the marketing and you did not

                    To channel my inner Trump: WRONG! You have clearly failed to understand the marketing. Good work finding a detailed description of NAND and ignoring the information that is most front and center to consumers. You're still splitting hairs trying to save your horrible interpretation of the situation while you continue to ignore the ACTUAL M

                    • *sigh*

                      Please read your own darn links. They refer to it as memory. If you aren't going to call it memory, which it is, what exactly would YOU call it? "thing that stores stuff for a computer, but isn't memory"?

                      Good work finding a detailed description of NAND and ignoring the information that is most front and center to consumers.

                      I don't have to when you send them to me.

                      You tried to discredit a promise Intel made on it's marketing material (which didn't make sense and failed to deliver) with ... a promise made on Intel marketing material.

                      Well, it's not really just marketing material when you can buy it and do it yourself. Granted with the P4800X, it's a software hypervisor (by a 3rd party) running with your OS as a client balancing requests between DRAM and Optane, but to the OS and the appl

                    • "Kingston A1000 M.2 2280 480GB PCI-Express 3.0 x2 3D TLC Internal Solid State Drive (SSD) SA1000M8/480G"
                      "Intel Optane M.2 2280 32GB PCIe NVMe 3.0 x2 Memory Module/System Accelerator MEMPEK1W032GAXT"

                      One has a product name (the single most important component of marketing) called SSD the other has a product name called memory module.

                      I'm done. You've displayed enough ignroance for one day.

      • by Z80a ( 971949 )

        The intelHD line was quite OK, specially the iris pro stuff.
        At least the intel chips now manage to boot and run the games unlike the GMA line.

      • by Agripa ( 139780 )

        I'll believe Intel can build a discrete GPU worth buying when I see it. Every attempt so far has been flawed (Real3D i740 starved for texture bandwidth), weak (Silicon Image GMA950 with terrible performance and even worse drivers on Windows), or vapourware (Larrabee). There's no indication that it'll be different this time.

        Larrabee was not vaporware exactly but it is worth considering why it did (or did not) fail. I suspect the development of ISPC detailed below may point to what Intel has in mind.

        https://pharr.org/matt/blog/20... [pharr.org]
        http://tomforsyth1000.github.i... [github.io]

    • Of course, There are no details, because this is a vapor-ware press release to push up the stock price. They have developed no IP to do the things needed to compete with Nvidia or AMD in the next 2-3 years. This is laughable.
      • by Rockoon ( 1252108 ) on Sunday December 02, 2018 @07:15AM (#57735796)

        They have developed no IP to do the things needed to compete with Nvidia or AMD in the next 2-3 years.

        IP doesnt always have to be developed by a particular company. See Intels latest deal with AMD for integrated graphics.

        I have been saying for a couple years now here that Intel is in very serious trouble. Especially after those layoffs and the PR announcement for a "cloud strategy." The first key point here is that its taken several years before it became obvious to most (even here) that Intel is in any trouble at all. The second key point is that Intel did know it years ago that they were in big trouble.

        Intels biggest problem is that their vertical integration has really constrained them. Silicon (not just CPU's) doesnt leave an Intel plant without being branded Intel. They have older fabs that are idle because they wont sell time on them, and newer fabs that even at 100% capacity cant satisfy demand. The later wouldnt be a problem if Intel were the only source for a particular component by raising prices to decrease demand, but the reality is their competitors in total have far more capacity than they do.

        It is because of all this that a company like AMD would trade off some IP to Intel. It doesnt fix Intels fundamental and now unfixable problem, which is that they will never be the market leader again, never steer the markets that they partake in. From here on out they can only react to what other market players are doing.

        On the desktop process side, Intel was blindsided by the economy of AMDs chiplets, and they are still at least several years from an effective design. It isnt just about small dies on a single processor board, its about being wholly modular. The same chiplets that Threadripper uses are also used by AMD's low end Ryzen APUs.

        Intel does have some "chiplet" experience but then too it was as a reaction to a blind-sided moment when their main competitor introduced multi-core to the consumer. It was a hack that they didnt explore but should have.

        On the fabrication side, Intel is now dwarfed by the rent-a-fab market capacity on its entirety, and even individual rent-a-fabs are now overtaking them in capacity.

        I've said it before and I'll say it again. Sell your Intel stock. Even if you arent manually in the stock market, check your 401Ks and Roth IRAs. They might be able to prevent becoming a Motorola, but even if they do its still bad. Very bad. Intel is fucked.

        • IP doesnt always have to be developed by a particular company.

          Pretty sure by "IP" they meant"actual designs," not just patents.

        • by Agripa ( 139780 )

          Intels biggest problem is that their vertical integration has really constrained them.

          Based on things Bob Colwell has said in his book and other places I think Intel's problem is management which was turning toxic before he left. What I have read about the failure of Larrabee and the i960 indicates the same thing.

          Intel does not need effective management while the x86 train was paying the bills but when that train slows down, I expect a panic that the older Intel under Andy Grove which moved from memory to microprocessors could have handled.

          Having to rely on Microsoft does not help.

    • Raytracing for sure. That's still the hottest area of rendering after all these years, even though its soooo computationally complex.
      • by mikael ( 484 )

        It's quite simple. You take your scene, slice and dice it into triangles, even parametric surfaces like NURBS, subdivision surfaces used by Pixar, 3D models from 3Dmax, Maya, Blender. All of that gets converted into textures, material shaders and geometry mesh. The geometry mesh gets chopped up into a hierarchical bounding volume like a kd-tree. All of this can be stored in a data format loaded straight into the GPU or CPU cache. It's all vectors, matrices and parametric coordinates. Separate processors ar

        • by Rockoon ( 1252108 ) on Sunday December 02, 2018 @03:32PM (#57737450)
          Its all fun and games when the scene data is static.

          What has prevented RTRT for all this time has been dynamic scene data - things moving around - the acceleration structures like space partitioning trees are either too expensive to generate in realtime or sacrifice too much trying to deal with it.

          Remember that GPUs have high memory bandwidth but absolutely terrible memory latency, not like CPU's where the opposite is true. Intel learned that the issue continued to remain insurmountable with their Larrabee failure. They could build those acceleration structures quickly on Larrabee, but then when it came time to render, the lack of memory bandwidth became the killer.

          As far as I know there is still no acceptable solution. nVidia is claiming they have it, but in practice they are still mainly rasterizing.
    • by gl4ss ( 559668 )

      it's just the same thing they've been bullish or whatever crap for 15 years.

      they can't fucking come up with a good cost efficient cpu nowadays and now we should expect them to actually come up with a fast gpu?

      I mean for 15 years they've been touting the same line of "oh in a year you don't need an extra gpu, our will be just sooo fast!" and then it comes to the market and is like a budget gpu from 5 years before. literally that's what they do.

      another alternative is that it'll cost even more than a 2080 and

    • > If they buy up another promising game and then shut it down

      I assume you are talking about the intriguing Project Offset [youtu.be] ?

      I never did understand Intel's logic in that. They aren't a game dev studio nor publisher. Were they hoping to showcase Intel's CPU and/or Larrabee performance "advantage" and then when that completely FAILED (compared to regular discrete GPUs) they canceled it?

      Or were they hoping to leverage buying Havok (Game Physic Engine) in 2007 when they bought Project Offset in 2008 [wikipedia.org] ?

    • by AHuxley ( 892839 )
      How to make existing CPU sell as a new GPU.

      Find a lot of working and well understood CPU product.
      Spread out a lot of CPU hardware over a long GPU looking card. A long card to fit more CPU all the way along.
      Add powerful cooling and a new look to the brand.
      Many working CPU with an easy to support open source driver get sold as a powerful new look GPU.

      Show the world a ray tracing demo.
      That existing CPU design is sold at a new GPU price.

      Start thinking of the next generation.
      Add more memory and
    • If its sole purpose is added realism in gaming then I think ray tracing will be almost like VR -- cool and few people will really care about it. Current games have much larger gaps in realism elsewhere than in graphics, for example changing state of objects, sound generation, not to mention AI. That's assuming realism is the most important thing for a game to sell well.

      If the purpose is something else, I'd be curious what that is.

  • Fuck nVidia (Score:2, Insightful)

    by KiloByte ( 825081 )

    At least we know the drivers will be ok from day one. It's not humanly possible to suck more than current GPU makers.

    nVidia not only doesn't provide documentation for its cards, but even actively interferes with nouveau on its new cards (encrypting and signing crap). On every card, it's random whether either their proprietary drivers or nouveau will work without crashing. The proprietary drivers are useless if you even dabble in kernel development -- they get ported to current kernels 0-6 months after a

    • I just built a new machine with a Vega 56, and it works perfectly on Linux. You just need a distro with a recent enough kernel that includes the amdgpu driver.

    • At least we know the drivers will be ok from day one.

      I'm sure they will be of the same high^W quality as Intel's MELTDOWN and SPECTRE patches.

    • Despite the card's age, it worked perfectly

      No man. *Because* of the card's age it worked perfectly. You slot in an 8 year old card from any manufacturer in a Linux box and it works perfectly.

      • Nor for nVidia -- they have already dropped drivers for any generation up to Geforce 500. And nouveau works adequately only for some cards. With nVidia's hostility to independent driver writers, it's a wonder nouveau is even in this state.

        • Are you implying that you can't get Linux working on a card of that generation? That would be news to many. Just because someone drops support for a driver doesn't mean that it doesn't work. This is even more certain in the Linux world than anywhere else, and kind of my point: Linux has phenomenal hardware support, but only in the long run.

    • AMD really seems committed to provide an open source driver, but while the amdgpu driver provides good performance and includes even the most exotic features and newest hardware, it is still full of bugs, and system crashes are all but infrequent.

      When you read the commit messages of Intel drivers, you get the feeling that those who write those drivers know what they are doing, and just need to follow a proper, written down hardware specification.

      In contrast, if you read the commit messages of the "amdgpu
  • by IYagami ( 136831 ) on Sunday December 02, 2018 @06:41AM (#57735732)

    From the interview:

    Q: Will Intel’s new GPU architecture eventually migrate down onto the CPU or will the discrete and integrated solutions remain separate architectures?
    A: Leveraging Intel’s broad portfolio of products is critical to building winning platforms: lots of performance, in compelling form factors, in compelling power envelopes. We’re excited by the opportunity to build technologies that will allow us to take experiences, features, and innovation to new and unique form factors, and to an install base of a billion screens around the world.

    Are they going to improve the integrated graphics in their CPUs? (which currently is the weakest link in their offering, AMD Ryzen APUs have Vega GPU cores). According to the interview....... I don't know!

    I think there is WAY more progress in the AMD and ARM front

  • Intel has been uncharacteristically vocal about its most recent plans to enter the discrete GPU market.

    What? Like any prior time was secret? Each time has been accompanied by plenty of press releases, and each time so far has been an abject failure. But this time for sure!

  • Oh, man. Should be "Will push ahead of.." They have lost already. The feminized Intel is on a downward spiral. You have to lead your target, not aim directly at it. The new product sounds like it will be merely current GPU technology made proprietary with some Intel IP added to lock clients into Intel's also- ran GPU technology. And if they started doing that three years ago they are already behind in technology. Not a great job Intel.
  • by Artem S. Tashkinov ( 764309 ) on Sunday December 02, 2018 @11:23AM (#57736524) Homepage
    What a lousy "interview" - I failed to see a single reply for the first six questions and stopped reading at that point.
  • I hope they're really successful with high performance video chipsets. Right now, I'd welcome additional competition in that space, no matter who is doing it.
    The current situation is pretty ridiculous -- where every single person on the planet interested in 3D gaming or design/CAD/CAM or animation/rendering work is stuck with what one of only two vendors have to offer them.

    Every time people come up with a new reason to buy fast video cards (like crypto-mining the latest e-coin), there's a massive shortage o

  • Some competent colleagues of mine left Intel right at the time when they were asked to work on software for Larrabee, because they knew right from the start that this project was bound to fail. The very concept of the Larrabee hardware was ridiculously flawed, as anyone with eyes could see. They of course told their supervisors so, but as usual, were not heard.

    Bad for Intel, good for us, as we were back then just hiring.
  • In TFA Intel states they "plan to use telemetry and machine learning, on a per-system and per-user basis" - wow, that sounds like a solid threat to turn users into their product, like Facebook does.
  • There's probably a secret messages encoded in Slashdot's delightful crop of fresh misspellings.

Avoid strange women and temporary variables.

Working...