Intel Discloses Its Forthcoming Discrete GPU Strategy and Design Efforts (hothardware.com) (hothardware.com) 92
MojoKid writes: Intel has been uncharacteristically vocal about its most recent plans to enter the discrete GPU market. Over the last year or so, the company has disclosed a few morsels of information and made some high-profile hires, in its bid to build-up and flesh-out its latest discrete GPU plans. This week, Intel decided to have a sit down with HotHardware, offering the opportunity to chat with Ari Rauch, Vice President of the Core And Visual Computing Group at Intel, to discuss what makes this most recent endeavor different from the company's previous and now discontinued attempts in the discrete GPU space. As a follow up, HotHardware also enlisted readership questions to engage with Intel about its upcoming GPU plans, compiling responses in a Q&A format.
In short, this isn't Larabee 2.0, not by a long shot. Intel is gearing up for a traditional GPU architecture design, coupled with some of the company's own strategic IP that it can bring to the table, to help differentiate its products. Further, Rauch noted Intel "will bring discrete GPUs to both client and data center segments aiming at delivering the best quality and experiences across the board including gaming, content creation, and enterprise. These products will see first availability over a period of time, beginning in 2020."
When questioned on their current silicon fabrication hiccups and delays and how it might affect Intel's ability to execute in this highly competitive space, Rauch noted, "we feel very confident about our product roadmap across software, architecture, and manufacturing." Based on some of the responses to product positioning questions, it also appears Intel is gearing up to address all performance envelopes as well, from entry-level to midrange and high-end graphics cards.
In short, this isn't Larabee 2.0, not by a long shot. Intel is gearing up for a traditional GPU architecture design, coupled with some of the company's own strategic IP that it can bring to the table, to help differentiate its products. Further, Rauch noted Intel "will bring discrete GPUs to both client and data center segments aiming at delivering the best quality and experiences across the board including gaming, content creation, and enterprise. These products will see first availability over a period of time, beginning in 2020."
When questioned on their current silicon fabrication hiccups and delays and how it might affect Intel's ability to execute in this highly competitive space, Rauch noted, "we feel very confident about our product roadmap across software, architecture, and manufacturing." Based on some of the responses to product positioning questions, it also appears Intel is gearing up to address all performance envelopes as well, from entry-level to midrange and high-end graphics cards.
Re: (Score:1)
These days I consider Wall Street Journal center left, New Yorker Times left, and New Yorker far left.
Muffingtonpost? Tis a bad joke.
Re: (Score:2)
I must admit that I've never read the Huffington Post, but the rest of your comment tells me more about you than about them. WHY do you consider them whatever you think of as "left".
Re: (Score:1)
"HuffPost (formerly The Huffington Post and sometimes abbreviated HuffPo)[2] is a politically left-leaning, American news and opinion website and blog"
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
FWIW, here is how your expectations line up with mediabiasfactcheck.com:
Wall Street Journal: Highly Factual, Center-Right
https://mediabiasfactcheck.com... [mediabiasfactcheck.com]
New York Times: Highly Factual, Center-Left
https://mediabiasfactcheck.com... [mediabiasfactcheck.com]
New Yorker: High Factual, Left
https://mediabiasfactcheck.com... [mediabiasfactcheck.com]
Whether or not it's their bias or your bias, you seem to view all of those sites one rank to the left relative to how they rate them.
Information-Free Article (Score:5, Informative)
Skimmed the article. The opening data is almost all stuff that's been previously revealed, or is obvious. The Q&A session is painful PR-speak noncommittal vagueness.
I want to know if it's going to support DXR (directx raytracing) or how many generations of architectures they're committing to. If they buy up another promising game and then shut it down like they did when they cancelled Larrabee, I'll be peeved.
Re:Information-Free Article (Score:4, Insightful)
I'll believe Intel can build a discrete GPU worth buying when I see it. Every attempt so far has been flawed (Real3D i740 starved for texture bandwidth), weak (Silicon Image GMA950 with terrible performance and even worse drivers on Windows), or vapourware (Larrabee). There's no indication that it'll be different this time.
Re: (Score:2)
Indeed. Every time I hear news about Intel and GPUs I think about this Santa comic [dvhardware.net]
Re: (Score:3)
There's no indication that it'll be different this time.
This! Intel has nothing going for it right now. They have shown no ability to innovate in the CPU market, they have shown only to be capable of buying up another company's technology and bring it to market full of mixed messages and frankly broken promises (Optane), and their history in discrete graphics is a disaster.
They don't deserve any benefit of the doubt. They deserve only skepticism.
Re: (Score:3)
What promises did Optane break? It seems to be exactly where everyone believed it would be. Much lower latency than SSD, at a price point somewhere between SSD and DRAM prices. Every benchmark I've seen shows that is exactly where it is.
Re: (Score:2)
They had to walk back their endurance claims by an order of magnitude, which put certain applications of the tech into question. The rollout was also much slower, at a lower density, than expected, even after delays. It was hyped/implied to be a replacement for DRAM and NAND but has drawbacks that don't let it completely replace either.
Re: (Score:2)
What promises did Optane break? It seems to be exactly where everyone believed it would be.
It's exactly what everyone believed it would be (a fast technology for high-end SSDs). Just not what Intel said it would be (the end of DRAM as we know it) when Optane was in the same state as this GPU announcement. Hell they even market it as "Optane Memory". The fact that everyone called out their bullshit at the time and it has proven to be exactly what we thought doesn't change this.
I didn't say it's a bad product without a purpose. I just said it's not what Intel promised in their useless marketing re
Re: (Score:2)
Can't believe I have to explain this on slashdot, but... Optane IS memory. So is NAND, your old spinning rust drive, digital tapes, CDs, and blurays. In fact, not only is it memory, but it's also RAM (random access memory). But so is everything listed above with the exception of digital tapes. If you believe the only thing that qualifies something to be memory is DRAM, then your definition is simply wrong.
It really could replace DRAM -- if they can get the endurance back up -- it's not bad endurance, ju
Re: (Score:2)
Can't believe I have to explain this on slashdot, but... Optane IS memory.
Maybe you should learn what marketing is and what people need to understand. Yes Optane and NAND is memory. Now why do you think that NAND isn't sold as "memory" to consumers? Why do you think Optane is?
Your splitting hairs on definition is as dishonest as Intel's marketing division. The same marketing decision which has almost directly caused the current trend of advertising laptops with 24GB of "memory" (8GB of RAM, I'll leave it as an exercise to you to guess the rest). There's a time to split hairs (Tes
Re: (Score:2)
Now why do you think that NAND isn't sold as "memory" to consumers?
Let's stop right there, because NAND *IS* sold as memory to consumers. For example, let's take one of the largest suppliers of NAND flash products that are consumer facing today... Kingston. And here is an article by them: https://www.kingston.com/us/co... [kingston.com]. "Here's a quick primer on what you need to know about NAND Flash memory."
Here is the wikipedia article on NAND: https://en.wikipedia.org/wiki/... [wikipedia.org].
Cameras use "Memory Sticks" -- all based on flash memory.
I could sit here all day a google marketing pres
Re: (Score:2)
Let me stop you right there again.
https://www.newegg.com/Product... [newegg.com]
https://www.newegg.com/Product... [newegg.com]
Or why not let the SIs speak for you: https://store.hp.com/us/en/cv/... [hp.com]
Considering that I understood the marketing and you did not
To channel my inner Trump: WRONG! You have clearly failed to understand the marketing. Good work finding a detailed description of NAND and ignoring the information that is most front and center to consumers. You're still splitting hairs trying to save your horrible interpretation of the situation while you continue to ignore the ACTUAL M
Re: (Score:2)
*sigh*
Please read your own darn links. They refer to it as memory. If you aren't going to call it memory, which it is, what exactly would YOU call it? "thing that stores stuff for a computer, but isn't memory"?
Good work finding a detailed description of NAND and ignoring the information that is most front and center to consumers.
I don't have to when you send them to me.
You tried to discredit a promise Intel made on it's marketing material (which didn't make sense and failed to deliver) with ... a promise made on Intel marketing material.
Well, it's not really just marketing material when you can buy it and do it yourself. Granted with the P4800X, it's a software hypervisor (by a 3rd party) running with your OS as a client balancing requests between DRAM and Optane, but to the OS and the appl
Re: (Score:2)
"Kingston A1000 M.2 2280 480GB PCI-Express 3.0 x2 3D TLC Internal Solid State Drive (SSD) SA1000M8/480G"
"Intel Optane M.2 2280 32GB PCIe NVMe 3.0 x2 Memory Module/System Accelerator MEMPEK1W032GAXT"
One has a product name (the single most important component of marketing) called SSD the other has a product name called memory module.
I'm done. You've displayed enough ignroance for one day.
Re: (Score:2)
The intelHD line was quite OK, specially the iris pro stuff.
At least the intel chips now manage to boot and run the games unlike the GMA line.
Re: (Score:2)
I'll believe Intel can build a discrete GPU worth buying when I see it. Every attempt so far has been flawed (Real3D i740 starved for texture bandwidth), weak (Silicon Image GMA950 with terrible performance and even worse drivers on Windows), or vapourware (Larrabee). There's no indication that it'll be different this time.
Larrabee was not vaporware exactly but it is worth considering why it did (or did not) fail. I suspect the development of ISPC detailed below may point to what Intel has in mind.
https://pharr.org/matt/blog/20... [pharr.org]
http://tomforsyth1000.github.i... [github.io]
Re: (Score:1)
Re:Information-Free Article (Score:5, Interesting)
They have developed no IP to do the things needed to compete with Nvidia or AMD in the next 2-3 years.
IP doesnt always have to be developed by a particular company. See Intels latest deal with AMD for integrated graphics.
I have been saying for a couple years now here that Intel is in very serious trouble. Especially after those layoffs and the PR announcement for a "cloud strategy." The first key point here is that its taken several years before it became obvious to most (even here) that Intel is in any trouble at all. The second key point is that Intel did know it years ago that they were in big trouble.
Intels biggest problem is that their vertical integration has really constrained them. Silicon (not just CPU's) doesnt leave an Intel plant without being branded Intel. They have older fabs that are idle because they wont sell time on them, and newer fabs that even at 100% capacity cant satisfy demand. The later wouldnt be a problem if Intel were the only source for a particular component by raising prices to decrease demand, but the reality is their competitors in total have far more capacity than they do.
It is because of all this that a company like AMD would trade off some IP to Intel. It doesnt fix Intels fundamental and now unfixable problem, which is that they will never be the market leader again, never steer the markets that they partake in. From here on out they can only react to what other market players are doing.
On the desktop process side, Intel was blindsided by the economy of AMDs chiplets, and they are still at least several years from an effective design. It isnt just about small dies on a single processor board, its about being wholly modular. The same chiplets that Threadripper uses are also used by AMD's low end Ryzen APUs.
Intel does have some "chiplet" experience but then too it was as a reaction to a blind-sided moment when their main competitor introduced multi-core to the consumer. It was a hack that they didnt explore but should have.
On the fabrication side, Intel is now dwarfed by the rent-a-fab market capacity on its entirety, and even individual rent-a-fabs are now overtaking them in capacity.
I've said it before and I'll say it again. Sell your Intel stock. Even if you arent manually in the stock market, check your 401Ks and Roth IRAs. They might be able to prevent becoming a Motorola, but even if they do its still bad. Very bad. Intel is fucked.
Re: Information-Free Article (Score:2)
IP doesnt always have to be developed by a particular company.
Pretty sure by "IP" they meant"actual designs," not just patents.
Re: (Score:2)
Intels biggest problem is that their vertical integration has really constrained them.
Based on things Bob Colwell has said in his book and other places I think Intel's problem is management which was turning toxic before he left. What I have read about the failure of Larrabee and the i960 indicates the same thing.
Intel does not need effective management while the x86 train was paying the bills but when that train slows down, I expect a panic that the older Intel under Andy Grove which moved from memory to microprocessors could have handled.
Having to rely on Microsoft does not help.
Re: (Score:1)
Re: (Score:2)
It's quite simple. You take your scene, slice and dice it into triangles, even parametric surfaces like NURBS, subdivision surfaces used by Pixar, 3D models from 3Dmax, Maya, Blender. All of that gets converted into textures, material shaders and geometry mesh. The geometry mesh gets chopped up into a hierarchical bounding volume like a kd-tree. All of this can be stored in a data format loaded straight into the GPU or CPU cache. It's all vectors, matrices and parametric coordinates. Separate processors ar
Re:Information-Free Article (Score:4, Interesting)
What has prevented RTRT for all this time has been dynamic scene data - things moving around - the acceleration structures like space partitioning trees are either too expensive to generate in realtime or sacrifice too much trying to deal with it.
Remember that GPUs have high memory bandwidth but absolutely terrible memory latency, not like CPU's where the opposite is true. Intel learned that the issue continued to remain insurmountable with their Larrabee failure. They could build those acceleration structures quickly on Larrabee, but then when it came time to render, the lack of memory bandwidth became the killer.
As far as I know there is still no acceptable solution. nVidia is claiming they have it, but in practice they are still mainly rasterizing.
Re: (Score:2)
There is a raytracing extension to Vulkan and OpenGL which allows an optimized kd-tree to be stored in a buffer and interrogated using custom instructions.
Re: (Score:1)
The tensor ALUs are for "AI/machine-learning" inferencing. Not so great as designed-for-crypto stuff without instructions for all the bitwise manipulation found in those algorithms.
Re: (Score:2)
it's just the same thing they've been bullish or whatever crap for 15 years.
they can't fucking come up with a good cost efficient cpu nowadays and now we should expect them to actually come up with a fast gpu?
I mean for 15 years they've been touting the same line of "oh in a year you don't need an extra gpu, our will be just sooo fast!" and then it comes to the market and is like a budget gpu from 5 years before. literally that's what they do.
another alternative is that it'll cost even more than a 2080 and
Re: (Score:2)
> If they buy up another promising game and then shut it down
I assume you are talking about the intriguing Project Offset [youtu.be] ?
I never did understand Intel's logic in that. They aren't a game dev studio nor publisher. Were they hoping to showcase Intel's CPU and/or Larrabee performance "advantage" and then when that completely FAILED (compared to regular discrete GPUs) they canceled it?
Or were they hoping to leverage buying Havok (Game Physic Engine) in 2007 when they bought Project Offset in 2008 [wikipedia.org] ?
Re: (Score:2)
Find a lot of working and well understood CPU product.
Spread out a lot of CPU hardware over a long GPU looking card. A long card to fit more CPU all the way along.
Add powerful cooling and a new look to the brand.
Many working CPU with an easy to support open source driver get sold as a powerful new look GPU.
Show the world a ray tracing demo.
That existing CPU design is sold at a new GPU price.
Start thinking of the next generation.
Add more memory and
Why ray tracing? (Score:2)
If its sole purpose is added realism in gaming then I think ray tracing will be almost like VR -- cool and few people will really care about it. Current games have much larger gaps in realism elsewhere than in graphics, for example changing state of objects, sound generation, not to mention AI. That's assuming realism is the most important thing for a game to sell well.
If the purpose is something else, I'd be curious what that is.
Fuck nVidia (Score:2, Insightful)
At least we know the drivers will be ok from day one. It's not humanly possible to suck more than current GPU makers.
nVidia not only doesn't provide documentation for its cards, but even actively interferes with nouveau on its new cards (encrypting and signing crap). On every card, it's random whether either their proprietary drivers or nouveau will work without crashing. The proprietary drivers are useless if you even dabble in kernel development -- they get ported to current kernels 0-6 months after a
Re: (Score:1)
AMD did it before nVidia a good 5 years or so. I distinctly remember outright refusing to purchase Radeon graphics for this very reason.
The usual drivel of "But our DEVELOPMENT TIMES!!!!ELEVENTYONE!" is bullshit. Use a real compiler, and not visual studio. Use real winapi, and not the .net McInterfaces. Your product will be much faster, and wont have absurd dependencies.
No, your raging vagina being hurt by this simple fact is not sufficient reason to foist shit onto end users.
Thanks.
Re: (Score:1)
I just built a new machine with a Vega 56, and it works perfectly on Linux. You just need a distro with a recent enough kernel that includes the amdgpu driver.
Re: (Score:2)
At least we know the drivers will be ok from day one.
I'm sure they will be of the same high^W quality as Intel's MELTDOWN and SPECTRE patches.
Re: (Score:2)
Yeah, the Direct3D drivers for GMA950 were absolutely terrible. Somehow the Windows OpenGL drivers actually support more features (e.g. Windows OpenGL driver supports non-power-of-two textures while Direct3D driver doesn't). They were pretty bad on Mac OS X as well. I had a white MacBook with one, and using an external monitor to extend the desktop would cause regular kernel panics. The way it stole RAM bandwidth from the CPU made performance suck, and that got worse with an external monitor connected,
Re: (Score:2)
Despite the card's age, it worked perfectly
No man. *Because* of the card's age it worked perfectly. You slot in an 8 year old card from any manufacturer in a Linux box and it works perfectly.
Re: (Score:2)
Nor for nVidia -- they have already dropped drivers for any generation up to Geforce 500. And nouveau works adequately only for some cards. With nVidia's hostility to independent driver writers, it's a wonder nouveau is even in this state.
Re: (Score:2)
Are you implying that you can't get Linux working on a card of that generation? That would be news to many. Just because someone drops support for a driver doesn't mean that it doesn't work. This is even more certain in the Linux world than anywhere else, and kind of my point: Linux has phenomenal hardware support, but only in the long run.
"amdgpu" driver is all good intentions... (Score:2)
When you read the commit messages of Intel drivers, you get the feeling that those who write those drivers know what they are doing, and just need to follow a proper, written down hardware specification.
In contrast, if you read the commit messages of the "amdgpu
Re: (Score:2)
Most people are fine with Intel graphics for day to day work. But as Intel looks to VR support at least in basic support I think that's more their goal.
Good luck to them, then. They don't have any reasonable plan for becoming competitive in that space. Between MELTDOWN and their inability to drive down failure rates in their new process, Intel has lost their competitive advantage completely. Now they think they're going to suddenly become a credible GPU vendor, in spite of ample historical evidence to the contrary? Tee hee hee.
At this stage Intel should be focusing on developing a credible Integrated GPU, and worry about discrete ones later. And on becomin
Article has no details at all, just wait til 2020! (Score:4, Interesting)
From the interview:
Q: Will Intel’s new GPU architecture eventually migrate down onto the CPU or will the discrete and integrated solutions remain separate architectures?
A: Leveraging Intel’s broad portfolio of products is critical to building winning platforms: lots of performance, in compelling form factors, in compelling power envelopes. We’re excited by the opportunity to build technologies that will allow us to take experiences, features, and innovation to new and unique form factors, and to an install base of a billion screens around the world.
Are they going to improve the integrated graphics in their CPUs? (which currently is the weakest link in their offering, AMD Ryzen APUs have Vega GPU cores). According to the interview....... I don't know!
I think there is WAY more progress in the AMD and ARM front
Uncharacteristically? (Score:2)
Intel has been uncharacteristically vocal about its most recent plans to enter the discrete GPU market.
What? Like any prior time was secret? Each time has been accompanied by plenty of press releases, and each time so far has been an abject failure. But this time for sure!
Re: (Score:2)
I've got to admit my first though was "How is a GPU going to allow spyware the massively infect systems?". (My real thought was "malware", but only the spyware provides them with any benefit.
Re: (Score:2)
Intel is out of the CPU biz now in all sectors
Lol, intel and AMD are the only CPU vendors with any significant presence in the mid range.
"Will not lag behind..." (Score:2)
Re: (Score:2)
Because Intel isn't a perceived "market leader" in the graphics space the way nVidia is.
But yes, it is rather ironic.
Was this an interview? (Score:3)
Might be too little, too late? But .... (Score:2)
I hope they're really successful with high performance video chipsets. Right now, I'd welcome additional competition in that space, no matter who is doing it.
The current situation is pretty ridiculous -- where every single person on the planet interested in 3D gaming or design/CAD/CAM or animation/rendering work is stuck with what one of only two vendors have to offer them.
Every time people come up with a new reason to buy fast video cards (like crypto-mining the latest e-coin), there's a massive shortage o
Just in time for the bitcoin mining boom! (Score:2)
I have to thank Intel for failing with Larrabee (Score:2)
Bad for Intel, good for us, as we were back then just hiring.
"plan to use telemetry .. per-user"... WTF? (Score:2)
horomone and Larabee (Score:2)
There's probably a secret messages encoded in Slashdot's delightful crop of fresh misspellings.