Intel Says Its First Discrete Graphics Chips Will Be Available in 2020 (marketwatch.com) 99
Ryan Shrout, reporting for MarketWatch: Intel CEO Brian Krzanich disclosed during an analyst event last week that it will have its first discrete graphics chips available in 2020. This will mark the beginning of the chip giant's journey toward a portfolio of high-performance graphics products for various markets including gaming, data center and artificial intelligence (AI). Some previous rumors suggested a launch at CES 2019 this coming January might be where Intel makes its graphics reveal, but that timeline was never adopted by the company. It would have been overly aggressive and in no way reasonable with the development process of a new silicon design. In November 2017 Intel brought on board Raja Koduri to lead the graphics and compute initiatives inside the company. Koduri was previously in charge of the graphics division at AMD helping to develop and grow the Radeon brand, and his departure to Intel was thought to have significant impact on the industry.
i741 (Score:5, Funny)
liquid cooled and running at 50Mhz with an overdrive chip
Re: (Score:2, Insightful)
liquid cooled and running at 50Mhz with an overdrive chip
Or any of the many many others they made after that.
It's pretty obvious that Ryan Shrout just doesn't know what he's writing about.
Re: (Score:3)
liquid cooled and running at 50Mhz
Sorry, but you can't really push the 741 [st-andrews.ac.uk] far above maybe 10 kHz...
First discrete graphics? (Score:5, Informative)
What about this one?
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
I came here to mention this. The article is wrong, though the i740 came about when Intel licensed the technology from the Real3D division of Lockheed Martin. Intel later purchased the intellectual property after Real3D was closed.
Re: (Score:3)
They're trying to pretend they didn't, since Starfighter was a P.O.S.
Re: (Score:2)
When you're doing a half-ass attempt, you can at worst expect half-shit results.
Re: (Score:2)
half-ass vs full-ass
Re:First discrete graphics? (Score:5, Informative)
The video cards from the era you've linked either used system RAM, or only did 2D graphics using a few MB of onboard RAM for the framebuffer. So they are analogous to today's integrated graphics. The need for the GPU to have gobs of its own high-speed VRAM didn't arise until 3D graphics began pushing frames faster than you could transfer needed data across the bus from system RAM to the video card. Most of that VRAM is taken up by textures used for 3D graphics, so only 3D graphics cards have large amounts of it. A framebuffer, found on both 3D and 2D graphics cards, is only 8 MB for 1080p 32-bit color. So there's no need for large amounts of VRAM in an integrated video card.
Back then, we called them a 3D video card vs a 2D video card. That nomenclature was abandoned once even low-end 2D video cards became capable of rudimentary 3D graphics. The distinction then shifted to whether it was a "serious" 3D graphics card with its own dedicated VRAM, or whether it was a 2D video card (commonly integrated into the motherboard) which could do 3D graphics in a pinch by borrowing system RAM to use as VRAM.
Re:First discrete graphics? (Score:4, Informative)
No, the i740 had its own, dedicated VRAM. Therefore, this is not Intel's first discrete graphics chipset.
Re: (Score:1)
From the very wiki article you seem to have skipped over:
A unique characteristic, which set the AGP version of the card apart from other similar devices on the market, was the use of on-board memory exclusively for the display frame buffer, with all textures being kept in the computer system's main RAM.
As he said, there is a minimal framebuffer on the card, the rest being in system memory. Hence, integrated.
Re: (Score:2)
Re: First discrete graphics? (Score:1)
Stop it!
The fact that they relied on AGP for making pretend video memory for texture doesn't make it "integrated". Call it "lousy execution of a card", " worst 3D video card ever made" or "Intel's ugliest 3D child" if you will... But integrated it isn't.
The 740 had a successor. The 810 if I remember right. This one was integrated.
Re: (Score:2)
The key is "the *growing* use of textures". (emphasis mine)
Intel loves deep pipelines and caching. The latter plays to their self-image as a manufacturing company first and a design company second; and the former looks good in the presence of the former and cooperative workloads. In this case, the workloads changed faster than their GPU design.
Re: (Score:2)
"Intel also sold the i740 to 3rd party companies, and some PCI versions of the accelerator also were made. They used an AGP-to-PCI bridge chip and had more on-board memory for storing textures locally on the card, and were actually faster than their AGP counterparts in some performance tests."
Seems totally discrete to me.
Probably most of the uses of this new chip will be similar... sold to OEMs to integrate as they will.
Re: (Score:1)
Uh, all graphics cards whether 2D or 3D had their own RAM, and even integrated graphics card (e.g. ATI Rage Pro, or even ISA graphics in old OEM PCs) had their own RAM soldered to the motherboard.
Even if you only had 4MB on a graphics card that did both 2D and 3D, well you had 4MB for everything, framebuffer (double-buffered + Z buffer) and textures.
The original 3dfx Voodoo did only 3D but had its own framebuffer : 2MB framebuffer memory and separate 2MB for textures.
So that was not much but you'd run somet
Re: (Score:2)
Here's what John Carmack had to say about the i740.. [fool.com]
Re: (Score:2)
They even have a more modern attempt based on many cores with an onboard OS: https://en.wikipedia.org/wiki/... [wikipedia.org]
nvidia on suicide watch (Score:1)
fuck you geforce i dont want to update driver right now
Ok... (Score:2)
Since I use GPUs a lot for non gaming applications this is interesting.
Normally I'd not be interested because it's Intel who will have to play catch up. But with Raja involved this might actually have life.
Wait and see....
Re: (Score:2)
So same year as the new XBOX? (Score:2)
XBOX is AMD maybe PS5? (Score:2)
XBOX is AMD maybe PS5?
Re: (Score:2)
As Intel could never compete... (Score:1)
I hope this means that GFX will get mature and the endless cycle of "faster" will come to an end. There is a lot of evidence of massive slowdown at this time already, only a few years after CPUs. Finally having mature tech here would be endlessly beneficial.
Re: As Intel could never compete... (Score:2)
I'd love something to drive down (Score:3)
Diversification (Score:5, Funny)
Re:Diversification (Score:4)
I had a IHOP hamburger a few days ago. It was pretty good. Not sure I can say the same about intel graphics chips.
Re: (Score:2)
Back in the desktop era, PC manufacturers didn't want Intel to have good graphics. Graphics card upgrades had much higher margins than the base PC. So, you can consider that Intel graphics have been deliberately handicapped.
I've been in this long enough to remember when the big graphics company was Silicon Graphics, and before that Evans and Sutherland. Intel just needs to hire good people.
Let's hope this try goes better than Intel's last attempt at making better graphics chips (which seemed to fizzle out),
Re: (Score:2)
let's also hope they don't intentionally cripple thunderbolt to bolster their graphics cards.
An nvidia card over a TB3 is still light years better than anything intel has produced -- and will likely stay that way for several years.
Re: (Score:2)
Wiat a min... I thought Intel was done... (Score:2)
Didn't we have a story last week about how Intel was on death's door because they couldn't get their yield on the new chips high enough? Wasn't AMD ready to pounce? Now this?
Re: (Score:2)
10 nanometer [wikipedia.org]
After you've been the 800-lb gorilla for four decades, death's door is merely running ab
Re: (Score:2)
Good luck to Intel with their GPUs. Intel won't be getting me back from the forseeable future. Threadripper 2 for me this fall. Btw, it is said that losing Raja Koduri will actually speed up AMDs GPU evolution, because he changed direction too often.
Re: (Score:2)
Re: (Score:1)
That doesn't look like floating point to me.
Remember "FUD" (Score:2, Funny)
The best I can hope for out of this... (Score:1)
The best I can hope for out of this is that intel will do ok, adopt freesync, and force Nvidia to get with the program and drop the stupid proprietary gsync.
Make it bad at mining please (Score:2)
Re: (Score:1)
Gaming and mining involve the same kind of math. Your suggestion is impossible. Government (over)regulation of cryptocurrency will be what does it in, not graphics card manufacturers cutting their own throats by intentionally gimping their products.
Competition (Score:1)
Discrete graphics? (Score:5, Funny)
So it's a GPU that won't tell anyone about the kind of porn you watch?
Re: (Score:2)
Re: (Score:2)
So NVIDIA doesn't have discrete graphics cards? https://www.neowin.net/news/nv... [neowin.net]
Time to branch out (Score:1)
Now that Intel has finally asserted and solidified it's superiority and dominance over all it's competitors in the CPU market, it only makes sense to branch out into other markets.
Intel probably doesn't have to try that hard anymore. Their lead is so big that they can probably just continue to profit indefinitely even without any real innovation on their part.
That said, I'm hearing that Intel has managed to get a 28-core chip running at 5 GHz on all cores. Their advancements in tablecloth technology has ma
Way, way behind. (Score:1)
Back in the day, Motorola had the 6845 in their 6800 processor family. I think even Zilog has a CRT controller of some kind. Intel has waited until now? Really?!?
If unlike nVidia, they provide open source drivers (Score:3)
Re: (Score:2)
I don't know man, I've got a system with a Radeon HD 3450 in it (crappy base discrete card Dell put in everything a while back) and it is absolutely rock stable, the uptime is... well, I did a kernel update last week so it's a week, but this has never crashed once except some weird condition where accessing a file on a mounted SMB caused a GPF once and left the system in a weird state.
The older Radeon cards are extremely well supported and stable, although I sure wouldn't want to game on one...
Re: (Score:2)
It will do ray tracing for free (Score:2)
Want more? The app creator will have to tell the all the CPU's on the Intel GPU what to do for their app.
cheap GPU ? (Score:1)
I came to this article thinking, "Ooh, maybe I don't have to pay an extra $400 for a GPU on a new computer!"
Is that wishful thinking?
Does this announcement make any handwavey motions in that directions or am I waay off course.