Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel AMD Hardware

Intel Says Its First Discrete Graphics Chips Will Be Available in 2020 (marketwatch.com) 99

Ryan Shrout, reporting for MarketWatch: Intel CEO Brian Krzanich disclosed during an analyst event last week that it will have its first discrete graphics chips available in 2020. This will mark the beginning of the chip giant's journey toward a portfolio of high-performance graphics products for various markets including gaming, data center and artificial intelligence (AI). Some previous rumors suggested a launch at CES 2019 this coming January might be where Intel makes its graphics reveal, but that timeline was never adopted by the company. It would have been overly aggressive and in no way reasonable with the development process of a new silicon design. In November 2017 Intel brought on board Raja Koduri to lead the graphics and compute initiatives inside the company. Koduri was previously in charge of the graphics division at AMD helping to develop and grow the Radeon brand, and his departure to Intel was thought to have significant impact on the industry.
This discussion has been archived. No new comments can be posted.

Intel Says Its First Discrete Graphics Chips Will Be Available in 2020

Comments Filter:
  • i741 (Score:5, Funny)

    by mandark1967 ( 630856 ) on Tuesday June 12, 2018 @12:28PM (#56772566) Homepage Journal

    liquid cooled and running at 50Mhz with an overdrive chip

    • Re: (Score:2, Insightful)

      by kelemvor4 ( 1980226 )

      liquid cooled and running at 50Mhz with an overdrive chip

      Or any of the many many others they made after that.

      It's pretty obvious that Ryan Shrout just doesn't know what he's writing about.

    • liquid cooled and running at 50Mhz

      Sorry, but you can't really push the 741 [st-andrews.ac.uk] far above maybe 10 kHz...

  • by slew ( 2918 ) on Tuesday June 12, 2018 @12:28PM (#56772568)

    What about this one?

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    • by kriston ( 7886 )

      I came here to mention this. The article is wrong, though the i740 came about when Intel licensed the technology from the Real3D division of Lockheed Martin. Intel later purchased the intellectual property after Real3D was closed.

    • They're trying to pretend they didn't, since Starfighter was a P.O.S.

    • by Solandri ( 704621 ) on Tuesday June 12, 2018 @01:35PM (#56772974)
      In modern terminology, the difference between "discrete" and "integrated" graphics is not whether it's a standalone plug-in card. It's the presence of VRAM - high-speed, high-bandwidth RAM dedicated for use by the video card for 3D rendering. Discrete GPUs come with their own VRAM. Integrated GPUs use system RAM (though they're increasingly showing up with their own small buffer of high-speed RAM that acts more like a cache), with a much smaller amount of dedicated RAM for framebuffers.

      The video cards from the era you've linked either used system RAM, or only did 2D graphics using a few MB of onboard RAM for the framebuffer. So they are analogous to today's integrated graphics. The need for the GPU to have gobs of its own high-speed VRAM didn't arise until 3D graphics began pushing frames faster than you could transfer needed data across the bus from system RAM to the video card. Most of that VRAM is taken up by textures used for 3D graphics, so only 3D graphics cards have large amounts of it. A framebuffer, found on both 3D and 2D graphics cards, is only 8 MB for 1080p 32-bit color. So there's no need for large amounts of VRAM in an integrated video card.

      Back then, we called them a 3D video card vs a 2D video card. That nomenclature was abandoned once even low-end 2D video cards became capable of rudimentary 3D graphics. The distinction then shifted to whether it was a "serious" 3D graphics card with its own dedicated VRAM, or whether it was a 2D video card (commonly integrated into the motherboard) which could do 3D graphics in a pinch by borrowing system RAM to use as VRAM.
      • by kriston ( 7886 ) on Tuesday June 12, 2018 @02:05PM (#56773158) Homepage Journal

        No, the i740 had its own, dedicated VRAM. Therefore, this is not Intel's first discrete graphics chipset.

        • by Anonymous Coward

          From the very wiki article you seem to have skipped over:

            A unique characteristic, which set the AGP version of the card apart from other similar devices on the market, was the use of on-board memory exclusively for the display frame buffer, with all textures being kept in the computer system's main RAM.

          As he said, there is a minimal framebuffer on the card, the rest being in system memory. Hence, integrated.

          • Comment removed based on user account deletion
          • by Anonymous Coward

            Stop it!
            The fact that they relied on AGP for making pretend video memory for texture doesn't make it "integrated". Call it "lousy execution of a card", " worst 3D video card ever made" or "Intel's ugliest 3D child" if you will... But integrated it isn't.

            The 740 had a successor. The 810 if I remember right. This one was integrated.

          • "Intel also sold the i740 to 3rd party companies, and some PCI versions of the accelerator also were made. They used an AGP-to-PCI bridge chip and had more on-board memory for storing textures locally on the card, and were actually faster than their AGP counterparts in some performance tests."

            Seems totally discrete to me.

            Probably most of the uses of this new chip will be similar... sold to OEMs to integrate as they will.

      • by Anonymous Coward

        Uh, all graphics cards whether 2D or 3D had their own RAM, and even integrated graphics card (e.g. ATI Rage Pro, or even ISA graphics in old OEM PCs) had their own RAM soldered to the motherboard.

        Even if you only had 4MB on a graphics card that did both 2D and 3D, well you had 4MB for everything, framebuffer (double-buffered + Z buffer) and textures.
        The original 3dfx Voodoo did only 3D but had its own framebuffer : 2MB framebuffer memory and separate 2MB for textures.
        So that was not much but you'd run somet

      • Here's what John Carmack had to say about the i740.. [fool.com]

        Good throughput, good fillrate, good quality, good features.
        A very competent chip. I wish intel great success with the 740. I think that it firmly establishes the baseline that other companies (especially the ones that didn't even make this list) will be forced to come up to.
        Voodoo rendering quality, better than voodoo1 performance, good 3D on a desktop integration, and all textures come from AGP memory so there is no texture swapping at all.
        Lack of 24 bi

    • They even have a more modern attempt based on many cores with an onboard OS: https://en.wikipedia.org/wiki/... [wikipedia.org]

  • by Anonymous Coward

    fuck you geforce i dont want to update driver right now

  • Since I use GPUs a lot for non gaming applications this is interesting.

    Normally I'd not be interested because it's Intel who will have to play catch up. But with Raja involved this might actually have life.

    Wait and see....

  • Coincidence? Yeah, probably...
  • I hope this means that GFX will get mature and the endless cycle of "faster" will come to an end. There is a lot of evidence of massive slowdown at this time already, only a few years after CPUs. Finally having mature tech here would be endlessly beneficial.

  • by rsilvergun ( 571051 ) on Tuesday June 12, 2018 @12:39PM (#56772654)
    video chip prices. It's been 2 years and a 1060 is still selling over MSRP.
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Tuesday June 12, 2018 @12:40PM (#56772666) Homepage Journal
    Intel is making graphics chips, and IHOP is making hamburgers.
    • by jwhyche ( 6192 ) on Tuesday June 12, 2018 @12:54PM (#56772764) Homepage

      I had a IHOP hamburger a few days ago. It was pretty good. Not sure I can say the same about intel graphics chips.

      • Back in the desktop era, PC manufacturers didn't want Intel to have good graphics. Graphics card upgrades had much higher margins than the base PC. So, you can consider that Intel graphics have been deliberately handicapped.

        I've been in this long enough to remember when the big graphics company was Silicon Graphics, and before that Evans and Sutherland. Intel just needs to hire good people.

        Let's hope this try goes better than Intel's last attempt at making better graphics chips (which seemed to fizzle out),

        • let's also hope they don't intentionally cripple thunderbolt to bolster their graphics cards.

          An nvidia card over a TB3 is still light years better than anything intel has produced -- and will likely stay that way for several years.

    • IHOP is cooking their hamburgers on Intel chips.
  • Didn't we have a story last week about how Intel was on death's door because they couldn't get their yield on the new chips high enough? Wasn't AMD ready to pounce? Now this?

    • by epine ( 68316 )

      Didn't we have a story last week about how Intel was on death's door because they couldn't get their yield on the new chips high enough?

      10 nanometer [wikipedia.org]

      Currently Intel's 10 nm process is denser than TSMC's 7 nm process and available in limited quantities, but volume production is delayed until 2019. However, TSMC's 7 nm process [in name only] is planned to be soon available in high volume shipments or mass produced devices.

      After you've been the 800-lb gorilla for four decades, death's door is merely running ab

    • Good luck to Intel with their GPUs. Intel won't be getting me back from the forseeable future. Threadripper 2 for me this fall. Btw, it is said that losing Raja Koduri will actually speed up AMDs GPU evolution, because he changed direction too often.

    • by AHuxley ( 892839 )
      Selling a lot of old CPUs on a card as a new expensive GPU product line will help with that.
  • by Anonymous Coward
    Wow. Intel has fallen back on its old tricks. Announce a product that is going to be "so aweseome" way out in the future. I'm telling you, this news from Intel is going to make me put my graphics card purchases on hold! Gotta get me some of that "Intel Inside"!!
  • by Anonymous Coward

    The best I can hope for out of this is that intel will do ok, adopt freesync, and force Nvidia to get with the program and drop the stupid proprietary gsync.

  • A GPU good at graphics but bad at mining is needed so people can start gaming and designing again. I hope Intel cripples mining capabilities in their card.
    • by Anonymous Coward

      Gaming and mining involve the same kind of math. Your suggestion is impossible. Government (over)regulation of cryptocurrency will be what does it in, not graphics card manufacturers cutting their own throats by intentionally gimping their products.

  • It will be interesting to see what this does to the duopoly enjoyed right now between Nvidia and AMD. It would be nice to see some price pressure on the market and new tech coming out. We've been in this two-horse race pretty much since the late 90s/early 00s.
  • by DontBeAMoran ( 4843879 ) on Tuesday June 12, 2018 @01:58PM (#56773134)

    So it's a GPU that won't tell anyone about the kind of porn you watch?

  • Now that Intel has finally asserted and solidified it's superiority and dominance over all it's competitors in the CPU market, it only makes sense to branch out into other markets.

    Intel probably doesn't have to try that hard anymore. Their lead is so big that they can probably just continue to profit indefinitely even without any real innovation on their part.

    That said, I'm hearing that Intel has managed to get a 28-core chip running at 5 GHz on all cores. Their advancements in tablecloth technology has ma

  • Back in the day, Motorola had the 6845 in their 6800 processor family. I think even Zilog has a CRT controller of some kind. Intel has waited until now? Really?!?

  • by ffkom ( 3519199 ) on Tuesday June 12, 2018 @04:51PM (#56774042)
    ... and unlike AMD, they provide _stable_ open source drivers, then I'm all ears. At this point in time, the Intel iGPU drivers are the only ones I can trust to run 365/24 in a Linux system.
    • I don't know man, I've got a system with a Radeon HD 3450 in it (crappy base discrete card Dell put in everything a while back) and it is absolutely rock stable, the uptime is... well, I did a kernel update last week so it's a week, but this has never crashed once except some weird condition where accessing a file on a mounted SMB caused a GPF once and left the system in a weird state.

      The older Radeon cards are extremely well supported and stable, although I sure wouldn't want to game on one...

      • by ffkom ( 3519199 )
        I could live with the lesser 3d power of old ATI cards - but not without 4k 60Hz displays, which only the newer ones support.
  • That will be the limits of the free software spread over many Intel CPU's sold as a new GPU.
    Want more? The app creator will have to tell the all the CPU's on the Intel GPU what to do for their app.
  • I'm a dev who knows nothing about HW.

    I came to this article thinking, "Ooh, maybe I don't have to pay an extra $400 for a GPU on a new computer!"

    Is that wishful thinking?

    Does this announcement make any handwavey motions in that directions or am I waay off course.

Genius is ten percent inspiration and fifty percent capital gains.

Working...