Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Hardware

Intel Enters the Laptop Discrete GPU Market With Xe Max (arstechnica.com) 32

An anonymous reader quotes a report from Ars Technica: This weekend, Intel released preliminary information on its newest laptop part -- the Xe Max discrete GPU, which functions alongside and in tandem with Tiger Lake's integrated Iris Xe GPU. We first heard about Xe Max at Acer's Next 2020 launch event, where it was listed as a part of the upcoming Swift 3x laptop -- which will only be available in China. The new GPU will also be available in the Asus VivoBook Flip TP470 and the Dell Inspiron 15 7000 2-in-1.

During an extended product briefing, Intel stressed to us that the Xe Max beats Nvidia's entry-level MX 350 chipset in just about every conceivable metric. In another year, this would have been exciting -- but the Xe Max is only slated to appear in systems that feature Tiger Lake processors, whose Iris Xe integrated GPUs already handily outperform the Nvidia MX 350 in both Intel's tests and our own. The confusion here largely springs from mainstream consumer expectations of a GPU versus what Intel's doing with the Xe Max. Our GPU tests largely revolve around gaming, using 3DMark's well-known benchmark suite, which includes gaming, fps-focused tests such as Time Spy and Night Raid. Intel's expectations for the Xe Max instead revolve, almost entirely, around content creation with a side of machine learning and video encoding.

Xe Max is, roughly speaking, the same 96 Execution Unit (EU) GPU to be found in the Tiger Lake i7-1185G7 CPU we've tested already this year -- the major difference, beyond not being on-die with the CPU, is a higher clock rate, dedicated RAM, and separate TDP budget. Tiger Lake's Iris Xe has a peak clock rate of 1.35GHz, and it shares the CPU's TDP constraints. Iris Xe Max has its own 25W TDP and a higher peak clock rate of 1.65GHz. It also has its own 4GiB of dedicated RAM -- though that RAM is the same LPDDR4X-4266 that Tiger Lake itself uses, which is something of a first for discrete graphics and might lead to better power efficiency.

This discussion has been archived. No new comments can be posted.

Intel Enters the Laptop Discrete GPU Market With Xe Max

Comments Filter:
  • Comment removed based on user account deletion
    • High end GPUs are sexy but the money is in the volume motherboard end.

    • by Entrope ( 68843 )

      Do you think Intel, now on their (IIRC) 10 nm++++ process, has not been trying to improve their in-house chip lithography? I am sure they would love to have a 7 or even 5 nm process to boast about, but they seem to be stuck.

    • by gtall ( 79522 )

      Yup. Here's Xinping's calculus: every day that goes by without him taking Taiwan means a day closer to his death; he longs to be remembered as the Man Who Dismembered Taiwan.

    • The did that! I think they've failed at it for long enough to try failing at something else. ;)

  • Price included?

    • It's not even that, if my claim to fame was:

      the Xe Max beats Nvidia's entry-level MX 350 chipset in just about every conceivable metric

      in other words "I can beat a seven-year-old kid in almost every kind of sport", I'd be hiding behind a bush, not issuing a press release over it.

      • And "beat him" ... in sychronized swimming and stamp collecting at that!

        Truly the big alpha man that the girls love, that one!

  • Intel's expectations for the Xe Max instead revolve, almost entirely, around content creation with a side of machine learning and video encoding.

    This is the modern Intel, so naturally they'd aim this workstation tech at the mobile market.

  • by ffkom ( 3519199 ) on Wednesday November 04, 2020 @07:33PM (#60685396)
    A companion chip to some laptop CPU that already includes a GPU? No, thank you, that is just weird. So please release some decent PCIe graphics card, that is where the world could use some more of a competition.
    • Intel has tried to create that product several times, but hasn't actually released a product because none of their attempts were worth selling.

      There's no reason to expect anything different from them in the future, they can't even fix their speculative execution bugs and are still selling CPUs with them. What makes anyone think they can produce a viable, competitive GPU?

    • by AmiMoJo ( 196126 )

      Their GPUs are no competitive with other manufacturers, a PCIe card would be pointless.

  • And if you do why wouldn't you just get something with a mobile 1650 or better integrated in?

    Or is this because they're getting slapped down in CPU benchmarks that revolve around video encoding and they want to use the new chip to catch up there?
  • by UnknownSoldier ( 67820 ) on Wednesday November 04, 2020 @08:44PM (#60685592)

    This is what, the 12th time now, that Intel has tried graphics? [wikipedia.org]

    1. i740
    2. Extreme Graphics
    3. GMA 900
    4. GMA 3000
    5. Intel HD Graphics
    6. HD Graphics 2000 / 3000
    7. HD Graphics 4000
    8. HD Graphics 400
    9. HD Graphics 500
    10. Iris Pro Graphics
    11. UHD Graphics / Iris Plus Graphics

    But this time it will be different, right? /s

    Everytime I am reminded of this cartoon [wordpress.com]. Nvidia also poked with this other [zorinaq.com] cartoon.

    Keep trying Intel, I'm sure someday gamers will actually care about your GPUs: Integrated or Discrete. LOL.

    • This time they're also down in CPU share, just started outsourcing manufacturing to TSMC because their manufacturing process is at the ropes... what could go wrong? invest into shrinking market is not the answer.
    • You forgot the i860. Not designed as a GPU per se, but it was intended for graphics.
      • Thanks! I knew I was missing one. I assumed it was the i740 but I had a nagging feeling it wasn't quite right but sadly didn't follow up. I was thinking of the i960 but that also didn't seem right.

        Technically Larrabee [wikipedia.org] probably also belong the list but it is already long enough as is plus I don't want to obscure it with "distractions."

    • by AmiMoJo ( 196126 )

      To be fair I think most of the previous ones were not supposed to be very competitive, they were merely designed to give manufacturers a low cost highly integrated option that was okay for anything other than gaming.

      Nowadays the GPU is becoming more important as it gets used for things like video encoding and AI. Even stuff like Photoshop has AI upscaling filters now that use GPU acceleration. As Intel's CPUs slip behind AMD in all meaningful metrics, while AMD throws in Radeon GPUs that give Nvidia a run f

      • by tlhIngan ( 30335 )

        To be fair I think most of the previous ones were not supposed to be very competitive, they were merely designed to give manufacturers a low cost highly integrated option that was okay for anything other than gaming.

        Nowadays the GPU is becoming more important as it gets used for things like video encoding and AI. Even stuff like Photoshop has AI upscaling filters now that use GPU acceleration. As Intel's CPUs slip behind AMD in all meaningful metrics, while AMD throws in Radeon GPUs that give Nvidia a run f

    • Intel's GPU market isn't gaming or even HPC, but having something that can be integrated into laptops and low-end workstations at a price point the vendors and engineers like. They have to walk this balancing act between providing compelling value to their customers, and not succeeding to the degree that their GPUs threaten the CPU line.

      Nvidia - while they produce good GPUs - will always be an also-ran until they enter the CPU market. Intel doesn't have to beat them, just make something "good enough" t

    • Intel has shipped more desktop GPUs than any other company in the known universe.

  • If your product name includes "Max", and maxes out on the exxtreme "X"es,, and it beats ... the lowest end product of your competitor ... and *still* cheats in the benchmarks by implying unusual usages ...

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...