Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Intel Hardware Technology

Intel Abandons Discrete Graphics 165

Stoobalou writes with this excerpt from Thinq: "Paul Otellini may think there's still life in Intel's Larrabee discrete graphics project, but the other guys at Intel don't appear to share his optimism. Intel's director of product and technology media relations, Bill Kircos, has just written a blog about Intel's graphics strategy, revealing that any plans for a discrete graphics card have been shelved for at least the foreseeable future. 'We will not bring a discrete graphics product to market,' stated Kircos, 'at least in the short-term.' He added that Intel had 'missed some key product milestones' in the development of the discrete Larrabee product, and said that the company's graphics division is now 'focused on processor graphics.'"
This discussion has been archived. No new comments can be posted.

Intel Abandons Discrete Graphics

Comments Filter:
  • Groan (Score:4, Insightful)

    by Winckle ( 870180 ) <`ku.oc.elkcniw' `ta' `kram'> on Wednesday May 26, 2010 @02:23PM (#32351202) Homepage

    I hope they at least manage to incorporate some of what they've learnt into their integrated chips.

    Intel's integrated chips have been appallingly bad in the past, some incapable of decoding HD video with reasonable performance. Manufacturers using those intel integrated chips in their consumer level computers did a great deal of harm to the computer games industry.

  • by jtownatpunk.net ( 245670 ) on Wednesday May 26, 2010 @02:35PM (#32351324)

    A company that hasn't produced a discrete graphics card in over a decade (I'm pretty sure I remember seeing an Intel graphics card once. Back in the 90s.) is going to continue to not produce discrete graphics cards. Wow. Stop the presses. Has Ric Romero been alerted?

  • by TheRaven64 ( 641858 ) on Wednesday May 26, 2010 @02:40PM (#32351394) Journal

    CPUs have been "fast enough" for years, but GPUs have not.

    Really? I think you might want to take a look at what most people use their GPUs for. Unless you are a gamer, or want to watch 1080p H.264 on a slightly older CPU, a 4-5 generation old GPU is more than adequate. My current laptop is 3.5 years old, and I can't remember ever doing anything on it that the GPU couldn't handle. As long as you've got decent compositing speed and pixel shaders for a few GUI effects, pretty much any GPU from the last few years is fast enough for a typical user.

  • by Anonymous Coward on Wednesday May 26, 2010 @02:44PM (#32351450)

    A large, publicly announced project with a great deal of media hype that had the potential to shake up the industry was cancelled. So, yeah, stop the presses.

  • Re:I wonder (Score:1, Insightful)

    by Anonymous Coward on Wednesday May 26, 2010 @02:48PM (#32351498)

    Does this mean that they'll be focusing on continuous graphics instead?

    More directly, what the hell is "discrete graphics"? I've been working with computers since the TRS-80 days and this is the first time I've seen the term. The writeup makes it sound like "discrete graphics" is some revolutionary kind of new display method that will make our old idea of viewing "pixels" on a "screen" obsolete, but the Google tells me that "discrete graphics" just means a video card as opposed to an onboard chip. Call it a video card! (Or more precisely, the chip that would run on a video card as opposed to being integrated into the motherboard.)

  • Not really (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Wednesday May 26, 2010 @02:50PM (#32351544)

    Everyone gets up on Intel integrated GPUs because they are slow, but they are looking at it from a gamer perspective. Yes, they suck ass for games, however that is NOT what they are for. Their intended purpose is to be cheap solutions for basic video, including things like Aero. This they do quite well. A modern Intel GMA does a fine job of this. They are also extremely low power, especially new newest ones that you find right on the Core i5 line in laptops.

    Now what AMD may do well in is a budget gaming market. Perhaps they will roll out solutions that cost less than a discreet graphics card, but perform better than a GMA for games. That may be a market they could do well in. However they aren't going to "kill" Intel by any stretch of the imagination. For low power, non-gaming stuff using minimal power is the key and the GMA chips are great at that. For the majority of gaming, a discreet solution isn't a problem ($100 gets you a very nice gaming card these days) and can be upgraded.

  • Re:Groan (Score:1, Insightful)

    by Anonymous Coward on Wednesday May 26, 2010 @03:08PM (#32351752)

    That isn't saying much considering that the 5430 is an extremely low end GPU.

  • Re:Not really (Score:1, Insightful)

    by Anonymous Coward on Wednesday May 26, 2010 @03:40PM (#32352162)

    Now what AMD may do well in is a budget gaming market. Perhaps they will roll out solutions that cost less than a discreet graphics card, but perform better than a GMA for games. That may be a market they could do well in. However they aren't going to "kill" Intel by any stretch of the imagination.

    I think you're selling AMD short. Intel has always relied on its fabs to keep things competitive even when their designs weren't. Now that die shrinks are beginning to reach a point of diminishing returns, Intel cannot rely on their fabs as heavily. Also, Fusion certainly has the potential to break Intel's lock on the low power IGP market.

    Ever since they ran Ruiz out, AMD has been executing brilliantly. If AMD is able to come close to equaling Intel's CPU tech and considering that ATI's GPU tech spanks Intel, how much better do you all think Bulldozer will ultimately be than Larabee? I'm guessing metric tons considering applications (i.e. Photoshop) are now utilizing GPU acceleration for stream processing (not just HD accel. and Aero).

    If AMD can keep this pace up, Intel is in for some deep hurting... You can double the hurting if (and it is a big if) ARM starts moving up the hardware stack (e.g. iPad) in the next couple of years.

  • by TheRaven64 ( 641858 ) on Wednesday May 26, 2010 @03:52PM (#32352346) Journal

    From what I remember all the cards I was using at the time Intel was trying to sell the i740 (Permedia-2, TNT, etc) were on the AGP bus.

    Check the dates. The i740 was one of the very first cards to use AGP. Not sure about the Permedia-2, but the TNT was introduced six months after the i740 and cost significantly more (about four times as much, as I recall). It performed a lot better, but that wasn't really surprising.

  • Re:Not really (Score:5, Insightful)

    by forkazoo ( 138186 ) <<wrosecrans> <at> <gmail.com>> on Wednesday May 26, 2010 @05:09PM (#32353338) Homepage

    Everyone gets up on Intel integrated GPUs because they are slow, but they are looking at it from a gamer perspective. Yes, they suck ass for games, however that is NOT what they are for. Their intended purpose is to be cheap solutions for basic video, including things like Aero. This they do quite well. A modern Intel GMA does a fine job of this. They are also extremely low power, especially new newest ones that you find right on the Core i5 line in laptops.

    Funny, at this point, I thought the purpose of Intel graphics was to try and make sure that OpenCL never becomes a viable solution. Seriously, Intel does everything in their power to make their terrible graphics chips universal. They've done some pretty shady dealing over the years to try and make it happen. At this point, they have even put their GPU's right on the CPU's of their current generation laptop chips. Apple and nVidia had to come up with dual-GPU solutions that can't be as power efficient as an Intel-only solution because they have to leave the Intel GPU also running and burning power. Intel is trying to sue nVidia out of the integrated chipset market. Examples go on and on.

    Why? It isn't like Intel makes all that much money on their GPU's. It's nothing to sneeze at. Intel makes more money in a year on GPU's than I'll probably make in a lifetime, but that's peanuts on the scale of Intel. It's also not enough cash to justify the effort. But, if you look at it as a strategic move to make sure that the average consumer will never have a system that can run GPGPU code out of the box, it starts to make a little more sense. Intel is trying to compete on sheer terribleness of their GPU's, because if the average consumer has an nVidia integrated GPU in their chipset, then developers will bother to learn how to take advantage of GPU computing, which will marginalize Intel's importance.

    I know it sounds kind of like a crazy conspiracy theory, but after the last several years of Intel-watching, it really does seem like quietly strangling GPGPU is a serious strategic goal for Intel.

  • by Calinous ( 985536 ) on Thursday May 27, 2010 @02:54AM (#32358722)

    "Graphics cards with performance comparable to the best integrated graphics aren't exactly expensive"
          You can't find expansion graphic cards with performance comparable to the current integrated graphics - the integrated graphics are slower than anything else (less available memory bandwidth, fewer compute clusters, ...).

E = MC ** 2 +- 3db

Working...