Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Hardware Technology

Intel Graphics Division Shares Wild Futuristic GPU Concept Cards (hothardware.com) 70

MojoKid writes: What do you think graphics cards will look like in the next decade and a half? Intel wanted to know that as well, so it commissioned designer Cristiano Siquiera to give us a taste of what graphics cards might look like in the year 2035. Siquiera, the original talented designer that brought the first set of Intel Odyssey GPU renders not long ago, focused primarily on the fan/shroud designs and what innovations could be fostered in the coming years. He was tasked with thinking far beyond current design conventions, materials and cooling technologies in current-gen graphics cards, and to envision new designs that could employ technologies and materials not even invented yet. One concept, called Gemini, shows an ionic-based cooling system that isn't too far beyond the realm of feasibility. Yet another design, called Prometheus, showcases top edge-mounted display readout that could also be fairly easily employed with flexible OLED display technology. Intel also just launched a new Graphics Command Center driver package today, which offers more customization, better control of power and cooling and one-click game optimization for Intel GPU-enabled systems.
This discussion has been archived. No new comments can be posted.

Intel Graphics Division Shares Wild Futuristic GPU Concept Cards

Comments Filter:
  • by Anonymous Coward on Wednesday May 29, 2019 @05:44PM (#58675186)

    Useless garbage to hide the fact that Intel graphics sucks

    • Yeah, when the best PR they can buy is that it "isn't too far beyond the realm of feasibility," I only hear that they're beyond the realm of feasibility. And that they know it.

      Adding an extra mini-monitor on top of a monitor doesn't impress me. And I'm not sure why that would tell us anything about the GPU.

      • by aliquis ( 678370 )

        Maybe a log of what data was leaked last.

        • LOL!

          Yeah, maybe add a JTAG emulator to run a boundary scan in a loop, and figure out what data was leaked.

          That will speeds things up for sure!

  • Maybe I'm not normal (Score:4, Interesting)

    by Snotnose ( 212196 ) on Wednesday May 29, 2019 @05:46PM (#58675204)
    I love games, I love the cutting edge graphics. But if I need liquid cooled whatevers, unless it's my Margarita, then not happening. We can debate power requirements, and fan sizes vs noise, all day long. But going from air to liquid cooled is a line in the sand for me.
    • by Bigbutt ( 65939 )

      What’s your aversion to liquid cooling? Currently I have a sealed CPU cooling unit that’s significantly easier to manage (clean) than the previous hundred finned heat sink. And I don’t overclock or anything, I just wanted to give it a try and I would do it again.

      [John]

      • by darkain ( 749283 )

        Yeah, I'm curious on this one, too. I personally do the majority of my heavy computing work on a pre-built HP Xeon Workstation that contains a stock AIO liquid cooler for the CPU. Liquid isn't just for "OH EM GEEZ G@|\/|3Rz" anymore, it is quite common in serious compute workloads now.

        • Air coolers don't leak. Ever.

          EVER.

          Water coolers don't often leak.

          Not often...

          Now, if we had little heat pumps, and could cool our CPUs with refrigerant, that might actually be worth doing. But that would be at risk of leaking, too. HFC-134a though (for example) is quite innocuous stuff. I huff some regularly... in my asthma inhaler.

          • Actually AIOs pretty much never leak unless you decided to puncture a component. I mean seriously leaking? Do you own a car? You see all those fluids sloshing around vibrating and bumping, heating to extreme temperatures and pressurized to a few bar, notice those don't leak? You're worried about a pre-manufactured closed loop that doesn't see vibration, doesn't see thermal expansion, and runs at less than half a bar, I would suggest getting a bubblewrap lined helmet for all those concerns in your life.

            Now, if we had little heat pumps, and could cool our CPUs with refrigerant

            Why?

        • by Bigbutt ( 65939 )

          It might be the difference between a purchased liquid unit vs one that’s been assembled by the user. Mine is the former.

          [John]

      • Comment removed based on user account deletion
        • But when it comes to sustained thermal load, eventually the loop will become heat-soaked as well.

          If you're going to an AIO for sustained thermal load you're doing it wrong.

          TLDR: Water cooling requires more maintenance. It also makes upgrading a giant PITA.

          Short reply: If you're doing maintenance then you bought something really shit. If it impacts your upgrading them maybe stick to the same socket in the future.

    • I used to be in the same boat as you, but then I built a machine recently and decided to go for one of the AIO coolers, as the i9-9900 has been known to get quite hot when it's running in turbo mode. I really enjoy its cooling capabilities, smaller footprint, and just overall better aesthetics. It really is hassle-free.
    • But going from air to liquid cooled is a line in the sand for me.

      Why? Did you make some similar complaints when going from aluminium to copper? Or when coolers started needing heatpipes to transfer heat more effectively? Liquid cooling these days is plug and play with the bonus of not having to be careful selecting a cooler that doesn't hit your RAM chips or something like that.
      The additional effort is literally in the form of 4 screws to install a maintenance and assembly free All-In-One liquid cooler.

  • by fbobraga ( 1612783 ) on Wednesday May 29, 2019 @05:50PM (#58675224) Homepage
    Mainly for gamers (I also call it of "fear of AMD")
    • of x86 code compatible ARM processors. The current gen can hang with an i5 8250U, uses 10w instead of 15w and has a faster GPU.

      Not that AMD isn't squeezing them, but much as I love my RX 580 I've heard their Vega 11 drivers are kinda crap (lots of crashes) and they just lost a few driver programmers to Intel. There aren't a lot of those guys on earth so that's gonna sting...
    • Mainly for gamers

      Well yes, the high end GPU market is absolutely dominated by these guys and it turns out computers are no longer beige boxes. What else would you suggest? Targeting corporate CTOs with gaming GPUs?

  • by cdsparrow ( 658739 ) on Wednesday May 29, 2019 @05:55PM (#58675252)

    Sure I might care about what the stuff on my monitor or VR headset looks like, but the card itself? Form over function is what many many BS technogadgets have given us now.

    • by AHuxley ( 892839 )
      People have lights and clear sides to their computer design.
      The look and color of everything inside the case matters given the investment in design.
    • Oh man, found the guy who uses their computer for work. Seriously you don't consider the look of the GPU before you buy it? Next you're going to tell me your computer isn't a glass box lit from the inside with rainbow coloured LEDs.

      The look of GPUs is hugely important these days, especially since they are commonly mounted vertically against a glass side panel to show them off with colour matched cabling no less.

      Gamers have progressed beyond beige boxes, and beyond black boxes. ALL companies put a lot of eff

      • No, i have a gaming system, if fact it actually lives out of sight in many pieces. Power supply and several hard drives aren't even mounted in the case, just laying nearby...

        So yeah, don't care about what my computer looks like, some of the BS I've bought for it do have LEDs, but I have physically disabled as many of those as possible. Had to take power supply apart to cut the led wires in its fan, but was cheaper than a similar model without lights :(

  • Inspiring... (Score:4, Informative)

    by fuzzyfuzzyfungus ( 1223518 ) on Wednesday May 29, 2019 @06:02PM (#58675302) Journal
    If the purpose of a graphics card was to be looked at; rather than to draw things you actually care about looking at, this exercise might have been not completely worthless.

    As it is...
    • As it is...

      As it is graphics cards are often selected on looks, and looks are hugely important in the gaming world. You may not have been to an exhibition recently but there's not a single GPU card company out there that doesn't consider the look of the card important, in often cases to the point of form over function (my previous GPU had a superfluous fan without heatsink fins under it).

      Gamers don't own black boxes. They own glass systems lit from the inside often with careful thought put into the colour of everythin

  • I didn't see one, NOT ONE, with racing stripes and undercarriage lights. Didn't even have a spoiler, if you can believe it.

    Pfft, some designer.

  • ..and pink and blue lights flickering out of the back panel of my computer...under my desk...behind cables, ups and wall-warts...which are telling me ...what? Nothing of importance> Yeah...less of that.

  • Intel is trying to be wild and futuristic. You know what would be a wild future? One in which Intel made a graphics card with a useful level of performance.

  • Correct math functions in GLSL. https://github.com/AnalyticalG... [github.com]
  • by belthize ( 990217 ) on Wednesday May 29, 2019 @06:45PM (#58675576)

    That's all well and good but who decides what color they'll be.

  • When I glanced through the story, I first got the idea that it was about future graphic card performance and features. For instance, real time ray tracing, physics processing direct to screen, multiple display support that enables multiple screens or display surfaces on a single small device, or casting to a VR headset, etc. ( For instance, the Asus laptop mentioned a day or two ago here on Slashdot [ https://tech.slashdot.org/stor... [slashdot.org] ]. Upgrade models could have not just the main screen and and upper de

    • by AHuxley ( 892839 )
      Re 'real time ray tracing, physics processing direct to screen, multiple display support that enables multiple screens or display surfaces on a single small device, or casting to a VR headset, etc. "
      With the amount and quality of staff working on the GPU project that is all going to be ok.

      A great new GPU project needs good design and looks too.
      Re " box could like like, for those who have a glass case and want to show off."
      Is the part Intel has to get right given the existing brands and their support fo
    • No, yeah, this is just some concept art for graphics card design.

      But I think the subtext is more important here, in that Intel is possibly looking at dedicated GPUs again. They hired that AMD graphics guy a while ago so this might be a hit at something new coming sooner or later.

  • The biggest chip fab in the world and just some new coolers is what they come up with?

    Apparently Intel believe the big determining factor in future GPU sales is what color the card's leds are.

    • The biggest chip fab in the world and just some new coolers is what they come up with?

      Only if you take this announcement in isolation and ignore all their others.

      Apparently Intel believe the big determining factor in future GPU sales is what color the card's leds are.

      Yes they do. As does every other company on the market. GPU looks is a huge selling feature with often more form than function the higher end you go given that those higher end cards are the ones likely to be mounted in a glass box on a vertical bracket.

  • I expect that by 2035 that on die GPUs from ARM chips will be able to do 144hz, 8K fully immersive VR, I don't want to still see graphics cards in 16 years look like that have built in vacuum cleaners.
  • Instead of actually thinking of revolutionizing GPU technology, they're banking on style? Looks like Intel has lost vision.
    • Nope, looks like you don't have a clue. Style is a consideration of every GPU manufacturer selling high end stuff. If your card doesn't have style expect it to not do well.

      Mind you given how you read about one article on style and ignore all other articles on Intel's developments and conclude they're "banking" on style, just further reinforces you not having a clue.

  • by Z80a ( 971949 ) on Wednesday May 29, 2019 @10:16PM (#58676592)

    That's pretty much how video cards will look in the future.

  • Every decade or so, Intel takes another kick at the can.

    Every decade or so, they fail miserably.

    Intel is not a video card company. They don't have the internal culture to retain the talent required, and the talent they have is CPU-focused.

    This story will end up the same. Unless they acquire nVidia. That would be utterly terrible for everyone.

    • Actually the talent they have right now is very GPU focused. There have been many high profile moves from both AMD and NVIDIA to Intel in the past 2 years. The question is as you put it: will they retain that talent.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...