Forgot your password?
typodupeerror
Graphics Hardware Technology

Promised Platform-Independent GPU Tech Is Getting Real 102

Posted by kdawson
from the near-linear dept.
Vigile writes "Last year a small company called Lucid promised us GPU scaling across multiple GPU generations with near-linear performance gains without restrictions of SLI or CrossFire. The company has been silent for some time, but now it is not only ready to demonstrate the 2nd generation hardware, but also to show the first retail product that will be available with HYDRA technology. In this article there is a quick look at the MSI 'Big Bang' motherboard that sports the P55 chipset and HYDRA chip and also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering. Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive."
This discussion has been archived. No new comments can be posted.

Promised Platform-Independent GPU Tech Is Getting Real

Comments Filter:
  • If it is essentially just a load-balancer, why can't it be done in software?

    The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?
    • by shentino (1139071) on Wednesday September 23, 2009 @12:18AM (#29512239)

      Crippleware is a common method of rent seeking, and copyrights, patents, and plain old obfuscation may obstruct genuine improvements.

      Case in point: Old mainframes deliberately given a "cripple-me" switch that only an expensive vendor provided technician is authorized to switch off.

      • It ain't just the old ones. They do it remotely now.
      • by zefrer (729860)

        Uh not quite. Old mainframes were deliberately given a 'block remote access' switch that gives full control to the console to the person physically at the mainframe. That's a feature, not a cripple-me switch.

        • by shentino (1139071)

          The switch in question I speak of either capped the CPU speed or disabled part of the memory, but I'm not sure which. But it definitely counted as crippleware.

      • by zemkai (568023)
        Ah... memories. Back in the day when I worked at Amdahl, this was called the "Mid Life Kicker"... components / performance designed and built in from the get go, but not available or activated until later and at a fee.
    • Re: (Score:3, Interesting)

      by Nyall (646782)

      That would require CPU. Rendering a game at 60Hz not only requires a GPU that can render your imagery within 16.7ms, but also requires the software running on the CPU to issue its directx/openGL commands in 16.7ms.

    • The idea here is to improve performance. There are many complex calculations that need to be performed each frame to properly load balance the cards to achieve significant performance gains. If all that work was being performed by the CPU, it's quite possible that the rendering process could become CPU limited while the graphics cards sit waiting for the CPU to decide which card does what.
    • Re: (Score:2, Informative)

      by gedw99 (1597337)
      Yes VirtualGL can do this easily. it can attach to any window also and then use GPUS on multiple machines at once. Its very stable and easy to use also
    • by jdb2 (800046) *

      If it is essentially just a load-balancer, why can't it be done in software? The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?

      See my post concerning Chromium. [slashdot.org]

      jdb2

  • Lucid is offering up scaling between GPUs of any kind within a brand (only ATI with ATI, NVIDIA with NVIDIA)

    Strange... Is the difference between a 10-years-old NVIDIA card and a current-year NVIDIA card really smaller than the difference between a current ATI card and a current nVidia card?
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The cards of a brand share drivers across a few generations. So if this solution communicates with the drivers, you get the picture.

    • From the Anandtech article [anandtech.com] on the subject, it appears that multi-vendor GPU scaling has been implemented, as it was demoed with a GTX 260 and an HD 4890. The mixed-vendor implementation apparently requires Windows 7 to get the card's drivers to work properly (the article was light on details on this point), but it does work. And spare me the "M$ is teh suxorz" garbage. This is aimed at gamers, and like it or not, new games come out on Windows, not Linux, so that's where Lucid's priorities will be for the
      • by nxtw (866177)

        The mixed-vendor implementation apparently requires Windows 7 to get the card's drivers to work properly (the article was light on details on this point)

        Windows 7 includes a new version of the display driver model [wikipedia.org]. One of the new features: "Support multiple drivers in a multi-adapter, multi-monitor setup".

        In Vista, multiple adapters can only be used with Aero and the new graphics features if they all used the same driver. (The Windows XP drivers still support this, and you can still use them in Vista.)

  • by MarkRose (820682) on Wednesday September 23, 2009 @12:12AM (#29512215) Homepage
    Let me be the first to saw aw s  e! Maybe I can finally get decent full screen flash performance on linux now!
                                 e om
    • FTA:

      The HYDRA technology also includes a unique software driver that rests between the DirectX architecture and the GPU vendor driver.

      The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.

      Looks like we'll have to keep waiting

    • by AmiMoJo (196126)

      Forget about performance, what about power saving? For a couple of years now we have been promised the ability to shut down a graphics card and rely just on the on-board chip for desktop use, with the card kicking in when a game is launched. No-one seems to have actually implemented it yet though.

      • Sony have done that. It required a reboot unfortunately.

        http://www.cnet.com.au/sony-vaio-vgn-sz483n-339284061.htm [cnet.com.au]

        Actually I think a better solution would be to put a PCI Express slot in a docking station and integrated graphics in the laptop. Then you could disable the integrated GFX when you dock and use discrete instead. Even better you could use a relatively cheap desktop card.

        Mind you Asus have tried that and it didn't exactly catch on

        http://www.techspot.com/news/24044-asus-introduces-xg-modular-laptop- [techspot.com]

      • by tlhIngan (30335)

        Forget about performance, what about power saving? For a couple of years now we have been promised the ability to shut down a graphics card and rely just on the on-board chip for desktop use, with the card kicking in when a game is launched. No-one seems to have actually implemented it yet though.

        I know nVidia does, and I believe ATI has similar technology. There's a weak GPU onboard the chipset, but it can then switch to the faster offboard GPU when you want the grunt (at the expense of battery life).

        Heck,

        • by AmiMoJo (196126)

          You are correct, it does exist but only seems to be used on laptops so far. I don't know of any desktop mobos that support it.

          • Of course it's only supported on laptops. Laptops are where there is a solid business case for the feature. I'm sure it will show up on desktop systems when it becomes more expensive to not do it.
    • Getting full screen flash to perform well on Linux is not the fault of your hardware, which is what this solution is. I have quite a mediocre PC running Windows, and Flash works full screen just fine.

      I also have a gaming PC with a more than capable second card not being used, but which would probably allow me the small performance boost I need to keep me from upgrading just yet. I think that I'm more the target market than you are.
      • I have a AMD Neo CPU (1.6GHz) and a on-board x1250 and Fullscreen Flash works well in Debian Sid.

        • So what was the point of the OP?
          • Fullscreen flash on Linux is just one of those /. memes that gets recycled past its "best by" date. It probably still applies to some combinations of graphics hardware and driver, especially older stuff.
        • by MarkRose (820682)

          I have a GeForce 9500 GT, and a dual core Athlon 5050e, yet Flash fails abysmally at playing hidef YouTube on even a single 1680x1050 screen.

          • by Khyber (864651)

            Well, for one, Youtube HD isn't even HD at all. It's a 640x272 resolution video that's been heavily upsampled. I've fullscreened a Youtube HD video and paused it, and I've counted the blocks that represent a 'pixel.' I've already tested this on Youtube and Vimeo with my DXG 720HD camcorder. Vimeo keeps the true HD, Youtube shrinks the resolution for file size then upsamples the entire thing. Pretty easy to spot when you're using a nice 32" LCD that is only 4 feet from your face.

            That why I just renewed my Vi

      • by PitaBred (632671)
        Just FYI, I picked up a pair of Radeon 4670's for ~$50 each open-box from Newegg and a bridge for $7, and in Crossfire they perform about like a 4850 from my benchmarking and basic testing. And they don't need a secondary power cable, either. It's a cheap, easy upgrade if you have two x8/x16 PCIe slots.
    • by JAlexoi (1085785)
      Hm... You have issues with that? Flash 10 works rather well on Linux in fullscreen mode. I have not seen any sluggishness for a long time now.
  • by prisma (1038806) on Wednesday September 23, 2009 @12:14AM (#29512221)
    Finally, we can have asynchronous GPU pairing? And cross-brands to boot? What's incredible is having heard nothing about this for so long, TFA now says a product may hit the market in the next 30 days. I take it that by sidestepping Crossfire and SLI technology, they won't have to pay any licensing fees to either AMD or NVIDIA. Hopefully the patent trolls won't be able to find any fodder that would prevent and delay commercial release.
    • Re: (Score:3, Informative)

      by ShakaUVM (157947)

      >>Finally, we can have asynchronous GPU pairing?

      I think NVIDIA has some sort of asymmetrical SLI mode available on its mobos with built-in video cards. It allows the weak built in card to help a little bit with the big video card installed in the main PCI-E slot.

      IIRC, it gives a 10% boost or so to performance.

      Ah, here it is...
      http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]

      • Ah, here it is... http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]

        Im pretty sure that also got discontinued with the 9xxx generation of nVidia GPU's

        • A 10% boost is not really worthing bothering about to be honest. Discrete graphics is so much faster than integrated you might as well turn off the integrated graphics completely.

          • Re: (Score:3, Insightful)

            by Trahloc (842734)
            10% isn't a big deal? There are people who go to crazy extremes just to tweak out an extra 1-3% with entire sub markets dedicated to them, so yeah 10% is worth it.
    • From TFA (emphasis mine):

      To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA) and the ability to load balance GPUs based on performance and other criteria.

      So what is the deal? Is it cross-brand or not?

      Also, they are only planning to launch their chipset on one motherboard with one manufacturer. It all sounds like a short-lived gimmick to me.

  • From the article, "The HYDRA technology, as it is called, is a combination of hardware (in the form of a dedicated chip) and software (a driver that sits between the OS and DirectX)", I can't wait for this software technology to be available for GNU/Linux. But.. something tells me it will take a while as never ATI and NVidia chips can not even do 3D using free software as of today and support seems to be years away. And yes, I know, there is some unstable proprietary binary blob available for my ATI card wh
    • Re: (Score:1, Insightful)

      by Slarty (11126)

      We live in a world where thousands of children starve to death every day, people are killed or imprisoned for expressing their beliefs, women/minorities/everybody are oppressed, and few people really care about any of it, because it's all someone else's problem. I find it kind of funny (and more than a little sad) that the use of a driver can be blithely written off as "immoral" just because you can't download the source.

      • Re: (Score:1, Insightful)

        by Anonymous Coward

        ... women(by women and some men)/men(by women)/minorities/everybody are oppressed, ...

        Fixed that for you

      • by Tom (822) on Wednesday September 23, 2009 @02:41AM (#29512777) Homepage Journal

        I hope you die young. Seriously. If we get world hunger solved, and peace eternal, people will start to complain about even less important stuff. People complain about things, it's part of human nature. Just because 500 people died in Africa today before I got out of bed doesn't mean I don't feel that particular idiot at work is a friggin' [censored].

        You can't deny people's feelings with a rational appeal to global standards.

      • by bcmm (768152) on Wednesday September 23, 2009 @04:45AM (#29513251)

        We live in a world where thousands of children starve to death every day, people are killed or imprisoned for expressing their beliefs, women/minorities/everybody are oppressed, and few people really care about any of it, because it's all someone else's problem. I find it kind of funny (and more than a little sad) that the use of a driver can be blithely written off as "immoral" just because you can't download the source.

        Some people rape children. How can you possibly think shoplifting is immoral?

        /me steals some stuff.

      • Cute, you are trying to pull an common didactic trick by sidestepping the issue with overexerting another. I know grade schoolers like to do it:

        Kid: I don't want to do my homework, it's so stupid.
        Parent: For the last time kid, do your homework!
        Kid: You know that hurricane Jane killed 5000 people yesterday on the west cost? And you are upset about some measly homework? How can you, there are so many worse problems on the world.
        Parent: [...]

        Now who the hell modded parent up?

        • > Now who the hell modded parent up? /. mods suck bigtime these days. To get modded up, just write a emo post.
          • Either that, or write "This will get modded down because of [...], but" at the beginning of your post.

    • So you can't wait until approximately...20never? You won't see drivers for anything like this in Linux. You'll be lucky to get decent bog standard 3D drivers.

    • by dark_requiem (806308) on Wednesday September 23, 2009 @01:38AM (#29512553)
      "Immoral"? What, because it's proprietary? Are you serious? Get ready to throw out your whole computer, because the whole damn thing is proprietary. You don't have circuit diagrams for the cpu or gpu, you don't have firmware code, nothing. Before you start taking the "moral" high ground about proprietary components, look at what you're typing on. There's plenty of room in the world for proprietary and open source to coexist, RMS' rantings not withstanding.
    • by Fred_A (10934)

      And yes, I know, there is some unstable proprietary binary blob available for my ATI card which can do 3D, but it is immoral to use that

      Now that you have confessed you shall say 2 our RMS and 3 hail Linus and all will be forgiven for the GNU is merciful. Go in peace, user.
      (duh)

      and it is actually so slow on 2D (which to me is more important) compared to the free "radeon" driver that it's ridiculous.

      Or you could get supported hardware from "the other company", or stop being anal about trivial issues nobody in his right mind cares about.

      Or just get a real SVGA card which is perfectly supported with completely open drivers. I hear Tseng ET3000 are a steal these days.

    • by PitaBred (632671)
      Keep an eye on the radeon development. They just pushed OpenGL 1.4 acceleration to the Radeon driver for all ATI cards (including the current r600/700 cards, the 2xxx/3xxx/4xxx series), and it's just getting better. It'll really fly once Gallium3D drops and allows GLSL and other improvements. Most distros should be including it when they get the 2.6.32 kernels shipping with them. So, Fedora 12 alphas have it running, Ubuntu 9.10 should have 3D without the KMS for the new radeons, and things in general are j
  • With their proprietary CUDA and Firestream technologies, I would think Nvidia and AMD/ATI resepctively would be able to make a daughter card that could add or increase GPU capability on their existing respective hardware, or open up 3rd party licensing to build this market segment.

    My ATI X1300 handles far more BOINC than it does games, and I have no real reason to upgrade right now. But if there was an add-on that ATI or an approved 3rd party manufacturer developed that was reasonably priced, I wouldn't he

    • There has been some noise from the big two GPU manufacturers for something similar. I don't have time atm to search for it, but I believe the article I read was on anandtech. Basically, the idea was to create a graphics card with basic functionality like 2D processing built in, but have the actual GPU chipset be user-replaceable by using a socket instead of hard soldering it to the board, so you could just plug in a new chip, and bam! instant 3D processing upgrade, without the unnecessary expense of repla
      • by RMingin (985478)

        What you're not seeing is that the PCIe card **IS** the user-replaceable GPU tech! WHY would you want/need to swap out the socketed GPU? All you're keeping by your method is the ram (which is probably slow and out of date by the time you swap GPUs) and the physical connectors, which cost roughly NOTHING. In exchange you've added a ton of connections to be loose or misconnected.

    • by PitaBred (632671)
      If you want a compatible upgrade, just check Newegg. An X1650pro [newegg.com] would do a lot more for BOINC than your current card, is supported by the exact same drivers, and only runs $54 if you do the free shipping option. A "daughterboard" or even just a new chip would require a heatsink, more power, and so on... a replacement just makes more sense, especially since the newest generations of cards are multiple times more powerful than your current one.
      • by Xin Jing (1587107)

        Thanks for the product recommendation. As you can tell, I'm not exactly operating on the bleeding edge of technology and that price range fits in nicely with my budget.

        • by PitaBred (632671)
          If you are using Windows and have a PCIe slot, you can pick up a 3xxx or 4xxx series ATI card for around the same price that will blow the 1650 out of the water. An open-box 4650 [newegg.com] is smoking fast for only $40. If you're limited by AGP, a 3450 [newegg.com] is still probably faster than the x1650, and definitely faster than your x1300. Lots of options available on a budget.
  • Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive.

    But this is not any ware close enough.

  • Performance issue (Score:3, Insightful)

    by dark_requiem (806308) on Wednesday September 23, 2009 @01:22AM (#29512471)
    There are going to be some performance hits compared to native crossfile/sli implementations. There are three models of the Hydra 200 part, and they each differ in their pcie lanes. The high-end model, which is going on the MSI motherboard, sports two x16 pcie lanes from the chip to the graphics cards (configurable as 2x16, 1x16 + 2x8, or 4x8), but only a single x16 lane from the chip to the pcie controller. So, where a good high-end crossfire or sli board will have two x16 pcie lanes from the controller to the slots for the gpus, this solution will be limited to one x16, limiting the bandwidth available to each graphics card. Exactly how much of a performance hit this would incur remains to be seen, and it probably depends on the cards being used (an older 8000 series geforce doesn't need/won't use as much bandwidth as a gtx 295, for example), but I would expect as gpus grow more powerful and require more bandwidth to keep them fed and working, we will start to see performance deterioration compared to the native crossfire and sli implementations (although lucid can always modify their design to keep pace).

    Incidentally, the two lower-end hydra chips will sport a x8 connection to the controller and 2 x8 connections to the cards, and a x16 connection to the controller and two x16 connections to the cards (strictly 2x16, not configurable in any other arrangement)
    • by Tynin (634655)
      I don't think there will be a performance issue for sometime as video cards aren't even using enough bandwidth to saturate the now older PCIe 1.1 which can do up to 4GB/s, and are obviously no where near touching the PCIe 2.0's 8GB/s. Their is a ~2% performance increase with running modern cards on the 1.1 compared to the 2.0 which is within the margin of error. So if you want to use this HYDRA setup with 2 cards, regardless of their speed, the 16x PCIe 2.0 line should be enough bandwidth to do the job righ
  • The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.

    So presumably it will work only in Windows, and only with DirectX games (e.g. not with OpenGL.) I'm guessing that supporting OpenGL would require a big programming effort so we won't see it soon if at all. I suspect there aren't many OpenGL games out there anyway, but I don't follow such things.

    Unless the OS market changes drastically,

  • It'll be interesting to see how much extra latency the chip adds to the rendering process. I don't imagine the hardcore gamers would be too happy about it if they sacrifice an extra 50 ms to gain some FPS.

    • Re: (Score:1, Funny)

      by Anonymous Coward

      Nonsense. Graphics beat gameplay, remember? It would follow that throughput beats latency.

  • Great, now I can have 2 buggy display drivers installed at the same time, each with their own quirks. And who helps me out when I have graphical problems in a game? Do you really think ATI or NVIDIA will give end-user support for this? What about game developer support? It is a support nightmare for all involved. No thanks. Sorry, this idea is brain-dead long before it hits the shelf.

    • by Dr. Spork (142693)
      Hey, if I could just plug a second card into my system that didn't have to match the first one, I'd be at newegg right after hitting "submit". If this gets big it will definitely sell more hardware because it will lead to more frequent upgrading and just more GPU buying. AMD and NVidia would be crazy to kill this goose.
    • by Nemyst (1383049)
      Like they give support for their cards right now. Sure, if it breaks and it's still under warranty, they'll replace it, but they rarely fix problems with game compatibility unless a majority of their users are experiencing the issue.

      If vendor support is so important for you, get a console.
  • It would make sense if they developed a spec with a common access API to the HW instead of using wrappers like OpenGL/DX on proprietary drivers. HW should expose a platform-independent API so driver could be written by Microsoft or apple or who ever. And by the way, why should I put two different vendors cards on the same machine instead of using the native single vendor solution? It is only useful when OpenGraphics comes out.
    • Re: (Score:2, Insightful)

      by Hal_Porter (817932)

      Back in the Dos days video hardware was originally a register level standard. Then the accelerator companies all invented their own solutions to line drawing, BitBlts and so on. Now in Dos each programmers used Vesa Bios calls to get into high res modes but they had to write a driver themselves for anything more complex. Windows came along and acted like a software motherboard - application programmers wrote to user mode API and the graphics card manufacturers wrote drivers to a kernel level API.

      At this poi

  • eh. SLI/crossfire has always been a niche market. Buying 1 top of the line nvidia or ATI card is always a stronger solution then buying two mid level cards. So this would only make sense if you are buying 2 top of the line cards and honestly while the charts make it look impressive, it's just that, a bragging right. You don't see human detectable improvement in performance in most games/apps. It's a very small market. I believe that's why nidia or ATI hasn't done any real development of their own p
  • First it was Nvidia, then Nvidia's control over Ageia (of PhysX chip fame)

    Now it's MSI's turn in their control over Hydra

    What this means is, there are only ATI and Intel out there who are seriously dabbling with graphics hardware, who are not based in Taiwan !

    • Speaking of Phys-X,

      "also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering"

      That might cause a problem. Remember, nVidia disabled Phys-X on their latest drivers when ATI video hardware is present (to prevent people from using cheap nvidia GPU's as a glorified Phys-X PPU) so I hope these guys made their own "custom drivers" that work with both cards (and not just a software bridge between the two). This will eliminate that restriction as well as the need to

  • http://www.virtualgl.org/ [virtualgl.org] Will scale across many GPUS on the same board or across the world.
  • last time i heard of
    such magical product
    it was april fool

  • A software based solution to the problem of aggregating a heterogeneous collection of parallel OpenGL command streams into one, compositing the output of several graphics cards into one image, or both, has been available for years : It's called Chromium [sourceforge.net].

    Although originally designed for a networked cluster with one gpu per machine, it can conceivably be adapted to one machine with multiple GPUs. Because Chromium's software based compositing would bog down a single processor system, a natural extension wo
  • Would this technology enable me to use the onboard IGP (Radeon HD3300) which
    is now doing absolutely nothing as I am using a separate Radeon HD3850.

  • Interesting. But let me guess its only compatible with windows.

There are never any bugs you haven't found yet.

Working...