Promised Platform-Independent GPU Tech Is Getting Real 102
Vigile writes "Last year a small company called Lucid promised us GPU scaling across multiple GPU generations with near-linear performance gains without restrictions of SLI or CrossFire. The company has been silent for some time, but now it is not only ready to demonstrate the 2nd generation hardware, but also to show the first retail product that will be available with HYDRA technology. In this article there is a quick look at the MSI 'Big Bang' motherboard that sports the P55 chipset and HYDRA chip and also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering. Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive."
can it be done in software? (Score:2, Insightful)
The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?
Re:can it be done in software? (Score:4, Insightful)
Crippleware is a common method of rent seeking, and copyrights, patents, and plain old obfuscation may obstruct genuine improvements.
Case in point: Old mainframes deliberately given a "cripple-me" switch that only an expensive vendor provided technician is authorized to switch off.
Re: (Score:2)
Re: (Score:1)
Uh not quite. Old mainframes were deliberately given a 'block remote access' switch that gives full control to the console to the person physically at the mainframe. That's a feature, not a cripple-me switch.
Re: (Score:2)
The switch in question I speak of either capped the CPU speed or disabled part of the memory, but I'm not sure which. But it definitely counted as crippleware.
Re: (Score:1)
Re: (Score:3, Interesting)
That would require CPU. Rendering a game at 60Hz not only requires a GPU that can render your imagery within 16.7ms, but also requires the software running on the CPU to issue its directx/openGL commands in 16.7ms.
Re: (Score:2)
Re: (Score:2, Informative)
Re: (Score:2)
If it is essentially just a load-balancer, why can't it be done in software? The article only mentions DirectX, no word about OpenGL, so it must be not a pure hardware solution. If all it does is re-routing of D3D calls, why CPU can't do it?
See my post concerning Chromium. [slashdot.org]
jdb2
only ATI with ATI, NVIDIA with NVIDIA... (Score:1)
Strange... Is the difference between a 10-years-old NVIDIA card and a current-year NVIDIA card really smaller than the difference between a current ATI card and a current nVidia card?
Re: (Score:2, Insightful)
The cards of a brand share drivers across a few generations. So if this solution communicates with the drivers, you get the picture.
Re: (Score:2)
Re: (Score:2)
Windows 7 includes a new version of the display driver model [wikipedia.org]. One of the new features: "Support multiple drivers in a multi-adapter, multi-monitor setup".
In Vista, multiple adapters can only be used with Aero and the new graphics features if they all used the same driver. (The Windows XP drivers still support this, and you can still use them in Vista.)
Is it useful? (Score:5, Funny)
e om
Re: (Score:1)
The HYDRA technology also includes a unique software driver that rests between the DirectX architecture and the GPU vendor driver.
The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.
Looks like we'll have to keep waiting
Re: (Score:2)
Forget about performance, what about power saving? For a couple of years now we have been promised the ability to shut down a graphics card and rely just on the on-board chip for desktop use, with the card kicking in when a game is launched. No-one seems to have actually implemented it yet though.
Re: (Score:1)
Sony have done that. It required a reboot unfortunately.
http://www.cnet.com.au/sony-vaio-vgn-sz483n-339284061.htm [cnet.com.au]
Actually I think a better solution would be to put a PCI Express slot in a docking station and integrated graphics in the laptop. Then you could disable the integrated GFX when you dock and use discrete instead. Even better you could use a relatively cheap desktop card.
Mind you Asus have tried that and it didn't exactly catch on
http://www.techspot.com/news/24044-asus-introduces-xg-modular-laptop- [techspot.com]
Re: (Score:2)
I know nVidia does, and I believe ATI has similar technology. There's a weak GPU onboard the chipset, but it can then switch to the faster offboard GPU when you want the grunt (at the expense of battery life).
Heck,
Re: (Score:2)
You are correct, it does exist but only seems to be used on laptops so far. I don't know of any desktop mobos that support it.
Re: (Score:2)
Re: (Score:2)
I also have a gaming PC with a more than capable second card not being used, but which would probably allow me the small performance boost I need to keep me from upgrading just yet. I think that I'm more the target market than you are.
Re: (Score:2)
I have a AMD Neo CPU (1.6GHz) and a on-board x1250 and Fullscreen Flash works well in Debian Sid.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
I have a GeForce 9500 GT, and a dual core Athlon 5050e, yet Flash fails abysmally at playing hidef YouTube on even a single 1680x1050 screen.
Re: (Score:2)
Well, for one, Youtube HD isn't even HD at all. It's a 640x272 resolution video that's been heavily upsampled. I've fullscreened a Youtube HD video and paused it, and I've counted the blocks that represent a 'pixel.' I've already tested this on Youtube and Vimeo with my DXG 720HD camcorder. Vimeo keeps the true HD, Youtube shrinks the resolution for file size then upsamples the entire thing. Pretty easy to spot when you're using a nice 32" LCD that is only 4 feet from your face.
That why I just renewed my Vi
Re: (Score:2)
Re: (Score:2)
Suddenly, to be released to market in 30 days (Score:5, Interesting)
Re: (Score:3, Informative)
>>Finally, we can have asynchronous GPU pairing?
I think NVIDIA has some sort of asymmetrical SLI mode available on its mobos with built-in video cards. It allows the weak built in card to help a little bit with the big video card installed in the main PCI-E slot.
IIRC, it gives a 10% boost or so to performance.
Ah, here it is...
http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]
Re: (Score:1)
Ah, here it is... http://www.nvidia.com/object/hybrid_sli.html [nvidia.com]
Im pretty sure that also got discontinued with the 9xxx generation of nVidia GPU's
Re: (Score:1)
A 10% boost is not really worthing bothering about to be honest. Discrete graphics is so much faster than integrated you might as well turn off the integrated graphics completely.
Re: (Score:3, Insightful)
Cross brand or not? I'm confused (Score:2)
From TFA (emphasis mine):
To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA) and the ability to load balance GPUs based on performance and other criteria.
So what is the deal? Is it cross-brand or not?
Also, they are only planning to launch their chipset on one motherboard with one manufacturer. It all sounds like a short-lived gimmick to me.
Great, can't wait until there's a Linux driver (Score:2)
Re: (Score:1, Insightful)
We live in a world where thousands of children starve to death every day, people are killed or imprisoned for expressing their beliefs, women/minorities/everybody are oppressed, and few people really care about any of it, because it's all someone else's problem. I find it kind of funny (and more than a little sad) that the use of a driver can be blithely written off as "immoral" just because you can't download the source.
Re: (Score:1, Insightful)
... women(by women and some men)/men(by women)/minorities/everybody are oppressed, ...
Fixed that for you
Re: (Score:2, Insightful)
Certainly. One is important, the other is trivial nerd rage bullshit.
Re:Great, can't wait until there's a Linux driver (Score:4, Insightful)
I hope you die young. Seriously. If we get world hunger solved, and peace eternal, people will start to complain about even less important stuff. People complain about things, it's part of human nature. Just because 500 people died in Africa today before I got out of bed doesn't mean I don't feel that particular idiot at work is a friggin' [censored].
You can't deny people's feelings with a rational appeal to global standards.
Re:Great, can't wait until there's a Linux driver (Score:5, Insightful)
Some people rape children. How can you possibly think shoplifting is immoral?
/me steals some stuff.
Re: (Score:2)
Cute, you are trying to pull an common didactic trick by sidestepping the issue with overexerting another. I know grade schoolers like to do it:
Kid: I don't want to do my homework, it's so stupid.
Parent: For the last time kid, do your homework!
Kid: You know that hurricane Jane killed 5000 people yesterday on the west cost? And you are upset about some measly homework? How can you, there are so many worse problems on the world.
Parent: [...]
Now who the hell modded parent up?
Re: (Score:1)
Re: (Score:1)
Either that, or write "This will get modded down because of [...], but" at the beginning of your post.
Re: (Score:2)
So you can't wait until approximately...20never? You won't see drivers for anything like this in Linux. You'll be lucky to get decent bog standard 3D drivers.
Re:Great, can't wait until there's a Linux driver (Score:5, Insightful)
Re: (Score:2, Funny)
It's a slippery slope though
1) You can't read the source code to your graphics driver.
2) ???
3) You are being herded into a gas chamber.
Re: (Score:1)
On second thoughts, I'm not going to go there.
Re: (Score:1)
Re: (Score:1)
I was on a slippery slope once. I got mud all over my pants.
Re: (Score:2)
And yes, I know, there is some unstable proprietary binary blob available for my ATI card which can do 3D, but it is immoral to use that
Now that you have confessed you shall say 2 our RMS and 3 hail Linus and all will be forgiven for the GNU is merciful. Go in peace, user.
(duh)
and it is actually so slow on 2D (which to me is more important) compared to the free "radeon" driver that it's ridiculous.
Or you could get supported hardware from "the other company", or stop being anal about trivial issues nobody in his right mind cares about.
Or just get a real SVGA card which is perfectly supported with completely open drivers. I hear Tseng ET3000 are a steal these days.
Re: (Score:2)
Re: (Score:2)
The Z buffer shouldn't be an issue. Just render objects on each CPU (pretend they're different scenes), and then merge the images using their Z buffers as a key. You only need to exchange the Z buffer once for the final merge. It doesn't matter that objects will be drawn that would normally be hidden under objects drawn by the other GPU, because the right pixels will be chosen during the final merge based on the Z buffers.
There are certainly questions to be answered, but the fundamental idea of rendering di
Re: (Score:1)
I think you could do tile based rendering. If you look at the Larrabee paper by Intel they managed to get very impressive scaling by doing this.
http://en.wikipedia.org/wiki/Larrabee_(GPU)#Preliminary_performance_data [wikipedia.org]
Of course you have to wonder about Larrabee. If it's as good as this, why haven't Intel launched it as a hybrid CPU/GPU? I think it's one of those ideas like Itanium which are great at the academic paper level but seriously flawed in terms of real world performance - i.e. there are couple of use
Add-On GPU Daughterboard Hardware... (Score:1)
With their proprietary CUDA and Firestream technologies, I would think Nvidia and AMD/ATI resepctively would be able to make a daughter card that could add or increase GPU capability on their existing respective hardware, or open up 3rd party licensing to build this market segment.
My ATI X1300 handles far more BOINC than it does games, and I have no real reason to upgrade right now. But if there was an add-on that ATI or an approved 3rd party manufacturer developed that was reasonably priced, I wouldn't he
Re: (Score:2)
Re: (Score:2)
What you're not seeing is that the PCIe card **IS** the user-replaceable GPU tech! WHY would you want/need to swap out the socketed GPU? All you're keeping by your method is the ram (which is probably slow and out of date by the time you swap GPUs) and the physical connectors, which cost roughly NOTHING. In exchange you've added a ton of connections to be loose or misconnected.
Re: (Score:2)
Re: (Score:1)
Thanks for the product recommendation. As you can tell, I'm not exactly operating on the bleeding edge of technology and that price range fits in nicely with my budget.
Re: (Score:2)
NOT Platform-independent (Score:1)
Truly platform-independent GPU scaling is nearly here and the flexibility it will offer gamers could be impressive.
But this is not any ware close enough.
Performance issue (Score:3, Insightful)
Incidentally, the two lower-end hydra chips will sport a x8 connection to the controller and 2 x8 connections to the cards, and a x16 connection to the controller and two x16 connections to the cards (strictly 2x16, not configurable in any other arrangement)
Re: (Score:1)
DirectX/Windows only (Score:2)
The distribution engine as it is called is responsible for reading the information passed from the game or application to DirectX before it gets to the NVIDIA or AMD drivers.
So presumably it will work only in Windows, and only with DirectX games (e.g. not with OpenGL.) I'm guessing that supporting OpenGL would require a big programming effort so we won't see it soon if at all. I suspect there aren't many OpenGL games out there anyway, but I don't follow such things.
Unless the OS market changes drastically,
Latency (Score:2)
It'll be interesting to see how much extra latency the chip adds to the rendering process. I don't imagine the hardcore gamers would be too happy about it if they sacrifice an extra 50 ms to gain some FPS.
Re: (Score:1, Funny)
Nonsense. Graphics beat gameplay, remember? It would follow that throughput beats latency.
2 display drivers??? Come on... (Score:2, Insightful)
Great, now I can have 2 buggy display drivers installed at the same time, each with their own quirks. And who helps me out when I have graphical problems in a game? Do you really think ATI or NVIDIA will give end-user support for this? What about game developer support? It is a support nightmare for all involved. No thanks. Sorry, this idea is brain-dead long before it hits the shelf.
Re: (Score:2)
Re: (Score:2)
If vendor support is so important for you, get a console.
Useless tech (Score:1)
Re: (Score:2, Insightful)
Back in the Dos days video hardware was originally a register level standard. Then the accelerator companies all invented their own solutions to line drawing, BitBlts and so on. Now in Dos each programmers used Vesa Bios calls to get into high res modes but they had to write a driver themselves for anything more complex. Windows came along and acted like a software motherboard - application programmers wrote to user mode API and the graphics card manufacturers wrote drivers to a kernel level API.
At this poi
SLI/crossfire is a niche market (Score:1)
Look out for Taiwan ! (Score:1)
First it was Nvidia, then Nvidia's control over Ageia (of PhysX chip fame)
Now it's MSI's turn in their control over Hydra
What this means is, there are only ATI and Intel out there who are seriously dabbling with graphics hardware, who are not based in Taiwan !
Re: (Score:1)
Speaking of Phys-X,
"also shows some demos of AMD HD 4890 and NVIDIA GTX 260 graphics cards working together for game rendering"
That might cause a problem. Remember, nVidia disabled Phys-X on their latest drivers when ATI video hardware is present (to prevent people from using cheap nvidia GPU's as a glorified Phys-X PPU) so I hope these guys made their own "custom drivers" that work with both cards (and not just a software bridge between the two). This will eliminate that restriction as well as the need to
VirtualGL does this NOW - http://www.virtualgl.org (Score:1)
wait and see (Score:2)
last time i heard of
such magical product
it was april fool
Hardware based ripoff of Chromium (Score:2)
Although originally designed for a networked cluster with one gpu per machine, it can conceivably be adapted to one machine with multiple GPUs. Because Chromium's software based compositing would bog down a single processor system, a natural extension wo
Hybrid Crossfire? (Score:2)
Would this technology enable me to use the onboard IGP (Radeon HD3300) which
is now doing absolutely nothing as I am using a separate Radeon HD3850.
Interesting. (Score:1)
Interesting. But let me guess its only compatible with windows.
Re:fuck you all (Score:4, Funny)
I know I speak anonymously when I say that this Anonymous Coward does not speak for the rest of us Anonymous Cowards.
So, parent poster, gargle our collective balls.
Why only on DirectX ? (Score:1)
Articles and articles I've read about Hydra is that it comes in both the HW and the SW parts, with the SW sits in between DirectX and the OS
My point is why only DirectX?
Wouldn't this be cutting itself short --- and be dependent on Microsoft?
While I know that in the world we live now DirectX pwns, but if we don't offer others a chance, how are they gonna be popular enough to rival DirectX ?
Re: (Score:1)
Re: (Score:2)
Yes? No? Yes? No!?
Dude it's about parallel processing, not quantum computers.