Tackling AGP 8X 258
EconolineCrush writes "AGP 8X is popping up in new chipsets and motherboards, and graphics cards are also starting to support the standard, but is there a major performance advantage over the older AGP 4X spec? According to this review of NVIDIA's latest AGP 8X-enable graphics products, no. The review also covers some of AGP 8X's new functionality, which includes support for multiple AGP ports with multiple AGP devices per port. Whether future games and applications take advantage of AGP 8X's extra bandwidth remains to be seen, but more interesting should be what companies do with multiple AGP devices and ports."
lack of performance (Score:4, Funny)
Re:lack of performance (Score:3, Interesting)
Re:lack of performance (Score:3, Informative)
I have this card, and it works rather well. I guess the Geforce2 is getting kind of dated, but it works well w/ my 1.2g Athlon. Couple this w/ a PCI card (dual-head or not) or two, and you should be able to have as many monitors as you want.
Re:lack of performance (Score:2)
Only two heads????
So I have to ut the other monitors on cheezy pCI cards which are getting harder to find, especially if you want something with decent power.
Draw! (Score:5, Funny)
Oh, I get it!!! You meant Dual-Head!
Nevermind. My bad.
Re:Draw! (Score:2, Funny)
I wish you guys would stop doing this stuff to me when I have a low blood-caffeine level.
"Oh, I get it!!! You meant Dual-Head! "
Cut that out.
Re:lack of performance (Score:2)
AGP8 (Score:5, Insightful)
What (cool thing) could you do w/multiple devices (Score:2, Interesting)
Re:What (cool thing) could you do w/multiple devic (Score:3, Insightful)
Re:What (cool thing) could you do w/multiple devic (Score:2)
I did a similar thing with 2 computers and msvcmon. It only works for visual studio, but if that's what you want, then you can basically debug 3d games without worrying about focus issues or trying to catch transient visual problems. If you use something else, there may well be a similar tool that does the same thing.
Re:What (cool thing) could you do w/multiple devic (Score:2, Insightful)
Two AGP slots would be nice
--Jeremy
Why you sometimes can't "just switch shells" (Score:2)
But what's wrong with debugging with Emacs and just switching shells between the 3d game being debugged and Emacs?
In some environments (such as Microsoft Windows), a shell switch between the GDI windowed environment and a fullscreen environment forces the video card to clear its memory entirely. This can mask the very problem you're trying to isolate in the debugger.
Re:What (cool thing) could you do w/multiple devic (Score:2)
Re:What (cool thing) could you do w/multiple devic (Score:2, Interesting)
An ENTIRE 20 inch monitor just for email, and another one for just a web browser?
I won't claim to know what you're actually doing, but that sounds really wasteful....
Re:What (cool thing) could you do w/multiple devic (Score:5, Funny)
Or maybe he's just an american, probably drives an SUV too.
Re:What (cool thing) could you do w/multiple devic (Score:2)
Re:What (cool thing) could you do w/multiple devic (Score:2)
Large Fonts (Score:3, Insightful)
Exactly how much email do you get that you need to dedicate a 20 inch monitor to it? Are you a spammer?
Exactly how do you jump from "uses a physically large display" to "sends an excessive amount of electronic mail"? For all we know, 4444444 could have vision problems and be running a 20-inch display either at a pixel count that most of us would associate with a 14-inch display or with the Mac equivalent of Windows's "Large Fonts".
Re:What (cool thing) could you do w/multiple devic (Score:5, Informative)
At work I leave Outlook open on one all the time, have Visual Studio open on that one and an Internet Exploder screen open on the right screen. That way when I make changes in VS on the left I can instantly refresh the IE window on the right without doing all the toggling back crap.
I also used to do reports and presentations. Having dual monitors allowed me to have Excel/Access/whatever source program open on the left, and Powerpoint on the right. I could drag a chart from Excel full size and drop it into Powerpoint without having to do cut/alt-tab switch window/paste. Much easier, gives WYSIWIG some credence to its name.
I am running dual monitors on an NT4 box with 2 Matrox Millenium PCI's (have had dual monitors for 4 or five years now I think on that one). My other box has a Matrox G450 AGP and a Matrox PCI Millenium for dual capability on it (W2K).
IMHO, Matrox makes the best multi-display drivers/cards at a reasonable price and have had them for quite a long time compared to the others. They have a quad output card also but it is costs a bit more than the duals.
ngoy
how useful (Score:3, Funny)
Re:What (cool thing) could you do w/multiple devic (Score:3, Interesting)
Most of these don't require the added bandwidth of the AGP, though, but then again, few things do - CAD/CAM might, and games, of course. Which leads to another possible use for multiple AGPs.
However, even though multi-device gaming has been possible for a long time and has even been pimped by the graphics chipset industry recently, it never really took off.
Re:What (cool thing) could you do w/multiple devic (Score:5, Interesting)
I'm currently using a six-monitor configuration for music production. I have Sonar spread over four 19" monitors and I use two 17" monitors to display virtual intruments/effects and the MOTU console.
3D isn't a factor on this machine, but it's tricky to get three (one AGP, two PCI) dual-head displays to work side by side correctly.
Two AGP slots would permit me to use just two Parhelia (or competitors'--once they jump on the triple-head bandwagon) cards and free up PCI slots for more useful things like DSP cards.
Then, too, a configuration like that would make for a breathtaking multi-monitor gaming experience!
Re:What (cool thing) could you do w/multiple devic (Score:3, Insightful)
What I want is something that I can easily control from my midi-instrument(s), not some useless knob on the screen to fiddle with. Alternately, keyboard/mouse control would be useful, but turning round knobs on the screen is completely useless... And they should not use up screen-space
Re:What (cool thing) could you do w/multiple devic (Score:2, Interesting)
1. my current AIW 128 Pro, just to use the TV-tuner.
2. a Radeon 8500, without the TV tuner.
But unfortunately, my system doesn't allow two AGP cards.
Dave
Re:What (cool thing) could you do w/multiple devic (Score:2)
Re:What (cool thing) could you do w/multiple devic (Score:2)
In more practical terms, it makes a lot of things easier. I can, in a GUI environment, have Emacs or VIM running on the two different monitors, and if I have each session of the respective editor displaying two different files, that's four files I can be editing at the same time without having to flip through a bunch of desktops or windows sitting on top of each other, which can get pretty old pretty fast. Many times I just want to be able to look at a piece of code really quickly, instead of having to switch through a whole bunch of windows sitting on top of each other.
Re:What (cool thing) could you do w/multiple devic (Score:3, Interesting)
Nothing new here (Score:3, Funny)
Re:Nothing new here (Score:2, Interesting)
version of PCI that is electrically similar to Infiniband. It also supposedly shares with
Infiniband protocol too. so far only 266MBps links have been discussed but that can
scale to 1 and 3 GBps per port. Each port is switched like an ethernet switch giving
each device its own 266MBps+ bandwidth. AGP 8X is fast but who knows what
tomorrow graphics cards will bring and what games will tax them.
Re:Nothing new here (Score:3, Informative)
As the silicon technology improves, the maximum speed of the lane will increase to 10Gb/s, for a total of 320Gb/s in the widest implementation, or about 25GB/s.
Now, that's a lot of bandwidth.
Re:Nothing new here (Score:2)
PCI Express is what Intel has been pushing for a while and what will become the standard in mid-2004. It's also known as 3GIO.
Hmm (Score:4, Interesting)
Re:Hmm (Score:5, Informative)
The bus is very one-sided. It gives 2.1GB/s to the card, but nothing particularly special on the way back. After all, the intent was to allow the cpu to stream from main memory.
Re:Hmm (Score:4, Insightful)
Not to say that you *couldn't* have an AGP disk controller. But I doubt the performance improvement would be sufficient to justify the hassle and the lost AGP slot.
PCI-X is starting to come close to the lower AGP speeds in performance, and is a much cleaner and more general standard.
-John
Re:Hmm (Score:3, Informative)
Two of my desktops have PCI-X (not available in normal desktop boards only workstation boards) and it is great. PCI-X Gigabit networking and fiberchannel. Very fast.
AGP wouldn't be as good as PCI-X is. It may have the data rate but the protocol is designed for a graphics card. You could put other cards on it but PCI-X/PCI is a much better choice! (To note: PCI66/64 will give you 0.515GB/s which is a really high data rate for a desktop system to sustain.)
Re:Hmm (Score:3, Interesting)
1. AGP is unidirectional. The plugin card is always the bus master and the chipset is the slave. If the chipset wants to initiate a transaction, it has to do good ol' PCI.
2. No error correction/detection. AGP doesn't use parity/ECC because a flipped bit here and there in video data isn't that important. This could be very bad for more sensitive devices.
3. Only one device per bus. AGP is point-to-point.
Re:Hmm (Score:2)
Re:Hmm (Score:2)
Re:Hmm (Score:3, Informative)
AGP and PCI do not share bandwidth at all.
That is the whole point of AGP--to be a port in which the I/O of other devices would not effect the performance of the video I/O.
Additionally, if the AGP port shared bandwidth with anything, it wouldn't be a port, it would be a bus.
Re:Hmm (Score:2)
Re:Why? (Score:2)
A Quick Commentary (Score:5, Interesting)
If multiple AGP is availiable for 8x then it's probably the greatest improvement possible. I ran 2 monitors at work, then got hooked. Now it's almost impossible for me to use 1 monitor. The problem is that you can't get multiple agps as of now so you have to use a crappy pci card.
This will also be awesome for gaming! I can't wait until I can get a dual agp card. I bet if they start making dual agp mobos then dual monitors will become very common.
The End.
Re:A Quick Commentary (Score:2)
What? (Score:2, Interesting)
I just checked, and the Radeon 8500 and 9700 both do the same thing.
Re:What? (Score:2)
Re:A Quick Commentary (Score:4, Informative)
three things:
1) Dual head AGP cards already exist, Matrox even has a triple head [matrox.com] AGP card.
2) What's wrong with PCI cards? If you use it for work (like you said in the first part of your comment), I don't see what's wrong with it. I'm using 1 AGP and 1 PCI right now and I'm happy the way it is. usually I use my main monitor, which has a higher resolution, for coding and at the same time my second screen is cluttered with IRC, IM and online-documentation
3) I don't think dual AGP slotted mobo's will become standard real soon: people have lots of PCI slots and that din't encourage people to go dual/triple/... screen. I rather think that dual AGP will remain something for techies, geeks and professionals.
And remember kids: the more monitos you have, the larger your penis is!
Re:A Quick Commentary (Score:2)
snip from the xfree.log
(--) PCI: (0:12:0) BrookTree unknown chipset (0x036e) rev 2, Mem @ 0xea002000/12
(--) PCI:*(1:0:0) ATI unknown chipset (0x4e44) rev 0, Mem @ 0xd8000000/27, 0xe9000000/16, I/O @ 0xc000/8
(--) PCI: (1:0:1) ATI unknown chipset (0x4e64) rev 0, Mem @ 0xe0000000/27, 0xe9010000/16
Ok, now I'm confused. (Score:2, Funny)
[Looks up to see a single 15 inch]
[Looks down to see another 15 inches]
Yup. Now I'm confused.
Get a dual head Geforce (Score:2)
They aren't hard to find. I got this Geforce4 MX 460 two weeks ago.... it has dual SVGA out, and COmposite and Svidoe out, AND composite and svideo in. Only 130 dollars, and it runs Ut2k3 like a charm :)
Re:A Quick Commentary (Score:2)
Re:A Quick Commentary (Score:2)
Confused (Score:4, Interesting)
which includes support for multiple AGP ports with multiple AGP devices per port.
I can't figure out why this would be good. (this is not a troll, i just can't figure it out). Can you put two video cards in, and have them work together, like voodoo SLI type things? Or is it just one card for a monitor, another to output to tv?
Re:Confused (Score:2)
Re:Confused (Score:2)
You have been able to have multiple video cards working torgether forever and a day now... every windows OS since Win98 supported it, and XFree has supported it since 4.0. Multiple monitors allows you to have an ultra wide desktop. It is one of those things that, onc eyou use it for a week, you can't live without. The problem is there is only one AGP port, so your secondary and tertiary, etc cards have to be PCI (that, or get a deual head AGP card, like I have).
Re:Confused (Score:2)
Re:Confused (Score:2)
Which is why you probably won't see it. The graphics companies are just like all other companies, they want as much of your money as they can get away with.
Re:Confused (Score:2)
Overkill? (Score:4, Interesting)
Interesting article (Score:2, Interesting)
SLI BACK AGAIN? (Score:4, Interesting)
halted its implementation into more modern cards. Now with multiple AGP ports and
multiple devices per port, SLI may soon be back.
Re:SLI BACK AGAIN? (Score:2)
Why would it come back?
It went away because performance got good enough in a single card that 2 cards weren't needed anymore.
Re:SLI BACK AGAIN? (Score:2)
Re:SLI BACK AGAIN? (Score:2)
Re:SLI BACK AGAIN? (Score:2, Informative)
Yeah, those were the days. But SLI only increased fill-rate, and not triangles/second. Granted, it was one of the best features around. By one card, get kick ass graphics and speed. Then by the second card, hook up in SLI, and boom you've effectively doubled your fillrate.
God no.. (Score:2)
If you want better looking 3D acceleration, look into motion blur. Properly done, a blurred 25FPS render can look as good to the eye as a 120FPS static render with no blur. Don't believe me? Go watch a movie in a theatre. Each frame captures that captures a hand in motion tricks the eye into seeing it that much "clearer" than a faster camera would look.
What I want to see is... (Score:3, Funny)
Re:What I want to see is... (Score:2, Funny)
Benifits vs cost (Score:2, Insightful)
Multiple ports & devices (Score:2, Interesting)
Thank you for any insight.
Re:Multiple ports & devices (Score:3, Informative)
AGP isn't the barrier in this case, IIRC (Score:3, Insightful)
As such, if you get AGP 8x running up to speed, isn't it possible you're testing the limitations of the cards that are available now, and not of the bus? I would think you'd want to flood the bus with data, and then see how it holds up.
See the press release [yahoo.com]. The GeForce4 Ti4600 is current king of the family, and it's nowhere to be found.
Somebody reply if I'm off in my thinking here.
AGP 4x on Radeon (Score:2)
Stop thinking graphics (Score:4, Interesting)
Personally, I would like to see that bandwidth used for other accellerators, such as SSL accelleration like nCipher provide. Or how about a Java non-virtual machine? I'm sure many games could benefit from a dedicated AI board, possibly using FPGA (field programmable gate arrays) so that some especially tricky AI functions could be off loaded from the CPU. To put it short, we already have stunning graphics, which will continue to evolve no matter what you think about the tweaks to AGP. What I hope the more imaginative of you are thinking, is what else could be done with this?
Re:Stop thinking graphics (Score:4, Insightful)
No, the place where the bandwidth has the most impact on the user experience remains the graphics. They look pretty nice nowadays, but until you see scenes generated on the fly at 60 fps or more that are indistinguishable from real life, graphics will always be lacking.
Re:Stop thinking graphics (Score:4, Interesting)
How about the way people move? Graphics card will be able to on the fly render something that looks true to life frame by frame, long before the PC will be able to feed the correct movements too match.
The limit of our ability to model movement is painfully obvious in the final fantasy movie. many stils were true to life in newspaper quality color depth and resolution, but most of the scenes were awkward. When the graphics get too good is when these movements become more annoying, think Toy Story vs Final fantasy. Or even the semi realistic princess in Shrek, it was just horribly awkward, but with less graphics, or very unreal (such as Shreck himself, or any of Monsters, inc) the movement does not stand out as much.
Re:Stop thinking graphics (Score:2)
Absolutely right, stop thinking graphics. I think you're off in left field, though. Think of this instead -- a video-input card on a bus with enough bandwidth to handle an uncompressed HD video stream (like what you get out of a DirecTV STB, or a cable STB in the few places that get HD over cable). The advent of dual-AGP motherboards and such a video-input card would suddenly make non-OTA HD signals available to PVR applications (not everybody has OTA HD signals broadcast in their area, or an antenna on which to receive them). Right now, the PCI bus simply does not have the bandwidth to transfer uncompressed HD video. (Right, the "proper" solution would be to bypass the HD decoding in the STB, sending the mpeg2 stream directly to the computer and then decoding it there, but I've yet to see an STB that will do that.)
Re:Stop thinking graphics (Score:2)
True enough, but I think AGP 8x would have a quicker consumer adoption rate than PCI64 (considering that 64-bit PCI has been around for quite a while and still isn't showing up too frequently in consumer-grade motherboards or cards), and would also have enough bandwidth for streaming uncompressed HD signals. That means that we'll be more likely to see an AGP 8x consumer-grade (< $1000) HD card before we'll see a PCI64 consumer-grade HD card.
Then again, the odds are in favor of the industry thinking that AGP is only for graphics, and so you won't see any AGP cards other than graphics cards, and second AGP slots on motherboards will only ever be used for multi-monitor displays.
8x write, what about read (Score:2)
Will 8x fix this problem?
Multiple AGPs.... (Score:5, Interesting)
Anyway, I too would like multiple AGPs on my motherboard, but it would take more then a smart vendor to make it a reality. Intel designed the AGP as a stopgap, temporary solution for the lowest common denominator. And it still works well if you only need one monitor.
Does AGP offer *any* advantage? (Score:4, Interesting)
At the time that AGP first came out, I was under the impression that its primary advantage was to allow a direct pipeline to system memory, if you ran out of on-board RAM.
Then RAM got really REALLY cheap, and we went from 4-8MB onboard to 32MB, almost overnight. Now you can get video cards with 64MB and even 128MB.
I can't imagine games using more than 128MB of texture RAM, and so I have to wonder why AGP is still being developed. What else does it offer?
Re:Does AGP offer *any* advantage? (Score:2, Informative)
There are colour, depth, stencil, alpha buffers (probably others.) Another big thing is vertex information. A static model can be loaded into video memory. Newer cards support programs (shaders) which are executed on the GPU. All this could add up.
RAM is cheap enough that adding more has a minimal effect on price, but could be useful for various things (even down the road.)
I vaguely recall some hack which let you use your video memory as a swap device. That's nice for a system which is only used as a gaming machine part of the time.
sh
Re:Does AGP offer *any* advantage? (Score:3, Informative)
Unreal Tournament 2003 has a textures directory of about 1,4 GB, and it's using relatively few stacked textures. Doom 3 might well use a dozen textures on one surface, so 128MB might not last that long into the future. Fortunately, texture compression helps a lot.
Re:Does AGP offer *any* advantage? (Score:2)
The reason why AGP will never amount to much more than a seperate bus(so it doesn't choke PCI) is that graphics vendors like NVidia and ATI will always put higher performance ram on the video card than are in the main system. Even if we had AGP32X that had a bandwidth of 10GB/s, there will be 50GB/s memory on the card, and memory too slow to even keep up with AGP on the motherboard. In the end, it may allow a developer to use a variety of textures provided that there aren't more than XMB of them in use at any given point in time. This is because you can fill the local memory with textures in less than a second. A small glitch in video that doesn't occur often isn't going to annoy most gamers if the graphics are nice.
Video memory just needs to be different. Video memory got cheaper with normal memory, but it's not
The video bandwidth to the monitor at 1280x1024x32bitx85hz(refresh) would be 6GB/s at that resolution,bit depth, and FPS(refresh). Most video cards would stretch to make that. Calculate that into the memory bandwidth of the next video card you get before you check out the FPS. See how accurate it is
Multihead (Score:2, Insightful)
Of course, 64MB is quite a lot of video memory. (Score:2, Insightful)
Even if a scene does have a lot of textures, clever memory management in the application can make sure that polygons using the same texture occur sequentially, meaning the load on the AGP bus is still quite small, only having to deal with new textures. Even if clever memory management is not used, a scene containing every texture in memory at every LOD will not happen in any real world situation.
Nvidia X drivers support 16 displays (Score:3, Informative)
I can think of several applications for this, starting with the 3dfx approach to boosting 3d performance by having each card take turns drawing a scanline (sli)
There is also a possibility 3D displays on the horizon will require more information to draw the screen (Twice as much because the scene has to be drawn once for each eye)
Another possibility is for game house use. Standard counterstrike gamehouses charge about $3@hr to rent a machine to play CS. If a player could rent a machine with a wider FOV from multiple monitors the operator could charge more to cover the costs of the extra graphics cards. I would gladly pay $20@hr to be able to play doom3 in a psuedo holodeck enviroment.
Well thats my 2cents into the fray.
Provided they are all the same, other limits... (Score:3, Insightful)
2D has been supported to varying degrees in X and Win98 for some time, allowing the desktop to span multiple cards from different vendors. With varying amounts of acceleration, Blting is easy, other features often fall back to software. Video overlay can be broken, degraded or only work on the first monitor.
The situation is worse for 3D. Some dual or more setups will only 3D accelerate the first monitor, or the monitors on the first card. FWIW MS-Flight-Sim does 3 heads but its in 2D mode.
All support for 3D-MultiHead so far is pretty much driver based, when graphic-library implementation support (openGL) is more appropriate.
The DRI [sourceforge.net] is hoping to implement a more general system where accelerated features are exposed on all heads, at all times, span cards from different manufactures, and can share/use/display multiple applications at the same time.
So is it any better to have faster AGP? (Score:4, Interesting)
The texture maps usually take up the most memory, and they can change depending on the position of the player and even which direction he is looking in.
The position of the objects is sent every frame but shows less variability.
But the texture maps need to be transfered into the graphics card memory once before they can be rendered.
So this happens initially when the texture first appears, but after that its in the memory and it doesn't need resending after that until it is flushed if it is no longer in view and something else needs the space.
But just occasionally new textures are needed. For example sometimes in say, half-life I used to spin around and the screen would stop updating for maybe 1/8 of a second. What was happening was that the wrong textures were in the graphics card and they were being pushed down the AGP-1 pipe as fast as it could take it- not really fast enough- I'd often get a rocket launcher up me; the screen would have stopped updating for just a moment.
Of course now the graphics cards have more memory, the software may be written better so that textures get preloaded before they are needed, and probably most or all of a levels textures fits into the card buffer anyway. So all in all- little or no waiting when spinning around; and the AGP is now x4 as well so instead of 1/8 second we are looking at 1/32 worst case; only 32 milliseconds, which for a one-off jitter isn't perceptible.
John Carmack has talked about the idea of generating texture maps dynamically. If he were to implement this, then AGP would be much more important. Right now, precalculated, fixed texture maps are much more common in games. Bottom line- who cares about agp x8; it's like ata133 it makes no difference to nearly everyone.
Re:So is it any better to have faster AGP? (Score:3, Interesting)
I'm not sure if any of the Quake games so far (or maybe Doom 3) have used it, but the original Unreal had dynamic/procedural textures all over the place. Fire, fountains, and pseudo-particle effects were all handled by dynamic textures. Some of them were even fractals, IIRC...
Unreal worked fine on computers that more often than not did not have AGP, but then the areas with dynamic textures were small portions of the screen and rarely (if ever) larger than about 256x256 or so.
Screw graphics... use AGP for GigE networking!! (Score:2, Interesting)
As evidenced by the NetBSD support, AGP is essentially a PCI-bridge-plus-frills with only one PCI device on it: your graphics card. It also adds snazzy stuff like the command FIFO (which if you study I2O, you will note is generally useful and not just a graphics processor concept).
The electrical simplicity of supporting only one card plays a large part in allowing it to be so much faster than the normal PCI bus. It's only a matter of time before you end up wanting AGP speeds for:
Since the most normal PCI slots you want on a single bridge is four, you could have a reasonably balanced motherboard with 3AGP+4PCI ... assuming the expansion card vendors agree to make AGP versions of things.
VRAM gets small and fast (Score:3, Insightful)
Just a though.... (Score:3, Interesting)
So do I
A) Believe the above, and think their email's & tech support are liars
B) Believe tey above is load of crap and all those crashes I have with AGP4X is a figment of my imagination. That when I set it to AGP2X they go away and 3DMark 2001 show less than 20 points difference between AGP2X & AGP4X
Just think with AGP8X, I can finally cause a system seizure on more than one freakin $399.99 card. And in more than one OS! Yeah!
Graphics Arrays? (Score:2, Informative)
However, when using AGP 3.0 (AGP 8x) it is possible to put more than one AGP device on a port, and thus massive SLI configurations can be made realistically enabling the heavy use of the new DX9 and OGL2 features. ATI or nVidia may design boards with 4 or 8 chips per board, all running off of one AGP slot (would probably require and external power supply) that they couls sell for a few grand a peice to companies wishing to get into the realtime, high fidelity, near realistic 3d graphics buisness.
It takes a while (Score:2, Insightful)
WireGL (Score:2)
WireGL comes to mind, but apparently it is now part of the Chromium project at http://sourceforge.net/projects/chromium/
I figure that would be a groovy way to make use of multiple ports.
Anyone want to donate $150,000 to me for researching how cool Quake 3 or UT2003 looks on 16 monitors? Uh... for purely scientific research, of course.
Dual Gigabit NICs (Score:2, Funny)
Its got 2 AGP ports...
And a PCI graphics card?!?!!
Dual Gigabit AGP NICs*...
Mmmm...Erotic Cakes....
-D
*Assuming someone's smart enough to make them
Use it as a special expansion slot (Score:2, Interesting)
Re:why not have 256 MB video ram ? (Score:3, Informative)
memory and then zipped to the video card when needed. The main thing is video card memory bandwidth. ATI's Radeon 9700 does something like almost 20
GBps. I remember a company called Bitboys that was working on embedding memory into the video processor silicon (as much as 24 MB) and using a 512 bit
wide bus running at core speed to provide 40+ GBps bandwidth. The chip also had an external memory bus to hold up to 256MB of additional video memory. Sony's PS2 uses a similar setup only using a 1024 bit bus and only 4 MB memory but it does over
50 GBps.
It would be cool to see a video chip that takes the 2D, video and AGP controller off the 3d chip and just have a 3d chip with 32 MB embedded ram and an external
memory controller. This would allow for more transistors to be used for rendering purposes. The AGP/2D chip acts as a frame buffer with its own memory to further
free up bandwidth inside the 3D chip. The two chips could then be linked together via a high speed low pin bus like Hyper Transport and possibly the 2D unit
could have 2 or 4 interfaces for adding more rendering units to double or quadruple rendering power. Of course this is a costly solution but the speed
gain would be incredible.
Re:shouldn't be called AGP anymore then (Score:4, Funny)
You tell me which one is bigger.