Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Hardware

Tackling AGP 8X 258

EconolineCrush writes "AGP 8X is popping up in new chipsets and motherboards, and graphics cards are also starting to support the standard, but is there a major performance advantage over the older AGP 4X spec? According to this review of NVIDIA's latest AGP 8X-enable graphics products, no. The review also covers some of AGP 8X's new functionality, which includes support for multiple AGP ports with multiple AGP devices per port. Whether future games and applications take advantage of AGP 8X's extra bandwidth remains to be seen, but more interesting should be what companies do with multiple AGP devices and ports."
This discussion has been archived. No new comments can be posted.

Tackling AGP 8X

Comments Filter:
  • by Zod000 ( 568383 ) on Monday October 21, 2002 @06:58PM (#4499921) Homepage
    I dont recall seeing much of an increase from agp 2x to agp 4x either so I'm no surprised
    • Ability to have multiple displays on the bus would be useful. There really is no good solution for multiple head systems, particularly if you want as many monitors as I tend to. Basically double headed cards tend to offer either TV out or second monitor, not both.
      • by jred ( 111898 )
        WinFast GeForce2 MX DH Pro [leadtek.com]

        I have this card, and it works rather well. I guess the Geforce2 is getting kind of dated, but it works well w/ my 1.2g Athlon. Couple this w/ a PCI card (dual-head or not) or two, and you should be able to have as many monitors as you want.
    • Because the current crop of games doesn't benefit from AGP 4x. Not sure about ut2k3 though, it might be modern enough to.
  • AGP8 (Score:5, Insightful)

    by Nonillion ( 266505 ) on Monday October 21, 2002 @06:58PM (#4499924)
    That would be cool to have more than one AGP slot. I am sometimes disapointed that I cannot have two dual headed agp cards installed...
    • Okay, it might be cool to have more than one AGP slot, but what I want to really know is what kind of aplications would make this useful? I'm kind of curious; could someone please come up with some creative ideas here?
      • Debugging full-screen 3D games. Right now I use a Matrox dualhead card, but it would be nice to have two independent adapters.
        • I did a similar thing with 2 computers and msvcmon. It only works for visual studio, but if that's what you want, then you can basically debug 3d games without worrying about focus issues or trying to catch transient visual problems. If you use something else, there may well be a similar tool that does the same thing.

          • I've done this as well, and it does work. It's still much more convenient to be able to develop and debug on the same machine, though.

            Two AGP slots would be nice ... But I don't know if I really want two high-powered 3d accelerators in one machine. That would put out quite a bit of heat.

            --Jeremy
      • I have a G4 with 3 20 inch monitors I use one for email one for the web and what ever else im doing at the moment and the other fot itunes and any other windows I don't want in my way but since i upgraded to jaguar I can only use my main card (AGP) the other pci cards prevent the system from booting ( I haven't tried everything yet to fix it) but once you get used to multiple monitors it really sucks to be stuck with one
      • by ngoy ( 551435 ) on Monday October 21, 2002 @07:16PM (#4500076)
        In gaming, you could use multiple POV's in flight simulators (I think M$ flight sim supports three monitors, IIRC), or racing (Front, left, right). In desktop publishing it is usful for seeing two pages at once, or four pages at once depending on what resolution you are using.

        At work I leave Outlook open on one all the time, have Visual Studio open on that one and an Internet Exploder screen open on the right screen. That way when I make changes in VS on the left I can instantly refresh the IE window on the right without doing all the toggling back crap.

        I also used to do reports and presentations. Having dual monitors allowed me to have Excel/Access/whatever source program open on the left, and Powerpoint on the right. I could drag a chart from Excel full size and drop it into Powerpoint without having to do cut/alt-tab switch window/paste. Much easier, gives WYSIWIG some credence to its name.

        I am running dual monitors on an NT4 box with 2 Matrox Millenium PCI's (have had dual monitors for 4 or five years now I think on that one). My other box has a Matrox G450 AGP and a Matrox PCI Millenium for dual capability on it (W2K).

        IMHO, Matrox makes the best multi-display drivers/cards at a reasonable price and have had them for quite a long time compared to the others. They have a quad output card also but it is costs a bit more than the duals.

        ngoy
      • Multi-monitoring is already routinely used in a whole slew of applications - publishing, image processing, CAD/CAM to name a few ...
        Most of these don't require the added bandwidth of the AGP, though, but then again, few things do - CAD/CAM might, and games, of course. Which leads to another possible use for multiple AGPs.
        However, even though multi-device gaming has been possible for a long time and has even been pimped by the graphics chipset industry recently, it never really took off.
      • by DeComposer ( 551766 ) on Monday October 21, 2002 @07:34PM (#4500218) Journal

        I'm currently using a six-monitor configuration for music production. I have Sonar spread over four 19" monitors and I use two 17" monitors to display virtual intruments/effects and the MOTU console.

        3D isn't a factor on this machine, but it's tricky to get three (one AGP, two PCI) dual-head displays to work side by side correctly.

        Two AGP slots would permit me to use just two Parhelia (or competitors'--once they jump on the triple-head bandwagon) cards and free up PCI slots for more useful things like DSP cards.

        Then, too, a configuration like that would make for a breathtaking multi-monitor gaming experience!
        • Yeah. Being a hobbyist keyboardist, I've looked into some of these virtual instruments myself, to connect to my sequencer setup. What I can't for the life of me understand is why they all (1) have a GUI, that (2) mimics the real instruments, and is therefore impossible to use, and (3) all take up a lot of screenspace showing useless logos, VU-meters, buttons that don't work, etc...

          What I want is something that I can easily control from my midi-instrument(s), not some useless knob on the screen to fiddle with. Alternately, keyboard/mouse control would be useful, but turning round knobs on the screen is completely useless... And they should not use up screen-space

      • I have a ATI All-in-Wonder (AIW) 128 Pro, so it has the TV tuner built-in. I'd like to upgrade to a Radeon AND keep the TV tuner. But the Radeon AIW costs a lot more than the Radeon non-AIW. Therefore, I would ideally like to have:

        1. my current AIW 128 Pro, just to use the TV-tuner.
        2. a Radeon 8500, without the TV tuner.

        But unfortunately, my system doesn't allow two AGP cards.

        Dave
        • Try a stand alone TV card like the AT Wonder VE I think it's called? Also, I think the Radeon 7500 AIW is an outstanding card, if your not totally into the 3d game thing. I personally have a 7500 AIW and it's nice. It has the analog tuner versus the digital one of the 8500 series. The 8500 AIW is also only like 150 now that the 9700 AIW is out now (I think the 8500 128 MB AIW is 150 after a rebate at Compusa now). Saw that after I came home with my 7500. Oh well. At least the Radeon seems to like framebuffer stuff better then my old card. Could never get a frame buffer console to work on my old Nvidia card. It seems with the Radeon I see it when I wasn't able to before. Hopefully I will get a entirely new system from Gateway (I am getting damn tired of building shit). The one I am looking at is the 500XL with the 18 inch LCD! Yummy! And it also has a DVD burner on it too. All for only 1799! That one would be my primary Windows machine while this one will be my Linux machine. It's not too shabby now itself....Athlon XP 2000+, ASUS A7S333 (I know, 333 DDR ram is not worth the extra bucks, but now I have it when I will need it), ATI Radeon AIW 7500, WD 40 GB 7200 RPM hard disk, Creative Live 5.1, DVD-ROM and my little 8x CD-RW. Oh and the AIW came with the Wonder Remote which is AWESOME! Even has Linux support.
      • If you had the right kind of glasses, you could have real 3d perspectives in any application you choose. The two eye pieces would display a scene with the camera in two different positions, with each card rendering the scene from the two different cameras.

        In more practical terms, it makes a lot of things easier. I can, in a GUI environment, have Emacs or VIM running on the two different monitors, and if I have each session of the respective editor displaying two different files, that's four files I can be editing at the same time without having to flip through a bunch of desktops or windows sitting on top of each other, which can get pretty old pretty fast. Many times I just want to be able to look at a piece of code really quickly, instead of having to switch through a whole bunch of windows sitting on top of each other.
      • Multiple monitor setups are often used in the financial world. I use 3 19" monitors as a daytrader. My screens are filled with real-time streaming charts and data all day long. I know other daytraders that use up to 14 monitors. Sure, 14 is excessive, but it is not uncommon for traders to run 6-8 monitors.
  • by gnillort ( 617577 ) <myslashdotemailaccount@yahoo.com> on Monday October 21, 2002 @06:59PM (#4499930) Journal
    AGP 8x == small bridge to PCI-X == waste of money
  • Hmm (Score:4, Interesting)

    by Maskirovka ( 255712 ) on Monday October 21, 2002 @07:00PM (#4499936)
    Is there anything preventing this new standard from being used for other peripherals like NICs and SCSI cards? If so, why not just phase out PCI completely?
    • Re:Hmm (Score:5, Informative)

      by Fulcrum of Evil ( 560260 ) on Monday October 21, 2002 @07:27PM (#4500173)

      The bus is very one-sided. It gives 2.1GB/s to the card, but nothing particularly special on the way back. After all, the intent was to allow the cpu to stream from main memory.

    • Re:Hmm (Score:4, Insightful)

      by OverCode@work ( 196386 ) <.overcode. .at. .gmail.com.> on Monday October 21, 2002 @07:37PM (#4500242) Homepage
      AGP is Accelerated Graphics Port, a very hackish specialization of the PCI bus for graphics devices. It is a master-to-target link, not a bus, per se. Its signalling rates are not be appropriate for a general-purpose bus (mobo manufacturers have enough trouble getting it right on the short runs to a single AGP port), and its optimizations are slanted toward squeezing performance out of bus traffic typical of graphics devices, not random access disk controllers and network devices.

      Not to say that you *couldn't* have an AGP disk controller. But I doubt the performance improvement would be sufficient to justify the hassle and the lost AGP slot.

      PCI-X is starting to come close to the lower AGP speeds in performance, and is a much cleaner and more general standard.

      -John
    • Re:Hmm (Score:3, Informative)

      by pqbon ( 7033 )
      PCI-X v2 is coming out soon... PCI-X v1 makes out at 1.039 GB/s (theoretical) with 64bits wide at 133Mhz. PCI-X v2 is supposed to be 133 DDR with the option for QDR so you will get 4.164 GB/s(QDR).

      Two of my desktops have PCI-X (not available in normal desktop boards only workstation boards) and it is great. PCI-X Gigabit networking and fiberchannel. Very fast.

      AGP wouldn't be as good as PCI-X is. It may have the data rate but the protocol is designed for a graphics card. You could put other cards on it but PCI-X/PCI is a much better choice! (To note: PCI66/64 will give you 0.515GB/s which is a really high data rate for a desktop system to sustain.)

    • Re:Hmm (Score:3, Interesting)

      Several things...

      1. AGP is unidirectional. The plugin card is always the bus master and the chipset is the slave. If the chipset wants to initiate a transaction, it has to do good ol' PCI.

      2. No error correction/detection. AGP doesn't use parity/ECC because a flipped bit here and there in video data isn't that important. This could be very bad for more sensitive devices.

      3. Only one device per bus. AGP is point-to-point.
  • A Quick Commentary (Score:5, Interesting)

    by clinko ( 232501 ) on Monday October 21, 2002 @07:01PM (#4499946) Journal
    Granted, this is slightly off topic but worthy.

    If multiple AGP is availiable for 8x then it's probably the greatest improvement possible. I ran 2 monitors at work, then got hooked. Now it's almost impossible for me to use 1 monitor. The problem is that you can't get multiple agps as of now so you have to use a crappy pci card.

    This will also be awesome for gaming! I can't wait until I can get a dual agp card. I bet if they start making dual agp mobos then dual monitors will become very common.

    The End.
    • This [pny.com] is the card you're looking for. The thing that I'm wondering is if CRT's are completely on their way out as a display technology, of if we are going to start seeing CRT's with DVI inputs, from what I understand the bandwidth of the old D-SUB connector is just about fully utilized.
    • What? (Score:2, Interesting)

      by MagPulse ( 316 )
      All the GeForce 4 Ti4x00 cards I've seen can drive two monitors at once with nview, as long as you have a DVI-analog converter for the second monitor. I'm not sure if you can go analog->DVI for two digital monitors.

      I just checked, and the Radeon 8500 and 9700 both do the same thing.
      • I have a 4200 and was wondering about hooking up the 2nd monitor - any idea where I can the converter you talk about? Is it a generic product or an nvidia thing?
    • by jedie ( 546466 ) on Monday October 21, 2002 @07:18PM (#4500097) Homepage
      I can't wait until I can get a dual agp card. I bet if they start making dual agp mobos then dual monitors will become very common.

      three things:
      1) Dual head AGP cards already exist, Matrox even has a triple head [matrox.com] AGP card.
      2) What's wrong with PCI cards? If you use it for work (like you said in the first part of your comment), I don't see what's wrong with it. I'm using 1 AGP and 1 PCI right now and I'm happy the way it is. usually I use my main monitor, which has a higher resolution, for coding and at the same time my second screen is cluttered with IRC, IM and online-documentation
      3) I don't think dual AGP slotted mobo's will become standard real soon: people have lots of PCI slots and that din't encourage people to go dual/triple/... screen. I rather think that dual AGP will remain something for techies, geeks and professionals.

      And remember kids: the more monitos you have, the larger your penis is!

      • The Ati 9700 shows up as 2 video cards. (2 PCI cards), not sure but I think the BrookTree is the 3rd, TV out.

        snip from the xfree.log

        (--) PCI: (0:12:0) BrookTree unknown chipset (0x036e) rev 2, Mem @ 0xea002000/12
        (--) PCI:*(1:0:0) ATI unknown chipset (0x4e44) rev 0, Mem @ 0xd8000000/27, 0xe9000000/16, I/O @ 0xc000/8
        (--) PCI: (1:0:1) ATI unknown chipset (0x4e64) rev 0, Mem @ 0xe0000000/27, 0xe9010000/16
      • And remember kids: the more monitors you have, the larger your penis is!

        [Looks up to see a single 15 inch]

        [Looks down to see another 15 inches]

        Yup. Now I'm confused.
    • They aren't hard to find. I got this Geforce4 MX 460 two weeks ago.... it has dual SVGA out, and COmposite and Svidoe out, AND composite and svideo in. Only 130 dollars, and it runs Ut2k3 like a charm :)

    • You can't get multiple AGP's, but there are several dual-head cards on the market.
    • I'm happily running dual-head at home from my geforce 4, and I know it's part of the geforce2 chipset as well (although most gf2s only have one monitor out plug)
  • Confused (Score:4, Interesting)

    by El Pollo Loco ( 562236 ) on Monday October 21, 2002 @07:01PM (#4499948)
    It's been a long weekend, but this part still confuses me.

    which includes support for multiple AGP ports with multiple AGP devices per port.

    I can't figure out why this would be good. (this is not a troll, i just can't figure it out). Can you put two video cards in, and have them work together, like voodoo SLI type things? Or is it just one card for a monitor, another to output to tv?
    • You could use it to have 2 monitors, and have a huge desktop. Having two AGP cards is a serious advantage in this situation, as there won't be a PCI card to hold you back.
    • You have been able to have multiple video cards working torgether forever and a day now... every windows OS since Win98 supported it, and XFree has supported it since 4.0. Multiple monitors allows you to have an ultra wide desktop. It is one of those things that, onc eyou use it for a week, you can't live without. The problem is there is only one AGP port, so your secondary and tertiary, etc cards have to be PCI (that, or get a deual head AGP card, like I have).

    • Personaly I would LOVE to see a return of SLI-like cofigs. Because of the highly parallel nature of rendering you could have one card render the top half of the screen and another do the bottom and get close to a 2x performance boost. This would be especialy usefull in this age of new graphics cards coming out every 9 months: buy a top-of-the-line card then pick up a second one on the cheap. Now instead of spending $400 every upgrade cycle you're spending $400+100 every other cycle.
      • Now instead of spending $400 every upgrade cycle you're spending $400+100 every other cycle.


        Which is why you probably won't see it. The graphics companies are just like all other companies, they want as much of your money as they can get away with.
        • I dunno, there's a lot of money to be made with this on the people who would go out and spend $800 on two top-of-the-line cards. Probably enough profit potential there to offset those upgrading by buying a second older card.
  • Overkill? (Score:4, Interesting)

    by bogie ( 31020 ) on Monday October 21, 2002 @07:02PM (#4499957) Journal
    Call me when we actually need more than AGP 2X. I've seen a lot of tests which show only the minor differences between AGP 2X and 4X. Its nice to know the bandwidth will be there, but this is one of those technologies like Serial ATA which really won't be showing its potential for a few years. Of course that won't stop the marketing gurus from tellig people AGP 8 is a "must have".
  • Interesting article (Score:2, Interesting)

    by Anonymous Coward
    However, it fails to include a mention of higher northbridge temperatures when running 1.5v cards with the 64mb aperture. I have always been inclined to enable the DDR compression for the AGP slot so that the motherboard won't signal a failure. The newer BIOS revisions in the i845 chipsets allow for this, but it is sadly lacking on this board.
  • SLI BACK AGAIN? (Score:4, Interesting)

    by LoRdTAW ( 99712 ) on Monday October 21, 2002 @07:03PM (#4499964)
    Remember VOODOO 2's SLI feature that we all so loved? Well it was AGP that
    halted its implementation into more modern cards. Now with multiple AGP ports and
    multiple devices per port, SLI may soon be back.

    • Why would it come back?

      It went away because performance got good enough in a single card that 2 cards weren't needed anymore.
      • Incorect, as far as GFX cards go there's always a need for more performance. The original poster is corect in saying it was AGP that killed SLI, it made it impractical.
        • Actually, the Voodoo 4/5 cards used SLI on board. So it doesn't have to occur between two cards. The Voodoo 5 6000 even had it between 4 VSA-100s
    • Remember VOODOO 2's SLI feature that we all so loved?

      Yeah, those were the days. But SLI only increased fill-rate, and not triangles/second. Granted, it was one of the best features around. By one card, get kick ass graphics and speed. Then by the second card, hook up in SLI, and boom you've effectively doubled your fillrate.
    • I'd rather have slower progressive scan than faster interlaced anyday. That's why I prefer monitors over TVs.

      If you want better looking 3D acceleration, look into motion blur. Properly done, a blurred 25FPS render can look as good to the eye as a 120FPS static render with no blur. Don't believe me? Go watch a movie in a theatre. Each frame captures that captures a hand in motion tricks the eye into seeing it that much "clearer" than a faster camera would look.
  • by trevinofunk ( 576660 ) on Monday October 21, 2002 @07:03PM (#4499968)
    32X! Go sega!!! Then i can play all 4 games i bought for it.
  • Benifits vs cost (Score:2, Insightful)

    by bl968 ( 190792 )
    It's simple if the AGP-8x offers a clear benefit for the costs to the users such as an increase in the quality of the graphics or the screen's refresh rate, or new graphic features then they will embrace it. If it doesn't it will whither on the vine. People expect things to be much better then the items they are replacing when they buy new. I know I do.
  • As far as I know AGP has a higher bandwidth than e.g. PCI. So will there be AGP network interface cards since there can be multiple AGP ports?
    Thank you for any insight.
    • I don't see why there is a need for them. Gigabit ethernet does a theoritical max of 125 MB/s, I think PCI 2.1's max is 133, I think. And we all know that these things don't always operate at their maxes.
  • If I remember the press release correctly from a few weeks back, nVidia introduced AGP 8X in some of their cards-- but inexplicably not in their top-of-the-line.

    As such, if you get AGP 8x running up to speed, isn't it possible you're testing the limitations of the cards that are available now, and not of the bus? I would think you'd want to flood the bus with data, and then see how it holds up.

    See the press release [yahoo.com]. The GeForce4 Ti4600 is current king of the family, and it's nowhere to be found.

    Somebody reply if I'm off in my thinking here.
  • It'd be nice if the drivers for my laptop's Radeon would allow me to set XFree86 at even AGP 4X without massive instability - I still get hard lockups if I run it for more than 30 seconds at higher than 2X mode.
  • by smugskii ( 518160 ) on Monday October 21, 2002 @07:20PM (#4500103)
    My wishlist (primarily as a server tech guy) does not concern squeezing a bit more graphics out of the bus.
    Personally, I would like to see that bandwidth used for other accellerators, such as SSL accelleration like nCipher provide. Or how about a Java non-virtual machine? I'm sure many games could benefit from a dedicated AI board, possibly using FPGA (field programmable gate arrays) so that some especially tricky AI functions could be off loaded from the CPU. To put it short, we already have stunning graphics, which will continue to evolve no matter what you think about the tweaks to AGP. What I hope the more imaginative of you are thinking, is what else could be done with this?
    • by Junta ( 36770 ) on Monday October 21, 2002 @07:38PM (#4500245)
      Why the hell would you need that sort of bandwidth for a SSL accelerator? Even if you were using it for Hard Disk I/O, Hard Disk I/O goes throuh the PCI bus, so it would not be saturated. For the most reasonable area of network usage, bandwidth on any PC equipment wouldn't go anywhere near such a need. For bigger needs, you probably need bigger equipment anyway. Don't think a dedicated Java processor would be a big seller, especially with the speedups in implementations seen as of late. AI hardware may sound intriguing, but it is so unsexy in terms of consumer visiblity. That, and methods to AI frequently change.

      No, the place where the bandwidth has the most impact on the user experience remains the graphics. They look pretty nice nowadays, but until you see scenes generated on the fly at 60 fps or more that are indistinguishable from real life, graphics will always be lacking.
      • by AvitarX ( 172628 ) <me&brandywinehundred,org> on Monday October 21, 2002 @07:55PM (#4500345) Journal
        The barrier to ultra hyper realistic on the fly graphics at 60 fps will never be the graphics (IMHO). It will be the movement. Wasn't there an article here once about how cloth will be nearly impossible to realistically render?

        How about the way people move? Graphics card will be able to on the fly render something that looks true to life frame by frame, long before the PC will be able to feed the correct movements too match.

        The limit of our ability to model movement is painfully obvious in the final fantasy movie. many stils were true to life in newspaper quality color depth and resolution, but most of the scenes were awkward. When the graphics get too good is when these movements become more annoying, think Toy Story vs Final fantasy. Or even the semi realistic princess in Shrek, it was just horribly awkward, but with less graphics, or very unreal (such as Shreck himself, or any of Monsters, inc) the movement does not stand out as much.
    • Absolutely right, stop thinking graphics. I think you're off in left field, though. Think of this instead -- a video-input card on a bus with enough bandwidth to handle an uncompressed HD video stream (like what you get out of a DirecTV STB, or a cable STB in the few places that get HD over cable). The advent of dual-AGP motherboards and such a video-input card would suddenly make non-OTA HD signals available to PVR applications (not everybody has OTA HD signals broadcast in their area, or an antenna on which to receive them). Right now, the PCI bus simply does not have the bandwidth to transfer uncompressed HD video. (Right, the "proper" solution would be to bypass the HD decoding in the STB, sending the mpeg2 stream directly to the computer and then decoding it there, but I've yet to see an STB that will do that.)

  • In other /. stories I've read that current AGP is very slow for input, thus limiting the ability to use graphics accelerators for non-screen rendering.

    Will 8x fix this problem?
  • Multiple AGPs.... (Score:5, Interesting)

    by Steveftoth ( 78419 ) on Monday October 21, 2002 @07:23PM (#4500127) Homepage
    Ok, first off AGP is an Accelerated Graphics Port. Notice that it is a PORT not a Bus. This means that in order to have more then one AGP Port, you would have to have more then one PCI bus. Since all the implementations of AGP share with the PCI control functions. It would be very difficult to just simply add more then one AGP port to the PC system, little things like the operating system would need to be updated, it's not like a simple bios tweak can handle it. There are already many problems with the current agp system. I'm sure some ofyou remember the whole fiasco with AMD and the AGP GART system tweak that was causing Linux to crash, but not Windows because AMD told MS to shut it off.
    Anyway, I too would like multiple AGPs on my motherboard, but it would take more then a smart vendor to make it a reality. Intel designed the AGP as a stopgap, temporary solution for the lowest common denominator. And it still works well if you only need one monitor.
  • by swordgeek ( 112599 ) on Monday October 21, 2002 @07:23PM (#4500132) Journal
    Please forgive my ignorance. This is an honest question.

    At the time that AGP first came out, I was under the impression that its primary advantage was to allow a direct pipeline to system memory, if you ran out of on-board RAM.

    Then RAM got really REALLY cheap, and we went from 4-8MB onboard to 32MB, almost overnight. Now you can get video cards with 64MB and even 128MB.

    I can't imagine games using more than 128MB of texture RAM, and so I have to wonder why AGP is still being developed. What else does it offer?
    • Games may not use 128MB of texture memory (or they may), but video memory is used by other stuff too.

      There are colour, depth, stencil, alpha buffers (probably others.) Another big thing is vertex information. A static model can be loaded into video memory. Newer cards support programs (shaders) which are executed on the GPU. All this could add up.

      RAM is cheap enough that adding more has a minimal effect on price, but could be useful for various things (even down the road.)

      I vaguely recall some hack which let you use your video memory as a swap device. That's nice for a system which is only used as a gaming machine part of the time.

      sh
    • I can't imagine games using more than 128MB of texture RAM

      Unreal Tournament 2003 has a textures directory of about 1,4 GB, and it's using relatively few stacked textures. Doom 3 might well use a dozen textures on one surface, so 128MB might not last that long into the future. Fortunately, texture compression helps a lot.
    • Actually, I think it shows very well that AGP is not a great thing at all. If you look at FPS of random cards and compare it to the memory bandwidth of the card(not AGP bandwidth), you'll probably come to the same conclusion that I have: memory bandwidth is more important than the GPU. The major improvements in the GPU have been in conserving memory bandwidth, or width of the bus to the memory. The Geforce4TI4600 offers a 10.4GB/s bandwidth. AGP 8x is a mere 2.1GB/s. With this in mind, any time you access main memory, it is going to be 1/5th the speed of the on board memory, meaning 1/5th the performance without speculative reading or cacheing. If your working set of textures is greater than the ammount of memory on your board, you will suffer severe performance hits even at 8x. 166Mhz DDR 8byte wide memory (333 dimms) are 2.6GB/s theoretical. System memory is not fit decent graphics anyhow.

      The reason why AGP will never amount to much more than a seperate bus(so it doesn't choke PCI) is that graphics vendors like NVidia and ATI will always put higher performance ram on the video card than are in the main system. Even if we had AGP32X that had a bandwidth of 10GB/s, there will be 50GB/s memory on the card, and memory too slow to even keep up with AGP on the motherboard. In the end, it may allow a developer to use a variety of textures provided that there aren't more than XMB of them in use at any given point in time. This is because you can fill the local memory with textures in less than a second. A small glitch in video that doesn't occur often isn't going to annoy most gamers if the graphics are nice.

      Video memory just needs to be different. Video memory got cheaper with normal memory, but it's not :) There are lots of reasons to want to do more than 128M of textures, but there are none to use system memory for video :)

      The video bandwidth to the monitor at 1280x1024x32bitx85hz(refresh) would be 6GB/s at that resolution,bit depth, and FPS(refresh). Most video cards would stretch to make that. Calculate that into the memory bandwidth of the next video card you get before you check out the FPS. See how accurate it is :)
  • Do you know how hard it is getting these days to find a decently modern PCI card for use in a multihead system? It's still possible to get fairly acceptable 2d cards for raw display, but anything 3d and you're generally screwed. This is why I'm looking forward to multiple AGP slots, not because of other devices that may make use of the slot (just look at the failure of VESA Local bus for that mistake), but for added video capabilities without having to resort to a lack-luster multihead card, an expensive as all hell multihead card *cough*MatroxParhelia*cough*, or a lackluster PCI card. I will be able to buy about $200 worth of video cards to get decent multihead performance. This, above all else, is what looks to be really cool...
  • Most software is designed for equipment with considerably less video mem. In some cases, you could probably get the data across fast enough using PCI.

    Even if a scene does have a lot of textures, clever memory management in the application can make sure that polygons using the same texture occur sequentially, meaning the load on the AGP bus is still quite small, only having to deal with new textures. Even if clever memory management is not used, a scene containing every texture in memory at every LOD will not happen in any real world situation.
  • by t0qer ( 230538 ) on Monday October 21, 2002 @07:42PM (#4500275) Homepage Journal
    Somewhere on nvnews.net (the official, unofficial support site for nvidia X drivers) I read that the X drivers support 16 cards running at once.

    I can think of several applications for this, starting with the 3dfx approach to boosting 3d performance by having each card take turns drawing a scanline (sli)

    There is also a possibility 3D displays on the horizon will require more information to draw the screen (Twice as much because the scene has to be drawn once for each eye)

    Another possibility is for game house use. Standard counterstrike gamehouses charge about $3@hr to rent a machine to play CS. If a player could rent a machine with a wider FOV from multiple monitors the operator could charge more to cover the costs of the extra graphics cards. I would gladly pay $20@hr to be able to play doom3 in a psuedo holodeck enviroment.

    Well thats my 2cents into the fray.
    • Due to various driver and graphical resources and features needing to be on all cards, the combinations allowed and the features supported (in multi head mode) are quite limiting. Usually the driver supports a few models of card and you end up with a very fixed system that you can't expand later.

      2D has been supported to varying degrees in X and Win98 for some time, allowing the desktop to span multiple cards from different vendors. With varying amounts of acceleration, Blting is easy, other features often fall back to software. Video overlay can be broken, degraded or only work on the first monitor.

      The situation is worse for 3D. Some dual or more setups will only 3D accelerate the first monitor, or the monitors on the first card. FWIW MS-Flight-Sim does 3 heads but its in 2D mode.

      All support for 3D-MultiHead so far is pretty much driver based, when graphic-library implementation support (openGL) is more appropriate.

      The DRI [sourceforge.net] is hoping to implement a more general system where accelerated features are exposed on all heads, at all times, span cards from different manufactures, and can share/use/display multiple applications at the same time.

  • by WolfWithoutAClause ( 162946 ) on Monday October 21, 2002 @07:45PM (#4500287) Homepage
    Well, a bit, sometimes; but mostly no. First you have to realise that modern graphics cards have a tonne of on-board memory. The on-board memory is used to store the current view, another view that the card is rendering (which is switched to only when complete to avoid flicker) and the distance that each pixel is from the user (the z-buffer). It also stores flattened out texture maps that the graphics card rotates into place on the rendering buffer.

    The texture maps usually take up the most memory, and they can change depending on the position of the player and even which direction he is looking in.

    The position of the objects is sent every frame but shows less variability.

    But the texture maps need to be transfered into the graphics card memory once before they can be rendered.

    So this happens initially when the texture first appears, but after that its in the memory and it doesn't need resending after that until it is flushed if it is no longer in view and something else needs the space.

    But just occasionally new textures are needed. For example sometimes in say, half-life I used to spin around and the screen would stop updating for maybe 1/8 of a second. What was happening was that the wrong textures were in the graphics card and they were being pushed down the AGP-1 pipe as fast as it could take it- not really fast enough- I'd often get a rocket launcher up me; the screen would have stopped updating for just a moment.

    Of course now the graphics cards have more memory, the software may be written better so that textures get preloaded before they are needed, and probably most or all of a levels textures fits into the card buffer anyway. So all in all- little or no waiting when spinning around; and the AGP is now x4 as well so instead of 1/8 second we are looking at 1/32 worst case; only 32 milliseconds, which for a one-off jitter isn't perceptible.

    John Carmack has talked about the idea of generating texture maps dynamically. If he were to implement this, then AGP would be much more important. Right now, precalculated, fixed texture maps are much more common in games. Bottom line- who cares about agp x8; it's like ata133 it makes no difference to nearly everyone.

    • John Carmack has talked about the idea of generating texture maps dynamically.

      I'm not sure if any of the Quake games so far (or maybe Doom 3) have used it, but the original Unreal had dynamic/procedural textures all over the place. Fire, fountains, and pseudo-particle effects were all handled by dynamic textures. Some of them were even fractals, IIRC...

      Unreal worked fine on computers that more often than not did not have AGP, but then the areas with dynamic textures were small portions of the screen and rarely (if ever) larger than about 256x256 or so.
  • As evidenced by the NetBSD support, AGP is essentially a PCI-bridge-plus-frills with only one PCI device on it: your graphics card. It also adds snazzy stuff like the command FIFO (which if you study I2O, you will note is generally useful and not just a graphics processor concept).

    The electrical simplicity of supporting only one card plays a large part in allowing it to be so much faster than the normal PCI bus. It's only a matter of time before you end up wanting AGP speeds for:

    1. graphics
    2. disk
    3. network

    Since the most normal PCI slots you want on a single bridge is four, you could have a reasonably balanced motherboard with 3AGP+4PCI ... assuming the expansion card vendors agree to make AGP versions of things.

  • by Vegan Pagan ( 251984 ) <deanasNO@SPAMearthlink.net> on Monday October 21, 2002 @08:06PM (#4500424)
    The point of fast AGP is letting VRAM act less like RAM (big and slow) and more like cache (small and fast). However, games are currently programmed for the former setup, so AGP 8X won't improve performance yet, nor will cache-like VRAM.
  • Just a though.... (Score:3, Interesting)

    by zannox ( 173829 ) on Monday October 21, 2002 @08:28PM (#4500570)
    But how about we get AGP4X working....Come on, out off all the /. crowd, a good part of you are running AMD CPU's. There is also a good chance that the AMD CPU is running on a VIA chipset. Anyone that does will know EXACTLY what I'm talking about. Windows/Linux doesn't matter you can set your card to AGP4X but "May run into instabilities or other irregularities"....They (Nvidia/VIA/MicroSoft/Linux) all say to put it at AGP2X "Because there is very little difference between the two and that the frame loss is minute"

    So do I

    A) Believe the above, and think their email's & tech support are liars
    B) Believe tey above is load of crap and all those crashes I have with AGP4X is a figment of my imagination. That when I set it to AGP2X they go away and 3DMark 2001 show less than 20 points difference between AGP2X & AGP4X

    Just think with AGP8X, I can finally cause a system seizure on more than one freakin $399.99 card. And in more than one OS! Yeah!
  • Graphics Arrays? (Score:2, Informative)

    by yokem_55 ( 575428 )
    The upcoming specs for DX9 and OGL 2.0 have features (128-bit color, displacement mapping, much bigger shader program support) that can begin to render in real time stuff that used to only be possible on the massive render farms owned by folks like Pixar and SKG Dreamworks. However the fist chips that impliment these features, the Radeon 9700 and nVidia's NV30 likely don't have sufficient performance to be able to make heavy use of these features realistically using only one chip.

    However, when using AGP 3.0 (AGP 8x) it is possible to put more than one AGP device on a port, and thus massive SLI configurations can be made realistically enabling the heavy use of the new DX9 and OGL2 features. ATI or nVidia may design boards with 4 or 8 chips per board, all running off of one AGP slot (would probably require and external power supply) that they couls sell for a few grand a peice to companies wishing to get into the realtime, high fidelity, near realistic 3d graphics buisness.
  • It takes a while (Score:2, Insightful)

    There wasnt really a noticable improvement from PCI to AGP, or AGP 1x to 2x. What you see is cards getting faster, and assume it's the silicon. The fact is that the faster bus is required to support the faster cards. The bus itself wont do squat for you, but a Geforce 9 aint running on AGP 4x.
  • How about being able to make smaller graphics clusters?

    WireGL comes to mind, but apparently it is now part of the Chromium project at http://sourceforge.net/projects/chromium/

    I figure that would be a groovy way to make use of multiple ports.

    Anyone want to donate $150,000 to me for researching how cool Quake 3 or UT2003 looks on 16 monitors? Uh... for purely scientific research, of course.

  • I'm envisioning a server....
    Its got 2 AGP ports...
    And a PCI graphics card?!?!!
    Dual Gigabit AGP NICs*...
    Mmmm...Erotic Cakes....

    -D

    *Assuming someone's smart enough to make them
  • How about game makers or other highly CPU intensive program makers use it as a cartridge slot. Sure it needs to be moved so that it is in a good position. Remember the old computers that could play megadrive games. Do this but the cartridge is actually some sort of dedicated hardware for use by the application. Make them hot swapable and it would be great. Hmm need some more graphics power plug in my graphics cartridge. Need to do some DSP plug in a different cartridge. We really need a versitle port for something like this.

What is research but a blind date with knowledge? -- Will Harvey

Working...