AGP4X vs. AGP8X 181
An anonymous reader writes "With upcoming chipsets such as the SiS648 claiming support for the latest AGP8X standard, we asked ourselves if there were any performance benefits. We took the SiS648 and Xabre 400 reference boards, modified them and compared the results." I can't even get 4x stable under XP, so I figure 8x is
half as likely to let me play NWN :)
Can't get AGP 4x stable? (Score:4, Funny)
Re:Can't get AGP 4x stable? (Score:1)
Re:Can't get AGP 4x stable? (Score:1)
Re:Can't get AGP 4x stable? (Score:2)
I believe you're knocking XFree86 here, and not Linux. But you're right, as Linux user and I can attest that XFree86 sucks when it comes to driver stability.
I'll get back to you when XDirectFB can run KDE apps..
Re:Can't get AGP 4x stable? (Score:1)
FWIW, I've found that dropping out of KDE (since you mentioned KDE, I assume you're running it) and into TWM for running games solves all of my stability problems. Quake3 will *always* lock up within 10 min. while KDE is running, but I can play for hours under TWM with no problem at all.
Re:Can't get AGP 4x stable? (Score:1)
You show me better Linux alternative to XFree86 and I will agree with you. Currently that's like saying "you are bashing windows explorer, API, hardware compatibility, and desktop, not windows" I don't think it would be practical to scrap XFree86, but a fork without support for remote desktop or backwards compatibility might be interesting.
Re:Can't get AGP 4x stable? (Score:1)
Accelerated X Display Server v6.0
Designed for desktop platforms, this X Window System
server and associated graphics drivers is the latest
in a long line of premium X servers for Linux and UNIX
system installations.
Supermicro boards? (Score:2)
Sometimes boards that have some of the features that you want, don't have all the features that you want, and when you spend alot on a good server/workstation board, you can't always jump to the newest standard on a whim.
Re:Supermicro boards? (Score:1)
Re:Supermicro boards? (Score:1)
Re:Supermicro boards? (Score:1)
Re:Supermicro boards? (Score:2)
What's AGP? (Score:1)
Re:Can't get AGP 4x stable? (Score:2)
But by now he is designing his own geforce 6 card.
Re:Can't get AGP 4x stable? (Score:1)
Re:Can't get AGP 4x stable? (Score:2)
Re:Can't get AGP 4x stable? (Score:2)
AGP8X (Score:4, Insightful)
Re:AGP8X (Score:4, Funny)
Re:AGP8X (Score:1)
it's like doing a for i in *;do rm -f $i;done instead of just a rm -f *. net result is the same, but the execution times are probably a bit different (depending on the number of files etc).
Re:AGP8X (Score:1)
what the hell are you talking about? bubble sort?
actually, i've seen bubblesort being used for vertice sorting in a real game
Re:AGP8X (Score:1)
Re:AGP8X (Score:1)
When you break it down into the amount of time that is spent transfering texture data per frame, you're looking at milliseconds in the double digits. At that point your 64MB of on-card RAM can indeed become a bottleneck.
Re:AGP8X (Score:2)
That would be about 37Mb of data per screen update, for a 60Hz refresh rate.
If you're using LOTS of high resolution textures, you need that speed.
Simon
Re:AGP8X (Score:5, Funny)
The really weird part is that a few grams of wet meat at the back of your eye can actually process and perceive 2.18 gigabytes per second of information.
Then within a few milliseconds, more meat analyzes it, distills it into high-level representations, calculates 3-D trajectories, then moves meat-based servos to aim and fire weapons. All for no other reason than it seems fun.
Life is strange.
Re:AGP8X (Score:1)
Re:AGP8X (Score:3, Interesting)
--
Re:AGP8X (Score:2)
Then there's those of us that want shiny, pretty things with lots of stuff happening and the newest most expensive hardware. Despite the fact that we're the minority, hardware companies are right there trying to keep us happy. Might have something to do with the fact that we pay 700 dollars for a video card that will be 200 bucks in 3 months
Re:AGP8X (Score:1)
AGP 8X may seem like too for anything mainstream to make good use of but in a few years 8X and NV30 will be in $500 systems. Developers will have had a few years to create better tools and learn to harness that power. Imagine then the games targeting what will then be considered a dull obsolete PC.
Re:AGP8X (Score:2)
I was pretty poor at the time, and couldn't afford any other upgrades so I made do. We're still going to have the UT2003 and Doom 3's to break ground (and piggybanks), but there's still going to be fun stuff coming out that uses last years technology or even the year before. It's just a matter of seeing through the marketing hype machines to find the little treasures out there. Like Bejeweled... that game is like crack!
As for RCT that game will run on practically anything, and it's great fun to boot
Re:AGP8X (Score:1)
why only 8X?! I want at least 32X! now! (Score:1)
He thought: "let's not aim for the very best we can imagine or produce.. instead, upgrade little bit at a time here and there and make customers think they get something special when "4X" becomes "8X", etc."
I mean don't they have resources for more than meager 8X, I understand that making things parallel is a bit costly but still, they could even try to make something significant instead of this.. this.. yuk!
It's conspiracy I say! Large manufacturers are the only ones who could make something like AGP32X happen but they don't want to give their bleeding edge knowhow out, they want to keep some moving space if something unexpected happen like unknown little companies releasing something revolutionary.
Re:AGP8X (Score:2)
We don't own a part of any hardware company, but we ARE bundled with many higher end video cards, both consumer level and higher. If we don't support their standards, we aren't bundled and we lose exposure to thousands of potential customers (never mind OEM revenue). We are the text generator in Final Cut Pro. Apple certainly expects us to keep up with ther rest of the market.
stop slashdotting sites its not funny! (Score:4, Funny)
thanks for slashdotting the server. thanks a lot.
i was wondering why the sudden slowdown when its 4am here (singapore), and i launch a new browser window, and the first thing i see is the agp 4x vs 8x article, and the page linked is hardwarezone.
Re:stop slashdotting sites its not funny! (Score:1)
(Score:3, Funny)
hrmn. That don't seem right.
whats the deal with NWN for linux? (Score:1, Offtopic)
Re:whats the deal with NWN for linux? (Score:4, Funny)
"Don't hold your breath waiting for it. When we announced it we didn't realize that Linux owners are all cheap fucks who don't pay for games."
Re:whats the deal with NWN for linux? (Score:2)
The Linux dedicated server will be distributed freely online, as close to the game being available in stores as possible. The Linux client will follow shortly thereafter. Linux users will need to own a Windows copy of Neverwinter Nights, as the Linux executables must import certain resources from those Windows CDs. All users will need to register their CD Keys (Linux users register the Windows CD Keys) at the Neverwinter Nights community site (www.neverwinternights.com). The Macintosh version will be available later in the fall (BioWare is completing the Macintosh Neverwinter Nights client and server programs, MacSoft is completing the Toolset).
weeee (Score:2, Insightful)
What I get worried about with these upgrades is that they're going to come out with games that actually require them! And them I'm screwed
Personally, I find it interesting that it continues to seem like every card *needs* more bandwidth, more power, etc (and yes, i know these cards operate at lower voltages, but still...). Someday I'm going to need that special SOUNDBLASTER QUASIEXTAGYWITHCHERRIESONTOP made for the SPECIAL SOUND CARD BUS WITH MORE BANDWIDTH. I dread the day when i need a special slot for every type of card i want
So gogo with the ultrauberbandwidth increases, but keep that backwards compatability! I like pci graphics cards sometimes!
Re:weeee (Score:1, Interesting)
I doubt 8X AGP is going to make a really big difference, at least not for a while to come. This comparison is using mostly synthetic benchmarks, not actual games. These sorts of benchmarks usually tend to show a more pronounced gap in performance than do real games. Actually if you look at a few of the benchmarks for Serious Sam and Quake3, 4X AGP is actually benchmarking nominally faster than 8X.
They should've tested with some more modern, more demanding games to give a clearer picture on whether or not 8X is actually much of a help or not. They're right though, we're going to have to wait until the RADEON 9700 and NVIDIA's NV30 before we can tell whether or not it really is going to make a difference. I'm putting my money on "no."
Re:weeee (Score:2)
That article would have been about a million times more useful if they had bothered to show performance of an AGP 2x system. From what I can see, there is no compelling performance increase (5% better doesn't compel me to buy a new mobo) with this new standard.
Anyone have an idea as to how AGP 4x compared to 2x when it first came out, and how it stacks up now that the technology is mature?
Actually I just got a new AGP4x video card, and I've been thinking about dropping it in my old Dell workstation with 2x AGP to see how it does, but of course that computer has a PII-350 and my new one has an Athlon XP 1600+ so it wouldn't really be too useful a comparison...
Re:weeee (Score:1)
Re:weeee (Score:3, Insightful)
Since the only noted difference is at lower resolutions it means the gfx core is the slowdown at any higher resolution which means the gfx core has to get a lot faster before the AGP bandwidth becomes the actual bottleneck. Which means one or two gfx core generations until you'll need faster AGP.
So, dont worry, there'll be no requirement for AGP 8x for any game that wants to sell more than a dozen copies in the next three years at least.
Feh (Score:2, Funny)
OT: SiS rocks (Score:5, Interesting)
Re:OT: SiS rocks (Score:3, Interesting)
But I had a MB based on the SiS 530 chipset and it was nasty. It was basically a cheapo bargain board. It sounds like they've improved substantially since then.
Re:OT: SiS sucks rocks (Score:3, Interesting)
As to another poster who says he likes SiS because of the "low heat/low power"
Re:OT: SiS sucks rocks (Score:2, Interesting)
Re:OT: SiS sucks rocks (Score:2)
Sometimes a company sucks for years, but suddenly gets better. Maybe SiS has done so while I wasn't looking. But when I go to a computer show and examine dozens of motherboards, and the ones that have clearly cut corners are mostly SiS-based, it doesn't produce a sense of confidence in their product.
And I wasn't trolling (I *never* troll). I've been building computers for 9 years (and make part of my living that way) and what I posted are my consistent observations over that timespan.
Re:OT: SiS rocks (Score:1)
OT: p4 License (Score:2)
One correction though: Last I heard it was not a proven fact that Via doesn't have a "valid" P4 license. Via claims that the license is valid because they purchased S3, and S3 had a license. Intel claims S3's license was not transferrable. It seems the case is still up the in air, and the lawyers will have to sort it out. Via does seem to have a reasonable claim to the license, however.
Re:OT: SiS rocks (Score:1)
Re:OT: SiS rocks (Score:1)
Scott obviously doesn't know a whole lot about mass producing hardware. One of the key factors that determines cost is the number of components. Less components = less money. A single chip solution will almost always be cheaper than two. Most likely there is some technical difficulty preventing SiS from making a single chip P4 chipset.
Re: Scott (Score:1)
Re: (Score:1)
Are these good tests to be using? (Score:2, Insightful)
Video Card Limited!!! (Score:4, Informative)
I'd definitely take this with a grain of salt until someone can do a 4x/8x review with a NV30 or a ATI 9700.
What kind of hardware guy looks at this and doesnt say "WTF Xabre 4000?? What kind of video card is that to benchmark anything?"
Hopefully the
-- D3X
Re:Video Card Limited!!! (Score:1)
The moral of the story is _research_, children.
Re:Video Card Limited!!! (Score:1)
-- D3X
Re:Video Card Limited!!! (Score:2)
The kind of hardware guy who knows the Xabre was the first video card to support AGP8x and is still one of very few that do.
difference? (Score:1)
This guy is smoking crack. all of the charts are virtually identical. Maybe a different person wrote the writeup from the one who made the charts?? or hes on the payola. Either way there was practically no difference.
AGP4x VS AGP8x. (Score:4, Insightful)
Well, when AGP 1x was out, people didn't find it very useful because it wasn't fast enough
AGP2x was okay to offload the PCI bus and do some basic stuff, but not fast enough for high-speed games and transfering large chunks of information.
AGP4x seems to be okay for today's technology and all, and AGP8X seems to be way overkill, but I personnaly think that it's finally what it should have been since the start: a *VERY* fast graphics port on which the bandwidth bottlenect doesn't become an issue, * at any resolutions * , and that help cutting down the cost in other fields beside gaming. (one example: uncompressed video editing 1600x1200@24bits(or more for film and with newer card with better colorspace) @60FPS) Right now you require exotic hardware for this, especially for uncompressed playback. let's say you'd want to invest on a fast Ultra320 array (ok you'll say if you do so you can afford the exotic hardware as well, but the point here is actually CUTTING down the price, and this is one way), well now you could get way more drives for your system.
There are many more examples for this, but the main idea is there are new features that are going to come out for cards, bigger bitdepth, better this and that, that's going to choke the bandwidth and 256MB on a card won't be enough in a not so distant future, using system memory at almost local memory speed increases quality and possibilities tremendously, and while we don't see much use right now, I'm sure it won't take long after 8x is installed that we'll see a use for 12x or 16x
Not fast enough? (Score:4, Informative)
AGP2x was okay to offload the PCI bus and do some basic stuff, but not fast enough for high-speed games and transfering large chunks of information.
Not fast enough to be useful? What reviews were you reading?
Back when AGP 1x and 2x were rolled out, they were found to be marginally useful because the graphics card was the bottleneck. This is true even today. Fill rate is still almost invariably the bottleneck for performance, and CPU power for geometry and physics is usually second.
The original intent of AGP was to transfer textures across the bus, with the card's texture memory just a cache of this data. But this is a _bad_ thing to do - bandwidth and especially latency of a card's on-board memory is likely to be much better than AGP transfer bandwidth and latency, so nobody in their right mind writes games that require streaming textures from system memory. This isn't going to change - the memory in your PC is optimized for being big. The memory in your graphics card is optimized for being fast. Even with a zero-latency, infinite-bandwidth AGP port, local memory is better.
All AGP is used for now is to transfer geometry data, and it's plenty fast for that (cards are still generally fill-rate limited). With on-board transformation and lighting, and further folding-in of the graphics pipeline on the way, the amount of data that needs to be transferred per frame is going to get _smaller_, not larger.
Very high AGP transfer rates are a marketing bullet-point, and not much else.
Oh, and if you're editing a 1600x1200 movie on a PC, you're limited by your disk transfer rate. No way are you storing *any* significant chunk of that in a PC's RAM.
Re:Not fast enough? (Score:2)
ever heard of PCI-X and aggregated (i.e. many in parallel) Ultra320 arrays? Added with lossless compression that result in 1:1 up to 4:1 compression depending on the data?
Plus, I was merely stating an example, add some funky stuff to process on the graphic card or CPU before displaying (thus you *MIGHT* need the extra bandwidth back and forth the memory/gfxcard/cpu to process the information PRIOR dumping it on a display. Of course you'd also want plenty of RAM to buffer the whole thing. You can add mathematically LOSSLESS compression (like a ZIP codec for example) to the video stream comming from the array, effectively doubling (in most cases) the amount of data comming in (let's see, "double" PCI-X bandwidth, yep... that's a lot of data). Of course you need a Quad CPU system to do all of this in real time (or a very powerful dual system).
As I've stated, it's easy to blast ONE given scenario, I'm sure a lot of people here could give you many scenarios where 8X is welcomed. In my case I'd have to break a (blah!) NDA to illustrate a very specific case in detail, but the concept of increasing complexity, bitdepth and quality/functionnality of newer graphic cards still remains.
About the 2x issues not being good enough, well the latency and all is a big problem for GAMES yes, your specific example for GAMES is right, but for OTHER stuff, 2x was too SLOW, with or without the latency issues, the bandwidth was just too little. The numbers in theory were good, but in practice with all of the other processes going around you had to count the given numbers by almost half. Anyways, you're right about the gaming issues and the fact that these GAMING card couldn't perform. I was thinking ASIDE from gaming. Profesionnal equipment, HDTV editing, Framebuffers, etc.
Re:Not fast enough? (Score:2)
ever heard of PCI-X and aggregated (i.e. many in parallel) Ultra320 arrays?
Of course you need a Quad CPU system to do all of this in real time (or a very powerful dual system).
Quite the "PC" there. *smirk*
I repeat - nothing that you're going to do real-time video editing on at that resolution will *have* an AGP bus (or cost less that about ten times what a home PC costs).
All you're doing is supporting my case.
Re:Not fast enough? (Score:2)
AGP Port ?...... (Score:1)
Re:AGP4x VS AGP8x. (Score:4, Interesting)
With texture memory creeping upwards in 3D cards we should eventually see a point where all textures can be stored on the card and sending textures over AGP should be rare.
However, sending geometry is usually done per-frame in most 3D games, and you'd be surprised how much all of those triangles can add up.
1M triangles, with 3 vertecies, 3 texture coordinates, 3 normal vectors and sometimes more per vertex with each vector being comprised of 4 floats and each float of 4 bytes.
1,000,000 * (3 + 3 + 3) * 4 * 4 = 144,000,000
That's 144MB per frame. At 60 frames per second that's 4.22GB per second.
Now, granted, 1M tris per frame is way high for today's games. Most current games push around 30k per frame, never more than 60k. My friend and I are doing closer to 300k and are already starting to become AGP-bandwidth-limited.
Anyway, you are right. You can't have too much bandwidth to your video card. I'd love to be able to push a full 1M tris/frame, and I'm sure I will be able to soon. Just not yet. And not even with AGP 8x in all likelyhood.
Justin Dubs
Re:AGP4x VS AGP8x. (Score:1)
Re:AGP4x VS AGP8x. (Score:2)
Also, at the ~300k tris/frame we are pushing at the ~30 frames we are getting, that would make 1.2GB/s of bandwidth. That's assuming perfect efficiency. We are probably using over 1.5GB/s of bandwidth.
The triangle we are pushing is stored completely in RAM and is being moved via DMA to the vid card. The scene itself is static, you are correct, but the camera can move, as this doesn't require any changes to the geometry be done by us.
We just push the new Model-View and Projection matricies, and push the same 300k tri scene and let the video card transform the triangles for us.
So, you are right, being AGP-bandwidth limited isn't a certainty. In fact, it is likely that we are video card limited. But, regardless, at 1.5GB/s we are getting close to pushing AGP 4x to the max. And when the next generation of vid cards comes out that can push twice as many tris/sec we will need AGP 8x to keep the vid cards saturated.
Justin Dubs
Re:AGP4x VS AGP8x. (Score:2)
Cause no one else has been able to devise this algorithm. Atlease not so that it will run in real-time.
Also, the math for the triangles is done on the GPU (the CPU on the vid card) rather than your CPU for almost everything. You just push the Model-View and Projection matricies on the matrix stack and push the geometry and OpenGL will make sure your video card does the transformations for you, which it can do in hardware much more efficiently than your CPU can.
Justin Dubs
my Tribes2 + X (Score:3, Interesting)
reading my X log I notices that DRI was using 1X mode for AGP. after some RTFM, I found the option to kick it into 4x.
Anyway the point being it didn't help speed up the game gfx (well i didn't notice much difference)
In case ur wondering for ATI cards the XF86Config option is:
Option "AGPMode" "4"
Also i noticed:
Option "AGPSize" "32"
But i cant tell if setting this bigger then the ram on the card helps or not (maybe that the buffer size opt?), was hoping to let the card borrow more of my sys ram (which is pc100, slow compared to gfx cards DDR, but better then hdd =)
anyone know any other good opts to help eek more speed?
Re:my Tribes2 + X (Score:2)
Also, regarding the AGPSize (which I thought referred to the AGP aperture in BIOS), I once read that setting it higher than 64mb in Linux was useless because of some X limitation. Perhaps someone with more experience can enlighten me as to why...?
Not for that video card. (Score:2)
His problem is A) Running Tribes 2 on an older Radeon. And B) The Tribes 2 Garage Games engine was horribly unoptimized in that game.
Re:my Tribes2 + X (Score:1)
HTH,
jh
Re:my Tribes2 + X (Score:1)
CmdrTaco... (Score:2)
It's always about you, isn't it?
Re:CmdrTaco... (Score:2)
god, what are these people thinking? (Score:5, Insightful)
What are these people smoking? The vast majority of the tests are all but identical. The VERY BEST performance difference is 3DMark2001SE Pro at 800x600x16, and it shows a whopping 4.7% improvement.
Clue: In the current 3D world, AGP4X IS NOT a constraint. Even AGP2X is fine. Hell, there was an early version of the (TNT2 or GeForce 1, I forget which) that was *PCI*, for chrissake, and it was only a whisker slower than the AGP cards at the time.
Geometry transfer, it would appear, just isn't very bandwidth intensive. The only time the AGP rate is going to matter much is when doing very heavy texturing from main memory, but that just isn't happening. Instead, manufacturers are putting more and more RAM on the video card instead, and all the games are oriented around pre-loading all necessary textures in that specialized, super-high-speed RAM.
At the present 1.06 MB/sec transfer rate of AGP 4X, that means that the entire video RAM of a 128MB card be filled in roughly 1/10th of a second. If you spend all the time, money, and effort to upgrade to AGP 8X, you can improve your load time by 1/20th of a second.
Just think...if you played 50 levels of some FPS a day, every day, you'd save over 15 minutes in your first year alone!
Obviously, this is a very important technology we should all rush out to buy. Thanks, hardwarezone.com! I'll trust you for all my technology reviews in future.
-----
AGP8X: Saving your time so efficiently, you won't even notice.
Oh come on! (Score:1)
What's supposed to happen is that insanely high resolution textures are supposed to be streamed from that gigabyte of DDR400 RAM that you have to back up that 128M GF4Ti4600. That's why we need more bus bandwidth. Trying to stream hundreds of megs of textures before the next frame needs to be rendered requires absolutely insane amounts of bus bandwidth.
Re:Oh come on! (Score:2, Interesting)
It may help doing background loads of 'seamless transition' games, but even so.... unless you're trying to stream all these textures out every frame, it's not likely to help much. AGP 4x can fill a 128MB card in 1/10th second; 8X can do it in 1/20th. Unless you get to the point of multiple updates per second, it's just not going to matter very much. Developers will use good caching algorithms and reasonably careful level design to work around AGP speed issues.
Streaming textures IS a pretty cool idea, and I would like to see games that use them. Maybe Doom 3 will, but it hasn't sounded like Carmack is trying to do anything like this yet.
The reason I was so acerbic in my original comment was that the website was talking like it mattered NOW, for the apps we have TODAY. (a whole 4.7% increase! in one benchmark! wow!).
In a nutshell: for everything out now and probably for another 18 months, AGP8X isn't going to matter a whit. Don't worry about it until 2004 sometime.
Re: (Score:1)
hypocrite (Score:1, Offtopic)
I don't run MS Software. I run Omniweb for a browser on Mac OS X. I don't use Office, I use Appleworks, and if not, I could always run KOffice or even star office.
Mods will probably mark this as off-topic, but c'mon, is cmdrtaco's offhand "Look I run XP" comment on topic? Of course, he's not as easy to moderate, I suppose.
Re:hypocrite (Score:1)
The minute an editor says something objective you jump on him for not being blindly pro-Linux?
If slashdot loses the few strands of objectivity it has left it will be of no value. They'd do no service to people by presenting propaganda. I'm quite sure the editors realize this, you'd do well to realize it yourself.
Re:hypocrite (Score:1)
Unstable in XP huh? thank god for linux (Score:1)
Status: Enabled
Driver: NVIDIA
AGP Rate: 4x
Fast Writes: Enabled
SBA: Enabled
uname -a
Linux daryl 2.4.19-gentoo-r5 #5 SMP Fri Jul 26 18:07:32 EDT 2002 i686 GenuineIntel
Nice and stable!
Re:Unstable in XP huh? thank god for linux (Score:2)
Fancy shit (Score:3, Insightful)
The whining and crying about AGP 8x is a bit premature and the AGP 3.0 standard has been pretty much supplanted in usefulness by graphics card manufacturers. Having a dedicated high speed port for graphics hooked up to the northbridge is a good design idea. It frees the traditionally low bandwidth nb-sb connection from needing to carry lots of graphics data. The memory sharing available in AGP has become increasingly useless as worthwhile graphics cards have scads of local memory now. About the only thing an AGP apeture is good for is an i845G chipset board or some other cheap piece of shit HPaq sticks in their computers.
The AGP 2.0 spec isn't much of a bottleneck either. Case in point, replacing the TNT2 based video card in my dual P3 500 with a GF2GTS more than doubled the 3DMark2001 SE score from 926 to 2068. The board is an IWill DBD-100 with a 2x AGP port on it. The fillrate or poly rendering ability was not adversely affected by the AGP 2x port, the only thing keeping the 3DMark score down is the relatively slow processors (as 3DMark is single threaded) and the low FSB bandwidth.
The fillrate of an ATi R300 or nVidia NV30 isn't going to affected much by an AGP bandwidth on ONLY 1GB/s. Most cards based on these chips will end up having >100MB of on board memory. It won't be too terribly long before the video card in the PC has more and faster memory than the system's main memory. Even Doom3's 80MB of textures isn't going to really stress a 4x AGP card, it would take all of a seventh of a second to transfer all 80MB of textures. Maybe AGP 8x will be on my upgrade path when the load time of a game's textures take a perceptible amount of time to load into the video card's local memory.
Rob it isn't Microsoft's fucking fault your AGP card doesn't work properly, you're probably stuck with some old VA Lin^H^H^HSoftware POS box. My system doesn't have any problems running reliably under Windows XP and I don't think too many other people running Windows 2000 or XP are having too many problems either. When do we get to mod the editors as -1 Troll?
Re:Fancy shit (Score:2)
To hit any kind of realistic graphics in complex scenery they need to handle a minimum of a million triangles per frame. 3 or 4 would be better (think individual points on maple leaves on a maple tree).
The math is easy (points * fps * number * 4bytes).
Once we hit 96x AGP (and a GPU which can crunch it) we can start getting some games which could be confused for a photo.
It's still pretty easy to tell them apart at a glance. Movies are certainly getting good, and stills we've pretty much mastered depending on the artist -- so maybe 2010 or so games will be able to start concentrating on physics improvements (hardware) than graphics hardware.
Re:Fancy shit (Score:2)
Also it isn't usually feasible even with tons of processing power to add more triangles to a scene than you need. A super high quality picture of a maple tree could be easily done by making a rectangle with a transparency map, bump map, and texture map fitted on top of it. A single maple leaf object can have as many instances as you need and only require that one bit of memory space for the model and maps. Wait until Doom3 and UT2002 based games hit the shelves. AthlonXP 3000+s with their GF6Pt will be outputting shit that looks like FF:TSW in realtime. Aki won't need a billion vertexes, just some cool shader tricks and support for hardware transforms and patches.
Re:Fancy shit (Score:2)
Huh?? (Score:1)
Huh? The difference at 1024x768x32 and above is moot, or often non-existant. Are these guys looking at the same graphs I am?
No one plays at 800x600x16 anymore.
other uses (Score:1)
Anybody else notice... (Score:2)
No wonder they're calling a "4.7% increase" worthwhile... jesus...
utilization of new technology and API's (Score:1)
warning! this is probably stupid but I am sleep deprevated and rambling...
since many will say that one of the big issues with any large leap like 4x to 8x is utilization of that technology, I wonder about the fact that there does not seem to be an indication of significant slowdown of these types of HW advances. In the face of this (as if that is a startling revelation) I wonder if API's (and the drivers written to support them) would be best served by making forward compatible designs. For example: the directx design allows any release of directx to work with older versions called on it.
That is good, however because of the WAY that the calls are written (and among these is the very annoying factor of inconsistency between versions) it is rarely an easy task to upgrade directx versions (or even sub versions) within a program. It would seem silly then to go in and do an equivelent amount of work within the code base who's goal was to 'buff up' the pipeline and storage method.
Now I am probably wrong... but as far as I know (haven't honestly messed with any directx past 7) there is no 'bandwidth detection' that is trully open ended, thus allowing a maximal optimization of texture and object transfer based on the [usable] bandwidth. I know memory is checked (optional), but what about the bandwidth? Would an external library that acts as a API of API's work, in which you could store the algorithm implementations and constants that, say... in the case of some great hardware advance would either already recalculate (with a config routine) or be patched that gives those with the new HW toys something to play with? Would this significantly slow down the program with an added lookup layer (or more)?
I only ask this because I am toying with a graphics rendering engine (toying being the key) that while 'could' be used for gaming will most likely be for rendering architecture crud. Because I am lazy, and for the sheer pleasure of seeing if it can be done, I would like to see an easier way to upgrade programs to make use of new technologies. Perhaps this could simply be a build time only API/tool that is a developing framework... ah, who knows?!
However (assuming this does not get modded down for [stupidity]) if anyone knows of such an existing process, toolset or API please respond.
Good Upgrade Path (Score:1)
Then when your hardware is starting to lag (for games) you can go out and get an 8X card that will have matured, and become afordable. The performance gains, from moving to 8X from 4X, might be only a few percent, but couple that with a new ATI or NV card down the road and boom you got playable framerates again.
The impact of AGP speed (Score:4, Informative)
Bad benchmarks (Score:2)
They need to run some tests where the memory to GPU bandwidth dominates the problem. For example, open up 3DS Max, Maya, or Softimage XSI with a complicated textured scene that can't redraw at full frame rate, and see if it helps.
The big win for more AGP bandwidth should be when the board's texture memory is full and the textures spill into main memory. Typically, game textures are tuned to avoid this, but you hit it all the time with authoring tools.
A bottleneck on geometry feed from the main CPU is unlikely, since it's hard for the CPU to generate a gigabyte/second of geometry.
Re:Bad benchmarks (Score:2)
I agree, it's silly to benchmark AGPx8 on games which are designed to run on the current crop of video cards and at least one generation back. The ATI 9700 and the NV30 are really streaming processors with a really slow memory access, games can compress textures and use low polygon optimized models because they throw lots of programmers at the problem. Now with floating point instead of bytes for the buffers that will need 4 times the bandwidth for the same performance, for better pictures of course. A good test would be to send a high dynamic range (floating point RGB) movie as a texture for a cube map, see if the frame rate isn't exactly twice as fast as AGPx4 then... Or just send 10 nice 0.5 million triangle subd surfaces, the frame rate will be dreadfully slow, even if the hardware accelerator can handle the load.
AGP for dummies (Score:1)
- Enhance AGP Performance
Two options are available: Disabled or Enabled. The default setting is disabled. This item can improve your AGP display performance. How about that - the 'Enhance AGP Perforance' option can in fact improve my performance. Who'd have guessed? No idea how or why it does it, nor why it's disabled by default. Also available are AGP Driving Controls (with a manual specification option involving settings in hex), Fast Write, Read Synchronization, and a few others. Can anyone point me to a site that might demystify some of this stuff for me? Guess-and-check + reboot for each combination isn't appealing...