ATI Radeon 256 146
snack writes "FINALLY! ATI has released info on their new graphics chip, built to take on both the 3dfx and nVIDIA. Reading through the press release it says that it has Windows, Linux and Mac suport. There are no benchmarks yet on the Web site, but reading through the tech specs it seems that this chip will blow everything else away. It also says that over the summer, this will implement the MAXX technology. Two of these chips working in parallel... Oh, my God!"
Re:Graphics in X (Score:1)
Another way to make a contribution is to intelligently reward companies like ATI and 3dfx who're supporting you as a Linux user by releasing register level programming info and/or paying for quality drivers to be written. Conversely, don't fall for hype from companies who promise support and don't deliver. ATI for example went from honestly not supporting Linux, to honestly supporting Linux and the OPen Source community. Nvidia by contrast has taken many of you for a ride, promising support and basically stalling with crappy GLX modules, ignoring the DRI standard interface and Xfree4.0, and in the end stonefacing all who raise these points in criticism. If you have a G-force and you were hoping to use it in Linux, sell it to a WinD'ohs user while it's still worth something, get something else and get on with your life. The wait is over, or infinite--which is the same thing.
History Repeats Itself (Score:1)
1. They don't support their cards.. They can't even get the drivers right for windows for crying out loud.
2. Windows is the only OS they support. All that other bullshit you hear is a crock of shit. Supposedly they hired PrecisionInsight to do their 3D Drivers which were to be released this quarter. Still nothing
3. From what I can tell the XF4.0 2D Server was done by SuSe and not ATI. btw Thanks for the specs ATI. Why would I support a vidcard company that can't write its own drivers?
4. They talk about how the hardware is superb. Yes that is nice but I could have a supercomputer in my room and without anywhere to plug it in how useful is it? Makes a might fine paperweight I'm sure.
5. We are going to be the #1 Opensource leader in graphics cards. Are you serious ATI? I like this reason here not to run ATI.. Blaatantly lying. They haven't done one thing for any other operating system besides windows in general. Look at their record.
6. The drivers are coming soon. We will fix this soon. Soon. Yeah soon. Its been 2 yrs now but soon.. Very soon
Its not even worth continuing. However I applaud the absolute idiot who helps ATI by posting a story about how their card is going to kickass and support Linux and Windows and all that. He is an absolute dumbass.
Check the record before putting them on showtime. Another reason why slashdot has turned lame. Absolute stupidity.
-I'm a troll? yah? I_redwolf(zanee)
Re:Hmmm (Score:1)
Re:Let's play a little game... (Score:1)
None of these cards will come close to their theoretical peak performance, but they're still pretty damn fast.
Now all we need is AGP 8x and 1024bit on card memory buses. And 2 meg on chip texture caches.
Re:Nice... (Score:1)
Discounting ALL online reviews is a mistake (Score:1)
The analogy anyone (meaning non-wired people) can understand, is: would you trust one of those news magazines at the checkout counters that have stories about the return of Elvis on the cover? Of course not. Well, how about if your local Sun or Tribune started selling out? You'd find another one. So yeah, there are bad news sources online, but don't hurt the good ones by making blanket statements like that.
Re:Let's play a little game... (Score:1)
the PS2 only has 4Megs of Video Ram
This means that at 640x480 there isnt much place left for textures and stuff
To avoid having to fetch the textures from system RAM and thus slowing everyhting down a LOT, they are forced to lower the Resolution to 640x240 or 320x480
Considering how everyone slaers over the PS2 all the time, am i the only one finding that to be acceptable?
Re:Let's play a little game...(correction) (Score:1)
--
Re:*sigh* (Score:1)
I wish I had a nickel for every time someone said "Information wants to be free".
Re:Not So Overwhelming, After All... (Score:1)
Yes, as a long-time ATI person, I agree, their drivers exhibit much suckage. Which is why ATI will always suck no matter what they do with hardware.
Thanks for the tip on Rage Underground. I didn't know about that.
I wish I had a nickel for every time someone said "Information wants to be free".
Re:Who to trust? (Score:1)
Mankind has always dreamed of destroying the sun.
Re:HDTV? (Score:1)
Re:3dfx is basically gone (Score:1)
nVidia/SGI have a driver in the works. I've seen it working at one of SGI's "linux university" shows. It was runing a performer demo, and it ran it very fast:) The driver is supposed to be out in May along with an SGI Linux workstation with their own version of an nVidia card in it.
The tech game (Score:1)
Multiple textures in 3D graphics? (Score:1)
Uwe Wolfgang Radu
Re:Dual Mode Graphics (Score:1)
hm, from the press release:
To ensure total performance dominance, the chip also includes support for ATI's patented MAXX(TM) multi-ASIC technology, enabling twin Radeon 256 chips on a single graphics card. The new chip will appear this summer in a range of board products.
I didn't see anywhere multiple boards, but if they can get circuitry for 2 radeons into one asic, I'm sure they could find a way to run them in parallel on one video card. But then again, if the radeon lives beyond the hype, we shall see.
But for linux consider your alternatives: nVidia insists they will bring killer drivers to linux (I have been waiting a long time now and am tired of their development efforts for linux). Question for somebody else, hasn't ATI opened the specs for their older cards? Aren't they contracting Precision Insight to bring drivers to linux (Xfree 4.0)?
Re:Geez, 128? (Score:1)
Bullcrap! current geforce cards actually smokes the "pro" OpenGL cards(like 3dlabs and evans&sutherland) maybe except with regards to driver stability, and "advanced" features like AAlines hahahaha....
A lot of DCC professionals will testify to this.
ATI will never be able to make a professional OpenGL card neither will 3dfx.
Check out the facts before you start mumbling on slashdot.
Here's another preview (Score:1)
Re:So Surprising... (Score:1)
I am using ATI All-In-Wonder AGP version for my RH 6.1: It's ok.
2D performance is a little low when you shift desktops - perhaps some more video RAM would have helped. TV is working although the (third party open source) TV program is somewhat rough. I can't tell you about 3D performance because I am not using any 3D programs. There are some not-so-active Linux projects working with ATI cards (GATOS), and some optimized ATI drivers on its way together with XFree86 4.0.
The main reason for me to choose ATI was the ergonomic qualities of ATI products: Refresh rate >= 100MHz in 1024x768 resolution or better (I think NVidia does that too).
All-In-All: All-In-Wonder is an ok card. Recommended.
-Claus
Re:So Surprising... (Score:1)
Re:So Surprising... (Score:1)
Re: (Score:1)
Re:Let's play a little game... (Score:1)
But the average PS2 resolution is 640x240x32 (16 bit color, 16 bit Z), which is as high as NTSC televisions will go. There is also a 3.2GB bi-directional bus between the GS and the EE. Technically, it would be possible to render a section of the framebuffer in the GS, and then page it out to the EE. Then the EE can do video post-processing (non-photorealistic rendering and complex antialiasing come to mind as possible applications) and output a framebuffer to the GS for display.
There are plenty of techniques to work around the 4MB VRAM on the PS2, you just need to know about parallel processing, multithreading, and paging. Unfortunately, since most games developers are still self-taught, these aren't common knowledge in the industry. When Phil Harrison guest lectured at a CS class of mine he basically went so far as to say that over half of current game developers don't have nearly what it takes to get the most of the PS2.
Re:Let's play a little game... (Score:1)
Typically it will end up being 1 matrix per triangle list (to include animation and translation), which will probably drop the matrix read bandwidth requirements by a factor of 30-40.
Re:Not So Overwhelming, After All... (Score:1)
Re:Not So Overwhelming, After All... (Score:1)
--
Hardware HDTV (Score:1)
The only way HDTV is ever going to catch on in the US in the next few years is if we start using the only tubes currently in households that can handle even 50% of the resolution -- Your computer monitor.
3D Graphics boards (Score:1)
It started when i bought a Matrox G200, which was advertised as having a full OpenGL implementation. Thinking 'woah, this'll mean i'll be able to run lightwave at a great pace!' i bought one.
Turns out, as with a lot of hardware these days, that the OpenGL drivers weren't ready when they shipped the product, and they only actually released them when the G400 shipped. This is about a year later.
Suffice to say i'll never buy another matrox product again.
The Nvidia GeForce is now out, supposedly bringing the wonders of hardware T&L to the world. Well, i have yet to see anything, bar NVidia demos, that actually use the geometry acceleration. Why? 'oh, the drivers that support it aren't ready yet'
And on my current TNT2 card, i have to use the 2.08 drivers (lst time i looked the drivers were up to release 3.58) because the later drivers break OpenGL 1.2 compliance.
How long does it take to get a decent set of drivers for a chipset???
And if i was a betting man, i'd put $100 on the fact that theres no way ATI are going to ship a product with all features enabled and working in the first release.
Re:Dual Mode Graphics (Score:1)
Errrr....you'd need to have a motherboard with 4 AGP slots right?
-
Ekapshi.
Re:So Surprising... (Score:1)
http://www.ultimatechaos.com/ucfx/
-
Ekapshi
Re:WoW! (Score:1)
Geez, 128? (Score:1)
Alright, I'm not an expert on video hardware. I always assumed that the amount of memory a card had was related to it's max resolution (800x600, etc), at least that's the way old cards were. I got into an argument with some guys in my office about weither a card with a ton of memory (say 64 mb) would increase the preformance of non-realtime rendering (say Bryce or something). I keep thinking how would video memory improve something like that. I know you can cache textures in vid memory, but what else is it good for?
And as for this card, I think the most impressive feature is that HDTV hardware. I'm guessing that it will act like a super-tuner-card or some such. Now you can watch PBS in super high rez.
And for the record, I'm using a 4meg ATI card right now (Rage Pro 3D or something), and it seems to handle Quake 2 just fine.
Re:Radeon? sounds like a cross between (Score:1)
Yellow tigers crouched in jungles in her dark eyes.
Surprised (Score:1)
Not curious .... WinHec ... (Score:1)
HDTV? (Score:1)
Internet Appliances perhaps?
Linux support is a definite plus.
WoW! (Score:1)
30 million transistors, in an
HDTV hardware decoder, 1.5 GigaTexel/Second rendering engine!
Charisma Engine ? I kinda like the name... something like velocity engine. But Radeon is definately a nice name!
I just find it hard to believe that ATI will take over the Voodoo5 and GeForce 2! I mean they have always been first out with the higher memory and stuff... BUT I have never liked the chips till now! Hmm any idea about the G800's... Matrox is one company thats my bet at all times!
Hmm linux support would be nice... but i will wait till they deliver!
Re:overwhelming... (Score:1)
Re:overwhelming... (Score:1)
Arrggghhh.... Got the G400... love the card, but its kinda rough having to wait for Quake3 and UT support in Linux... having to reboot into WinDoz for UT is such a bore! And I completely agree with ATI not being able to deliver what they promise! The Fury was to be some monster with 32 Megs of RAM and it just sucked for what it cost! WHich is what I am thinking about this chip... its probably going to cost a BUTT load!
False! (Score:1)
Re:Nice... (Score:1)
Anyone want to write a system to use spare texture memory as swap. Presumably it would be a bit speedier than Hard Disk.
Re:So Surprising... (Score:1)
1. Every manufacturer tweaks its own in-house benchmarks, thus making them useless to consumers.
2. The benchmarks in commercial magazines often favor their biggest advertisers, thus making them useless to consumers.
Re:Hopefully (Score:1)
Someone mod this up 'funny'
deja vous? (Score:1)
My memory is a bit fuzzy, so if someone could correct me, I'd appreciate it.
Show nVidia who's the Man... (Score:1)
Now go and buy one of these newfangled cards - and don't forget to write nVidia a polite letter explaining why binary-only drivers just don't cut it anymore =)
Linux support-What no XFree86? (Score:1)
What 'custom linux' thing are they doing such that Linux is listed and not XFree86?
Shadow casting (was: Re:Hmmm) (Score:1)
It's not really hard to do shadow casting in hardware, in fact standard OpenGL can do it, with a bit of creative use. See Nvidia's ShadowMap demo [nvidia.com] for an example. The source for a lot of the latest effects is Wolfgang Heidrich's thesis [mpi-sb.mpg.de]. Lots of really cool ideas, needs a reasonable computer graphics background, though.
If you want to get higher precision and speed you'll need some extensions, but not a lot. Depth textures and copy from framebuffer are enough, and have been available on high-end sgis for years, so the design is sort of stable.
Moral: with a good API and some creative use you can get really cool effects. Hardware can make it fast, but we'll have to see if the ATI chip delivers on that part...
Try PowerDVD (Score:1)
As for your DVD issues, though. When I upgraded to a KX133-based motherboard, (Yes, non-intel to another non-intel), my DVD playback got messed up. (jerky picture, flicker). I tried PowerDVD, and the playback is better than ever! Give it a try. -WD
Re:3dfx is basically gone (Score:1)
------
Re:Graphics in X (Score:1)
Nvidia released a GLX driver for XFree-3.3.5, and while it has a few bugs it does work and does support the GeForce. The only references I've been able to find about Nvidia *not* supporting DRI are on Slashdot. If you read the Nvidia site or the DRI developer mailling list you will see that Nvidia said they werent going to improve the 3.3.5 driver until XFree4 and then they would release a DRI driver. The last I read about the DRI driver was that it would be released in the first half of 2000.
Re:Graphics in X (Score:1)
Re:Let's be honest here. (Score:2)
I'm almost sure that the Linux drivers are done by Precision Insight (including Itanium), and I don't think that the Mac developers help the Windows drivers developers at all..
Just my thought
First to break the gigapixel barrier!! (Score:2)
Bitboys Oy [bitboys.fi]
The one and two chip solutions will deliver the first one and two gigatexels per second performance in the 3D market, with an amazing feature set and low solution cost!
ATI [ati.com]
First graphics chip to break through the Gigatexel barrier with an awesome 1.5 Gigatexel per second rendering engine.
3DFX [3dfx.com]
Taking advantage of the revolutionary scalable architecture of the 3dfx VSA-100 chip, the Voodoo 5 6000 AGP features four processors working together to be the world's first 3D accelerator to break the Gigapixel barrier.
Ok, that last one says 'pixel', but 3DFX is probably referring to single-texture polys anyway.
Couldn't find an nVidia reference, can anyone else find one?
Re:*sigh* (Score:2)
A site like that would get a lot of hits. And maybe people would stop believing the same bullshit from the same liars day in and day out, business as usual.
I wish I had a nickel for every time someone said "Information wants to be free".
Re:HDTV? (Score:2)
I'm not sure. I lay about six feet from my 32" TV, but that's because that's about how tall I am. Sometimes I sit about nine feet from it, but that's where the couch is. I really don't have a good place for a 54" projection set. It's much deeper then my current TV, so I would have to put it closer, and I don't think it would stand up to dog slobber.
For me the 28" HDTV VVega ultra-flat glass tube sounds better. I would lay about six feet from it, or sit about 8 feet from it.
I have a very nice desk chair. To gloat a bit, it's an Areon. It's not as comfey as my couch. After a day of sitting (at the office, or at home) it's not really as comfey as laying on the floor either.
My wife would also be a little upset if only one person could watch TV at once, especally if I kept kikcing her off it to read slashdot.
My car provides warmth, has a six speaker stereo, can haul 4 people and a little baggage at 130MPH with the top down. It can muss my hair. I can cook a pizza on it's engine block. It is definitly surpasses an HDTV system at what it does.
Regretabaly the HDTV surpasses both my car, and your PC at being a easy to use appliance that can be viewed by several people in the comfort of a tipical living room.
Now if you wanted a car, or a computer, then your better off getting one of them then a HDTV. But if you want a TV, a PC is only a limited substitute. Under the right set of circumstances it is a quite acceptable substitute.
I would question the wisdom of asking a girlfriend to sit at your desk and watch a movie with you. Then again you'll both have to sit close together, so it may work out.
Geeks definitly watch the superbowl. But pretty much only for the comercials.
Re:HDTV? (Score:2)
Yep. The Sony 38" (or is it 36") ultra-flat glass VVega is quite pricey at $6000. The 50" projections sets are somewhat more so.
There are a lot of cheep monitors that won't do the more agressave HDTV reslutions, but yes any monitor over about $150 will do nicely. Of corse that's a bit unfair because I expect if Sony, et al thought the HDTV market would buy 19" HDTV sets they could make them cost somewhat less then computers.
Find me a 32" PC monitor for cheep. Howabout a 38"? Or a 54"? Remember to discount the size to get it to the same aspect ratio as HDTV (or use that amazingly costly Apple Cinima display).
It would be wondeful if I could make a $600 PC (or even $2000) PC system do the work of a $6000 HDTV, but it's just not possable yet.
Of corse if all you have for HDTV is $600, and 19" sounds good to you, go for it. I admit some HDTV probbably beats no HDTV. Then again Quake III might beat any HDTV just now, given the lack of much programming!
(I'm all for bargins, and trade-offs, just know what you are trading off!)
Re:Hardware HDTV (Score:2)
Whatchew talkin bout Willis? A 1280x1024 VGA monitor should handle 720p quite nicely (1280x720), and my 1920x1200 monitor at work can do 1080p (which isn't even in broadcast equipment yet IIRC)... Remember one of the really nice things about HDTV is actually the noninterlaced (or 'progressive' in non-computergeek-speak) capability, and we've been accustomed to good-quality non-interlaced VGA for 10+ years...
Hell, even your 1024x768 set can do 480p, which is better than flickery NTSC junk...
And if ATI really supports XFree (writing DRI drivers and GLX support) out of the box I can definitely see $tnt2_adapters_on_ebay++
Your Working Boy,
Let's be honest here. (Score:2)
Now when somebody asks about butt-stomping 3D, how many people (outside of ATI's marketing drones) instantly think "ATI!"?
Also, their drivers are somewhat less than stellar.
And MAXX'ed? Didn't we already see that MAXX technology does undesireable things to one's latency? Who cares if you get 150fps if you're a half-step behind everyone else?
True, ATI MAY have gotten their act together. I think I'll sit this generation of their cards out though. They've got a bit to prove to me before I go and plunk down the dross for one of their cards.
Chas - The one, the only.
THANK GOD!!!
3D hardware for the Mac market... (Score:2)
Right now, 3dfx has Mac drivers [3dfxgamers.com] and a bios flash that will allow your standard PCI V3 for the PC to work with a Mac. The drivers are beta, but the card is cheap (since you don't have to pay the Mac hardware premium), and test show it blowing away [xlr8yourmac.com] ATI's cards (In 3D, 2D is so-so). Only problem is that the V3 wasn't designed with Mac support in mind, so there are a few hardware issues that may never go away.
The V5, on the other hand, was designed to support the Mac from the beginning. The Mac version has already been demoed (Rumor has it the Mac version was ready before the PC one). Here's an article from InsideMacGames [insidemacgames.com]. 3dfx is coming out with PCI only at first, AGP may come later.
nVidia has announced that they intend to bring out a Mac card sometime later this year, possibly the NV15 or a varient thereof. They've released little info on this--but they did hire some director or manager guy away from Apple a few months ago.
Re:Curious Timing? (Score:2)
You read the TheRegister [theregister.co.uk], don't you?
Anyway, besides nVidia and ATI, 3dfx plans to show off a working V5-6000 (Their quad-chip card) at WinHec. There's a bunch of players that probably won't be there, though. S3 has been quiet lately (Not surprising, given that S3 is selling their graphics division to Via). BitBoys has pushed their expected release date back by an entire year. VideoLogic hasn't made any recent announcements, and STMicro's "GeForce-Killer" is still vapor.
Re:So Surprising... (Score:2)
But combining video capture, MPEG acceleration, etc onto a single card would certainly free up some slots in my computer. (Which is maxed out, what with sound, SCSI, NIC, FireWire, TV card, and VGA card). I've been considering buying the ATI All-In-Wonder card for just that reason when I put my next system together. (Just how well is that card - the AGP version - supported under Linux?).
Heck, how about putting the X server right on the card too?
Problems? (Score:2)
Then I started gaming. Wow, is this slow. Let's try this new ATI Rage Fury, 1st time out. Wow, it crashed Win98. No Linux support. Had to flash my BIOS. It crashes because it overheats. New driver? Cool. Installed it. Had to reinstall Win98 cuz it doesn't like the driver. Linux support? No way...
Let's try this Voodoo 3 3000. Wow, easy install. New driver? Not a problem. Linux support, playing Q3 and UT @ 1024X768. Never crashes. 3DFx is the sweetest thing on the planet. It may not be the fastest, but it's damn fast and it's trouble free -- and compatible.
When the Voodoo 5 gets the same Linux support, count me in as a loyal customer. The ATI will have Linux support? Doesn't phase me a bit.
Re:Not So Overwhelming, After All... (Score:2)
If you don't have an Intel processor/mobo, think twice before plonking down hard currency for anything made by ATI. I myself got an ATI All-in Wonder 128 card this Christmas, and it refuses to play well with my VIA based motherboard/K6-2 processor. It's not like the Super 7 platform is either too new or too old for ATI to have supported it in the Rage 128 based cards, or that the VIA MVP3-G chipset is so uncommon. ATI, quite frankly, just doesn't care about supporting non-Intel platforms, because they don't have to. They're the company of choice for Intel-based OEMs. So, they don't care about performance-loving AMD-using geeks like a lot of us here.
If you're performance loving, I can't understand why you would have ever bought an AMD CPU in the past (I certainly can now with the Athlon, however). The main reason why ATI cards have problems on Super7 is because Super7 is a bad, bad hack. If you actually look around, all of the cards were having problems around that time. The G200 had problems with the Super7 platform, as did the TNT, and just about anything else around at the time using AGP... Some companies were more responsive to the problems, admittedly, but the problem lay far more with the Super7 platform than the graphics card manufacturers.
Funny then how the REALmagic Hollywood+ I got after the ATI's performance bit delivers flawless DVD performance on the same VIA chipset, with CPU usage averaging under 5%. Yeah, ATI, blame it on the mobo chipset instead of your own laziness when it comes to drivers...
And the REALMagic is a PCI card, right? And your ATI card is AGP? Again, this is more a problem with Super7 than the manufacturer of your graphics card.
Thank GOD (Score:2)
Anyway, more good Gfx Cards with good OpenGL support is always good (Something 3dfx never seemed to grasp...) Oh well, I just hope they change the name again when they get there next chip. I don't want to be talking about the "ATI Radeon 512 II Pro Turbo Championship Addition" in 2007...
John Carmak (Score:2)
Actually, I've heard the exact opposite, (From a direct quote as well) That JC likes voxels, and would prefer them to polys eventually, as opposed to more and complex-er 3d-model based systems.
He said that the situation was a lot like back in the day with vector graphics vs. pixel graphics. Vectors were great, and didn't take up that much ram, whereas pixels needed bitdepth*screen size of ram. Quite a bit when you're talking about boxes with 64k of ram.
But, as images got more and more complex, so did there vector data. Whereas pixels require the same amount of data no matter how visual complex the image is.
Eventually voxles will take over.
I don't think that they will for quite a while though...
Curious Timing? (Score:2)
Re:Hmmm (Score:2)
In fact, including mocap, keyframing is far and away the most popular technique in modern computer graphics. Not many people use procedural animation (which is possible, although not terribly realistic yet). Basically, if the actual motion is stored as discrete samples along the motion curve which are then interpolated, you've got keyframing.
Skeletal animation is preferred because you only need to keyframe (typically) 24-50 degrees of freedom for a human object. This is much easier for artists than having to manually handle 10,000 NURBS surfaces, and also makes capturing the motion really easily. Skinning, such as included in the Radeon, comes into play because the model is only defined once. If you look at your own skin as you bend your elbow, the skin on the outside will stretch while the skin on the inside will contract. Since polygons are not soft, skinning by using matrix interpolation is used to ensure that no seams emerge at any of the joints.
Re:Going through the features... (Score:2)
But 1500MTexels/3 = 500M Pixels per second. If the chip can effectively commit 1 pixel per clock cycle that would be 500MHz, I suspect it can commit 2 pixels per clock cycle, or even more! Some detailed overview of how the internals of the chip work would be great - you see this for CPUs, but not for graphics chips...
Re:So Surprising... (Score:2)
Well, a healthy dose of skepticism is a good thing, but don't get paranoid. The media has always been in a struggle to both advertise but maintain integrity and independence. I'd say on most respectable sites they are entirely fair and will burn a product of a company that advertises with them if it really sucks. The only reason people go to these site are their credibility. Once they blow that it's over. You can only do something stupid like that in the Windows propaganda magazines (which have all those stupid ads with some schmucks shoe business).
Don't forget Matrox (Score:2)
HH
Yellow tigers crouched in jungles in her dark eyes.
Re:3dfx is basically gone (Score:2)
PR for ATI? (Score:2)
Does snack sound like he works for ATI PR people to anyone else? If I were ATI I'd submit every announcement I can to
I'm just waiting for real-time virt. reality (Score:2)
Re:Hmmm (Score:2)
The whole keyframe thing didn't impress me too much. Keyframes were used in Quake 2, but now most games use skeletal animation, which usually does not involve keyframes at all. But, keyframes may come back into style once we start animating individual limbs more. Of course, the Radeon has skeletal animation accelleration in hardware as well, and it is better than nVidia's.
What impressed me was the shadow casting stuff. I have been wondering about how best to implement shadows for some time. It is really a lot harder that you'd expect. I am very happy to see it done in hardware.
------
Who to trust? (Score:2)
If you could have any of the three fancy new chips in your dream computer, which would it be?
Please reply with your answer. Thanks!
Re:overwhelming... (Score:2)
Uhm, you do know that the G400 does Q3 ( and probably UT as well, though I haven't tried that ) under Linux just fine, don't you? Get the GLX for it here [sourceforge.net]
Re:Show nVidia who's the Man... (Score:2)
Molog
So Linus, what are we doing tonight?
Re:So Surprising... (Score:2)
Definetly since we moved to AGP cards, there has been next to no determinable difference between the 2d performance of graphics cards under windows, under XFree86 however the acceleration of the drivers meant that some cards speed ahead while others were left behind. I have no figures to confim this, but I am sure that there must now under linux be no real difference in 2d performance between a selection of 6-12 month old graphics cards from the linux "supporting" manufacturers. So 2d is no longer an advertising campaign....what is?
3d seems to still be the marketeers primary goal, they feel certain that by convincing us that there card can handle more 3d data (by throwing fill-rates, bandwidth, texture memory and ramdac speeds our way) we will experience VR on a standard x86 machine.....rubbish. The 3d question when buying a graphics card = will it play the newest games at an acceptable speed in my system NOW (i.e. not if they ever get their drivers out)? If yes move on to next question, if no move onto next card, to speculate as to whether its performance will be adequate to play the next generation of games is futile as no-one will no how they will be written and you don't know what your system will be like in the future.
Video. MPEG acceleration, Capture and perhaps CSS decoding and pal/ntsc out, these are all addons (like 3d used to be) that nobody seems to see as a marketing ploy. I was in two high-street computer shops in Dublin this weekend and noticed both carrying the (nice big sticker advertising the fact on box) NON-VIVO version of the same TNT II...why? Seems like to someone not having video is an advertising sell. When buying a graphics card, how many people will not look at a card if it will not do significant hardware acceleration of MPEG, how many will look for a TV out card and how many will spend those extra bucks for the ability to capture from their camcorder and do a bit of non-linear editing. Look at the ATI All-In-Wonder or the some of Asus' NVidia based boards and the G400 and you will find many of these features, but how many work outside of one or maybe two variants of M$ Winblows?
What does this all mean? Well, IMHO the first manufacturer to produce a product that has an open source driver (for any platform, if you build it, we will hack) which
Who here on slashdot would turn down a video-card/driver combination that provides these features because it will only play todays games at 85 and not 110 frames per second?
Radeon? sounds like a cross between (Score:2)
Re:HDTV? (Score:2)
Re:Let's play a little game... (Score:2)
You don't need a separate 4x4 matrix per-vertex. One will do fine for the whole scene -- unless you're trying to simulate a non-linear camera lens or something.
Re:Thank GOD (Score:2)
The Ati 128 chip used a different core, and thus required new 2D XFree86 drivers.
Anyway, enough of this boring ATI history. The reference to the Linux drivers probably only means 2D support. :-(
Christmas '98, G3s, et al (Score:2)
reminds me of ATi's MacOS drivers (Score:2)
Moan and groan about OpenGL performance and compliance.
Dig up driver updates from Apple and ATi, install.
Note performance boosts in one area, slow downs elsewhere, few bugs, few bug fixes.
Lather, rinse, repeat.
Re:Not So Overwhelming, After All... (Score:2)
It's such a shame that such beautiful hardware has to be hobbled by such awful software.
*LoL* Linux on x86 is beautiful software hobbled by awful hardware! The irony is killing me.
--// Hartsock
Driver support... (Score:3)
---
pb Reply or e-mail; don't vaguely moderate [152.7.41.11].
*sigh* (Score:3)
Seriously, I wish that Slashdot and all of the other sites that repost on computer hardware would start dropping crap like this in the trashbin. Post reviews of released products, post reviews of soon to be released products, but if I never see another review of a product that only exists on paper, I'll still have seen too many. Posting this kind of stuff just encourages these hardware companies to write more of it.
overwhelming... (Score:3)
3dfx is basically gone (Score:3)
Odd... I submitted this hours ago, yet my writeup was rejected...
Anyway, the ATI Radeon can do 1.5 gigatexels per second. The Voodoo 5 can only do 667 megatexels. So, the Radeon will far outperform a V5. And it has T&L! What a deal! The funny thing is that 3dfx is hyping the V5 based on its fill rate...
Now, on Wednesday, nVidia is going to announce the GeForce 2. It will have a fill rate of 1.6 gigatexels, just a bit higher than ATI's offering. On the geometry side, the GF2 will do 250 million triangles per second. I don't know how fast the Radeon is as far as geometry, but if anyone else knows, please share! It is also rumored that the GF2 will be in stores on Friday. As in, THIS Friday. Whoah.
Back on ATI's side, the Radeon looks like it will have more features that the GF2. As a game coder, I like that. :) Also, ATI is likely to have better Linux support. I also like that.
It looks like choosing between these two cards will be tough, but I'm leaning towards ATI right now. One thing that I know for sure, however, is that 3dfx is not in the running. Their only hope right now is to drop their prices very low. I would not like to be working at 3dfx right now.
Oh, here's some links:
Again, nVidia will be announcing the GF2 on Wednesday. Check their site then for details.
------
Multiple OS support (Score:3)
Things seem to be swinging back to the way that it was in the '80s with many different OS's. This time, however, we have standardized hardware (mostly).
Making Linux into an adequate gaming platform also depends upon immediate support when hardware is released. I think the ATI Radeon is a step in the right direction.
--
Re:Not So Overwhelming, After All... (Score:3)
> CPU in the past (I certainly can now with the Athlon, however).
First point: I bought an AMD K6-2 because, at the time, it was the only reasonable alternative to supporting the Intel monopoly, a monopoly I found just as odious then as I do now. The K7 Athlon was six months or more away, and I needed a computer sooner than that. And, I wasn't going to buy Intel on principle--they'd been handing us slight modifications of the original Pentium, without much true innovation, for far too long. Amazing how the underdog AMD, with comparatively few resources, was first to market with a true "786" processor core... But, back to the point, I bought AMD on principle, and their price/performance ratio at the time was very competitive with the Celeron (differing clockspeeds for equal performance, of course). It'll still make a damn fine file server when I build a new Athlon/Thunderbird system at the end of the year.
> The main reason why ATI cards have problems on Super7 is because
> Super7 is a bad, bad hack. If you actually look around, all of the
> cards were having problems around that time.
Yes, but you're missing the point! *ALL* of the other major manufacturers fixed their drivers to make their products work with Super 7, *EXCEPT* for ATI. A TNT2 will run well on Super 7, and has since a couple months after introduction when the drivers were fixed for VIA Super 7. Ditto for G200/G400. Even a shiny new GeForce runs well under VIA Super 7. But still, after an eternity, not the ATI Rage 128 based cards. *That's* the point. All other important players run well on VIA Super 7 chipsets now, except for ATI which has had well over a year to fix their driver support. So, the problem is with ATI's protracted laziness. Nowadays Super 7 may not be the best platform to bother with, but it was for most of the over a year in which ATI ignored driver dev for it.
> And the REALMagic is a PCI card, right? And your ATI card is
> AGP? Again, this is more a problem with Super7 than the manufacturer
> of your graphics card.
No, no, no, no. The video card is a standard PCI model; I have an AGP slot which I wanted to save for an nVidia or 3Dfx card when I could save the money to upgrade, then use the ATI strictly for its multimedia functions live Video Desktop and vid capping. Look towards the future, I always say, and I wasn't about to waste my AGP slot on a card featuring an ATI chip which was already a year old. So, it isn't a problem with VIA's AGP implementation, it's, I repeat, a problem with ATI's substandard hardware support/driver support. So, if the REALmagic card has no problem doing DVD over my PCI bus, the ATI card shouldn't either. It's that simple. REALmagic took the time to write drivers which would handle DVDs well from a Super 7 mobo, but ATI Multimedia didn't. Their driver support is consistently substandard when compared to nVidia, 3Dfx, and Matrox, and I mean even on Intel mobos--read the posts on the Rage Underground help boards if you doubt this. But the driver support is especially bad for AMD/VIA solutions. There is *ZERO* excuse for blaming a chipset instead of fixing your drivers to work with it, especially when every other important player in the graphics industry has managed to make their cards work quite well with it. And before you try again to lay the blame elsewhere, yes, the latest VIA drivers have been installed and configured properly.
And, a final note: especially since a PCI-based Rage 128-based card could not play well with X in standard SVGA mode, ATI has absolutely no business calling a VIA chipset non-standard. It's ATI's cards and drivers which are non-standard.
Dual Mode Graphics (Score:3)
As a Mac user... (Score:3)
Rant, Rant, Rant
Overlinked (Score:4)
You [mentaltempt.org] have [dictionary.com] overlinked [sfsu.edu] this [this.com] article [article.com]. Just because [justbecause.com] you [mentaltempt.org] know [notmuch.com] the [the.com] URL [url.com] of [msn.com] Linux [linux.org] doesn't [dbytes.com] mean [tony.ai] you need to [youneedto.com] use it. [useit.com]
Hmmm (Score:4)
Note, I'm no graphics professional. I am merely an interested individual. Repeat: I am no John Carmack!
The first thing that jumped out at me that ATI seems to be doing in the "new and cool" area (rather than just adding more horsepower to today's GPUs) is adding keyframe interpolation. Not *2D* interpolation, but *3D mesh* interpolation. The idea has a good illustration at the bottom of this [ati.com] page.
Voxels seem to be cropping up here [ati.com]. It's cool to see that they are adding support for them at the hardware level. I know that John Carmack has been skeptical about using voxels due to the sheer amount of processing power they need.
Most of the stuff I saw in the specs, however, is mostly just fluff covering various graphics technologies and what they do. While the specs hint that the chip will have support for them, it doesn't do too much more than hint at it.
Maybe there'll be more information soon...
Hopefully (Score:4)
Re:So Surprising... (Score:5)
Everyone is going to announce their latest chip with guns blazing, claiming that it will be the fastest thing ever, with the most features, bla. blah blah.
And, I'm surprised slashdotters are falling for it. "Look at those specs!!! It must be good!!! I can't wait to buy it!!!" The marketing folks there must be already patting themselves on the back.
If anyone's suffered through ATI's past chips (and their [lack of] relationship with the Linux community) they will already know to stay away from anything ATI puts out in the future.
When shopping for hardware, especially video hardware where the competition is downright cutthroat, here are some do's and dont's:
Don't rely on online reviews.
Or at least hit reload every once in a while to make sure the site isn't financially supported by one of the card companies reviewed. Let's not forget the fiasco where a chip company's ads were running on "you know who's" Hardware Guide a few months back when they were trying to do an "unbiased" review.
Don't rely on benchmarks created by a graphics chip company.
Of course NVIDIA's card will run the NVIDIA tree demo faster than anyone else!!! What unbiased information does this tell you? Nothing. I personally find any benchmark that is not part of an actual application totally useless. Quake is an OKAY benchmark if youre into gaming, and many CAD applications come with their own benchmarks. I'd put a little more trust in these.
Test all resolutions and color depths
Remember: low-resolution and/or high-poly tests guage the driver's performance and efficiency, while high-resolution, low-poly tests guage the card's fillrate. Don't trust a Quake benchmark that is only done at 640x480. Beta or low-quality drivers can make a card look bad at this low a resolution.
Test on multiple CPU's
Make sure the graphics chip's performance scales well with better CPU's. Drivers can also be optimized for Pentium-2 and -3 class machines.
________________________________
Let's play a little game... (Score:5)
I've got a graphics card by some manufacturer (it really doesn't matter who) that has a 1.6 Gigatexel fill rate. Now, given the propensity for developers to use 32-bit textures, that means that each and everyone one of the 1.6 billion texels I process every second must be accompanied by its own 4-byte read. Now, how much memory bandwidth does this require? And how much bandwidth is on the card?
Now, let's start expanding on this... 30 million triangles / second, given triangle lists equates to about 16 million vertices. At 3 floats (x,y,z) per vertex, and 4 bytes per float, that's another 192MB/sec of bandwidth we don't have. Now, in order to actually use textures, each of those vertices also needs texture coordinates, which add another 2 floats, or 128MB/sec. And then for lighting we need a vector normal to each vertex... that's another 3 floats, so 192MB/sec. Now, in order to actually project these coordinates onto the screen, every vertex needs to be multiplied by a 4x4 matrix, or 16 more floats. Whooppee! That's another GIGABYTE of bandwidth down the tubes. Then to actually display this, since I have 2 texture units per pixel pipeline, my card delivers 800MegaPixel fill rates, which at 8 bytes per pixel (24-bit RGB + 8-bit alpha, 24-bit Z, 8-bit stencil) is another 6.4GB/sec of bandwidth.
So, when all is said and done, to reach the theoretical maximum of my card, I need 14.3GB/sec memory bandwidth minimum. Add in things like texture filtering (multiple texel reads per texel write) or alpha blending and you can break 20GB/sec easily.
Multiply all this by about 10 for Microsoft's X-Box (which somehow claims to shovel 14.4 Gigatexel performance across a 6.4GB/sec unified bus), and you'll know why any and all paper specs for the X-Box are completely ridiculous.
There is only one architecture currently available or in production that actually has the bandwidth to support its theoretical maximums, and there's no way in hell it'll fit in an AGP slot. It's manufactured by Sony, can currently be bought in Japan for about $400, is slightly larger than a bread box, and provides 48GB/sec of bandwidth, albeit at a slight hit to the actual frame buffer size.
So, in the spirit of the industry, I'm announcing my new video card. It has 400Gigatexel performance, and can transform 100 billion triangles every second. Unfortunately, due to current memory and bus technologies, you'll never see more than about 500 megatexels and 2.5 million triangles, anyway.
Not So Overwhelming, After All... (Score:5)
This has always been ATI's main problem. Unlike nVidia and 3Dfx, ATI releases drivers slowly and never ever advertises them; in fact, its own driver download pages warn that the drivers are only supposed to be for people experiencing problems, etc., and might cause new problems. They go beyond a "standard disclaimer" and try to actively discourage driver updates--no wonder then that sites like "Rage Underground" are the center for the ATI guys into performance, sites which have their own *unofficial* performance-optimized drivers because ATI drivers suck.
So, I'm convinced that no matter the potential of ATI's new chips, they won't live up to them until it's too late. The other ATI problem is also driver-related: lack of hardware support. If you don't have an Intel processor/mobo, think twice before plonking down hard currency for anything made by ATI. I myself got an ATI All-in Wonder 128 card this Christmas, and it refuses to play well with my VIA based motherboard/K6-2 processor. It's not like the Super 7 platform is either too new or too old for ATI to have supported it in the Rage 128 based cards, or that the VIA MVP3-G chipset is so uncommon. ATI, quite frankly, just doesn't care about supporting non-Intel platforms, because they don't have to. They're the company of choice for Intel-based OEMs. So, they don't care about performance-loving AMD-using geeks like a lot of us here.
This is in stark contrast to nVidia and 3Dfx, which release new drivers all the time and which try to support every viable platform. When GeForce cards were having a problem on Athlon mainboards, nVidia released new drivers to fix the problem. Yet, ATI would probably have done the same thing they did a year ago with K6-2 and K6-3 platforms and the Rage 128 cards and blame the problem on the chipset vendors for being non-standard--i.e., non-Intel.
This is a serious attitude problem on ATI's behalf, and until they can prove that they'll provide adequate enough driver support at least for Windows, I'd recommend staying away from anything they offer because the drivers will kill it. Let alone Linux. I tried installing both Corel Linux 1.0 and Linux-Mandrake 6.0 with my A-i-W 128--based on the same year-old chip from the Rage Fury--and couldn't get it to work with X even in generic SVGA mode. ATI doesn't support all common platforms under Windows, so forget about decent Linux drivers.
I am satisfied somewhat with the multimedia features of my All-in-Wonder 128 under Windows--Video Desktop is a godsend--but even then DVD playback was unbearably awful. Of course, ATI blamed it on my VIA chipset. Funny then how the REALmagic Hollywood+ I got after the ATI's performance bit delivers flawless DVD performance on the same VIA chipset, with CPU usage averaging under 5%. Yeah, ATI, blame it on the mobo chipset instead of your own laziness when it comes to drivers...
As I said, I like the multimedia features of my A-i-W 128, even though DVD playback won't work because of shoddy drivers the rest of it is great. Video capture is flawless, and Video Desktop for TV viewing always wows my guests and provides me with hours of entertainment during my long visits to pr0n--er, tech sites. But never, ever, ever, buy an ATI card for its performance stats. It won't live up to them until the card is outdated, and even then it might never live up to them unless you have an Intel mobo and processor.
So Surprising... (Score:5)