AnandTech Reviews ATI's Mobility Radeon 9000 125
Mike Bouma writes "AnandTech has reviewed ATI's latest mobile graphics solution. According to the reviewer this small and energy efficient chip is the new king when it comes to mobile graphic chips for Notebooks. Also John Carmack is apparently very positive about the chip and also stated that Doom 3 will be able to run smoothly with this new Radeon chip."
Will doom 3 run on any other card? (Score:2)
Re:Will doom 3 run on any other card? (Score:2, Funny)
Re:Will doom 3 run on any other card? (Score:2)
So GeForce 3 or an ATI 8500 at the minimum.
Re:Will doom 3 run on any other card? (Score:2)
GeForce 1 is the minimum card for Doom 3. Of course, don't expect it to look pretty on anything less than a GeForce 3.
Re:Will doom 3 run on any other card? (Score:2)
Anyway could someone tell me what a shader is?
Re:Will doom 3 run on any other card? (Score:2)
Re:Will doom 3 run on any other card? (Score:2)
Also, Americas Army plays decently on a P3-700 with a GF2 at 800x600 (the lowest the Unreal engine supports anyway).
Re:Will doom 3 run on any other card? (Score:2)
Re:Will doom 3 run on any other card? (Score:2)
Re:Will doom 3 run on any other card? (Score:2)
Hopefully we start to see UT2K3 benchmarks in reviews, I'm tired of Q3A, I hope Doom replaces it for benchmarks.
Re:Will doom 3 run on any other card? (Score:1)
The presentation was also running the game on a 2.2Ghz Pentium 4 with that lovely Radeon 9700, so it was no wonder that it was moving smoothly. Both Tim Willits during out little talk and Carmack in his keynote later were quick to point out that the game could run on anything as low as a GeForce1 however. "The game running at full features is with a GeForce3 video card or higher. It'll run on anything down to a GeForce1 because of the hardware acceleration, but we feel that some of the graphical features would have to be turned down. But with the new products from ATI and nVidia coming out before the release of the game, we're sure that we'll have great penetration for the game full feature."
Re:Will doom 3 run on any other card? (Score:1)
Say it ain't so. (Score:1)
Is there a new Trek game out?
Re:stupid (Score:1)
Macs are not game machines, so it is not a surprise that they will perform poorly running games. Your comment in this area makes as much sense as saying an X-Box sucks because it cannot calculate a spreadsheet fast enough. I have often heard that IE is slow on os X, but that doesn't mean that all browsers are slow and FWIW, browsers can render a webpage in one of two ways. Incrementally, or all at once. IE originally overshot NN 4.xx in perceived speed because IE was faster at showing something on the screen first. IE showed the text of the document as it was downloaded and incrementally rendered tables, NN4.xx did not. However nn4.xx will still download and display a complete webpage FASTER than IE 5 as will opera, moz and konq. IE is fast on windows because it shares dll's(libraries) with the OS, Konq performs similiar on a kde system for pretty much the same reasons. Apple does need to wisen up and create their own optimized os embedded browser, but overall, it is a small price to pay for having a system with the power of bsd and the traditional usability of a Mac GUI.
If you are really disatisfied with your Apple, then you have a responsibility to all computer users to take your campaign to Apple. Whether you believe it or not, Apple will read your complaint and take it very seriously. Running around spreading what amounts to FUD does not help improve gaming or browsing on Macs, it only hurts the situation because for each lost sale, there is less money for apple to develop better apps and for Apple to spend on R&D.
Owning a non-mainstream computer comes with certain implied responsibilities and one of them is to take any flaws and try to fix them or to make them better. A strong and loyal userbase is the only way for any OS to compete with windows. The only reason Apple and Linux have actually survived the Windows onslaught is due to the kookery of their user base and their faith in the platform that any flaws will eventually be fixed. If you feel the compulsive need to bash Apple, then you are not mature enough nor ready enough to graduate from the windows platform.
Re:stupid (Score:1, Offtopic)
>>>>>>>
Uh no. On Windows, that statement is an unproved rumor. On Linux, its a physical impossibility, since the whole of the GUI is in USERSPACE!
Re:stupid (Score:2, Interesting)
No, I haven't. I took it out of the box, upgraded the RAM to 512, and immediately downloaded all the patches from Apple. It was slow, right away. What else, exactly, do you think the user should have to do?
You cannot BS anyone into believing you are playing any type of 3d games with decent fps or doing anything of a graphical nature on the machine.
As an example, I use Tony Hawk Pro skater 2. The Mac was 500mhz with OS X 10.1.5 and an ATI Rage 128 and 512 MB RAM. The PC is 700mhz with Windows 2000 and 256 MB RAM with a (far inferior) ATI Rage Mobility (Basically, a Rage Pro). The PC runs the game smoothly, the Mac stutters and jerks.
Mozilla would be another good test. Runs fine as my everyday browser on the Windows machine. Under OS X, it lags at window sizing, launching new windows, text entry, and scrolling up and down lengthy web pages.
I have often heard that IE is slow on os X, but that doesn't mean that all browsers are slow and FWIW, browsers can render a webpage in one of two ways.
Every browser I've used on OS X (and I've used them all - Chimera, Mozilla, IE, Omniweb, et al) is slower than what's available on Windows - sometimes significantly slower.
Running around spreading what amounts to FUD does not help improve gaming or browsing on Macs
It's not FUD, it's true. And yes, hopefully complaining will make a difference! What kind of message does it send when OS X is slow, and all the users just go out and buy new Macs? Doesn't give Apple much incentive to fix it.
On the other hand, if the users were up in arms (as they should be), Apple would have to fix OS X in a hurry. The user's blind loyalty hurts them, they get less for their dollar than they would if they exercised their consumer choices.
Re:stupid (Score:2)
Everything launched quick and was smooth. I did not run any benchmarks because this was a compusa demo machine but it seemed alot, alot smoother then the g3 powerbook with MacOSX 1.1. Believe me when I say its leaps and bounds quicker thanks to jaguar.
Come to think of it I never even saw the ball moving for anything.
However if I had 1699 I prefer a dual athlon or single p4 box for gentoo linux.
It's great but. . . (Score:3)
I really feel for everyone that will be playing this thing on their P3 1.2gHz and GeForce3 Ti500.
"Wow John, you got above 15frames a minute? That's incredible!"
Where is my demo! Bring me my demo!
Re:It's great but. . . (Score:1)
Re: When has this NOT been the case? (Score:1)
Games have always pushed computer systems...
I remember folks whining that their 3d-shooter games wouldn't play fast on the 486SX-16, and that the designers were only writing them for the 'rich-boys' with their 486DX-66's...
A year after the release, most all common computers being sold could run the game just fine. Point being, let the designers create the game with long-term lasting power by putting it at the limit of todays hardware. Let the hardware designers push their new products by showing how well they can do the same job as the most expensive card out there (Using games like this as a benchmark)
Re: When has this NOT been the case? (Score:2)
Re: When has this NOT been the case? (Score:1)
- HeXa
Re:It's great but. . . (Score:3, Insightful)
If nobody pushes the envelope, there will be no reason for consumers to buy new cards and graphics technology will stagnate, so lighten up.
Anyway, what id is doing now has worked out great for them throughout their history -- why would they change it now?
Re:It's great but. . . (Score:2)
Sure we can't put our monitors to 1600x 1800 with the latest video cards like we can under quakeIII but do we really need this?
QuakeIII when it first came out had simuliar high requirements. Now even the e-machines desktops can run it fluidly. 800x600 with 32 bit color is fine for me and my new upcomming Radeon 97000 pro.
Re:It's great but. . . (Score:2)
Re:It's great but. . . (Score:3, Interesting)
I think there's no way around needing the latest, greatest hardware to play the latest, greatest games; the two go hand in hand, always have. And I for one love it, since it means games get immersive and realistic a lot faster than they would if nobody was pushing the envelope.
Doom 3 will push lots of people to buy fancy graphics cards now rather than a year from now, which will prompt other developers to release their snazzy eye candy games a year from now rather than two years from now, which will cause enough content to be available that the non-Doom crowd will upgrade sooner rather than later, etc. Not good for people on a budget, but people on a budget are rarely on the cutting edge of any technology, so no reason to expect games to be any different.
Re:It's great but. . . (Score:1)
i'll believe it when i see it.... (Score:2, Informative)
/me is with ya (Score:1)
Re:i'll believe it when i see it.... (Score:2)
Why? (Score:3)
Nice, so I can play on my laptop, almost as great as playing a PS2 on a tv from 1950.
It's not really the screen size, some laptops have better sized screens than my desktop, but the angle.
Re:Why? (Score:1)
I usually play sitting just in front of my monitor. How about you?
Re:Why? (Score:2)
IMO the big problem with laptop gaming is the poor refresh rates of your standard LCD. Lots of nasty ghosty artifacts... But its still nice to have the option, I guess...And you can always plug the laptop into a monitor for desktop use (yeah this misses the point of the laptop, but you can still use it on the go for all your office stuff, and then use it at home for games, avoiding the usual practice of having to have a dedicated desktop gaming rig)...
Re:Why? (Score:3, Informative)
Re:Why? (Score:2)
Re:Why? (Score:2, Informative)
If you're going to get a laptop and you still want to play the higher end games, then there's really no excuse not to get an Inspiron with the Ultrasharp display. Of course, the biggest advantage with Inspirons is that you can upgrade the graphics card by calling Dell and having the new card shipped out. True, you don't get the wide range of choices like you would with a regular PC, but then there are trade-offs with laptops.
Re:Why? (Score:2)
Plus, there's talk that the Inspiron 8200's might be upgradable, video-wise :)
So, you own one, and you're still not sure if it's upgradeable or not? Or are you unsure about which videocard you can actually put in it?
Personnally, that's one of the reason I'm not likely to buy a laptop: you're not told up front what you can and cannot change once you bought it. In a normal self-made tower, you know you can easily change or add any part (except what you chose to be integrated on the MB). And I'm still not sold to an LCD, even if some people like their expericence with them.
(Last thing: thirty degrees of freedom in either direction? What is it supposed to mean in the context of a laptop?)
Re:Why? (Score:2)
As for the LCD, its magnificent. 1600x1200 on a 15" screen gives 133 dpi, and with ClearType, its like reading a piece of paper.
As for the degrees of freedom, I meant that you can swing about thirty degrees to either side without losing the image on the LCD.
Re:Why? (Score:1)
I've recently changed the Mini-PCI card in my laptop from wired ethernet to wireless.
I could also change the screen, swap out the processor, change the video card, or get a different media bay option by finding the parts on Dell's website.
Did you know. -- Re:Damn you ATI (Score:1)
Or if you just didnt use their cards.... (Score:1)
Re:Or if you just didnt use their cards.... (Score:2, Interesting)
I have a Radeon AIW 8500DV, a Rage Fury Pro and a rage 2c card in my three home systems. I have 40 machines with Rage 128 boards in them at the office. Does this qualify me as having enough experience with ATI cards?
If you read the manual and install the drivers properly, you will have almost 0 problems, or at least no more problems than you have with any other card.
The manual explicitely states you MUST remove the older drivers and install a standard vga adapter driver. If an issue is encountered where a file is being copied that is older than one on the machine, you MUST copy the older file. If you follow this procedure every time and do it correctly, you will NOT have problesm with ATI drivers. Removing the drivers from the hardware manager is not acceptable, you must uninstall them from the system.
To say that ATI drivers suck is plain wrong. they may not be as user friendly as nvidia's drivers to install where you can install updated drivers over the old ones, but there is nothing wrong with ATI's drivers themselves. If there was, ATI would NOT have succeded in either the corporate market or with OEM's which is their bread and butter.
If you have ever installed an ATI driver over an existing ATI driver without first uninstalling the older drivers,you will have big problems with the system unless you install another physical video card and clean out all the ATI dll's, registry entries and INF files.
Blaming a product because you did not follow the directions is quite silly.
Re:Or if you just didnt use their cards.... (Score:2)
necessarily suck, IMO it is a matter of how the are installed.
But, the point is the user should not have to jump thru
hoops to that extent to get a peripheral working.
I own three ATI cards (8MB AIW Pro, AIW Radeon, and AIW 8500DV,
with the latter two still in use). I have successfully upgraded
over older drivers, but that is still hit and miss whether
everything the card can do will still work afterward.
I have had other devices (sound card, motherboard) that have
failed or worked adversely after an upgrade, but those are
usually fixed by reinstalling the older driver, in some
cases I've had to uninstall the new driver first, and at
least once I have had to restore the system from a backup
image.
Only with the ATI cards has there consistently been such
an expectation of user interaction. That's the real problem.
If the standard instructions don't work or make sense,
you can check forum threads at sites like www.rage3d.com for
good advice on how and why to install the drivers in a
particular order, and what caveats to watch out for.
The drivers in the past have been adequate, the drivers in
current release are actually quite good, but the install/uninstall
process has serious flaws that force the user to do much of the
decision making that should be automated as part of the process.
ATI still has to improve that process to a point where following
the instructions amounts to running the setup program, and letting
it take care of dependencies and legacy issues. They have to deal
with the idea that a lot of users don't care to follow instructions;
particularly when those instructions have the reputation of an
arcane ritual. When they do that, it will do much to help
the reputation of thier drivers.
Fair (Score:1)
Re:Fair (Score:1)
Upgrading My Laptop (Score:1)
I just can't wait for the days when I will be able to go order a new video card for my laptop. I was very interested to ready that you can already upgrade the video cards on some Dell laptops from the GeForce2Go to the Geforce 4 card, maybe those days aren't far off for most new laptops.
Re:Upgrading My Laptop (Score:1)
Re:Speaking of laptops (Score:1)
ATI about more than just games (Score:1, Redundant)
They've given us some nice features for notebooks for a while now...that anti-aliased resizing on non-native resolutions, for one. Dunno if Nvidea et al eventually got round to matching that one. But ATI had it first by a long shot.
I've often wondered why Carmack liked ATI .... (Score:1)
And you know what it comes down to? Apple. Apple has a deal to use ATI cards in their Macintoshes. They're STUCK using these crappy cards forever, and it's no wonder Carmack never disses them, he has always had a sweet spot for Apple. Why, I have no idea, but I wish he'd call it like it really is. ATI just plain SUCKS. Their drivers suck, their All-In-One-TV-OUT-DVD-Replay-Work-Your-Toaster SuperRAGEKillerXXXtreme 3D cards STINK.
Re:I've often wondered why Carmack liked ATI .... (Score:3, Interesting)
Carmack has noted in the past that Nvidia's drivers are far better ("gold standard" were the words he used). But you have to keep in mind that as a 3D games guy, it is not in his best interests for a monopoly to emerge in the consumer 3D video card world. Competition keeps the new features coming, which gives Carmack new hardware to write new games for.
Re:I've often wondered why Carmack liked ATI .... (Score:1)
Re:I've often wondered why Carmack liked ATI .... (Score:3, Informative)
Please don't buy the hype before seeing the reviews.
Re:I've often wondered why Carmack liked ATI .... (Score:3, Interesting)
This is one of the few advantages if any of being tied into a single software/hardware platform. Many Unix guru's prefer sun over Lintel boxes for that very reason. Quality and consitancy. It is likely that apple itself could of partially written the drivers if ATI's own drivers didn't meet the requirements bill or if they prefered to pay apple to do it instead.
Also remember that Windows (Windows95 & 98 particularly)have a horribe and I mean horrible driver model and sdk. Infact this is what caused all the those infamouns bsod. Even NT4 has everything running in the kernel which is supposed to be there server line OS. WIndows2000 is improving however.
Oddly enough I am getting the newer cards because of better linux support. Better linux support...from ATi? Well since ATI never releases Linux drivers but rather reveals all its technical secrets to the community, I can just wait for the write drivers to come on in with dri XF86 support. I have to rely on nvidia with opengl under linux which I have observed will not compile properly with certain kernels and is very unstable with certian VIA athlon chipsets. Not to mention nvidia does not use the standard dri architecture. With now improved Windows drivers and supperior opensource linux ones, I will buy an ati. If there is a problem discovered in the linux ati drivers, it will be fixed and I do not have to wait for a corporation to do it. Its rumoured that ATI is even developing a unified driver model which is simuliar to nvidia's that will upgrade all its drivers for all recent cards! Oh, and its almost twice as fast as a gefore4!
Re:I've often wondered why Carmack liked ATI .... (Score:1)
p.
Re:I've often wondered why Carmack liked ATI .... (Score:2)
Thanks for warning me. This pisses me off. I think nvidia has given the video card industry some nasty idea's. I will probably not buy it since I do not even trust nvidia to come out with good drivers for linux let alone ati.
Re:I've often wondered why Carmack liked ATI .... (Score:2)
Nvidia's drivers come from the exact same code base as their Windows drivers. This makes them just as fast and stable as their Windows counterparts. They aren't GPL, but the sure are damn good.
Re:I've often wondered why Carmack liked ATI .... (Score:2)
They claim "up to 25% increase in performance", but just about the only place where you see that is the Nature-test in 3DMark. In _games_ (you know, the things we actually use these vid-cards for?) there is exactly _zero_ improvement! And I heard that they changed the default-setting for filtering to one notch worse, so the image-quality suffers. Also, these new drivers have severe stability-problems.
FYI re: Carmack, & Apple's use of ATI hardware (Score:1)
As for Carmack, he's very much known for giving his honest opinion. He's certainly never been inclined to hold back legitimate, often quite scathing criticisms of Apple, and it was only when Apple adopted OpenGL as its 3d API of choice that he really began offering praise.
Re:The 9000 is SLOWER than the 8500 (Score:1)
In fact, Carmack has stated that the Radeon 7500 and Geforce 256/2 (1st gen T&L chipsets) will be able able to run Doom3, even though there's no vertex/pixel shaders present (lack of pixel shaders would require more passes for per-pixel lighting and shadowing). What makes you think that a card that runs DirectX 8/OpenGL games as well as a GF3 can be inferior to that of a GeForce 256? I shudder to think about how many passes a GF2 must have to do when scenes with per-pixel lighting, volume stencil shadows, and dynamic light sources are being displayed. A Radeon 9000 wouldn't even have to do more than 2 or 3 passes on that same scene, especially since it supports pixel shader version 1.4 (like the 8500- it allows 6 textures in a single pass).
Besides, from what I've heard, Doom 3 is being limited to 2nd-gen T&L GPUs, and probably won't utilize special features of chipsets like the Radeon 9700 and nvidia's NV30.
How long until a Ti PowerBook with one? (Score:1)
Re:How long until a Ti PowerBook with one? (Score:1)
Re:How long until a Ti PowerBook with one? (Score:1)
This should really speed things up (Score:1)
Geforce 4 2GO (Score:2)
Re:Geforce 4 2GO (Score:2)
As far as I know...they don't exist. Did you manage to get your hands on some super-special one-off engineering sample or something?
Re:Geforce 4 2GO (Score:2)
Re:Geforce 4 2GO (Score:1)
Makes me wonder What do I have that you don't ?
For sale (Score:1)
Bloody laptop video cards should be replaceable like their bigger brethren
laptop 3d chipsets have suckde for a while now.. (Score:2)
I am in the market for a laptop, but the graphics chipsets suck horribly (Radeon mobility? it's crappy and isnt linux friendly... so under linux you get ZERO 3d accelleration) and are not powerful enough to do squat. I would pay a premium of upwards of $200.00 for high end graphics in a laptop, but nobody want's to offer it.
Re:laptop 3d chipsets have suckde for a while now. (Score:1)
find 10,000 people who would, and you'll get your real laptop gaming chip.
No problems with ATI drivers (Score:1)
Re:Agree, Doom sucks (Score:1)
Who cares what he says? (Score:1)
Any bit of information about 3D hardware/gaming coming from the legendary programmer that brought us Wolfenstein 3D, Doom, Quake (and its sequels), plus is resposible for plenty of other titles using his tech (Soldier of Fortune (I & II), Jedi Knight 2, Return to Castle Wolfenstein, Medal of Honor: Allied Assault, and many others) is worthwhile and helpful in choosing a new graphics card.
To give a good example, he was one of the first to tell us, the gamers, (in one of his
Plus, regardless of whether you want to hear it or not, Doom 3 may be one of the most anticipated games ever. I wouldn't be surprised if it becomes the highest-grossing.